Mixed methods evaluation of a stream monitoring citizen science program
Abstract
Citizen science is defined as engaging the public in scientific research endeavors. Within this field, water monitoring citizen science remains one of the most popular branches. Evaluation of citizen science projects have shifted from programmatic outcomes to participant learning outcomes, but the field still struggles with common frameworks and appropriate methodologies. Blue Thumb is a water monitoring citizen science program centered in Oklahoma, United States, with the purpose of stream protection through education. In collaboration with Blue Thumb administrators, the purpose of this research was to better understand how volunteer experience in place-based and data-rich programs influences science learning, motivations, and data quality. This research used a multi-phase mixed-methods evaluation that consisted of three distinct phases: 1) survey tools from Cornell Lab of Ornithology to assess the impact of participation in Blue Thumb on individual learning outcomes, such as behavior and stewardship, interest, and motivation, 2) interviews about place-based motivations for volunteerism, and 3) correlation analysis of data quality with participants’ self-efficacy and skills. Triangulation of datasets reveal that Blue Thumb volunteers have high levels of self-reported pro-environmental behaviors, interests in science, and motivations for environmental action. Further, volunteers form attachments to the places being monitored which in turn influences their motivation for continued participation. Finally, Blue Thumb volunteers had high levels of self-efficacy and skill due to programmatic support, and in turn produced accurate and reliable data at quality assurance events. Collectively, these studies address existing gaps in the literature about appropriate volunteer outcome evaluation methodology, the connection between place-based attachments and volunteer motivations in environmental monitoring, and volunteer perceptions of data quality. This research provides support for the necessity for mixed-methodology in citizen science evaluation, given that they are more insightful than survey items alone and give context specific to certain programs. Future research will benefit from the inclusion of more robust and inclusive evaluation tools, with consideration to volunteers’ individual experiences and narratives.
Collections
- OSU Dissertations [11222]