**This blog entry orginally appeared on the website oceanspaces.org.**/p>

If the goal of citizen science is at least partially to educate volunteers about the subject at hand, then how can you expect new volunteers to take good data? This chicken-and-egg conundrum pops up periodically, often with a skeptical eye toward the quality of volunteer-collected data to contribute to science. On one hand, you want the volunteers to do and learn as much as possible, but on the other hand, you want some credentialed people overseeing at least the more complicated parts of the protocol. Image: Rockfish caught as part of the California Collaborative Fisheries Program. Some species of rockfish are notoriously hard to identify, even by well-established experts and exemplify the education-data quality nexus.

This particular myth was suggested by Caren Cooper, from the Cornell Lab of Ornithology, as something to investigate. She suspected that in some cases the tradeoff is a myth, while in other cases it may be a very valid concern. The world of bird volunteers is an unusual perspective to think about this question from, because by and large birders are experts in their own right, keep excellent records, and have proven themselves worthy data collectors since the beginning of the Audubon citizen science programs early last century. For them, the balance is tipped in favor of good data – but what does the broader field think about this tradeoff?

The general consensus seems to be that the tradeoff is an active concern that needs to be addressed, but it is not necessarily an either/or situation. As Travis Windlehearth wrote in his dissertation about why and how museums engage with citizen science, “citizen science programs in museums are situated at the intersection of research and education, and as such may have the potential to unite disparate efforts to achieve larger institutional goals.”

At the very least, once volunteers have participated for a while, they learn the methods and content and can hopefully be trusted to contribute quality data. Sometimes this takes the form of a trial period, where data is double-checked and professionals accompany volunteers on each data-collecting mission. This time period is also the easiest place to think about the tradeoff between education and data.

However, there may be some ways to aid that training time without acting like Big Brother to the volunteers. Take, for example, species observation data that so many citizen science programs rely on – instead of requiring species identification alone and trusting their skills, also require a picture. In a letter to the editor to Nature, Jeffrey Parsons and colleagues write “participants should be given the option to report a sighting in terms of observed attributes, eliminating the need to force a (possibly incorrect) classification.” An expert can then look at the photos or descriptions and make sure the classifications are correct.

Sometimes the advantage of citizen science is the sheer amount of data that can be collected. In this case, the expert may not have enough time to double-check all recordings. Snapshot Serengeti, like many of its sister projects in Zooniverse relies on volunteers to check themselves by having each image in the project identified by at least 10 people. Even in hard cases, the group can figure out the taxonomy – and as the program organizer points out, even experts can make mistakes like clicking the wrong button, so this method also cleans up those logistical errors.

The tradeoff rears its head again with deciding what to do once the data is collected and recorded. Data analysis can stymie even professionals, especially when tied to needed knowledge of certain statistical or technical methods. In almost all cases, a professional is needel here as a consultant or guide to the analysis process. However, a few extra perspectives never hurt in this regard and can offer opportunities to interactively educate about additional parts of the scientific process.

Bottom line is that the tradeoff is very much a reality, but prioritizing education does not mean sacrificing data quality. It will mean being creative about methods and process to make sure both goals are met. It will also certainly require more staff time for oversight and volunteer training. The goals of education also take time to consciously foster, be they meeting state curriculum standards or public science literacy goals. Figuring out how those goals link to the needed activiites in a citizen science program also take time, and likely require a person dedicated to curriculum development to be effective. So the tradeoff is not education or data quality, but a matter of where staff time and funding are directed.

We welcome perspectives and experiences from readers who have walked the line between education and data quality.
In addition, if there are myths about citizen science you’d like to see investigated in this monthly series, send them our way!