Do you see what I see? Quantifying inter-observer variability in an intertidal marine citizen science experiment

Hannah Earp, Siobhan Vye, Katrin Bohn, Michael Burrows, Jade Chenery, Stephanie Dickens, Charlotte Foster, Hannah Grist, Peter Lamont, Sarah Long, Zoe Morrall, Jacqueline Pocklington, Abigail Scott, Gordon Watson, Victoria West, Stuart Jenkins, Jane Delany, Heather Sugden

    Research output: Contribution to journalArticlepeer-review

    60 Downloads (Pure)

    Abstract

    Citizen science represents an effective means of collecting ecological data; however, the quality/reliability of these data is often questioned. Quality assurance procedures are therefore important to determine the validity of citizen science data and to promote confidence in conclusions. Here, data generated by a marine citizen science project conducted at 12 sites across the United Kingdom was used to investigate whether the use of a simple, low-taxonomic-resolution field-monitoring protocol allowed trained citizen scientists to generate data comparable to those of professional scientists. To do this, differences between field estimates of algal percentage cover generated by different observer units (i.e., trained citizen scientists, professional scientists, and combined units), and digitally derived baseline estimates were examined. The results show that in the field, citizen scientists generated data similar to those of professional scientists, demonstrating that training, coupled with the use of a simple, low-taxonomic-resolution protocol can allow citizen scientists to generate robust datasets in which variability likely represents ecological variation/change as opposed to observer variation. The results also show, irrespective of observer unit, that differences between field and digital baseline estimates of algal percentage cover were greatest in plots with medium levels of algal cover, highlighting that additional/enhanced training for all participants could be beneficial in this area. The approach presented can serve as a guide for existing and future projects with similar protocols to assess their data quality, to strengthen participant training/protocols, and ultimately to promote the incorporation of robust citizen science datasets into environmental research and management.
    Original languageEnglish
    Article number12
    Pages (from-to)1-13
    Number of pages13
    JournalCitizen Science: Theory and Practice
    Volume7
    Issue number1
    DOIs
    Publication statusPublished - 4 May 2022

    Keywords

    • data verification
    • data accuracy
    • public participation
    • volunteer
    • Coral Point Count
    • temperate rocky shore

    Fingerprint

    Dive into the research topics of 'Do you see what I see? Quantifying inter-observer variability in an intertidal marine citizen science experiment'. Together they form a unique fingerprint.

    Cite this