#PAGE_PARAMS# #ADS_HEAD_SCRIPTS# #MICRODATA#

The CrowdWater game: A playful way to improve the accuracy of crowdsourced water level class data


Autoři: Barbara Strobl aff001;  Simon Etter aff001;  Ilja van Meerveld aff001;  Jan Seibert aff001
Působiště autorů: Department of Geography, University of Zurich, Zurich, Switzerland aff001;  Department of Aquatic Sciences and Assessment, Swedish University of Agricultural Sciences, Uppsala, Sweden aff002
Vyšlo v časopise: PLoS ONE 14(9)
Kategorie: Research Article
prolekare.web.journal.doi_sk: https://doi.org/10.1371/journal.pone.0222579

Souhrn

Data quality control is important for any data collection program, especially in citizen science projects, where it is more likely that errors occur due to the human factor. Ideally, data quality control in citizen science projects is also crowdsourced so that it can handle large amounts of data. Here we present the CrowdWater game as a gamified method to check crowdsourced water level class data that are submitted by citizen scientists through the CrowdWater app. The app uses a virtual staff gauge approach, which means that a digital scale is added to the first picture taken at a site and this scale is used for water level class observations at different times. In the game, participants classify water levels based on the comparison of the new picture with the picture containing the virtual staff gauge. By March 2019, 153 people had played the CrowdWater game and 841 pictures were classified. The average water level for the game votes for the classified pictures was compared to the water level class submitted through the app to determine whether the game can improve the quality of the data submitted through the app. For about 70% of the classified pictures, the water level class was the same for the CrowdWater app and game. For a quarter of the classified pictures, there was disagreement between the value submitted through the app and the average game vote. Expert judgement suggests that for three quarters of these cases, the game based average value was correct. The initial results indicate that the CrowdWater game helps to identify erroneous water level class observations from the CrowdWater app and provides a useful approach for crowdsourced data quality control. This study thus demonstrates the potential of gamified approaches for data quality control in citizen science projects.

Klíčová slova:

Surveys – Peer review – Games – Scientists – Citizen science – Quality control – Computers – Tornadoes


Zdroje

1. Engel SR, Voshell JR. Volunteer biological monitoring: can it accurately assess the ecological condition of streams? Am Entomol. 2002;48:164–77.

2. Cooper CB, Shirk J, Zuckerberg B. The invisible prevalence of citizen science in global research: Migratory birds and climate change. PLoS One. 2014;9(9).

3. Danielsen F, Jensen PM, Burgess ND, Altamirano R, Alviola PA, Andrianandrasana H, et al. A Multicountry Assessment of Tropical Resource Monitoring by Local Communities. Bioscience. 2014;64(3):236–51.

4. Freitag A, Meyer R, Whiteman L. Correction: Strategies Employed by Citizen Science Programs to Increase the Credibility of Their Data. Citiz Sci Theory Pract. 2016;1(1):2.

5. Jennett C, Kloetzer L, Schneider D, Iacovides I, Cox AL, Gold M, et al. Motivations, learning and creativity in online citizen science. J Sci Commun. 2016;15(3):1–23.

6. Bonter DN, Cooper CB. Data validation in citizen science: A case study from Project FeederWatch. Front Ecol Environ. 2012;10(6):305–7.

7. Goodchild MF, Li L. Assuring the quality of volunteered geographic information. Spat Stat. 2012;1:110–20.

8. Wiggins A, Newman G, Stevenson RD, Crowston K. Mechanisms for data quality and validation in citizen science. In: Seventh IEEE International Conference on e-Science Workshops. Stockholm: IEEE; 2011. p. 14–9.

9. Cohn JP. Citizen Science: Can Volunteers Do Real Research? Bioscience. 2008 Mar;58(3):192–7.

10. Bonney R, Cooper CB, Dickinson J, Kelling S, Phillips T, Rosenberg K V., et al. Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy. Bioscience. 2009 Dec;59(11):977–84.

11. Hochachka WM, Fink D, Hutchinson R a, Sheldon D, Wong W-K, Kelling S. Data-intensive science applied to broad-scale citizen science. Trends Ecol Evol. 2012 Feb;27(2):130–7. doi: 10.1016/j.tree.2011.11.006 22192976

12. Walker D, Forsythe N, Parkin G, Gowing J. Filling the observational void: Scientific value and quantitative validation of hydrometeorological data from a community-based monitoring programme. J Hydrol. 2016 Jul;538:713–25.

13. Wiggins A, Crowston K. From Conservation to Crowdsourcing: A Typology of Citizen Science. In: 2011 44th Hawaii International Conference on System Sciences. IEEE; 2011. p. 1–10.

14. Yu J, Kelling S, Gerbracht J, Wong W-K. Automated data verification in a large-scale citizen science project: A case study. 2012 IEEE 8th Int Conf E-Science. 2012;1–8.

15. Bird TJ, Bates AE, Lefcheck JS, Hill N a., Thomson RJ, Edgar GJ, et al. Statistical solutions for error and bias in global citizen science datasets. Biol Conserv. 2014;173:144–54.

16. Edgar GJ, Barrett NS, Morton AJ. Biases associated with the use of underwater visual census techniques to quantify the density and size-structure of fish populations. J Exp Mar Bio Ecol. 2004;308(2):269–90.

17. See L, Sturn T, Perger C, Fritz S, Mccallum I, Salk C. Cropland Capture: A Gaming Approach to Improve Global Land Cover. In: Huerta, Schade, Granell, editors. Connecting a Digital Europe Through Location and Place Proceedings of the AGILE 2014 International Conference on Geographic Information Science. Castellón; 2014. p. 3–6.

18. Sullivan BL, Wood CL, Iliff MJ, Bonney RE, Fink D, Kelling S. eBird: A citizen-based bird observation network in the biological sciences. Biol Conserv. 2009 Oct;142(10):2282–92.

19. Foody GM, See L, Fritz S, Van der Velde M, Perger C, Schill C, et al. Assessing the accuracy of volunteered geographic information arising from multiple contributors to an internet based collaborative project. Trans GIS. 2013;17(6):847–60.

20. Haklay M (Muki), Basiouka S, Antoniou V, Ather A. How Many Volunteers Does it Take to Map an Area Well? The Validity of Linus’ Law to Volunteered Geographic Information. Cartogr J. 2010;47(4):315–22.

21. Koch J, Stisen S. Citizen science: A new perspective to advance spatial pattern evaluation in hydrology. PLoS One. 2017;12(5):1–20.

22. Hennon CC, Knapp KR, Schreck CJ III, Stevens SE, Kossin JP, Thorne PW, et al. Cyclone center: can citizen scientists improve tropical cyclone intensity records? Bull Am Meteorol Soc. 2015;96(4):591–608.

23. Lintott CJ, Schawinski K, Slosar A, Land K, Bamford S, Thomas D, et al. Galaxy Zoo: Morphologies derived from visual inspection of galaxies from the Sloan Digital Sky Survey. Mon Not R Astron Soc. 2008;389(3):1179–89.

24. Swanson A, Kosmala M, Lintott C, Packer C. A generalized approach for producing, quantifying, and validating citizen science data from wildlife images. Conserv Biol. 2016;30(3):520–31. doi: 10.1111/cobi.12695 27111678

25. MacDonald EA, Case NA, Clayton JH, Hall MK, Heavner M, Lalone N, et al. Aurorasaurus: A citizen science platform for viewing and reporting the aurora. Sp Weather. 2015;13(9):548–59.

26. Franzoni C, Sauermann H. Crowd science: The organization of scientific research in open collaborative projects. Res Policy. 2014 Feb;43(1):1–20.

27. Iacovides I, Jennett C, Cornish-Trestrail C, Cox AL. Do games attract or sustain engagement in citizen science? CHI ‘13 Ext Abstr Hum Factors Comput Syst—CHI EA ‘13. 2013;1101.

28. Prestopnik NR, Tang J. Points, stories, worlds, and diegesis: Comparing player experiences in two citizen science games. Comput Human Behav. 2015;52:492–506.

29. Baaden M, Delalande O, Ferey N, Pasquali S, Waldispühl J, Taly A. Ten simple rules to create a serious game, illustrated with examples from structural biology. PLoS Comput Biol. 2018;14(3):1–9.

30. Ponti M, Hillman T, Kullenberg C, Kasperowski D. Getting it Right or Being Top Rank: Games in Citizen Science. Citiz Sci Theory Pract. 2018;3(1):1–12.

31. Schrier K. Knowledge games. Baltimore: Johns Hopkins University Press; 2016. 270 p.

32. Law E, Ahn L von. Input-Agreement: A New Mechanism for Collecting Data Using Human Computation Games. In: Proc CHI’09. 2009. p. 1197–206.

33. von Ahn L. Games with a Purpose. Computer (Long Beach Calif). 2006;39(6):92–4.

34. Salk CF, Sturn T, See L, Fritz S, Perger C. Assessing quality of volunteer crowdsourcing contributions: lessons from the Cropland Capture game. Int J Digit Earth. 2016;9(4):410–26.

35. Kawrykow A, Roumanis G, Kam A, Kwak D, Leung C, Wu C, et al. Phylo: A citizen science approach for improving multiple sequence alignment. PLoS One. 2012;7(3).

36. Cooper S, Khatib F, Treuille A, Barbero J, Lee J, Beenen M, et al. Predicting protein structures with a multiplayer online game. Nature. 2010 Aug 5;466(7307):756–60. doi: 10.1038/nature09304 20686574

37. Horowitz S, Koepnick B, Martin R, Tymieniecki A, Winburn AA, Cooper S, et al. Determining crystal structures through crowdsourcing and coursework. Nat Commun. 2016;7.

38. Curtis V. Online citizen science games: Opportunities for the biological sciences. Appl Transl Genomics. 2014;3(4):90–4.

39. Seibert J, Strobl B, Etter S, Hummer P, van Meerveld HJ. Virtual Staff Gauges for Crowd-Based Stream Level Observations. Front Earth Sci. 2019 Apr 12;7.

40. Pimm SL, Jenkins CN, Abell R, Brooks TM, Gittleman JL, Joppa LN, et al. The biodiversity of species and their rates of extinction, distribution, and protection. Science (80-). 2014;344(6187).

41. Silvertown J, Harvey M, Greenwood R, Dodd M, Rosewell J, Rebelo T, et al. Crowdsourcing the identification of organisms: A case-study of iSpot. Zookeys. 2015;480:125–46.

42. Kampf S, Strobl B, Hammond J, Annenberg A, Etter S, Martin C, et al. Testing the waters: Mobile apps for crowdsourced streamflow data. Eos (Washington DC). 2018;99.

43. Seibert J, van Meerveld HJ, Etter S, Strobl B, Assendelft R, Hummer P. Wasserdaten sammeln mit dem Smartphone–Wie können Menschen messen, was hydrologische Modelle brauchen? Hydrol und Wasserbewirtschaftung. 2019;63(2).

44. Etter S, Strobl B, Seibert J, van Meerveld I. Value of uncertain streamflow observations for hydrological modelling. Hydrol Earth Syst Sci. 2018;22:5243–57.

45. Strobl B, Etter S, van Meerveld I, Seibert J. Accuracy of crowdsourced streamflow and stream level class estimates. Hydrol Sci J. 2019 Mar 29;(Special issue on hydrological data: opportunities and barriers):1–19.

46. Crowston K, Prestopnik NR. Motivation and data quality in a citizen science game: A design science evaluation. Proc Annu Hawaii Int Conf Syst Sci. 2013;450–9.

47. Good BM, Su AI. Crowdsourcing for bioinformatics. Bioinformatics. 2013;29(16):1925–33. doi: 10.1093/bioinformatics/btt333 23782614

48. Ponciano L, Brasileiro F, Simpson R, Smith A. Volunteers’ engagement in human computation for astronomy projects. Comput Sci Eng. 2014;16(6):52–9.

49. Prestopnik N, Crowston K, Wang J. Gamers, citizen scientists, and data: Exploring participant contributions in two games with a purpose. Comput Human Behav. 2017;68:254–68.

50. Sauermann H, Franzoni C. Crowd science user contribution patterns and their implications. Proc Natl Acad Sci. 2015;112(3):679–84. doi: 10.1073/pnas.1408907112 25561529

51. Tinati R, Luczak-roesch M, Simperl E, Hall W. “Because Science is Awesome”: Studying Participation in a Citizen Science Game. ACM Web Sci 2016. 2016;45–54.

52. Jiménez M, Triguero I, John R. Handling uncertainty in citizen science data: Towards an improved amateur-based large-scale classification. Inf Sci (Ny). 2019 Apr;479:301–20.

53. Mavandadi S, Dimitrov S, Feng S, Yu F, Sikora U, Yaglidere O, et al. Distributed medical image analysis and diagnosis through crowd-sourced games: A malaria case study. PLoS One. 2012;7(5):1–8.

54. Waldner F, Schucknecht A, Lesiv M, Gallego J, See L, Pérez-Hoyos A, et al. Conflation of expert and crowd reference data to validate global binary thematic maps. Remote Sens Environ. 2019;221:235–46.

55. Surowiecki J. The wisdom of crowds: why the many are smarter than the few and how collective wisdom shapes business, economies, societies and nations. London, UK: Little Brown; 2004.

56. See L, Comber A, Salk C, Fritz S, van der Velde M, Perger C, et al. DONT USE! Comparing the Quality of Crowdsourced Data Contributed by Expert and Non-Experts. PLoS One. 2013;8(7):1–11.

57. Luengo-Oroz MA, Arranz A, Frean J. Crowdsourcing malaria parasite quantification: An online game for analyzing images of infected thick blood smears. J Med Internet Res. 2012;14(6):1–13.

58. Michelucci P. Science of Stall Catchers: Our new Magic Number [Internet]. EyesonAlz. 2017 [cited 2018 Nov 20]. Available from: https://blog.eyesonalz.com/our-new-magic-number/

59. Michelucci P. Validated Dynamic Consensus Approach for Citizen Science Projects Employing Crowd-based Detection Tasks. In: Presentation at the Citizen Science Association Conference 2017. Saint Paul, Minnesota; 2017.

60. Michelucci P, Dickinson JL. The power of crowds. Science (80-). 2016;351(6268):32–3.


Článok vyšiel v časopise

PLOS One


2019 Číslo 9
Najčítanejšie tento týždeň
Najčítanejšie v tomto čísle
Kurzy

Zvýšte si kvalifikáciu online z pohodlia domova

Aktuální možnosti diagnostiky a léčby litiáz
nový kurz
Autori: MUDr. Tomáš Ürge, PhD.

Všetky kurzy
Prihlásenie
Zabudnuté heslo

Zadajte e-mailovú adresu, s ktorou ste vytvárali účet. Budú Vám na ňu zasielané informácie k nastaveniu nového hesla.

Prihlásenie

Nemáte účet?  Registrujte sa

#ADS_BOTTOM_SCRIPTS#