Quality and timing of crowd-based water level class observations
- Simon Etter,
- Barbara Strobl,
- Ilja van Meerveld,
- Jan Seibert
Ilja van Meerveld
University of Zurich Faculty of Science
Author ProfileAbstract
Crowd-based hydrological observations can supplement existing monitoring
networks and allow data collection in regions where otherwise no data
would be available. In the citizen science project CrowdWater, repeated
water level observations using a virtual staff gauge approach result in
time series of water level classes. To investigate the quality of these
observations, we compared the water level class data for a number of
locations where water levels were also measured and assessed when these
observations were submitted. We analysed data for nine locations where
citizen scientists reported multiple observations using a smartphone app
and stream level data were also available. At twelve other locations,
signposts were set up to ask citizens to record observations on a form
that could be left in a letterbox. The results indicate that the quality
of the data collected with the app was higher than for the forms. A
possible explanation is that for each app location, most contributions
were made by a single person, whereas at the locations of the forms
almost every observation was made by a new contributor. On average, more
contributions were made between May and September than during the other
months. Observations were submitted for a range of flow conditions, with
a higher fraction of high flow observations for the data collected with
the app. Overall, the results are encouraging for citizen science
approaches in hydrology and demonstrate that the smartphone application
with its virtual staff gauge is a promising approach for crowd-based
water level class observations.20 Feb 2020Submitted to Hydrological Processes 21 Feb 2020Submission Checks Completed
21 Feb 2020Assigned to Editor
21 Feb 2020Reviewer(s) Assigned
16 Apr 2020Review(s) Completed, Editorial Evaluation Pending
16 Apr 2020Editorial Decision: Revise Major
29 May 20201st Revision Received
30 May 2020Submission Checks Completed
30 May 2020Assigned to Editor
30 May 2020Reviewer(s) Assigned
02 Jul 2020Review(s) Completed, Editorial Evaluation Pending
03 Jul 2020Editorial Decision: Accept