The pervasiveness of sensor-rich mobile devices enables citizens to contribute to sensing and data collection efforts by participating in citizen science and communitysensing. While millions of people can potentially contribute, projects mainly rely on dedicated volunteers and many fail to garner widespread participation. Despite technical advances and a handful of success stories, the number of people actively engaged in participatory communitysensing and citizen science projects today is a tiny percentage of the more than one hundred million Americans on the go each day, 58% of whom carry a smartphone. This collective human mobility is a valuable potential resource, yet we do not have comprehensive tools and methods to utilize it. For people who are commuting to work or school, exercising, or spending time with friends, the effort required to make a contribution to a project can far exceed the perceived benefit of contributing. Even opening an app and submitting a short report may discourage contributions. As a consequence, projects mainly rely on a much smaller crowd of dedicated volunteers. Another approach collects sensor data passively through device-carrying participants. While this has proved useful for some applications such as tracking traffic patterns, it cannot be used for tasks that require human sensing capabilities, e.g., recognizing cracks in the sidewalk or places with interesting architecture.

We take an alternative approach to consider the use of low-effort interactions that support people contributing to physical crowdsourcing while on the go, through their daily routines. We introduce TapShare, a mobile application that allows people to collectively track objects or events without looking at their phone by using physical gestures, such as a double tap. Users first join an existing tracking effort or start their own.

TapShare’s main contribution is an eyes-fee input modality that ties physical gesture recognition to location tracking for the purpose of reporting an event with minimal effort. TapShare takes as input physical interactions such as taps that can be accurately detected using smartphone sensors. A user can make a report without looking at their phone, without taking out their phone, and without stopping. Compared to existing communitysensing applications that require users to interact with a visual interface on their phone to make a report, we hypothesize that TapShare significantly lowers the amount of time required to make a report. While the collected data may be lower in fidelity (e.g., consists of only event-location pairs and not photos or text descriptions), the ease and speed of reporting enables lightweight contributions through people’s existing routine, with the potential to broaden participation to a larger, mobile crowd. The main technical challenge was designing the kinds of low-effort interactions that can be used for data collection. We overcame these challenges by making several design choices that involved building and testing gestures that can be reliably detected by mobile sensors and that allow for fast contributions without disrupting the user’s mobility and effective methods of feedback (auditory, tactile).

We performed a measurement study using benchmark tests with TapShare, one to evaluate the speed of making a report and another to evaluate the accuracy of the location data collected. We found that reporting with TapShare took an average of two seconds, which makes it possible for someone to report on the go albeit at a lower fidelity. For distance, we found that the distances recorded were within a reasonable error range of the iPhone GPS accuracy. We additionally conducted a usability study with a standalone iOS application implementing TapShare. Participants were asked to download TapShare onto their phones and track items or events of interest for five days. From log data we found that TapShare allowed users to make a large number of reports in a short period of time. People typically tracked items of personal significance while on-the-go and found the double tapping gesture easy to learn, low-effort, and fun. Both studies showed that users can make reports accurately without stopping, with minimal effort, and without having to take out their phone.

TapShare image 1

Figure 1: Map of TapShare user reports during the pilot study.

TapShare image 2

Figure 2: TapShare users can add something to track or choose an existing effort to contribute to (Left). TapShare's reporting screen provides information about the reporting gesture and feedback (Center). Users can see their contributions alongside those of other users (Right).



  • None

Ph.D. Students

  • None

Masters and Undergraduate Students

  • 🎓 Nicole Zhu
  • 🎓 Stephen Chan