Crowds as Sensors. In many scenarios, crowd members report various aspects of the physical world. They take the role of sensors. How do we build systems to capture this role of a crowd, in addition to its participants being actuators and controllers?
Crowds as Networks. When dealing with situations such as disaster management, we can model geographically co-located people as networks. The emerging field of network science will be useful for answering several interesting questions related to community detection, expertise identification, and routing communication.
Quality Assurance. How is crowdsourcing going to face the challenges in quality assurance, while providing valuable incentive frameworks that enable honest contributions? An important consideration is the impact of the QA on the cost of the crowdsourcing solution. For example, especially in the low-cost disaster environments when crowdsourcing is introduced to masses of volunteers, one needs to control the cost arising from the quality and trust mechanisms.
Incentives. Incentives are a key to success or failure of the crowdsourcing activity. How do we differentiate between the incentives for an individual task, in comparison to a group-collaborative activity?
Security and Privacy. The heterogeneity of wireless network protocols used by the large variety of network connected hardware and software sensors providing crowdsourcing data increases the risk of security compromises. Furthermore, crowdsourcing systems may gather, collate and distribute personal information about individuals. It is essential that users have means for retaining control over the distribution and dissemination of their private information.
- A theme paper/ journal publication with contribution from participants, distilling the requirements for the platform for crowd computing platform; derived from discussions and emerging applications.
- The authors will be encouraged to incorporate feedback from the workshop, and revise their submissions towards shaping the chapters for a book on “Scientific Foundations of a Crowd Computing Platform”.
First International Workshop on Ubiquitous Crowdsourcing at UbiComp in 2010, attracted 12 participants (including 2 invited keynotes, one from a commercial lab in USA, and one from Czech Republic) with the acceptance rate of 83% (5 out of 6).
1. Invited talks (one industry and one academic keynote)
2. 6-9 papers: including research results, position papers and experiences from deployment and usage of crowdsourcing services.
Estimated 15-20 attendees, with an inter-disciplinary background, ranging from application domain experts (e.g. healthcare, disaster management, etc.), to technical experts (e.g. security for ubiquitous systems), and software engineers and researchers both from academia and industry.
Acceptance ratio is not to exceed 70%.
Registration will be open to all interested parties.
WORKSHOP ORGANIZATION TIMELINE
- May 2, 2011: Distribution of Workshop CFP
- June 25, 2011: Submission deadline
- July 1, 2011: Notification of acceptance
- July 11, 2011: Camera Ready Accepted Papers Due
- Sep 18, 2011: Ubicomp 2011 Workshop Program
EVALUATION OF SUBMISSIONS
Participants will be selected based on short 4-page papers and demonstrations around the aforementioned topics of interest. We will evaluate submissions on the following criteria, considering both theoretical and applied contributions:
1. Papers that describe interesting novel crowdsourcing applications.
2. Papers with the potential to generate stimulating discussions and spur interest within the research community.
3. Novelty and originality of work.
4. Interesting theoretical insight and those drawn from deployment of real-world crowdsourcing systems.