default search action
HCOMP 2013: Palm Springs, CA, USA - Works in Progress / Demos
- Human Computation and Crowdsourcing: Works in Progress and Demonstration Abstracts, An Adjunct to the Proceedings of the First AAAI Conference on Human Computation and Crowdsourcing, November 7-9, 2013, Palm Springs, CA, USA. AAAI Technical Report WS-13-18, AAAI 2013
Works in Progress
- Omar Alonso, Catherine C. Marshall, Marc Najork:
A Human-Centered Framework for Ensuring Reliability on Crowdsourced Labeling Tasks. - Obinna Anya, Melissa Cefkin, Steve Dill, Robert Moore, Susan U. Stucky, Osarieme Omokaro:
Making Crowdwork Work: Issues in Crowdsourcing for Organizations. - Alya Asarina, Olga Simek:
Using Crowdsourcing to Generate an Evaluation Dataset for Name Matching Technologies. - Yukino Baba, Hisashi Kashima:
Statistical Quality Estimation for General Crowdsourcing Tasks. - Julien Bourdaillet, Shourya Roy, Gueyoung Jung, Yu-An Sun:
Crowdsourcing Translation by Leveraging Tournament Selection and Lattice-Based String Alignment. - L. Elisa Celis, Shourya Roy, Vivek Mishra:
Lottery-Based Payment Mechanism for Microtasks. - Isabel Cenamor, Tomás de la Rosa, Daniel Borrajo:
OnDroad Planner: Building Tourist Plans Using Traveling Social Network Information. - Eric Colson:
Using Human and Machine Processing in Recommendation Systems. - Siamak Faridani, Georg Buscher:
LabelBoost: An Ensemble Model for Ground Truth Inference Using Boosted Trees. - Siamak Faridani, Georg Buscher, Ya Xu:
A Ground Truth Inference Model for Ordinal Crowd-Sourced Labels Using Hard Assignment Expectation Maximization. - Sandy J. J. Gould, Anna Louise Cox, Duncan P. Brumby:
Frequency and Duration of Self-Initiated Task-Switching in an Online Investigation of Interrupted Performance. - Sandy J. J. Gould, Anna Louise Cox, Duncan P. Brumby, Sarah Wiseman:
Assessing the Viability of Online Interruption Studies. - Kshanti Greene, Thomas Young:
Human Stigmergy in Augmented Environments. - Daniel Haas, Matthew Greenstein, Kainar Kamalov, Adam Marcus, Marek Olszewski, Marc Piette:
Reducing Error in Context-Sensitive Crowdsourced Tasks. - Annika Hämäläinen, Fernando Pinto Moreira, Jairo Avelar, Daniela Braga, Miguel Sales Dias:
Transcribing and Annotating Speech Corpora for Speech Recognition: A Three-Step Crowdsourcing Approach with Quality Control. - Kotaro Hara, Jin Sun, Jonah Chazan, David W. Jacobs, Jon Froehlich:
An Initial Study of Automatic Curb Ramp Detection with Crowdsourced Verification Using Google Street View Images. - Srinivasan Iyengar, Shirish Karande, Sachin Lodha:
English to Hindi Translation Protocols for an Enterprise Crowd. - Andrey Kolobov, Mausam, Daniel S. Weld:
Joint Crowdsourcing of Multiple Tasks. - Markus Krause:
GameLab: A Tool Suit to Support Designers of Systems with Homo Ludens in the Loop. - Walter Stephen Lasecki, Jeffrey Philip Bigham:
Automated Support for Collective Memory of Conversational Interactions. - Kristina Lerman, Tad Hogg:
Using Visibility to Control Collective Attention in Crowdsourcing. - Mieke H. R. Leyssen, Jacco van Ossenbruggen, Arjen P. de Vries, Lynda Hardman:
Manipulating Social Roles in a Tagging Environment. - Christopher H. Lin, Mausam, Daniel S. Weld:
Towards a Language for Non-Expert Specification of POMDPs for Crowdsourcing. - Sarah K. K. Luger, Jeff Bowles:
Two Methods for Measuring Question Difficulty and Discrimination in Incomplete Crowdsourced Data. - Pallavi Manohar, Shourya Roy:
Crowd, the Teaching Assistant: Educational Assessment Crowdsourcing. - Toshiko Matsui, Yukino Baba, Toshihiro Kamishima, Hisashi Kashima:
Crowdsourcing Quality Control for Item Ordering Tasks. - Jonathan Mortensen, Mark A. Musen, Natalya Fridman Noy:
Ontology Quality Assurance with the Crowd. - Peter Organisciak, Jaime Teevan, Susan T. Dumais, Robert C. Miller, Adam Tauman Kalai:
Personalized Human Computation. - Satoshi Oyama, Yukino Baba, Yuko Sakurai, Hisashi Kashima:
EM-Based Inference of True Labels Using Confidence Judgments. - Lesandro Ponciano, Francisco Vilar Brasileiro, Guilherme Gadelha:
Task Redundancy Strategy Based on Volunteers' Credibility for Volunteer Thinking Projects. - Jeffrey M. Rzeszotarski, Ed H. Chi, Praveen K. Paritosh, Peng Dai:
Inserting Micro-Breaks into Crowdsourcing Workflows. - Harini Alagarai Sampath, Rajeev Rajeshuni, Bipin Indurkhya, Saraschandra Karanam, Koustuv Dasgupta:
Effect of Task Presentation on the Performance of Crowd Workers - A Cognitive Study. - Preetjot Singh, Walter S. Lasecki, Paulo Barelli, Jeffrey P. Bigham:
HiveMind: Tuning Crowd Response with a Single Value. - Evgueni N. Smirnov, Hua Zhang, Ralf Peeters, Nikolay I. Nikolaev, Maike Imkamp:
Aggregating Human-Expert Opinions for Multi-Label Classification. - Kartik Talamadupula, Subbarao Kambhampati, Yuheng Hu, Tuan Anh Nguyen, Hankz Hankui Zhuo:
Herding the Crowd: Automated Planning for Crowdsourced Planning. - Hilary K. Thorsen, Maria Cristina Pattuelli:
Designing a Crowdsourcing Tool to Analyze Relationships Among Jazz Musicians: The Case of Linked Jazz 52nd Street. - Beth Trushkowsky, Tim Kraska, Michael J. Franklin:
A Framework for Adaptive Crowd Query Processing. - Ming-Hung Wang, Kuan-Ta Chen, Shuo-Yang Wang, Chin-Laung Lei:
Understanding Potential MicrotaskWorkers for Paid Crowdsourcing. - Shuo-Yang Wang, Ming-Hung Wang, Kuan-Ta Chen:
Boosting OCR Accuracy Using Crowdsourcing. - Michael Peter Weingert, Kate Larson:
TrailView: Combining Gamification and Social Network Voting Mechanisms for Useful Data Collection. - Ming Yin, Yiling Chen, Yuan Sun:
Task Sequence Design: Evidence on Price and Difficulty.
Demonstrations
- Elizabeth Brem, Tyler Bick, Andrew W. Schriner, Daniel B. Oerther:
Wanted: More Nails for the Hammer - An Investigation Into the Application of Human Computation. - Lydia B. Chilton, Felicia Cordeiro, Daniel S. Weld, James A. Landay:
Frenzy: A Platform for Friendsourcing. - Stephen Dill, Robert Kern, Erika Flint, Melissa Cefkin:
The Work Exchange: Peer-to-Peer Enterprise Crowdsourcing. - Yi-Ching Huang, Chun-I Wang, Shih-Yuan Yu, Jane Yung-jen Hsu:
In-HIT Example-Guided Annotation Aid for Crowdsourcing UI Components. - Ravi Iyer:
Crowdsourcing Objective Answers to Subjective Questions Online. - Vasilis Kandylas, Omar Alonso, Shiroy Choksey, Kedar Rudre, Prashant Jaiswal:
Automating Crowdsourcing Tasks in an Industrial Environment. - Juho Kim, Haoqi Zhang, Paul André, Lydia B. Chilton, Anant P. Bhardwaj, David R. Karger, Steven P. Dow, Robert C. Miller:
Cobi: Community-Informed Conference Scheduling. - Edith Law, Conner Dalton, Nick Merrill, Albert Young, Krzysztof Z. Gajos:
Curio: A Platform for Supporting Mixed-Expertise Crowdsourcing. - Alex Limpaecher, Nicolas Feltman, Adrien Treuille, Michael F. Cohen:
Real-Time Drawing Assistance through Crowdsourcing. - Arfon M. Smith, Stuart Lynn, Chris J. Lintott:
An Introduction to the Zooniverse.
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.