About the Event
Crowdsourcing---outsourcing tasks to a crowd of workers (e.g. Amazon Mechanical Turk, peer grading for massive open online courseware (MOOCs), scholarly peer review, and Yahoo answers)---is a fast, cheap, and effective method for performing simple tasks even at large scales. Two central problems in this area are:
1) Information Elicitation: how to design reward systems that incentivize high quality feedback from agents; and
2) Information Aggregation: how to aggregate the collected feedback to obtain a high quality forecast.
Recently, great process has been made in crowdsourcing, especially in the situation where there is no ground truth. However, the techniques used in this literature are ad hoc and sometimes lack deep intuition. Moreover, the literature also lacks a deep connection between information elicitation and information aggregation.
The combination of game theory and learning theory has made innovative progress (e.g. Generative Adversarial Networks) recently. A central contention of this thesis is that the combination of game theory, information theory, and learning theory will bring a unified framework to both of the central problems in crowdsourcing area. Thus, this thesis is aiming to build several innovative connections among game theory, information theory, and learning theory and use the connections to build a unified framework and make new progress in crowdsourcing.