Tuesday, September 27, 2016
Week 4: 9/20/2016 - 9/27/2016
This week I finished up my review of the paper "Gaze-tracked Crowdsourcing". From the paper I realized the importance of gaze-tracking as an implicit feedback source that other methods cannot provide (i.e. cursor tracking, clicking, scrolling, etc). While the main take-away from the paper was using eye-tracking to identify sense-distinguishing words to enrich data sets, there was also some information about insight to user confidence. The study was able to detect confidence in user answers and even get some indication as to which answers were going to be correct or not, based upon reading behavior from gaze-data. We discussed our own study and decided that user confidence is something we would also like to record and analyze for our tasks. This week I also began on some experiment design. We came up with our main tasks, which involves having users pick tags based on various presented postings from Stack Overflow. We are going to use C++ questions that range from simple to complex to use for our tasks. In the coming week I plan to continue to build upon this list of options for questions so we can decide on our final pool of tasks for our experiment.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment