Beyond the Medium: Rethinking Information Literacy through Crowdsourced Analysis

dc.contributor.authorBoichak, Olga
dc.contributor.authorCanzonetta, Jordan
dc.contributor.authorSitaula, Niraj
dc.contributor.authorMcKernan, Brian
dc.contributor.authorTaylor, Sarah
dc.contributor.authorRossini, Patricia
dc.contributor.authorClegg, Benjamin
dc.contributor.authorKenski, Kate
dc.contributor.authorMartey, Rosa
dc.contributor.authorMcCracken, Nancy
dc.contributor.authorØsterlund, Carsten
dc.contributor.authorMyers, Roc
dc.contributor.authorFolkestad, James
dc.contributor.authorStromer-Galley, Jennifer
dc.date.accessioned2019-01-02T23:41:13Z
dc.date.available2019-01-02T23:41:13Z
dc.date.issued2019-01-08
dc.description.abstractInformation literacy encompasses a range of information evaluation skills for the purpose of making judgments. In the context of crowdsourcing, divergent evaluation criteria might introduce bias into collective judgments. Recent experiments have shown that crowd estimates can be swayed by social influence. This might be an unanticipated effect of media literacy training: encouraging readers to critically evaluate information falls short when their judgment criteria are unclear and vary among social groups. In this exploratory study, we investigate the criteria used by crowd workers in reasoning through a task. We crowdsourced evaluation of a variety of information sources, identifying multiple factors that may affect individual's judgment, as well as the accuracy of aggregated crowd estimates. Using a multi-method approach, we identified relationships between individual information assessment practices and analytical outcomes in crowds, and propose two analytic criteria, relevance and credibility, to optimize collective judgment in complex analytical tasks.
dc.format.extent10 pages
dc.identifier.doi10.24251/HICSS.2019.051
dc.identifier.isbn978-0-9981331-2-6
dc.identifier.urihttp://hdl.handle.net/10125/59482
dc.language.isoeng
dc.relation.ispartofProceedings of the 52nd Hawaii International Conference on System Sciences
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subjectCrowd-enhanced Technologies for Improving Reasoning and Solving Complex Problems
dc.subjectCollaboration Systems and Technologies
dc.subjectCredibility, crowdsourcing, information assessment, information literacy, relevance
dc.titleBeyond the Medium: Rethinking Information Literacy through Crowdsourced Analysis
dc.typeConference Paper
dc.type.dcmiText

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
0042.pdf
Size:
472.87 KB
Format:
Adobe Portable Document Format