Xia, Huichuan2021-12-242021-12-242022-01-04978-0-9981331-5-7http://hdl.handle.net/10125/79885Crowd work platforms such as MTurk have been leveraged by academic scholars to conduct research and collect data. Though prior studies have discussed data quality and validity issues in crowd work via surveys and experiments, they kind of neglected to explore the scholars’ and particularly the IRB’s ethical concerns in these respects. In this study, we interviewed 17 scholars in six different disciplines and 15 IRB directors/analysts in the U.S. to fill this research gap. We identified common themes among our respondents but also discovered distinctive and even opposing views regarding the approval rate, rejection, internal and external research validity. Based on the findings, we discussed a potential Tragedy of the Commons regarding the data quality deterioration and the disciplinary differences regarding validity in crowd work-based research. Finally, we advocated that the IRB’s ethical concerns should be heard and respected.10 pagesengAttribution-NonCommercial-NoDerivatives 4.0 InternationalCrowdsourcing and Digital Workforce in the Gig Economyethics; crowd workcrowd work-based researchdata qualityresearch validityTragedy of the Commons - A Critical Study of Data Quality and Validity Issues in Crowd Work-Based Researchtext10.24251/HICSS.2022.548