Given Enough Eyeballs, all Bugs are Shallow - A Literature Review for the Use of Crowdsourcing in Software Testing

dc.contributor.author Leicht, Niklas
dc.date.accessioned 2017-12-28T02:00:15Z
dc.date.available 2017-12-28T02:00:15Z
dc.date.issued 2018-01-03
dc.description.abstract Over the last years, the use of crowdsourcing has gained a lot of attention in the domain of software engineering. One key aspect of software development is the testing of software. Literature suggests that crowdsourced software testing (CST) is a reliable and feasible tool for manifold kinds of testing. Research in CST made great strides; however, it is mostly unstructured and not linked to traditional software testing practice and terminology. By conducting a literature review of traditional and crowdsourced software testing literature, this paper delivers two major contributions. First, it synthesizes the fields of crowdsourcing research and traditional software testing. Second, the paper gives a comprehensive overview over findings in CST-research and provides a classification into different software testing types.
dc.format.extent 10 pages
dc.identifier.doi 10.24251/HICSS.2018.515
dc.identifier.isbn 978-0-9981331-1-9
dc.identifier.uri http://hdl.handle.net/10125/50404
dc.language.iso eng
dc.relation.ispartof Proceedings of the 51st Hawaii International Conference on System Sciences
dc.rights Attribution-NonCommercial-NoDerivatives 4.0 International
dc.rights.uri https://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subject Crowd Science
dc.subject crowdsourcing, crowdtesting, literature review, software testing
dc.title Given Enough Eyeballs, all Bugs are Shallow - A Literature Review for the Use of Crowdsourcing in Software Testing
dc.type Conference Paper
dc.type.dcmi Text
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
paper0517.pdf
Size:
434.95 KB
Format:
Adobe Portable Document Format
Description:
Collections