Learn or Earn? - Intelligent Task Recommendation for Competitive Crowdsourced Software Development
Learn or Earn? - Intelligent Task Recommendation for Competitive Crowdsourced Software Development
dc.contributor.author | Karim, Muhammad Rezaul | |
dc.contributor.author | Yang, Ye | |
dc.contributor.author | Messinger, David | |
dc.contributor.author | Ruhe, Guenther | |
dc.date.accessioned | 2017-12-28T02:19:01Z | |
dc.date.available | 2017-12-28T02:19:01Z | |
dc.date.issued | 2018-01-03 | |
dc.description.abstract | Background: Competitive crowdsourced development encourages online software developers to register for tasks offered on the crowdsourcing platform and implement them in a competitive mode. As a large number of tasks are uploaded daily, the scenery of competition is changing continuously. Without appropriate decision support, online developers often make task decisions in an ad hoc and intuitive manner. Aims: To provide dynamic decision support for crowd developers to select the task that fit best to their personal learning versus earning objectives, taking into account the actual competitiveness situation. Method: We propose a recommendation system called EX2 ("EX-Square") that combines both explorative ("learn") and exploitative ("earn") search for tasks, based on a systematic analysis of workers preference patterns, technologies hotness, and the projection of winning chances. The implemented prototype allows dynamic recommendations that reflect task updates and competition dynamics at any given time. Results: Based on evaluation from 4007 tasks monitored over a period of 2 years, we show that EX2 can explore and adjust task recommendations corresponding to context changes, and individual learning preferences of workers. A survey was also conducted with 14 actual crowd workers, showing that intelligent decision support from EX2 is considered useful and valuable. Conclusions: With support from EX2, workers benefit from the tool from getting customized recommendations, and the platform provider gets a higher chance to better cover the breadth of technology needs in case recommendations are taken. | |
dc.format.extent | 10 pages | |
dc.identifier.doi | 10.24251/HICSS.2018.700 | |
dc.identifier.isbn | 978-0-9981331-1-9 | |
dc.identifier.uri | http://hdl.handle.net/10125/50589 | |
dc.language.iso | eng | |
dc.relation.ispartof | Proceedings of the 51st Hawaii International Conference on System Sciences | |
dc.rights | Attribution-NonCommercial-NoDerivatives 4.0 International | |
dc.rights.uri | https://creativecommons.org/licenses/by-nc-nd/4.0/ | |
dc.subject | Frontiers in AI and Software Engineering | |
dc.subject | Crowdsourced Software Development Task Recommendations Learn Earn Machine Learning | |
dc.title | Learn or Earn? - Intelligent Task Recommendation for Competitive Crowdsourced Software Development | |
dc.type | Conference Paper | |
dc.type.dcmi | Text |
Files
Original bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- paper0702.pdf
- Size:
- 654.12 KB
- Format:
- Adobe Portable Document Format
- Description: