Reliably recruiting participants who have programming skills is an ongoing challenge for empirical studies involving software development technologies, often leading to the use of crowdsourcing platforms and computer science (CS) students. In this work, we use five existing survey instruments to explore the programming skills, privacy and security attitudes, and secure development self-efficacy of University CS student participants and participants from four crowdsourcing platforms (Appen, Clickworker, MTurk, and Prolific).
We recruited 613 participants who claimed to have programming skills and assessed recruitment channels in regards to costs, quality, programming skills, and privacy/security attitudes. We find that 27% of crowdsourcing participants, 40% of self-reported developers from crowdsourcing participants, and 89% of CS students got all programming skill questions correct. CS students are the cheapest recruitment channel and rate themselves lower than crowdsourcing participants in terms of secure development self-efficacy.
This dataset incudes the data from the 613 participants that we recruited for our study.
Tahaei, Mohammad; Vaniea, Kami. (2021). Recruiting Participants With Programming Skills: A Comparison of Four Crowdsourcing Platforms and a CS Student Mailing List, [dataset]. University of Edinburgh. School of Informatics. https://doi.org/10.7488/ds/3184.