Vivek Venkatachalam, Marcelo Nery Dos Santos, Harry Emil, Microsoft
Software test teams around the world are grappling with the problem of testing increasingly complex software with smaller budgets and tighter deadlines. In this tough environment, crowdsourced testing can (and does) play a critical role in the overall test mission of delivering a quality product to the customer.
At Microsoft, we firmly believe in using internal crowd sourced1 testing to help us reach high levels of test and scenario coverage. To achieve this goal, we use the dogfood program where employees volunteer to use pre-release versions of our products and give us their feedback on a regular basis. However, for a volunteer based crowdsourced testing effort to be really effective, one needs the ability to direct the crowd to exercise certain scenarios more than others and the ability to adjust this mix on demand. What if one could devise a mechanism that provides the right incentives for the crowd to adopt the desired behaviors ime?
This paper describes how we conceived of, designed and implemented Application Monitor, a tool that runs on a user’s machine and allows us to detect usage patterns of Microsoft Lync2 ( the software product that this team of authors worked on) in near real-time. The paper then describes a simple game we incorporated into the tool with the goal of making it fun for the crowd. The game also provided us with the ability to direct their efforts to test high-risk features, by appropriately changing incentives.
One learning point was that even crowd behaviors that attempted to game the system served the ultimate purpose, which was to increase testing of specific scenarios. The paper will discuss this and other takeaways as well as point out key issues that other teams wishing to start similar efforts should consider.
Vivek Venkatachalam, Marcelo Nery Dos Santos, Harry Emil, 2011 Technical Paper, Abstract, Paper, Slides