Laura Bright & Anand Iyer, McAfee
Test automation poses many challenges. The automation tools and framework need to be robust to ensure that tests are executed in a timely manner and results are accurate. The choice of automation tools is important. UI-based tools are useful to test the end-user experience and execute functionality not testable through the back-end, but may be slow and error-prone. Internally developed back-end tools are useful to test underlying functionality and are robust to UI changes and unexpected errors, but need to be thoroughly tested and maintained to ensure they test all required functionality and report accurate results. Successfully automating a test suite requires determining which automation tool is most appropriate for each test case, and ensuring that the testing tools are sufficiently accurate and robust to implement each test case. Seamlessly integrating automation tools and components and achieving full end-to-end automation present additional challenges. A final challenge is establishing best practices and coordinating efforts across multiple geographic locations.
This poster presents our experiences addressing these challenges in an end-to-end automation framework. Our framework includes a back-end automation tool (VTAF) developed internally using C#, and a widely used UI automation tool, QTP. Our framework integrates with MAGI, a company-wide test automation framework. The MAGI framework includes mechanisms to automatically launch tests on new builds, build the test rig, execute tests, and report results to a web server. The framework also includes a shared code base maintained by a core team and leveraged by all automation teams within the organization. We present an overview of the design of our automation framework and present the processes we established to implement this framework. We treat the entire automation framework as a product and follow a well-defined software process for developing and maintaining this product, including all of the following:
- Automation auditing
- We established guidelines to determine which test cases are automatable and criteria to determine which automation tool (UI or back-end) is most appropriate for each test case. Many factors affect the preferred automation tool including the capabilities of each tool and the functionality being tested.
- Code and script reviews
- All changes to the VTAF code are reviewed by other team members to ensure their accuracy and that best practices are being followed. In addition, test scripts are reviewed by other QA engineers to ensure that they match the required conditions of the original test case.
- Documenting framework usage and best practices
- All requirements and best practices for test scripts and coding standards for the VTAF tools are documented and reviewed regularly by team members to ensure that everyone adheres to these practices.
- Reporting and tracking framework bugs and enhancements in Bugzilla
In our automation framework, our testing tools are products, and any defects or enhancements are assigned to team members to be resolved in a timely manner.
We present the design of our integrated automation framework and outline the details of each of these practices as we developed the framework. We summarize our experiences and lessons learned, and discuss future directions.
2010 Poster Paper, Laura Bright & Anand Iyer, Abstract