|
Documenting Exploratory Testing Exploratory testing projects can have documentation requirements, just like any other project. Jonathan describes various ways we can create documentation on our exploratory testing projects, including guidance documents, test-coverage reports, and video software to help create lightweight, powerful documentation.
|
|
|
The Top Testing Challenges—or Opportunities—We Face Today
Video
Some people thrive on challenges, while others struggle with how to deal with them. Handled well, challenges can make us stronger in our passion, drive, and determination. In this video, Lloyd Roden describes some of the challenges we face today in software testing and how we can respond...
|
Lloyd Roden
|
|
How to Squeeze the Most Out of Your Automated Testing Jonathan Lindo describes examples of automated test infrastructure utilizing both open source and traditional, independent-software-vendor-sourced software. In addition, he discusses new techniques for extending the value of automated testing by transforming the process from defect finding to defect resolution by reducing the effort required to document, reproduce, and troubleshoot the defects generated from automated tests.
|
|
|
Branching to Distraction Branching can be an effective solution for managing change, enabling parallel development and improved productivity. But, working on a branch is a distraction and can decrease agility, productivity, and code robustness. Learn when the value of working on a branch outweighs the cost.
|
|
|
Software Performance Testing: Beyond Record and Playback Predictable software performance is crucial to the success of almost every enterprise system and, in some cases, to the success of the company itself. Before deploying systems into production, management is demanding comprehensive performance testing and reliable test results. This has created significant pressure on development managers and performance test engineers. Alim Sharif guides you through the basic steps for planning, creating, executing, and reporting performance tests. He explains how to actively involve stakeholders-application developers, database administrators, network engineers, IT infrastructure groups, and senior managers-to identify and resolve performance issues. Alim discusses how to maintain the balance between these stakeholder interests during each step and demonstrates how to effectively lead the performance test effort.
|
Alim Sharif, Ultimate Software Group
|
|
The Test Manager's Dashboard: Making It Accurate and Relevant Gathering and presenting clear information about quality-both product and process-may be the most important part of the test manager's job. Join Lloyd Roden as he challenges your current progress reports-probably full of lots of difficult-to-understand numbers-and asks you to replace the reports with a custom Test Manager's Dashboard containing a series of graphs and charts with clear visual displays. Your dashboard needs to report quality and progress status that is accurate, useful, easily understood, predictive, and relevant. Learn about Lloyd's favorite dashboard graphs-test efficiency, risk progress, quality targets, and specific measures of the test team's well being. Learn to correlate and interpret the various types of dashboard data to reveal the complete picture of the project and test progress.
|
Lloyd Roden, Grove Consultants
|
|
Multi-level Testing in Agile Development Before they could begin automated testing, test teams used to wait on the sidelines for developers to produce a stable user interface. Not anymore. Agile development environments and component-based applications challenge testers to contribute value earlier and continuously throughout development. Although most agile testers currently focus on unit and integration testing, they also need to test the application’s business and service layers-all the way to the system level. Roi Carmel guides you step-by-step through these stages, describing which practice-GUI or non-GUI automated testing-is the right choice and why. The incorrect choice can lead to iteration delays, lower team productivity, and additional problems.
|
Roi Carmel, Hewlett-Packard
|
|
Handling Failures in Automated Acceptance Tests One of the aims of automated functional testing is to run many tests and discover multiple errors in one execution of the test suite. However, when an automated test runs into unexpected behavior-system errors, wrong paths taken, incorrect data stored, and more-the test fails. When a test fails, additional errors, called inherited errors, can result or the entire test can stop unintentionally. Either way, some portion of the system remains untested, and either the error must be corrected or the automation changed before proceeding. Alexandra Imrie describes proven approaches to ensure that the most tests will continue running despite errors encountered. She begins by sharing a specific way of designing tests to minimize the disturbance from an error. Using this test design as a foundation, Alex describes the strategies she exploits for handling and recovering from error events that occur during automated functional tests.
|
Alexandra Imrie, BREDEX GmbH
|
|
End-to-End Testing-When the Middle Is Moving State-of-the-art development technologies and methods have increased our ability to rapidly implement new systems to support continuously changing business needs. These technologies include Web services and services that encapsulate legacy systems, as well as SOA, SaaS, cloud computing, agile practices, and new test sourcing options. Testers are being pushed to create suites of end-to-end tests in which all parts of the system are tested together. Ruud Teunissen explores ways to create end-to-end tests that are integrated, production-like, automated, continuously running, and cover the full application landscape. Ruud presents strategies and tools for developing, executing, and maintaining these tests including issues surrounding the test environment and test data.
|
Ruud Teunissen, POLTEQ IT Services BV
|
|
Developing a Testing Center of Excellence In spite of well-established testing processes, many organizations still are struggling to achieve consistent, reliable testing results. Are testing deliverables completed incorrectly? Is your organization slow to react to change? A Testing Center of Excellence (TCOE) provides oversight of the testing efforts across the enterprise to help provide the best testing services possible and adapt more rapidly to innovations and challenges. Mona Lane shares the strategy Aetna followed to build a successful TCOE. Originally focused on one specific area-test tools-it evolved and continues to expand to encompass all aspects of testing. She shares the checklists they've developed to review testing artifacts for consistency and how these reviews are helping Aetna improve quality.
|
Mona Lane, Aetna
|