Conference Presentations

The Zen of Software Testing: Discovering Your Inner Tester

Testing techniques and methods are usually based on models or theories-models derived from experience and theories from science. An alternative approach is Zen, a Buddhist doctrine stating that enlightenment can be attained through direct intuitive insight. Zen is all about harmony and balance. Dawn Haynes believes that a Zen approach to testing can help you meld disparate testing practices and gain new insights into your test processes and your everyday testing activities. We've all had those "aha" moments-like when you just knew it was a buffer overflow problem and immediately found where it was located in the code. When we "Zen" it, we figure out something through meditation or a sudden flash of enlightenment. Join Dawn to learn the Zen way to apply the models and theories you currently use for testing and then apply your intuitive insights to discover the rest.

Dawn Haynes, PerfTestPlus, Inc.
Session-Based Exploratory Testing-With a Test

Session-based exploratory testing is an effective means to test when time is short and requirements are not clearly defined. Is it advisable to use session-based exploratory testing when the requirements are known and documented? How about when the test cases are already defined? What if half of the test team is unfamiliar with the software under test? The answers are yes, yes, yes. Brenda Lee explains how her team modified the session-based exploratory testing approach to include requirements and test cases as part of its charter. In one instance, during the short seven-day test window the team validated forty-one out of forty-five requirements, executed more than 200 test cases using seventeen charters, and identified fifteen new, significant issues. The team was able to present a high-level test summary to the customer only two days after the conclusion of system test. What did the customer say?

Brenda Lee, Parallax Inc.
Managing Keyword-Driven Testing

Keyword-driven test automation has become quite popular and has entered the mainstream of test automation. Although some hail it as a panacea, many companies using it in one form or another have been disappointed. Keyword-driven testing projects succeed only if they are managed well. This presentation is not about the keyword method itself. Instead, Hans Buwalda focuses on the management side: how to manage a keyword-driven project. What are the factors that indicate progress and success? What are the common risks for a keyword project? Hans shares insights he has gathered in countless keyword projects in many industries all over the world. Many of the lessons he presents were learned the hard way. Learn from Hans' successes and mistakes and become more successful with your keyword-driven automation.

  • The success factors and risks for keyword-based automation
Hans Buwalda, LogiGear Corporation
Holistic Test Analysis and Design

To test professionally and understand software risks fully, we need to know what our tests cover. Counting test cases is not enough-that's like sizing business requirements by counting program modules. Neil Thompson presents a test analysis and design method that integrates four key elements into a holistic approach: test items, testable features, test basis documents, and product risks. Testing standards and many textbooks have anaesthetized us into the delusion that test cases are simple and can easily be derived through basic techniques. This is false thinking. According to Neil, we must consider and prioritize all available test techniques, incorporating both exploratory techniques and new thinking into our testing. Join Neil to learn a holistic approach for test design and the need for more complete information traceability.

  • The different types of coverage-logical and physical
Neil Thompson, Thompson Information Systems Consulting Ltd. and Mike Smith, Testing Solutions Group
The Top Ten Signs You Need to Improve Your Testing Process

Does this sound familiar? Patch #94 was just released for the application you shipped last month; your customers refuse to upgrade to the latest version until someone else tries it first; your project manager casually asks if the application was tested on Windows 98 because that's what your biggest customer uses. Robert Watkins discusses these and other signs of test process breakdowns. He then suggests ways to improve the testing process by making sure the testing activities are in line with the needs of all stakeholders (customers, business owners, support staff, developers, and testers). Find new ways to establish appropriate quality gates that everyone honors, enlist the best champion for your improvement efforts, and communicate the right information to the right people at the right time.

  • Improvements to mitigate or eliminate test process breakdowns
Robert Watkins, Metavante
STARWEST 2007: The Hard Truth about Offshore Testing

If you have been a test manager for longer than a week, you have probably experienced pressure from management to offshore some test activities to save money. However, most test professionals are unaware of the financial details surrounding offshoring and are only anecdotally aware of factors that should be considered before outsourcing. Jim Olsen shares his experiences and details about the total cost structures of offshoring test activities. He describes how to evaluate the maturity of your own test process and compute the true costs and potential savings of offshore testing. Learn what is needed to coordinate test practices at home with common offshore practices, how to measure and report progress, and when to escalate problems. Jim shares the practices for staffing and retention, including assessing cultural nuances and understanding foreign educational systems.

Jim Olsen, Dell Inc.
Mission Possible: An Exploratory Testing Experience

Interested in exploratory testing and its use on rich Internet applications, the new interactive side of the Web? Erik Petersen searched the Web to find some interesting and diverse systems to test using exploratory testing techniques. Watch Erik as he goes on a testing exploration in real time with volunteers from the audience. He demonstrates and discusses the testing approaches he uses everyday-from the pure exploratory to more structured approaches suitable for teams. You'll be amazed, astounded, and probably confounded by some of Erik's demonstrations. Along the way, you'll learn a lot about exploratory testing and have some fun as well. Your mission, should you choose to accept it, is to try out your testing skills on the snappiest rich Internet applications the Web has to offer.

  • Key concepts in exploratory testing demonstrated
  • Learn to test Rich Internet applications (RIA's)
Erik Petersen, Emprove
User Interface Testing with Microsoft Visual C#

Manually testing software with a complex user interface (UI) is time-consuming and expensive. Historically the development and maintenance costs associated with automating UI testing have been very high. Vijay Upadya presents a case study on the approaches and methodologies his Microsoft Visual C# test team adopted to answer the testing challenges that have plagued them for years. Vijay explains how the test team worked with developers to design high levels of testability into Microsoft Visual Studio 2005. These testability features enabled the test team to design a highly robust and effective test suite which completely bypasses the UI. Join Vijay to find out how they adopted data driven testing below the UI and achieved dramatic cost reductions in developing and maintaining their tests.

  • How to bypass the user interface without compromising test effectiveness
  • Designs for software with high testability
Vijay Upadya, Microsoft Corporation
Taming the Code Monolith-A Tester's View

Many organizations have systems that are large, complex, undocumented, and very difficult to test. These systems often break in unexpected ways at critical times. This is not just limited to older legacy systems-even more recently built Web sites are also in this condition. Randy Rice explores strategies for testing these types of systems, which are often monolithic mountains of code. He describes methods he has used to understand and "refactor" them to break up their huge complex codebase into something more testable and more maintainable. Randy describes how to build a set of tests that can be reused even as the system is being restructured. Find out how to perform regression, integration, and interoperability testing in this environment. See how new technologies such as service oriented architecture (SOA) can help achieve better system structures, and learn when and where test automation fits into your plans.

Randy Rice, Rice Consulting Services Inc
Selecting Mischief Makers: Vital Interviewing Skills

Much of testing is tedious-the focus on details, the repetitive execution of the same code, the detailed paperwork, the seemingly endless technical discussions, and the complex data analysis. All good testers have the skills and aptitude necessary to deal with these activities. However, great testers have one other characteristic-they are mischievous. As a hiring manager, detecting mischievous testers is a challenge you should pursue to build the best testing staff. How do you uncover a candidate's mischievous traits during the selection process? Résumés do not help, and phone interviews or email conversations are too easily misunderstood. The best chance you have for detecting mischief is during the interview. Andy explores the ways he identifies the clever people who make great testers and shares techniques that you can easily add to your interview process to find the best people for your team.

Andy Bozman, Orthodyne Electronics

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.