Conference Presentations

Cross-platform Testing at Microsoft

Microsoft Office products have been available on Windows-based and Apple Mac personal computers for many years. Now, with the rapid proliferation of mobile device platforms, current testing strategies are more difficult to justify because it is not feasible to implement test suites from scratch for each new mobile platform that arrives in the market. Join Jean Hartmann in an exploration of the platform-agnostic testing space and learn about current cross-platform strategies, tools, and processes being developed within Microsoft Office. Jean presents examples to help you gain a better understanding of the benefits, challenges, and trade-offs that you must make when considering such approaches. To start the process of developing the new strategies, tools, and processes, you’ll need to create portable tests in which testers define their core test scenarios once and then leverage them for different platforms and devices.

Jean Hartmann, Microsoft Corporation
Combinatorial Test Design: Beyond the Gee-wiz Numbers

Combinatorial Test Design (CTD) is a great technique to ensure that your tests cover your test space thoroughly at a depth that matches the level of risk. Although it is entertaining to consider the huge number of tests required to test all combinations and compare that to the small number that CTD selects, there is so much more to learn. Karen Rosengren takes you on a journey through a program she led inside IBM. Its objective was to minimize the number of tests being run; however, in the process they learned much more about their testing efforts. They found ways to measure the effectiveness of their testing, how to clearly show the complexity of feature creep, how their understanding of the test space drove better low level designs in the product code, and how the generated test designs created a better foundation for test automation. Join Karen to learn about these additional-and valuable-benefits of CTD.

Karen Rosengren, IBM
The Many Flavors of Exploratory Testing

The concept of exploratory testing is evolving, and different interpretations and variations are emerging and maturing. These range from the pure and original thoughts of James Bach, later expanded to session-based exploratory testing by Jon Bach, to testing tours described by James Whittaker, to the many different ways test teams across the world have chosen to interpret exploratory testing in their own contexts. Though it appears to be simple, exploratory testing can be difficult to introduce into a traditional organization where testers are familiar only with executing scripted test cases and where the concept of exploration and creative testing may be somewhat foreign. At the same time, organizations need to address the challenges of traceability and reporting, moving from traditional ways to a more exploratory approach.

Gitte Ottosen, Sogeti Denmark
The Tester's Role in Continuous Integration

If your software product is recompiled and integrated frequently, you can improve your testing by integrating automated tests into your continuous integration process. In many organizations, unit tests are run as part of continuous integration; however, that is not enough. During the continuous integration cycle, integration of all automated tests-system, integration, unit, and regression-is vital to help find defects quickly and provide a substantial return on investment. Ayal Cohen and Roi Carmel describe the types of tests needed, the pros and cons of each type, and how to choose which tests to execute according to development code change, business criticality, and history of execution. Ayal and Roi discuss the need for service virtualization so you can run your tests in an environment that has not yet been fully developed, providing virtual substitutes for the missing services.

Ayal Cohen, HP
Testing a Business Intelligence/Data Warehouse Project

When an organization builds a data warehouse, critical business decisions are made on the basis of the data. But how do you know the data is accurate? What should you test, and how? Karen Johnson discusses how to test in the highly technical areas of data extraction, transformation, and loading. Stored procedures, triggers, and custom ETL (extract, transform, load) transactions often must be tested before the reports or dashboards from a business intelligence (BI) project can be tested. The volume of data is frequently so large that testing “all the data” is simply not possible so choosing an appropriate test data set is often one of the most strategic decisions in BI testing. Karen shares stories about past BI projects and ideas on how to test data warehouse and business intelligence projects. Learn the techniques for ensuring quality data on your vital databases.

Karen Johnson, Software Test Management, Inc.
Adding Good User Experience Practices into Agile Development

Whose job is it to ensure that the user has a good experience with a new application? As agile processes are taught today, the user experience (UX) design practice is usually left out or at best described as an optional team role. However, the companies that build useful, usable, and desirable software know that UX is baked into the whole development process. Jeff Patton describes what user experience design is and isn’t, and how every person on the team has something to contribute. Hear concrete examples of how companies have adapted their UX practice to work well in an agile context and, along the way, discovered innovative UX practices that work better in agile contexts. Jeff explores pragmatic personas, guerrilla user research, design sketching, lightweight prototyping, and concept testing. Leave with valuable tips for adding UX practices and thinking to your agile process to help you get good user experience.

Jeff Patton, Jeff Patton & Associates
Invest in a Testing Framework
Video

HP Automated Testing Solutions. A modern application. Component based testing—why frameworks? Automation evolution. HP Automated functional testing solution. Orginzational Considerations. Resources. 

Heather Taylor, HP Software
Visualization: Seeing Test Requirements in a New Light

Change is everywhere in software-feature enhancements, regulatory requirements, technology updates, re-designs, and re-implementations. How can we ensure that testers really understand requirements, business rules, and know what’s changing? Vijay Atmavilas shares how Verisign began to employ visual modeling and visual test design techniques to address these challenges. New models were produced using diagrams that highlighted process flows, input types and combinations to highlight data, and scenarios to highlight usage. As a result, team members quickly increased their understanding of feature requirements and improved their testing. Learn how Vijay and his team employ visual models to identify undertested functional areas and to help them measure coverage and the effectiveness of their regression tests and suites.

Vijay Atmavilas, VeriSign Inc
Artful Testing: Learning from the Arts

At first glance, art and testing may seem like an odd couple. However, Glenford Myers combined both in his book, The Art of Software Testing-though his "art" referred only to skill and mastery. More recently, Robert Austin and Lee Devin published Artful Making which relates software development to the creation of a piece of artwork. These authors inspired Zeger Van Hese to consider the idea of artful testing. Zeger investigates what happens when we combine and infuse testing with aesthetics. With some surprising examples, Zeger shows how the fine arts can support and complement our testing efforts. For instance, the tools art critics use for their critiques are valuable additions to the tester toolbox, enabling testers to become more professional software critics.

Zeger Hese, CTG
Enhancing Collaboration through Acceptance Tests

Even though acceptance testing principles and tools are common today, teams often stumble during implementation. In the worst cases, acceptance tests start to feel like a burden rather than a boon. Paul Nelson guides you through common acceptance testing pitfalls and provides practical, “tested” solutions to keep your acceptance testing efforts on track. Starting with a typical example, Paul guides you through important principles that focus on collaboration with the business-getting the words right, managing the level of detail, dependency isolation, and refactoring in safe steps. Paul explores common abstraction patterns and demonstrates examples using Cucumber, though the principles are equally applicable with other tools. Leave with a renewed confidence in your ability to maintain control of your acceptance tests and make them the collaboration tool they should be.

Paul Nelson, ThoughtWorks, Inc.

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.