STARWEST 2010 - Software Testing Conference

PRESENTATIONS

Reducing the Testing Cycle Time through Process Analysis

Because system testing is usually what lies between development and release to the customer-and hopefully more business value or profit-every test team is asked to test faster and finish sooner. Reducing test duration can be especially difficult because many of the factors that drive the test schedule are beyond our control. John Ruberto tells the story of his team’s cutting the system test cycle time from twelve weeks down to four. John shares how they first analyzed the overall test process to create a model of the test duration.

John Ruberto, Intuit Inc
Requirements Based Testing on Agile Projects

If your agile project requires documented test case specifications and automated regression testing, this session is for you. Cause-effect graphing-a technique for modeling requirements to confirm that they are consistent, complete, and accurate-can be a valuable tool for testers within agile environments. Whether the source material is story cards, use cases, or lightly documented discussions, you can use cause-effect graphing to confirm user requirements and automatically generate robust test cases.

Richard Bender, Bender RBT, Inc.

Software Performance Testing: Beyond Record and Playback

Predictable software performance is crucial to the success of almost every enterprise system and, in some cases, to the success of the company itself. Before deploying systems into production, management is demanding comprehensive performance testing and reliable test results. This has created significant pressure on development managers and performance test engineers. Alim Sharif guides you through the basic steps for planning, creating, executing, and reporting performance tests.

Alim Sharif, Ultimate Software Group
STARWEST 2010 Keynote: Lightening Strikes the Keynotes

Lightning Talks have been a very popular part of many STAR conferences throughout the years. If you’re not familiar with the concept, a Lightning Talk session consists of a series of five-minute talks by different presenters within one presentation period. For the speakers, Lightning Talks are the opportunity to deliver their single biggest-bang-for-the-buck idea in a rapid-fire presentation. For the first time, lightning has struck the STAR keynote presentations.

Lee Copeland, Software Quality Engineering

STARWEST 2010: Automating Embedded System Testing

Many testers believe the challenges of automating embedded and mobile phone-based systems testing are prohibitively difficult. By approaching the problem from a test design perspective and using that design to drive the automation initiative, William Coleman demystifies automated testing of embedded systems. He draws on experiences gained on a large-scale testing project for a leading smart-phone platform and a Window CE embedded automotive testing platform.

William Coleman, LogiGear Corporation
STARWEST 2010: Quality Metrics for Testers: Evaluating Our Products, Evaluating Ourselves

Finally, most businesses realize that a final system testing "phase" in the project cannot be used as the catch-all for software quality problems. Many organizations are changing development methodologies or creating organization-wide initiatives that drive quality techniques into all aspects of development. So, how do you know that a quality initiative is working or where the most improvement effort is needed?

Lee Copeland, Software Quality Engineering

STARWEST 2010: Tour-based Testing: The Hacker's Landmark Tour

When visiting a new city, people often take an organized tour, going from landmark to landmark to get an overview of the town. Taking a “tour” of an application, going from function to function, is a good way to break down the testing effort into manageable chunks. Not only is this approach useful in functional testing, it’s also effective for security testing. Rafal Los takes you inside the hacker’s world, identifying the landmarks hackers target within applications and showing you how to identify the defects they seek out.

Rafal Los, Hewlett-Packard
State-driven Testing: An Innovation in UI Test Automation

Keyword-driven testing is an accepted UI test automation technique used by mature organizations to overcome the disadvantages of record/playback test automation. Unfortunately, keyword-driven testing has drawbacks in terms of maintenance and complexity because applications easily can require thousands of automation keywords. To navigate and construct test cases based on so many keywords is extremely cumbersome and can be impractical.

Dietmar Strasser, Borland (a Micro Focus company)
Stick a Fork in It: Defining Done

It seems that developers have as many definitions of “done” as Eskimos have words for “snow.” But without a clear definition of done, it is difficult to gauge progress on a project. Menlo Innovations has a simple solution. Instead of declaring a story card or feature done on its own, developers collaborate with the business analyst and testing teammates to determine when the application meets their requirements.

Tracy Beeson, Menlo Innovations
Streamlining the Developer-Tester Workflow

The islands that many development and test teams live on seem far apart at times. Developers become frustrated by defect reports with insufficient data to reproduce a problem; testers are constantly retesting the same application and having to reopen "fixed" defects. Chris Menegay focuses on two areas for improvement that will help both teams: a better build process to deliver a more testable application to the test team and a defect reporting process that delivers better defect data back to the developers.

Chris Menegay, Notion Solutions, Inc.

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.