STARWEST 2005 - Software Testing Conference
PRESENTATIONS
A Flight of Fancy - The Evolution of a Test Process for Spacecraft Software
The Johns Hopkins University Applied Physics Laboratory formed an embedded software group for producing space flight software. In addition to defining the process for developing and testing this software, the group had to quickly apply and adjust the new processes to a series of four spacecraft missions, starting in 2001, as resources were over-extended and schedules were compressed. Brenda Clyde shares highlights, complexities, and differences of testing these spacecraft missions in the last four years. |
Brenda Clyde, Johns Hopkins University
|
A Spoon Full of Sugar Helps the Test Process Improvement Go Down
Test process improvement is the medicine many software organizations need to heal wounds caused by today's fast-paced software development lifecycles. But project and test managers are often like stubborn children who refuse to take their medicine even when it is for their own good. How do we get them to take it so the health of the project will improve? Just add a spoonful of sugar! Dion Johnson reveals approaches he has used to gain management buy-in for improvements and the implementation steps that have worked for him. |
Dion Johnson, DiJohn IC, Inc |
Agile Testing of Embedded, Real-Time Systems
Until now, Agile development and testing concepts have been aimed largely at Web sites, interactive applications, and software packages where short production cycles are a must. With care, many of these same testing practices can work on embedded systems, in which long development cycles, no user interface, and regulatory requirements are the norm. Jon Hagar examines Agile testing practices you can implement within both hardware and software product domains. |
Jon Hagar, Lockheed Martin |
Aligning Testing Strategies wwith Corporate Goals
When developing a testing strategy, test managers normally review the business case for the project, study the new requirements, and consider what they know about the system under test. By also including a review of your organization's mission, values, and corporate goals, you will immediately stand out among your peers and at the same time improve the business value of testing. Stewart Noakes has worked with test managers at both large and small companies to help them align test strategies with corporate goals. |
Stewart Noakes, Transition Consulting Ltd
|
Behavior Specification for Testing Embedded Systems
A behavior specification is a valuable engineering artifact for the design, review, and testing of embedded software. It is a black-box model defining all interactions between system and environment and the conditional state-based causal relationships among them. Based on work by IEEE working group P1175, Dwayne Knirk describes a new reference model for specifying the behavior of computing systems. An embedded software control application is used to illustrate the application of this model. |
Dwayne Knirk, Sandia National Laboratories
|
Bugs Shipped: Agile Versus eXtreme
Traditionally, acceptance testing is an end-of-development, final-stage test activity, often done ad-hoc by users. Instead, with extreme acceptance testing, you can transform it into an iterative, automated practice that can be used by developers throughout the project. Marnie Hutcheson explains how turning the "acceptance testing" knob up to "ten" increases the ROI of testing throughout the project and why the practice of testing only at the end of a project fails to provide the timely feedback needed by developers and users. |
Marnie Hutcheson, Ideva
|
Choosing Effective Test Metrics
Every software project can benefit from some sort of metrics, but industry studies show that 80 percent of software metrics initiatives fail. How do you know if you've selected the right set of test metrics and whether or not they support your organizational goals? Alan Page offers methods for determining effective and useful test metrics for software quality and individual effectiveness and presents new studies showing correlation between certain metrics and post-ship quality. |
Alan Page, Microsoft Corporation |
Deploy a Peerless Peer Review Process
Peer review programs are like parachutes-proper deployment is essential; otherwise, they inevitably will crash. When effectively implemented, peer reviews have a significant return on investment and result in greater product reliability. |
Lee Sheiner, Georgia Tech Research Institute
|
Design and Optimize Test Cases from Use Cases
As part of developing software requirements, many project teams employ use cases to describe the human interactions with a system. Testers can use the same documents to optimize test case design. Learn the basics of use case writing and what you need to do to turn a use case into a test scenario. Find out how to extract test conditions and equivalence classes from use cases, build a test case matrix, and apply orthogonal array techniques to reduce the number of test cases needed. |
Ronald Rissel, Vanguard
|
Diagnosing Performance Problems in Web Server Applications
Many application performance failures are episodic, leading to frustrated users calling help desks, frantic troubleshooting of production systems, and re-booting systems. Often these failures are a result of subtle interactions between code and the configuration of multiple servers. On the other hand, well-designed applications should demonstrate gradual performance degradation and advanced warning of the need to add hardware capacity. |
Ron Bodkin, Glassbox software
|
Pages
Recommended Web Seminars
On Demand | Building Confidence in Your Automation |
On Demand | Leveraging Open Source Tools for DevSecOps |
On Demand | Five Reasons Why Agile Isn't Working |
On Demand | Building a Stellar Team |
On Demand | Agile Transformation Best Practices |