Conference Presentations

Deploy a Peerless Peer Review Process

Peer review programs are like parachutes-proper deployment is essential; otherwise, they inevitably will crash. When effectively implemented, peer reviews have a significant return on investment and result in greater product reliability. Lee Sheiner shares the key features for making peer reviews a value-added practice at Georgia Tech Research Institute, including: selecting the proper type of review for each work product, identifying the right reviewers, focusing on early defect detection, using supporting tools, fostering an environment conducive to reviews, managing the review materials, and much more. Learn from Lee the ways they have crosspollinated peer review methods across the organization and how successful peer reviews encourage project groups to "gel" and become highly productive teams.

Lee Sheiner, Georgia Tech Research Institute
STARWEST 2005: Testing Dialogues - Management Issues

As a test manager, are you struggling at work with a BIG test management issue or a personnel issue? If so, this session is for you. "Testing Dialogues--Management Issues" is a unique platform for you to share with and learn from test managers who have come to STARWEST from around the world. Facilitated by Esther Derby and Elisabeth Hendrickson, this double-track session takes on management issues-career paths for test managers, hiring, firing, executive buy-in, organization structures, and process improvement. You name it! Share your expertise and experiences, learn from others’ challenges and successes, and generate new topics in real time. Discussions are structured in a framework so that participants will receive a summary of their work product after the conference.

Esther Derby, Esther Derby Associates Inc
Intelligence Testing: Techniques for Validating a Data Warehouse System

Many organizations have implemented information repositories-single-source data warehouses-to capture and provide key business intelligence information. Data warehouse testing presents unique challenges including: the absence of a user interface, constantly shifting user requirements, slow-changing data, a lack of user control with reporting tools, and a state of perpetual change in the applications supplying data. Geoff Horne explores the different testing techniques you can apply to testing a data warehouse and how their usage differs from traditional application testing. Learn how to test a data warehouse even when the source systems are being developed at the same time and identify when your testing is appropriate and adequate.

Geoff Horne, iSQA
A Spoon Full of Sugar Helps the Test Process Improvement Go Down

Test process improvement is the medicine many software organizations need to heal wounds caused by today's fast-paced software development lifecycles. But project and test managers are often like stubborn children who refuse to take their medicine even when it is for their own good. How do we get them to take it so the health of the project will improve? Just add a spoonful of sugar! Dion Johnson reveals approaches he has used to gain management buy-in for improvements and the implementation steps that have worked for him. Learn about the test process improvement models that are available and some practical ways to implement them. Take away specific improvement ideas to improve the health of your test organization.

Dion Johnson, DiJohn IC, Inc
Bugs Shipped: Agile Versus eXtreme

Traditionally, acceptance testing is an end-of-development, final-stage test activity, often done ad-hoc by users. Instead, with extreme acceptance testing, you can transform it into an iterative, automated practice that can be used by developers throughout the project. Marnie Hutcheson explains how turning the "acceptance testing" knob up to "ten" increases the ROI of testing throughout the project and why the practice of testing only at the end of a project fails to provide the timely feedback needed by developers and users. Learn how extreme acceptance testing fits into the flow of an Agile development project and how developers, testers, and customers benefit from this approach. See examples of the automated acceptance testing frameworks, Avignon and FIT.

Marnie Hutcheson, Ideva
The Value-added Manager: Five Pragmatic Practices

What do great managers do that others don't? Great managers focus their efforts, increase their productivity, and develop their people. In this session, Esther Derby describes five pragmatic practices that managers can apply to improve both work results and worker satisfaction-give both positive and corrective feedback weekly, consciously decide what not to do, limit multitasking, develop people, and meet with staff individually and as a group every week. Esther says these ideas are not rocket science. If you apply these five practices consistently you will improve the value of your team to the organization-and keep your sanity, too.

Esther Derby, Esther Derby Associates Inc
Going Wireless - Test Strategies for Mobile Applications

Testers face unique challenges with mobile applications. Not only do the testers have to test the software for functional and performance correctness, they have to consider compatibility with innumerable combinations of devices and networks. Manish Mathuria discusses how test automation can be leveraged to tame this complex testing challenge in a highly competitive market. He offers a comprehensive perspective on the challenges, justifications, and requirements of doing test automation in the dynamic world of mobile applications. With specific examples from Brew and J2ME technologies, Manish demonstrates the essential components of a test automation framework using a real mobile application. See how test automation speeds the process leading to the certification that providers mandate.

Manish Mathuria, InfoStretch Corporation
STARWEST 2005: Testing Outside the Bachs: A Hands-On Exploratory Testing Workshop

Simply put, exploratory testing means designing your tests as you perform them. When it's done well, it's a fantastically productive and rewarding approach to testing. However, to do it well requires training, practice, and discipline. Lecture presentations about exploratory testing are a poor substitute for seeing it and doing it. So ... plan to bring your laptop to this session and test along with James Bach and Jon Bach as they demonstrate exploratory testing in a live testing workshop. Participate or just observe as exploratory testing is performed in real time with play-by-play and color commentary. Learn how to bring structure to this apparently unstructured testing method. See if you can find bugs that they do not find as you test "outside the Bachs"!

Jon Bach, Quardev Laboratories
A Flight of Fancy - The Evolution of a Test Process for Spacecraft Software

The Johns Hopkins University Applied Physics Laboratory formed an embedded software group for producing space flight software. In addition to defining the process for developing and testing this software, the group had to quickly apply and adjust the new processes to a series of four spacecraft missions, starting in 2001, as resources were over-extended and schedules were compressed. Brenda Clyde shares highlights, complexities, and differences of testing these spacecraft missions in the last four years. She describes the initial test process, the problems encountered during the test phase for each mission, the resolution of the problems, and the incorporation of the changes into the next mission. Learn about the challenges the Applied Physics Laboratory faced testing embedded software and the process in place for testing their next spacecraft mission.

Brenda Clyde, Johns Hopkins University
Testing with Styles

Walt Disney is famous for characters like Mickey Mouse and Donald Duck, but there were three special characters he used as thinking tools. No, not Huey, Duey, and Louie, Donald's nephews, but three special character styles. These styles are dreamer, realist, and spoiler. Often Walt participated in meetings having adopted one of these styles. We can also use these styles to guide software development, reviews and testing, user-system interactions, and system-to-system interactions. For testers, the dreamer suggests positive testing to ensure the product works, the realist suggests negative testing in case the user makes mistakes, and the spoiler suggests illogical user actions or destructive testing to focus on unusual or malicious system use.

Erik Petersen, Emprove

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.