Conference Presentations

Automated Testing of Packaged Applications Upgrades

If you are responsible for Oracle Application implementations, you probably understand the complexity and quality challenges of implementing 11.5x upgrades. You may also be struggling with how to best use automated testing for these upgrades. Find out about Mentora Group's experiences tackling these issues and the criteria they have developed for a successful project. Walk away from this session with specific insights for testing Oracle 11i Financials and Manufacturing, as well as general techniques applicable to testing any packaged ERP project.

Dan Downing, Mentora Inc
Code Reviews - An Improved, Light Weight Process

Formal code inspections work for some organizations, but are often considered too structured for many others. Thirumalesh Bhat discusses the pitfalls of formal inspections and presents a lightweight code review process that works at Microsoft. Learn the value of code reviews to improve the quality of both the test code and production code. Find out why test teams at Microsoft participate in all code reviews. Start a code review process in your organization, or improve the one that's in place now.

Thirumalesh Bhat, Microsoft
Combating the Test Schedule Killers

By focusing on three simple but often overlooked methods, David Petrella’s test team consistently stays on schedule and delivers the testing results his projects expect. Learn how to develop and employ Risk Assessment documents to define the scope of testing and identify areas that cannot be tested with available resources. Publish an Entrance Criteria document that defines what resources (hardware, software, data, etc.) are needed for a successful test project. Then, use the Code Freeze concept to ensure that the software is not constantly changing until the day of delivery. Take away specific examples and helpful templates based on David’s experience using these methods in numerous projects.

David Petrella, SysTest Labs
Testing Dialogues: Technical Issues

Test professionals face a myriad of issues with immature development technologies, changing systems environments, increasingly complex applications, and 24/7 reliability demands. We must choose the right methodology and best testing techniques to meet these challenges, all with a limited set of tools and not enough time. In this double-track session, you'll be able to ask for help from your peers, share you expertise with the group, and develop some new approaches to your biggest challenges. Johanna Rothman and Esther Derby facilitate this session, focusing on topics such as model based testing, security testing, testing without requirements, testing in the XP/Agile world, and configuration management. Discussions are structured in a framework so that participants will receive a summary or their work product after the conference.

Facilitated by Esther Derby and Johanna Rothman
Improving Testing In A Small Informal Organization

Many smaller organizations are looking for ways to improve their testing processes and approaches. They do not need complex assessment reports, change task forces, extensive implementation of guidelines, or expensive training programs. Solutions for smaller organizations often involve gradually upgrading test awareness and test process performance. Even introducing a supporting template and providing a little coaching helps considerably. Martin Pol shares his experiences with process improvement and outlines ways to improve testing in your small organization.

Martin Pol, POLTEQ IT Services BV
Configuration Test Automation using Virtual Machines

There are more than 7,000 possible configurations of operating systems, browsers, screen resolutions, and other unique characteristics in today’s computer environments. Learn about a flexible automation framework for functional configuration testing based on an approach developed by Plaxo, Inc. This approach uses multiple virtual operating systems with a pre-installed, commercial automation tool launched on a single Intel-based computer. The results were highly scalable for new configurations under test and allowed the team to test over 30 different configurations on one PC.

Vladimir Belorusets, Plaxo
Establishing Bug Priority and Severity: The Elevator Parable

"How do you know when you're finished?" A key process in this assessment is making good bug severity and priority assignment. Robert Sabourin presents a fun, interactive parable that teaches an important lessonx0151assigning bug priority and severity is a business decision, not a technical one. By having clear rules for how you assign severity to bug and applying them consistent, you'll go a long way toward making the right business decisions. Learn how business context impacts bug priority and severity, and review real-world prioritization schemes used by leading organizations.

Robert Sabourin, AmiBug.com Inc
STARWEST 2003: Surviving The Top Ten Challenges of Software Test Automation

Although test automation has been around for years, many organizations still have difficulty making automated testing a reality. Although organizations see the value in using automated test tools, research shows that most organizations perform more manual testing than automated testing. This presentation examines the big challenges in test automation and describes ways that successful organizations have overcome them. Learn how to apply these lessons in your organization to add value to your testing efforts. Topics include: how to organize your team for test automation; how to find the right tools; how to get and keep management support; creative ways to train people to use test tools; how to manage people's expectations; how to control and maintain automated testware; and how to integrate tool usage into existing test processes.

Randy Rice, Rice Consulting Services Inc
Testing Web Services - A Big, Big Problem

Because Web services are almost completely dynamic, there is an increased chance of errors in applications using these services. In addition, applications often are more closely tied to business transactions, increasing the business risks whenever those errors do occur. By design, Web services allow access from anywhere in the world and provide its services in real-time. Because a client's behavior cannot be controlled, Web services are vulnerable to many unexpected uses and unanticipated inputs, all of which can impact the functionality of the service. For testing Web services, traditional black-box approaches are not enough. Adam Kolawa explains how to test Web services by adopting white-box testing practices that look in side the software at its design, code, data communications, and control flows.

Adam Kolawa, ParaSoft Corporation
Improving Requirements Through Testing

Because testing, by some definitions, is ensuring that the observed results match the expected results, we often are highly dependent on the quality of the requirements when we test. Unfortunately, most software projects do not have sufficient requirements that pre-determine exactly what the results of all of the tests should be. So, what should testers do? In this talk, Richard Bender addresses the testing techniques you can use to improve the quality of requirements so that they are accurate, complete, unambiguous, and consistent. Learn how to validate requirements against objectives, how to extract the real requirements from domain experts, how to perform initial ambiguity reviews, and how to create a cause-effect graph to chart logical consistency within the requirements.

Richard Bender, Bender & Associates

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.