|
8-Fold Path of Stress Testing Using examples and results taken from several successful stress testing projects, Dan Downing illustrates an eight stage methodology for planning, designing, executing, and evaluating the results of an automated load test. Learn the key activities, tools, resources, costs, deliverables, techniques, and challenges for each stage of this approach. Explore load testing concepts and terminology.
|
Dan Downing, Mentora, Inc.
|
|
Advanced Test Automation Using Programming Languages Companies setting up test automation projects find out pretty quickly that the major automation tools on the market today cannot always accomplish what is needed to support a full and complete automation testing project. While some companies can afford to purchase multiple tools, others rely on popular programming languages. Explore how testers can use these common programming languages and techniques to build test scripts and utilities to enhance and support their automation projects.
|
Mary Sweeney, Data Dimensions, Inc.
|
|
Performance Testing a Firewall Firewall products pose some interesting challenges. Learn of some of the issues--and their resulting resolution--which occurred while planning the performance and characterization testing of a network security product (firewall). Based on lessons learned during this planning process, gain an understanding of the complexity of the issues and the broad range of requirements (from marketing, engineering, and customers) placed on the characterization project.
|
Howie Dow, Compaq Computer Corporation
|
|
Capture Replay: A Foolish Test Strategy Capture replay is not an effective strategy for test automation. This popular technique seems to enable unskilled testers to generate test scripts quickly. However, there are two reasons why this strategy is generally not effective. Learn why the capture replay concept is poorly suited for the realities of software development, and why this technology is often unreliable. Discover various architectures successfully used for automated testing.
|
Bret Pettichord, Independent Consultant
|
|
STARWEST 2000: Data-Driven Automated Testing Using XML Test automation is an unavoidable entity for testing Web-based applications where reduced time-to-market is the name of the game. Data-driven test cases allow the test automation engineer to automate/develop once and run many times with different conditions to test the system. Learn why XML--the markup language for documents containing structured information--is the best way to present the test data for automated testing. Explore the advantages and disadvantages of XML-based test data.
|
Rutesh Shah, Arsin Corporation
|
|
Management of Test Case Aging Testing continues over a software product's lifecycle, but the test plans--particularly test cases and methods--undergo an evolution and aging as they mature in character, depth, and complexity. Based on analysis of a suite of tests and methods that have matured over a ten to twenty year period, this presentation examines testing from its initial stages through its maturity. Explore the impact of software trouble reports and change requests, including impacts from system usage on the testing.
|
Jon Hagar, Lockheed Martin
|
|
Making a Business Case for Test Process Improvement Time-consuming and marginally effective test processes are unacceptable in today's marketplace. The high demands of eBusiness applications combined with the more challenging quality requirements on security, usability, and performance require adequate and more mature test solutions. Dedicated, practice-based process improvement models provide the frame of reference for continuous improvement of test processes. This is obvious to quality and testing professionals--but how do you convince management? Martin Pol discusses ways to obtain management buy-in for test process improvement, and provides case data from his experiences in improvement projects.
|
Martin Pol, POLTEQ IT Services B.V.
|
|
Testing in the Cold "Testing in the cold" refers to those times when you feel there is no commitment to testing and people or other circumstantial factors are not being cooperative. Hans Buwalda provides forty-five tips for testing in such a situation, including issues on commitment, politics, managing expectations, dependencies, difficulty of testing, motivation of participants, and practical issues and problems. Learn how to successfully "test in the cold" when circumstances appear to be working against you.
|
Hans Buwalda, CMG TestFrame Research Center
|
|
Validation and Component-Based Development Component-based development is the practice of constructing software applications from new or existing encapsulated language-independent modules. In his presentation, David Wood details a case study on the use of opaque-box testing, coupled with code coverage and pre-/post-conditions, to provide validated software components. Learn about component-based development and how to apply it to your projects.
|
Rob Harris, Harris Corporation and David Wood, Applied Object Engineering
|
|
A Method for Test Machine Setup for Multiple Operating Systems Software testing is becoming more involved all the time: products involve more components; automation tools are used more often; and testing is required on more than one operating system, version, or language. In his presentation, Rick Smith addresses this problem and presents a solution that is automated, flexible, efficient, and repeatable. Learn how to improve--and simplify--software testing efficiency in your organization.
|
Rick Smith, IBM
|