Conference Presentations

Getting Started with Test Driven Development

Test-driven (or test first) development (TDD) is an excellent method for improving the quality of software applications. It forces the programmer to focus on ensuring that the behavior of the objects at the lowest level of the system is appropriate. It also provides a mechanism to ensure that future source code changes do not break existing behaviors. Using C++ as the example language, Robert Walsh presents an overview of test-driven development, available TDD testing frameworks, and a demonstration of a project started from scratch using TDD. You can apply these concepts to other languages, including Java and Visual Basic. Learn how to overcome the initial hurdles many developers experience when starting out with TDD.

  • An introduction to test-driven development using C++ as the example language
  • The testing frameworks available for TDD
  • Programming tasks that are difficult to implement using TDD
Robert Walsh, EnvisionWare, Inc.
Open Source Development Tools: Coping with Fear, Uncertainty, and Doubt

Using open source tools in a development and test environment can be a big relief for your budget. However, open source remains a foreign and often frightening concept for many developers and organizations. Today, open source options are available for all types of tools used in the development process. In this session, you will gain a better understanding of the tradeoffs between choosing open source and commercial tools. In addition, you will learn about the wide variety of open source tools available for many operating environments and how to locate the most robust ones. Danny Faught, who has actively evaluated open source tools as they have evolved over the last five years, provides an honest analysis of the benefits and difficulties you may encounter using these tools for development.

  • Open source tools to consider for you and your team
Danny Faught, Tejas Software Consulting
eXtreme Architecture and Design for Test

eXtreme programming emphasizes test-first coding-you write the tests before writing the implementation code. You can apply the same approach in design when developing a complex system, including an architecture to support testing. To be successful, systems developed with agile methods must support a high level of testability and test automation. For large distributed systems, more sophisticated testing is needed to help determine which components may be contributing to failures. For such complex systems, you should architect the system for testing rather than add testing functionality as an afterthought. Ken Pugh presents a framework that employs polymorphic-style internal and external interface patterns to ease the work of testing and debugging. He also covers adding test-only functionality, test-only outputs, and test-only logging to interfaces.

Ken Pugh, Pugh-Killeen Associates
Design Testability and Service Level Measurements into Software

Design and architecture decisions made early in the project have a profound influence on the testability of an application. Although testing is a necessary and integral part of application development, architecture and design considerations rarely include the impacts of development design decisions on testability. In addition, build vs. buy, third party controls, open source vs. proprietary, and other similar questions can affect greatly the ability of an organization to carry out automated functional and performance testing-both positively and negatively. If the software or service is delivered to a separate set of end-users who then need to perform testing activities, the problems compound. Join Jay Weiser to find out about the important design and architecture decisions that will ensure more efficient and effective testability of your applications.

Jay Weiser, WorkSoft
Patterns for Writing Effective Use Cases

Use cases are a wonderfully simple concept: document a system's functional requirements by writing down scenarios about how using it delivers value to its actors. However, writing effective use cases is more difficult than expected because you frequently must deal with difficult questions, such as: scope, level of detail needed for different people and projects, how to describe external interfaces, stored data, and more. You need a source of objective criteria to judge use case quality and effectiveness. Fill this critical information gap with a pattern language that provides simple, elegant, and proven solutions to common problems in use case development. Take away these use case patterns and profit from the knowledge and experience of other successful use case writers. And develop a new vocabulary for describing the properties of quality use cases.

  • The "signs of quality" and properties of a good use case
Steve Adolph, WSA Consulting Inc.
Customer Focused Business Metrics throughout the SDLC

Focusing on the customer throughout the software development lifecycle (SDLC) is difficult to do. Teams often can become mired in technical problems, internal resource limitations, or other issues. Following the customer mantra of "Faster! Better! Cheaper!" Steve Wrenn offers measurement and process techniques that he has used to deliver projects on time, on budget, and, most importantly, meeting customers needs. By focusing on the development cycle from the outside in, his organization provides business-based metrics dashboards to monitor and adjust the project plan throughout the development project. Find out how their performance dashboard helps the team and the customer stay on course and drive directly to the targeted results. Discover an approach to determine what customers really want and match product development to customer expectations.

Steve Wrenn, Liberty Mutual Insurance Information Systems
In the Beginning ..Testing Web Services (.NET and Otherwise)

A Web service provides an interface for sending and receiving information, but it doesn't have a user interface. Instead, everything is done via requests and methods. So how does one go about testing such interfaces? Programmatically, that's how. In this presentation you'll be introduced to the concept of Web services and how they work. Tom Arnold even walks you through how to create tests using Perl, Python, and VB-like languages. Anyone new to Web services testing is certain to find this presentation a crucial first step to getting started down the right path.

  • Learn how to work with a Web service interface
  • Obtain approaches to writing scripts to exercise a Web service's API
  • Look at a completed harness for testing Web services
Thomas Arnold, Xtend Development, Inc.
Using Test Oracles in Automation

Software test automation is often a difficult and complex process. The most familiar aspects of test automation are organizing and running of test cases and capturing and verifying test results. A set of expected results are needed for each test case in order to check the test results. Verification of these expected results is often done using a mechanism called a test oracle. This talk describes the use of oracles in automated software verification and validation. Several relevant characteristics of oracles are included with the advantages, disadvantages, and implications for test automation.

  • Learn why evaluation of automated test results are not easy
  • Use test oracles as critical factors in making useful automated tests
  • Learn useful models for automated tests and test oracles
  • Learn five strategies for automated test oracles
  • See examples where different oracles have been used
Douglas Hoffman, Software Quality Methods LLC
Avoiding Test Automation Pitfalls and Guaranteeing Return on Investment

Companies that have attempted to implement test automation for functional testing have discovered-usually the hard way-that it isn't easy. This presentation takes an in-depth look at the specific pitfalls companies encounter when implementing automated functional testing, and offers real-world, proven best practices to avoid problems and guarantee long-term success. You'll learn about an ROI model successfully used by companies for automated testing efforts, and learn to identify the key areas upon which to focus your test automation.

  • Utilize an ROI model for test automation planning and results measurement
  • Tips on how to avoid test automation failure
  • Find out when not to automate
Jeff Tatelman, TurnKey Solutions Corp
People Issues in Test Automation

This session uncovers some of the key people issues you'll likely be confronted with as a team leader or test manager when using test automation. Examine alternative team structures, automation roles, common problems, and tips for obtaining fast payback.

Lloyd Roden, Grove Consultants

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.