Conference Presentations

STARWEST 2005: Apprenticeships: A Forgotten Concept in Testing Training

The system of apprenticeship was first developed in the late Middle Ages. The uneducated and inexperienced were employed by a master craftsman in exchange for formal training in a particular craft. So why does apprenticeship seldom happen within software testing? Do we subconsciously believe that just about anyone can test software? Join Lloyd Roden and discover what apprenticeship training is and-even more importantly-what it is not. Learn how this practice can be easily adapted to suit software testing. Find out about the advantages and disadvantages of several apprenticeship models: Chief Tester, Hierarchical, Buddy, and Coterie. With personal experiences to share, Lloyd shows how projects will benefit immediately with the rebirth of the apprenticeship system in your test team.

Lloyd Roden, Grove Consultants
STARWEST 2005: Lightning Talks: A Potpourri of 5-Minute Presentations

Lightning Talks are nine five-minute talks in a fifty-minute time period. Lightning Talks represent a much smaller investment of time than track speaking and offer the chance to try conference speaking without the heavy commitment. Lightning Talks are an opportunity to present your single biggest bang-for-the-buck idea quickly. Use this as an opportunity to give a first time talk or to present a new topic for the first time. Maybe you just want to ask a question, invite people to help you with your project, boast about something you did, or tell a short cautionary story. These things are all interesting and worth talking about, but there might not be enough to say about them to fill up a full track presentation. For more information on how to submit your Lightning Talk, visit www.sqe.com/lightningtalks.asp. Hurry! The
deadline for submissions is October 1, 2005.

Erik Petersen, Emprove
Agile Testing of Embedded, Real-Time Systems

Until now, Agile development and testing concepts have been aimed largely at Web sites, interactive applications, and software packages where short production cycles are a must. With care, many of these same testing practices can work on embedded systems, in which long development cycles, no user interface, and regulatory requirements are the norm. Jon Hagar examines Agile testing practices you can implement within both hardware and software product domains. Learn to define the "user" for an embedded application, determine how much documentation is enough, and identify ways to perform early testing while the hardware remains in flux. Find out how to move from a more traditional embedded testing structure to embrace Agile concepts in your test practices.

Jon Hagar, Lockheed Martin
Open SourceTest Automation Frameworks

Open source software has come a long way in the past few years. However, for automated testing there still are not many ready-made solutions. Testers often must spend their time working on test cases rather than working on a test automation framework. Allen Hutchison describes the elements of an automated test framework and demonstrates a framework that you can quickly assemble from several open source software tools. He then explains how to put the pieces together with a scripting language such as Perl. Once you build the framework, you can improve and reuse it in future test projects. At the end of the presentation, Google will release the described framework as a new open source project that you can begin using immediately.

Allen Hutchison, Google
What the US Marine Corp Taught Me About Test Management

When we think of teams and teamwork, who epitomizes the team concept more than the US Marine Corps? From the Halls of Montezuma to the shores of Tripoli or to the hallways of your company, success requires teamwork. Are you building, overhauling, or just wanting to improve your test team? Looking for new ideas and approaches to tackle testing obstacles? A crucial part of success in testing is the motivation and effectiveness of the test team. The team of the best testers in the world fails if they are not working together to accomplish their mission. Former USMC Sergeant Sean Buck shares techniques for building motivated and successful teams based on time-tested USMC principles. Analyze what successful teams do right-and apply it to your test teams.

Sean Buck, The Capital Group Companies Inc
Globalization Testing

Globalization testing encompasses both internationalization testing and localization testing. Localization testing focuses on system details that must be modified for a particular location, region, or culture. These include language, appropriate idioms, currency formats, alphabetic sort order, left/right vs. right/left language display, date/time formats, and clip art and photograph appropriateness. The necessity of testing these is generally well understood. Terry Shidner points out that before localization testing is performed, the system must first undergo internationalization testing. This type of testing is important to verify the readiness of the system for the localization work. For example, has the system been implemented without embedded text strings? Has all text input, processing, and display been implemented with Unicode?

Terry Shidner, Symbio
Test Driven Development - It's Not Just for Unit Testing

Test-driven development (TDD) is a new approach for software construction in which developers write automated unit tests before writing the code. These automated tests are always rerun after any codes changes. Proponents assert that TDD delivers software that is easier to maintain and of higher quality than using traditional development approaches. Based on experiences gained from real-world projects employing TDD, Peter Zimmerer shares his view of TDD's advantages and disadvantages and how the TDD concept can be extended to all levels of testing. Learn how to use TDD practices that support preventive testing throughout development and result in new levels of cooperation between developers and testers. Take away practical approaches and hints for introducing and practicing test-driven development in your organization.

Peter Zimmerer, Siemens
STARWEST 2005: Planning for Successful Test Automation

You have the automation tool. You have the right technical skills. You have the application experts at your disposal. It's time to jump in and start coding! Or is it? Many well-intentioned test automation efforts fail due to a lack of planning. Steve Walters describes his practical approach for developing an overall test automation strategy. Learn how to plan for automation success, select the right tests to automate, and prioritize them for a faster return on investment. By quickly eliminating poor automation candidates and using Steve's scorecard to assess the value of automating a test, you will be on the right track to achieving your automation goals. Take away a quantitative approach for deciding what to automate-and what not to automate-and the steps to develop a realistic plan and timeline for getting the job done.

Steve Walters, Dell
STARWEST 2005: Testing Dialogues - Technical Issues

Is there an important technical test issue bothering you? Or, as a test engineer, are you looking for some career advice? If so, join experienced facilitators Esther Derby and Elisabeth Hendrickson for "Testing Dialogues-Technical Issues." Practice the power of group problem solving and develop novel approaches to solving your big problem. This double-track session takes on technical issues, such as automation challenges, model-based testing, testing immature technologies, open source test tools, testing web services, and career development. You name it! Share your expertise and experiences, learn from the challenges and successes of others, and generate new topics in real-time. Discussions are structured in a framework so that participants receive a summary of their work product after the conference.

Esther Derby, Esther Derby Associates Inc
Aligning Testing Strategies wwith Corporate Goals

When developing a testing strategy, test managers normally review the business case for the project, study the new requirements, and consider what they know about the system under test. By also including a review of your organization's mission, values, and corporate goals, you will immediately stand out among your peers and at the same time improve the business value of testing. Stewart Noakes has worked with test managers at both large and small companies to help them align test strategies with corporate goals. Using case examples, Stewart describes how they used this process to guide their testing approach and demonstrates how this approach significantly increases the tangible and intangible ROI on testing. Learn to use your company's corporate goals to help you make the right decisions about what to test, how much to test, and, importantly, when to stop testing.

Stewart Noakes, Transition Consulting Ltd

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.