|
STAREAST 2006: Branch Out Using Classification Trees for Test Case Design Classification trees are a structured, visual approach to identify and categorize equivalence partitions for test objects to document test requirements so that anyone can understand them and quickly build test cases. Join Julie Gardiner to look at the fundamentals of classification trees and how they can be applied in both traditional and agile test and development environments. Using examples, Julie shows you how to use the classification tree technique, how it complements other testing techniques, and its value at every stage of testing. She demonstrates a classification tree editor that is one of the free and commercial tools now available to aid in building, maintaining, and displaying classification trees.
- How to develop classification trees for test objects
- The benefits and rewards of using classification trees
- When and when not to use classification trees
|
Julie Gardiner, QST Consultants Ltd.
|
|
Tester Skills for Moving Your Automation to the Next Level Job interviews for test automation engineers are often limited to, "How proficient are you with the tool vendor XYZ's scripting language?" This approach does little to help the hiring manager choose those individuals who are or will become highly skilled automation professionals. As a test engineer, you will need to acquire specialized knowledge and tool independent capabilities to become a test automation expert. Join Dion Johnson as he identifies the core set of tool-independent competencies required of a successful automated software test engineer: automation framework design, programming and debugging skills, object model concepts, and automation methods based on the required quality attributes. Learn how you, as a hiring manager, can identify these skills, or find out how you personally can improve your skills to become a true test automation expert.
|
Dion Johnson, DiJohn Innovative Consulting, Inc.
|
|
The Software Vulnerability Guide: Uncut and Uncensored Warning: This talk contains graphic examples of software failure . . . not suitable for the faint of heart. This "no holds barred" session arms testers with what they really need to know about finding serious security vulnerabilities. Herbert Thompson takes you on an illustrated tour of the top twelve security vulnerabilities in software and shows you how to find these flaws efficiently. Each vulnerability is brought to life through a live exploit followed by a look at the testing technique that would have exposed the bug. Testers and test managers will leave with a keen awareness of the major vulnerability types and the knowledge and insight to fundamentally improve the security of the applications they support and test.
|
Herbert Thompson, Security Innovation LLC
|
|
STAREAST 2006: Testing Dialogues - Management Issues As a test manager, are you struggling at work with a BIG test management issue or a personnel issue? If so, this session is for you. "Testing Dialogues--Management Issues" is a unique platform for you to share with and learn from test managers who have come to STAREAST from around the world. Facilitated by Esther Derby and Johanna Rothman, this double-track session takes on management issues--career paths for test managers, hiring, firing, executive buy-in, organization structures, and process improvement. You name it! Share your expertise and experiences, learn from others’ challenges and successes, and generate new topics in real time. Discussions are structured in a framework so that participants will receive a summary of their work product after the conference.
|
Facilitated by Esther Derby and Johanna Rothman
|
|
Testing: The Big Picture If all testers put all their many skills in a pot, surely everyone would come away with something new to try out. Every tester can learn something from other testers. But can a tester learn something from a ski-instructor? There is much to gain by examining and sharing industry best practices, but often much more can be gained by looking at problem solving techniques from beyond the boundaries of the Testing/QA department. Presented as a series of analogies, Brian Bryson covers the critical success factors for organizations challenged with the development and deployment of quality software applications. He takes strategies and lessons from within and beyond the QA industry to provide you with a new perspective on addressing the challenges of quality assurance.
|
Brian Bryson, IBM Rational Software
|
|
Build Rules: A Management System for Complex Test Environments Due to the interaction of many software components, there is increased complexity in testing today's software solutions. The problem becomes especially difficult when the solution includes combinations of hardware, software, and multiple operating systems. To automate this process, Steven Hagerott's company developed "Build Rules," a Web-based application with inputs from their build management and test execution systems. Using logical rules about the builds, test engineers define the characteristics of the build solution points. To deliver the latest and greatest builds that meet the characteristics defined for each solution point, the system dynamically translates these rules into server side nested SQL queries. Learn how their efficiency and accuracy has improved significantly, allowing test engineers to stay on track with many different build combinations and to communicate results to outside departments and customers.
|
Steve Hagerott, Engenio Storage Group, LSI Logic Corporation
|
|
Progressive Performance Testing: Adapting to Changing Conditions An inflexible approach to performance testing is a prelude to disaster. "What you see at the start isn't always what you get in the end," says Jeff Jewell. Based on his experience performance testing applications on numerous consulting projects, Jeff demonstrates the challenges you may face testing your applications and how to overcome these obstacles. Examples from performance testing on these projects will demonstrate some of the ways that changing conditions of the projects and the information they discovered in early tests caused the testing approach to change dramatically. Find out how hardware configuration, hardware performance, script variations, bandwidth, monitoring, and randomness can all affect the measurement of performance.
|
Jeff Jewell, ProtoTest LLC
|
|
Test Metrics in a CMMI Level 5 Organization As a CMMI® Level 5 company, Motorola Global Software Group is heavily involved in software verification and validation activities. Shalini Aiyaroo, senior software engineer at Motorola, shows how tracking specific testing metrics can serve as key indicators of the health of testing and how these metrics can be used to improve your testing practices. Find out how to track and measure phase screening effectiveness, fault density, and test execution productivity. Shalini describes the use of Software Reliability Engineering (SRE) and fault prediction models to measure test effectiveness and take corrective actions. By performing orthogonal defect classification (ODC) and escaped defect analysis, the group has found ways to improve test coverage.
CMMI® is a registered trademark of Carnegie Mellon University.
- Structured approach to outsource testing
|
Shalini Aiyaroo, Motorola Malaysia Sdn. Bhd
|
|
STAREAST 2006: Apprenticeships: A Forgotten Concept in Testing The system of apprenticeship was first developed in the late Middle Ages. The uneducated and inexperienced were employed by a master craftsman in exchange for formal training in a particular craft. So why does apprenticeship seldom happen within software testing? Do we subconsciously believe that just about anyone can test software? Join Lloyd Roden and discover what apprenticeship training is and-even more importantly-what it is not. Learn how this practice can be easily adapted to suit software testing. Find out about the advantages and disadvantages of several apprenticeship models: Chief Tester, Hierarchical, Buddy, and Coterie. With personal experiences to share, Lloyd shows how projects will benefit immediately with the rebirth of the apprenticeship system in your test team.
- Four apprenticeship models that can apply to software testers
- Measures of the benefits and return on investment of apprenticeships
|
Lloyd Roden, Grove Consultants
|
|
ISQTB Certification: Setting the Standard for Tester Professionalism Sandra Bourgeois has 25 years experience as an IT professional project manager, test manager, developer and QA lead. She is a Director and Project Manager at MassMutual
Financial Services in Springfield, Mass. For the past three years she has functioned as the senior IT Test Manager at MassMutual, working with a variety of large projects to
identify and resolve testing roadblocks to project implementation. She also serves as a project manager and teaches classes on testing topics, focusing on Performance
Testing. Most recently she has been the Performance Test Manager for several critical projects including Internet, Intranet and technical upgrades. She has presented at the
QAI International Conference. Her background also includes working as a social studies teacher and museum tour guide since graduating from Vassar College and UCLA.
|
Rex Black, Rex Black Consulting
|