|
Proving Our Worth: Quantifying the Value of Testing Over the years, experts have defined testing as a process of checking, a process of exploring, a process of evaluating, a process of measuring, and a process of improving. For a quarter of a century, we have been focused on the internal process of testing, while generally disregarding its real purpose-creating information that others on the project can use to improve product quality. Join Lee Copeland as he discusses why quantifying the value of testing is difficult work. Perhaps that’s why we concentrate so much on test process; it is much easier to explain. Lee identifies stakeholders for the information we create and presents a three-step approach to creating the information they need to make critical decisions. He shares key attributes of this information-accuracy, timeliness, completeness, relevancy, and more.
|
Lee Copeland, Software Quality Engineering
|
|
Creating Crucial Test Conversations Many test leaders believe that development, business, and management don't understand, support, or properly value our contributions. You know what-these test leaders are probably right! So, why do they feel that way? Bob Galen believes it’s our inability and ineffectiveness in communicating-selling-ourselves, our abilities, our contributions, and our value to the organization. As testers, we believe that the work speaks for itself. Wrong! We must work harder to create the crucial conversations that communicate our value and impact. Bob shares specific techniques for holding context-based conversations, producing informative status reports, conducting attention-getting quality assessments, and delivering solid defect reports. Learn how to improve your communication skills so that key partners understand your role, value, and contributions.
|
Bob Galen, iContact
|
|
Avoid Failure with Acceptance Test-Driven Development One of the major challenges confronting traditional testers in agile environments is that requirements are incrementally defined rather than specified at the start. Testers must adapt to this new reality to survive and excel in agile development. C.V. Narayanan explains the Acceptance Test-Driven Development (ATDD) process that helps testers tackle this challenge. He describes how to create acceptance test checkpoints, develop regression tests for these checkpoints, and identify ways to mitigate risks with ATDD. Learn to map acceptance test cases against requirements in an incremental fashion and validate releases against acceptance checkpoints. See how to handle risks such as requirements churn and requirements that overflow into the next iteration. Using ATDD as the basis, learn new collaboration techniques that help unite testing and development toward the common goal of delivering high-quality systems.
|
C.V. Narayanan, Sonata Software Ltd.
|
|
Performance Testing SQL-based Applications Often, we discover the "real" software performance issues only after deploying the product in a production environment. Even though performance, scalability, stability, and reliability are standards of today's software development, organizations often wait until the end of the development life cycle to discover these limitations, resulting in late deliveries and even chaos. He embraces agile development's philosophies to explain how performance testers can identify and resolve software performance issues early and continue performance testing throughout the development process. Learn how to optimize the use of performance tuning tools such as SQL profiler and MS PerfMon to identify and fix MS SQL server, application, and Web server performance issues. Institute agile methods in your performance testing efforts to avoid that "Oh, no!" moment when the system goes live.
|
Alim Sharif, The Ultimate Software Group
|
|
Testing Lessons Learned from the Great Detectives What the great detectives have taught me about testing.
|
Robert Sabourin, AmiBug.com
|
|
Heuristics for Rapid Test Management Whether you are a tester or a test manager, Jon Bach believes you have little time to do the things you want to do. Even the things on your "absolutely must do" list are competing for your limited time. Jon has a list of what he calls "half-baked" ideas on how to cope. That is, these ideas are still in the oven-still being tested. In his role as a tester and manager, Jon has learned that it's not about time management; it's really about energy management-where you focus your personal energy and direct your team’s energy. Jon shares ideas that have worked for him and some that have failed: Open-Book Testing, Dawn Patrols, Tester Show-and-Tell, Test Team Feud, and Color-Aided Design. Learn how these ideas may solve your problems with test execution, reporting, measurement, and management-all at low or no cost and relatively easy to implement.
|
Jon Bach, Quardev, Inc.
|
|
How Google Tested Chrome Ever wish you could peek inside a big, high-tech company and see how they actually do testing? Well, now you can. Led by Sebastian Schiavone, Google's Chrome Test Team will detail everything they have done to test Google Chrome-both the browser and the Netbook operating system-beginning with their process for test planning and how they design test automation. Sebastian and his team share their initial plans, automation efforts, and actual results in what is likely to be the most candid and honest assessment of internal testing practices ever presented. Learn what worked, what didn't work, and how they'd proceed if they had it all to do over again. Take away copies of Google's actual test artifacts and learn how to apply Google's test techniques on the product you are currently testing.
|
Sebastian Schiavone, Google
|
|
Implementing Agile Testing Once the company decides to move to an agile development methodology, questions invariably arise: How should we implement this methodology? What are the expected benefits and pitfalls? How does testing fit into this new approach? Join Robert Reff as he describes real world experiences that helped his test team move from the design-code-test approach to a test-driven, agile development philosophy. Robert offers concrete advice on how to integrate testing, what testing activities to include or drop, and what to expect from both automation and exploratory testing. He describes possible practices, focus, and pitfalls, rather than the all-or-nothing approach often recommend by well-meaning experts.
|
Robert Reff, Thomas Reuters
|
|
Performance Testing Throughout the Life Cycle Even though it is easy to say that you should continuously test your application for performance during development, how do you really do it? What are the processes for testing performance early and often? What kinds of problems will you find at the different stages? Chris Patterson shares the tools and techniques he recently used during the development of a highly concurrent and highly scalable server that is shipping soon. Chris explores how developers and testers used common tools and frameworks to accelerate the start of performance testing during product development. Explore the challenges they faced while testing a version 1 product, including defining appropriate performance and scale goals, simulating concurrent user access patterns, and generating a real world data set. Learn from his team's mistakes and their successes as Chris shares both the good and the bad of the process and results.
|
Chris Patterson, Microsoft
|
|
I Wouldn't Have Seen It If I Hadn't Believed It: Confirmation Bias in Testing "It ain't what we don't know that gives us trouble; it's what we know that ain't so." Will Rogers was talking about confirmation bias-the tendency to feel secure in our beliefs rather than to seek evidence that might challenge them. In testing, confirmation bias prompts us to stop a test too early, to choose tests that conform too closely to the happy path, or to ignore results that confound our expectations. As a result, defects have a chance to hide in our self-induced blind spots. We can't eliminate confirmation bias, but we can manage and control it by diversifying our models, our techniques, and our test teams. In this hands-on and eyes-on session, Michael Bolton presents a set of exercises, videos, and conversations that show testing biases in action. Discover some new tricks that can help you defend yourself and your testing clients from being too sure, too soon, and later ... sorry.
|
Michael Bolton, DevelopSense
|