Conference Presentations

Testing in Production: Which Version Wins?

Would your marketing department like to know which website feature will excite online customers to buy more products, return to your site again and again, and increase revenue and profits? Harish Narayan describes how his team uses risk-based testing and statistical test design to optimally check features deployed with multiple website options. At Vistaprint, their measurement-focused marketing department requires live production tests of multiple web options-split runs in their jargon-that expose different features for different customer sessions; they choose to retain the one that “wins” to maximize returns. Preproduction testing of split-run features, especially when multiple runs are deployed in every release, presented challenges for Vistaprint’s testers.

Harish Narayan, Vistaprint
The Force of Test Automation in the Salesforce Cloud

What would happen if your company doubled or even tripled its number of releases and asked you to do the same with your testing? What if the number of developers doubled and your testing staff remained the same size? Would your test automation be capable of meeting the demand? How would you ensure that one-hundred Scrum teams are investing enough in test automation? How would you triage hundreds of test failures each day? How would you validate each of more than one-hundred releases to production per year? These are the questions Salesforce.com has had to answer during its twelve year history. These are the challenges that led to the creation of its "test automation cloud." Chris Chen shares how Salesforce.com's test automation cloud works and gives you an inside look at the different technologies and methodologies they use today.

Chris Chen, Salesforce.com
STARWEST 2011: Concurrent Testing Games: Developers and Testers Working Together

The best software development teams find ways for programmers and testers to work closely together to build quality into their software. These teams recognize that programmers and testers each bring their own unique strengths and perspectives to the project. However, working in agile teams we need to unlearn many of the patterns that traditional development taught us. In this interactive session with Nate Oster, you learn how to use the agile practice of "concurrent testing" to overcome common "testing dysfunctions" by having programmers and testers work together-rather than against each other-to deliver quality results throughout an iteration. Join Nate and practice concurrent testing with games that demonstrate just how powerfully dysfunctional approaches can act against your best efforts and how agile techniques can help you escape the cycle of poor quality and late delivery.

Nate Oster, CodeSquads LLC
Pushing the Boundaries of User Experience Test Automation

Although full test automation of the user experience (UX) is impractical and unwise, there are approaches that can save you time and resources. At eBay, Julian Harty and his colleagues are finding new ways to automate as much of UX testing for eBay.com as is reasonably possible. Even with a highly complex, web-based application, they have found that automation finds many potential problems in the user experience-even in rich application scenarios. Julian shares a practical experience report of their successes together with the barriers and boundaries they discovered-detecting navigation issues, layout bugs, and problematic differences between the behavior of various web browsers. Learn from eBay's experiences why automated testing can be beguiling and, paradoxically, increase the chances of missing critical problems if you chose to rely mainly or even solely on the automated tests.

Julian Harty, eBay, Inc.
Managing Test Data in Large and Complex Web-based Systems

Are you testing an application or web site whose complexity has grown exponentially through the years? Is your test data efficiently and effectively supporting your test suites? Does the test data reside in systems not under your direct control? Learn how the WellsFargo.com test team integrated test data management processes and provisions to gain control over test data in their very large and complex web system environment. Join Ron Schioldager to explore the lifecycle of data, its relationship to effective testing, and how you can develop conditioned, trusted, and comprehensive test data for your systems. Learn about the tools Wells Fargo developed and employs today to support their test data management process, enabling them to maintain a shorter data maintenance cycle while improving their test reliability.

Ron Schioldager, Wells Fargo
Top Ten Disruptive Technologies You Must Understand

The consumerization of enterprise software applications is no longer on its way-it is here. Emerging technologies such as mobile apps, tablets, 4G, cloud computing, and HTML5 are impacting software engineering and testing organizations across all industries. By enabling sensitive data to be accessed through the web and on mobile devices, there is immense pressure to ensure that apps are reliable, scalable, private and secure. Using real-world examples, Doron Reuveni identifies the top ten disruptive technologies that have transformed the software industry and outlines what they mean for the testing community now and in the future. The ways in which web and mobile apps are designed, developed, and delivered are changing dramatically, and therefore the ways these apps are being tested are being taxed and stretched to the breaking point.

Doron Reuveni, uTest
Cloud Computing: Powering the Future of Testing

With the advent of agile development processes, the expected cycle time for building and shipping quality software has been cut dramatically. Yet, much of the IT infrastructure testing used has remained the same for most companies. Testing teams often find themselves squeezed between the need for speed and their inadequate test infrastructure. Today, hundreds of companies are using cloud-based IT infrastructures to streamline, parallelize, and accelerate their testing cycles. Using real-world case studies, Sundar Raghavan shares how the cloud model can enable you to create multiple test environments, instantiate production-like virtual data centers, run multiple tests in parallel, and perform load tests almost at will. Sundar discusses how the cloud model reduces the cost and complexity of test harness set-up and tear-down-all without requiring you to change test tools or methodologies.

Sundar Raghavan, Skytap
STARWEST 2011: Seven Key Factors for Agile Testing Success

What do testers need to do differently to be successful on an agile project? How can agile development teams employ testers’ skills and experience for maximum value to the project? Janet Gregory describes the seven key factors she has identified for testers to succeed on agile teams. She explains the whole-team approach of agile development that enables testers to do their job more effectively. Then, Janet explores the “agile testing mindset” that contributes to a tester’s success. She describes the different kind of information that testers on an agile team need to obtain, create, and provide for the team and product owner. Learn the role that test automation plays in the fast-paced development within agile projects, including regression and acceptance tests. By adhering to core agile practices while keeping the bigger picture in mind, testers add significant value to and help ensure the success of agile projects.

Janet Gregory, DragonFire, Inc.
Test Automation Magic: Pushing the Frontiers

The evolutionary cycle of test automation appears to have hit a plateau. Krishna Iyer and Mukesh Mulchandani believe it is time to push the frontiers again for another cycle of improvements. Together, they describe how you can improve your test automation, with results that others will see as sheer magic. They describe a number of cutting edge ideas including automatic documentation of manual test cases, algorithms that will select the best automation scripts to run when you don't have sufficient time to execute them all, visual modeling of test automation to create new scripts from existing ones in a fraction of the time, and automation frameworks that disappear after test cases are built. Krishna and Mukesh also challenge traditional automation ideas such as automating only when the application is stable.

Krishna Iyer, ZenTEST Labs
STARWEST 2011: Lightning Strikes the Keynotes

Lightning Talks have been a very popular part of many STAR conferences throughout the years. If you’re not familiar with the concept, a Lightning Talk session consists of a series of five-minute talks by different presenters within one presentation period. For the speakers, Lightning Talks are the opportunity to deliver their single biggest-bang-for-the-buck idea in a rapid-fire presentation. And now, lightning has struck the STAR keynote presentations. Some of the experts in testing-Michael Bolton, Jennifer Bonine, Hans Buwalda, Lee Copeland, Dale Emery, Bob Galen, Julie Gardiner, Dorothy Graham, Jeff Payne, and Martin Pol-will each step up to the podium and give you their best shot of lightning. With no time to dither or vacillate-and hemming and hawing forbidden-you'll get ten keynote presentations for the price of one and have some fun at the same time.

Lee Copeland, Software Quality Engineering

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.