|
A Look at PerlClip Need to get the scoop on the latest software tests and trends? You've come to the right place. Get one reviewer's opinion of PerlClip.
|
|
|
Staying on the Critical Path Connect with an expert to learn how to work smarter and discover new ways to uncover more defects. Michael Bolton leads us further down the path to successful critical thinking by teaching us the kinds of questions we should ask to obtain the most useful information.
|
|
|
Bridging the Gap Between Disciplines: Software Testing and UX Design Two industry experts from very different worlds walk you through a Eureka! moment. Get their thoughts on how to build a strong, successful collaborative effort between two distinct disciplines—software testing and user experience (UX) design.
|
|
|
The Power of Predictability Turn to The Last Word, where software professionals who care about quality give you their opinions on hot topics. This month, Linda Hayes details why being able to predict the end state of your data at the beginning of a test is vital to achieving accurate results.
|
|
|
A Look at Subversion 1.2 Need to get the scoop on the latest software tests and trends? You've come to the right place. Get one reviewer's opinion of Subversion 1.2.
|
|
|
I Do Not Want a Bug Report Building relationships is important because trust allows us to share information more freely. In his article, Jason Yip explains why he'd rather have a face-to-face conversation about bugs instead of relying solely on a traditional bug report.
|
|
|
Web Services API Testing Traditionally, test engineers have had some type of a visual user interface for testing client/server and Web applications. Web services, on the other hand, are completely without a user interface, providing only an application program interface (API). A Web service does not display a visual output for testing. Although this fact makes manual testing very difficult, Web services are ideal candidates for automated testing. As a result, some programming skills are almost certainly needed for testers
who test Web services. What about testers with less technical skills? Learn about the challenges Papa Acquah faced with Web services testing: WSDL validation, unit testing, functional testing, client side testing, and server testing. Find out how he used a test harness as well as existing commercial testing tools to accomplish his testing needs.
|
Papa Acquah, LexisNexis
|
|
Peanuts and Crackerjacks: What Baseball Taught Me about Metrics Because people can easily relate to a familiar paradigm, analogies are an excellent way to communicate complex data. Rob Sabourin uses baseball as an analogy to set up a series of status reports to manage test projects, share results with stakeholders, and measure test effectiveness. For
test status, different audiences-test engineers, test leads and managers, development managers, customers, and senior management-need different information, different levels of detail, and different ways of looking at data. So, what "stats" would you put on the back of Testing Bubble Gum
|
Robert Sabourin, AmiBug.com Inc
|
|
Face-off: Stuctured Testing vs. Exploratory Testing and Error Guessing Exploratory testing and error guessing are valuable functional testing techniques. Like all other methods, though, they have limitations partly because they are based on the knowledge, experience, and intuition of the test engineer. If you primarily use unstructured approaches for testing, you risk wasting effort on redundant testing, testing in non-critical areas, and under-testing critical areas-all of which can lead to missed bugs or finding defects later in the cycle. BJ Rollison uses two case
studies to demonstrate the limits of unstructured testing and how fundamental test design techniques and gray box test design can improve the effectiveness of your tests. You'll correctly prioritize critical areas and uncover serious issues earlier and with less effort. The first case tests Weinberg's triangle algorithm from the design requirements and a C# implementation. The second
|
William Rollison, Microsoft Corporation
|
|
Developing an Error-Based Testing Strategy For more complete testing, you need to find and simulate possible error conditions in a system. Many methods throw exceptions when an error occurs. And although the application’s code catches many of these exceptions, an "unhandled" error condition could lead to unpredictable
events and big problems for customers. Rather than using only intuition to guide your error testing, join Christopher Shelley for a strategy to identify specific error conditions in your systems. Sharing sample code, he offers tips and hints for finding these often hidden defects in software. Then, he explains ways to devise specific error injection tests that expose these problems to the developers. Learn the skills you need to find unhandled exceptions within the source code and make sure that your code is exercised through all decision trees and error traps.
|
Chris Shelley, Dell, Inc.
|