|
Better Software Conference & EXPO 2004: The Seven Habits of Highly Insecure Software Over the past few years Herbert Thompson and his cohorts have scoured bug databases for the most malevolent and destructive security bugs ever to infest a released binary. Through that search they found that common characteristics and habits emerged-creating temporary files with secure data, trusting that system calls will succeed, foolishly relying on insecure third party components, and many others. In this session, he offers a startling and even scary accounting of the top seven habits of insecure software. Take away a red-teaming strategy that has broken some of the world's most secure software under testing contracts with Microsoft, IBM, the US DoD, and many others. Use this approach to make your software more secure, and you can sleep better at night.
- The differences between security defects and other common errors
- An intimate understanding of security faults as seen by hackers
|
Hugh Thompson, Florida Institute of Technology
|
|
Leverage Earned Value Management with Function Point Analysis In the Earned Value Management (EVM) approach, as work is performed, it is "earned" on the same basis it was planned-both the original plan and agreed to changes. Today, more and more software projects are using this approach. Function Point Analysis has been shown to be a reliable method for measuring the size of computer software based on detailed requirements and specifications. Function points can be leveraged throughout the EVM process to establish cost and schedule baselines, control project scope over the lifecycle, and quantitatively assess percent complete. Ian Brown delves into the concepts of EVM as applied to software development and the key conditions necessary to profitably employ this management technology. Learn how companies are using function point analysis to improve the technology.
- Earned Value Management applied to software development projects
|
Ian Brown, Booz Allen Hamilton
|
|
Software Test Automation Spring 2003: Mission Made Possible: A Lightweight Test Automation Experience Using a challenging client engagement as a case study, Rex Black shows you how he and a team of test engineers created an integrated, automated unit, component, and integration testing harness, and a lightweight process for using it. The test harness supported both static and dynamic testing of a product that ran on multiple platforms. The test process allowed system development teams spread across three continents to test their own units before checking them into the code repository, while the capture of the tests provided automated integration testing and component regression going forward. He'll also explain the tools available to build such a testing harness and why his team chose the ones they did.
- Examine the benefits-and challenges-of implementing an integrated, automated component and integration testing process in a Java/EJB development environment
|
Rex Black, Rex Black Consulting Services, Inc.
|
|
A Formula for Test Automation Success: Finding the Right Mix of Skill Sets and Tools Not sure what elements to consider now that you're ready to embark on the mission of automating your testing? This session explores the possibilities-the key mix of skill sets, processes, and tools-that can make or break any automation effort. The instructor shows you how to develop an informed set of priorities that can make all the difference in your effort's success, and help you avoid project failure.
- Create better, more reusable tests to improve efficiency and effectiveness
- Increase the value and reputation of QA within your organization
- Establish a closer relationship with developers based on mutual respect
|
Gerd Weishaar, IBM Rational software
|
|
Application Performance and Reliability Management - 24x7 Managing system performance and reliability has never been as significantx0151or as challengingx0151as it is now. These days, most organizations have multi-technology, multi-vendor, multi-tier environments. In other words, it’s a world rife with 24-hour, alwaysx0151on complexity. Add to this the need for continual changes to react to shifts in business conditions, technology advances, and mixes of demands and you have a recipe that calls for the highest level of performance and reliability possible. But getting there is next to impossible. However, new concepts emerging from research labs are delivering usable products such as flexible computing, autonomous computing, and self-tuning systems. These possibilities have revolutionary potential for performance management.
- Examine recommended suites of tools and their limitations
- Look at the major innovations and trends, such as self-tuning systems
|
Ross Collard, Collard and Company
|
|
Testing Web Services: A Dose of Reality Web services truly have the potential to change the world! Along with the magic of Web services comes a dose of reality. For Web services to truly be a panacea to the masses, quality is imperative. The old guard of "not enough" resources or processes must be challenged. The testing of Web services is one aspect of ensuring quality, but is it prudent to automate the testing of Web services? In this presentation, Theresa Lanowitz explores answers to these important questions:
- How are Web services tested today? What is real in 2003? Are we ready for test automation or should we conduct manual testing?
- What is the future direction of testing Web services? What is the outlook for 2005 and beyond?
- Who are the vendors making in-roads today? Who is laying the groundwork for the future?
|
Theresa Lanowitz, Gartner Inc
|
|
Home-Brewed Test Automatioin: Approaches from Extreme Programming Projects Projects that use eXtreme programming (XP) often do not use commercial GUI test tools, finding it more useful to build their own support for test automation. This session explains the strategies they've used, which can actually cross over to any project where developers take responsibility for building support for automated testing. The XP community has already made an impact on the tools and practices for unit testing in the wider development community. The instructor reviews the potential impact on customer-perspective testing.
- Share experiences in building in-house GUI test tools
- How and when to build and use test APIs
- Open-source tools to support these approaches
|
Bret Pettichord, Pettichord Consulting
|
|
Smaller-Scale Web Sites Need Performance Testing Too! Even a smaller-scale Web site requires careful planning and execution of performance tests. Making the critical decisions in a timely manner and identifying the performance goals are still prerequisites to a successful test. However, smaller sites don't necessarily have the resources required to do large-scale testing, so compromises have to be made. This requires good test planning. The instructor explains the testing of a small site looking to grow, as well as the successes and pitfalls of achieving reasonable goals.
- Define the test objectives; what's reasonable?
- Plan the test then utilize tools, choices, and tradeoffs effectively
- Apply and understand the results
|
Dale Perry, Software Quality Engineering
|
|
Planned Chaos: Malicious Test Day In a test and verification organization, it can be easy to fall into predictable ruts and miss finding important defects. Use the creativity of your test team, developers, users, and managers to find those hidden bugs before the software goes into production. Ted Rivera details how his organization conceived of, administers, evaluates, and benefits from periodic malicious test days. Learn ways to make your days of planned chaos productive, valuable, and, yes, even fun. Give both testers and non-testers an opportunity to find inventive ways to break your products and you'll get some surprising results.
- The danger of too much predictability and the results you can expect from a malicious test day
- Create and administer your own malicious test day
- Maximize the benefits of malicious test days
|
Ted Rivera, Tivoli/IBM Quality Assurance
|
|
Evaluating Test Plans Using Rubrics The phrase "test plan" means different things to different people. There is even more disagreement about what makes one test plan better than another one. Bernie Berger makes the case for using multi-dimensional measurements to evaluate the goodness of test plans. Walk away with a practical technique to systematically evaluate any complex structure such as a test plan. Learn how to qualitatively measure multiple dimensions of test planning and gain a context-neutral framework for ranking each dimension. You'll also find out why measurement of staff technical performance is often worse than no measurement at all and how to use this technique as an alternative approach to traditional practices. [This presentation is based on work at Software Test Managers Roundtable (STMR) #8 held in conjunction with the STAR conference.]
- Qualitatively evaluate complex structures, like test
- Ten dimensions of test planning
|
Bernie Berger, Test Assured Inc.
|