|
Internet of Things: Changing the Way We Test
Slideshow
The internet of things (IoT) brings connectivity to a range of previously non-internet-enabled physical devices and real-world objects. This shift has an impact testing—changing what we test, when we test, and the way we test. For one thing, once you’re in the real world, the number of possible issues explodes due to environmental conditions. Just like a race car must adjust its tires for different track conditions, IoT devices must account for environmental factors such as temperature and humidity to prevent unanticipated failures. Jane Fraser believes that for the IoT to be successful, we must focus on developing testing methods, analytics tools, and SDKs that help teams to automate activities such as checking connection strength and robustness, verifying mobile compatibility, and testing various hardware capabilities. This includes Wi-Fi, BTLE, radio, natural language processing technologies, and more.
|
Jane Fraser
|
|
Getting to Continuous Testing
Slideshow
Max Saperstone tells the story of how a health care company striving to get to continuous releases built up their automation to secure confidence in regular releases. Initially, as no test automation existed, Max was able to take an opportunity for greenfield test automation and, in the span of twelve months, develop over two thousand test cases. A pipeline was created to verify the integrity of the automated tests and build Docker containers for simplified test execution. These containers could be easily reused by developers and the DevOps team to verify the application. Join Max as he walks through the feedback loop that was created to allow application verification to go from hours to minutes. Max will share his choices of BDD tooling, integrated with WebDriver solutions, to verify the state of web and mobile applications.
|
Max Saperstone
|
|
Testing AI-Based Systems: A Gray-Box Approach
Slideshow
Testing artificial intelligence- and machine learning-based systems presents two key challenges. First, the same input can trigger different responses as the system learns and adapts to new conditions. Second, it tends to be difficult to determine exactly what the correct response of the system should be. Such system characteristics make test scenarios difficult to set up and reproduce and can cause us to lose confidence in test results. Yury Makedonov will explain how to test AI/ML-based systems by combining black box and white box testing techniques. His "gray box" testing approach leverages information obtained from directly accessing the AI’s internal system state. Yury will demonstrate the approach in the context of testing a simplified ML system, then discuss test data challenges for AI using pattern recognition as an example and share how data-handling techniques can be applied to testing AI.
|
Yury Makedonov
|
|
What's That Smell? Tidying Up Our Test Code
Slideshow
We are often reminded by those experienced in writing test automation that code is code. The sentiment being conveyed is that test code should be written with the same care and rigor that production code is written with. However, many people who write test code may not have experience writing production code, so it’s not exactly clear what is meant. And even those who write production code find that there are unique design patterns and code smells that are specific to test code. Join Angie Jones as she presents a smelly test automation code base littered with several bad coding practices and walks through every one of the smells. She'll discuss why each is considered a violation and via live coding, she will demonstrate a cleaner approach. While all coding examples will be done in Java, the principles are relevant for all test automation frameworks.
|
Angie Jones
|
|
Capturing Testing with 3 Magic Words
Slideshow
Testers tend to be innately curious creatures. Being curious and evaluating risks—that is what the testing job is about. Often it is the statement “I don’t know” that drives our curiosity in testing. Find out not only how to push past the fear of not knowing but how to embrace your curiosity.
|
Janna Loeffler
|
|
How Infrastructure as Code Can Help Test Organizations Achieve Automation
Slideshow
For many test organizations, the first hurdle to automating the testing of a product is deployment of that product in its test environments. Infrastructure as code can be used to facilitate the basic processes of provisioning servers, from bare metal to virtual to cloud, as well as configuration management of the software that resides on the servers. Off-the-shelf infrastructure-as-code tools such as AWS CloudFormation, Chef, Puppet, and Ansible provide less expensive alternatives to developing proprietary in-house deployment solutions. Join Kat Rocha to learn how infrastructure as code can better align test and production environments and reduce problems that arise from configuration drift. We will explore how to use some Infrastructure-as-code tools to facilitate automation and improve testing.
|
Kat Rocha
|
|
Automating Accessibility Testing with Axe
Slideshow
Accessibility empowers users, increases diversity, and can drive higher adoption and higher growth of your digital services. The axe family of open source technologies has been designed with speed, ease of integration, and zero false positives in mind.
|
Dylan Barrell
|
|
Continuous Application Security Testing
Slideshow
Because of its specialized nature, many aspects of application security testing are often assigned to testers from another team or another company, and they may be brought in to perform a point-in-time assessment prior to a release.
|
Josh Gibbs
|
|
Using Design Thinking to Create Better Test Cases
Slideshow
Designing good test cases can be described as an art. With test cases being written with a focus on business, testers should be part of the discovery and design phase of the project, and the business drivers should dictate test case design. But how can we ensure we are focusing on the user and bringing the biggest value possible in this phase? Larissa Rosochansky will describe what design thinking is, how it relates to the testing methodology, and how to use it in the design phase of your project. She will also show you how to better frame the business drivers and select the persona and most important exercises from a wide range of possibilities. After this introduction, Larissa will demonstrate how to apply the exercises, create the agenda for a design thinking workshop, select the team to ensure you have a diverse and multidisciplinary group, and document the results.
|
Larissa Rosochansky
|
|
From Zero to AI Hero
Slideshow
AI is here. Will it take over your job? Is it possible to make it beneficial, not detrimental to your career? Kevin Pyles and his team jumped right into the AI universe. Untrained and inexperienced, they realized immediately that they knew nothing.
|
Kevin Pyles
|