Keynote
Jenny Bramble and Jenna Charlton (Keynote)
Title: STAIRS!
Overview: Are you human? Do you need to change your elevation? Have you considered stairs?
Jenna Charlton and Jenny Bramble have spent a lot of times thinking about stairs, steps, ramps, handrails, risers and how context relates to our experience of software and the world around us. Anchoring and cognitive biases can lead us astray of inclusive design principles. Considering affordances lead to a better experience for everyone who comes in contact with your software. And context helps us to understand our software in it’s broader ecosystem and identify the right scope.
Join us as we explore types of bias, context, affordances, and design thinking in the context of something we all know and use every day.
Bio: Jenna is a software tester and director of product with over a decade of experience. They’ve spoken at a number of dev and test conferences and is passionate about risk-based testing, building community within agile teams, developing the next generation of testers, and A11y. When not testing, Jenna loves to go to punk rock shows and live pro wrestling events with their husband Bob, traveling, and cats. Their favorite of which are the 2 that share their home, Maka and Excalipurr.
Jenny ended up in a quality assurance career after coming up through support and devops, cutting her teeth on that interesting role that acts as the ‘translator’ between customer requests from support and the development team. Her love of support and the human side of problems lets her find a sweet spot between empathy for the user and empathy for her team. She’s done testing, support, or human interfacing for most of her career and now finds herself leveraging those skills as Director of Engineering at Papa. She finds herself happiest when she’s making an impact on other people–whether it’s helping find issues in applications, leading scrum, speaking at events, or just grabbing coffee and chatting.
Automation Track
Bas Dijkstra
Title: How well do you wield your tools? – The importance of craftsmanship in test automation
Overview: What do people talk about when they talk about test automation? Chances are that the first thing they will mention is the tools they use, or how they have adopted the latest and greatest language, tool framework, library or technique, hopping from trend to trend. What is much less often talked about are the more fundamental skills that engineers and teams need to achieve long-lasting success in test automation. Following good engineering practices. Understanding object-oriented programming principles and knowing when and when not to apply them. Writing tests, using tools and applying principles and patterns because you should, not just because you can. And the list goes on… In this talk, I’ll demonstrate the importance of these fundamental principles, and how understanding and knowing when and where to apply them makes you a better and more versatile automation engineer. I’ll do this by: showing examples of tests that I spent weeks writing that shouldn’t have been written in the first place showing examples of bad automation code that everybody can write, pull from StackOverflow or ask ChatGPT to write for them and what the better alternative is showing why exactly test automation is development, and how this realization helps you ask better questions to and have more meaningful conversations with developers The golden thread in this talk is formed by my own experiences, successes and (mostly) failures in the last 17 years I’ve spent in testing and automation. My hope? That it doesn’t take the ‘next generation’ another 17 years to learn what I’ve learned…
Bio: Hey, my name is Bas Dijkstra, and I am an independent test automation consultant and trainer. I have been active in the test automation field for some 17 years now, and have worked on software testing and automation solutions across a wide range of programming languages, frameworks and technology stacks. I’ve delivered test automation training to dozens of companies and hundreds of conference attendees in the Netherlands as well as abroad, to excellent reviews. You can find a complete overview of my professional life on my LinkedIn profile. If you want to get in touch, please use the contact form on this site, or send me an email at bas@ontestautomation.com. I’m also the developer of RestAssured.Net, a library that is meant to to make writing tests for HTTP APIs in C# a breeze. I live in Amersfoort, The Netherlands, together with my wife and two sons. When I am not at work, I like to go outside for a long bike ride, or to sit down and read a good book.
Josh Grant
Title: What the Fuzz?
Overview: In this talk, I will introduce fuzz testing, as a software technique for finding functional and security bugs in application software. Fuzzing is a technique that’s been around since at least the late 1980s (during that dark and stormy night), and has improved greatly since that time. The idea is simple: use automation to generate and execute test cases for application code with semi-valid “fuzzy” inputs to find issues. Even with a simple idea, fuzzing can be incredibly effective for finding issues, such as the Heartbleed security vulnerability found in the OpenSSL library and the now infamous Log4Shell/Log4J vulnerability in the Java world. In this presentation, audience members will: Understand what fuzz testing is and how it works See examples of how fuzzing can find functional and security bugs in several programming languages Learn the benefits of fuzz testing in software development and testing efforts. This presentation will feature demos of fuzzing in practice.
Bio: Josh Grant has been working in test automation for over a decade. He’s worked with end-to-end UI test automation, API test automation and, recently, fuzz testing. He’s currently a Developer Relations Advocate at Code Intelligence and resides in Toronto, Ontario, Canada.
Thomas Haver
Title: The Automation Firehose: Be Strategic and Tactical
Overview: The widespread adoption of test automation has led to numerous challenges that range from development lifecycle integration to simple scripting strategy. Just because a scenario CAN be automated does not mean it SHOULD be automated. Teams that adopt automation often rush to automate everything they can — the automation firehose. For those scenarios that should be automated, every team must adopt an implementation plan to ensure value is derived from reliable automated test execution. In this session, the audience will learn how to automate both strategically and tactically to maximize the benefits of automation. Entry criteria will be demonstrated for automation in the development lifecycle along with a set of checks to determine automation feasibility & ROI.
Bio: Thomas is presently serving as a Test Automation Architect. He leads a team of testers, ops engineers, and production support analysts in the adoption of DevOps practices. Previously, he led the enterprise automation support of 73 applications at Huntington National Bank that encompassed testing, metrics & reporting, and data management. Thomas has a background in Physics & Biophysics, with over a decade spent in research science studying fluorescence spectroscopy and microscopy before joining IT.
Martin F. Seidel
Title: Creating a reliable, efficient and scalable testing environment with appium
Overview: Creating a testing environment for apps have many pitfalls. First, there is the complexity of the product, different customers use different combinations of features, then, there is the test data which litters the app. There are processes like login into the application, which are carried out again and again and cost a lot of time and money. Finally, there is execution, where everyone expects the quality assurance engineer to run these tests on their machine and write some reports. Does at least one of these points appeal to you? In my presentation, I will provide guidance on how we solved every single point of the problems mentioned and made them scalable for future growth. I will draw a clear picture of our testing infrastructure and how it is built. I’ll go over our test API, how we use scenarios, and show some shortcuts. I’ll provide details on how we integrated native tests into our GitHub pipeline and ran them on every pull request.
Bio: Hello, I’m Martin and this is my story: 7 years ago I started my journey with Staffbase. Back then we were 12 people in a rented apartment in a small town in East Germany. Today we have grown to 900 employees in over 15 locations on 3 continents and recently became a unicorn. When I started at Staffbase, I was very interested in writing unit tests with JavaScript because nobody was doing that at university. After digging deeper into the topic, my horizons changed a lot. I took my first steps with test automation and built my first prototype with Appium. Our team has now grown to over 30 Qa’s worldwide and we have a wide range of tools and processes.
Bas Dijkstra
Title: Improving the quality of your tests with mutation testing
Overview: In this talk, I will show the participants the principles behind mutation testing, and how it can help them get better insight into the quality of their automated test suite. Challenges A lot of development teams write tests to check that their code behaves as intended. However, the quality of these tests is unfortunately often underappreciated. If any attention is paid to test quality at all, it is typically done in terms of code coverage, which, as attendees will see from examples I’m using in the talk, is a flawed metric. Yet, teams rely on the results of these tests in their software development lifecycle. Isn’t there a better way to get information about the quality of our tests? What’s covered in the talk In this talk, participants will get a practical introduction to mutation testing and how this technique can be used to get better insight into the quality of their tests. I will start with some theory on: The flaws of code coverage as a quality metric The concepts of false positives and false negatives The underlying concepts of mutation testing and how it tries to address the above issues After this, I will conduct a live demo of mutation testing, covering: How to add mutation testing to an existing set of unit tests This will cover adding PITest to an existing (Maven-based) Java codebase, consisting of some application code that implement functions for a hypothetical online banking system, as well as a set of unit tests that are used to document the application code and check the correctness of its behaviour. How to run a mutation testing tool and interpret the results This introduces the attendees to running PITest, checking the results (as well as how these results can make a build pass or fail) and interpreting the suggestions made by PITest in the reports. How to improve the existing set of unit tests for better mutation testing scores This final step is meant to show attendees how to improve their unit tests based on the results produced by mutation testing and how this improves their mutation testing score. This ties together all the concepts they have seen so far. Practical information The live demo part of the talk is done using PITest (http://pitest.org/) and Java, but the concepts of mutation testing and the problem it addresses is universal. All code I will use will be made publicly available on GitHub, so attendees can try it out for themselves after the talk.
Bio: Hey, my name is Bas Dijkstra, and I am an independent test automation consultant and trainer. I have been active in the test automation field for some 17 years now, and have worked on software testing and automation solutions across a wide range of programming languages, frameworks and technology stacks. I’ve delivered test automation training to dozens of companies and hundreds of conference attendees in the Netherlands as well as abroad, to excellent reviews. You can find a complete overview of my professional life on my LinkedIn profile. If you want to get in touch, please use the contact form on this site, or send me an email at bas@ontestautomation.com. I’m also the developer of RestAssured.Net, a library that is meant to to make writing tests for HTTP APIs in C# a breeze. I live in Amersfoort, The Netherlands, together with my wife and two sons. When I am not at work, I like to go outside for a long bike ride, or to sit down and read a good book.
Strategy Track
Kiruthika Ganesan
Title: The Power of Example Mapping
Overview: Who writes the acceptance criteria for your stories- Product owners, Business analysts or the entire team? If the Given-When-Then scenarios are already prewritten and presented to the team, their thought process is curtailed. It might also lead to preconceived ideas and notions. This also tips the responsibility towards a single stakeholder which can prove to be dangerous. What is the alternative then? Discover the power of Example Mapping. Example Mapping is a great way to encourage the team to adopt Behaviour Driven Development. The whole practice is about encouraging communication and collaboration between the various stakeholders. It is of prime importance in an agile setup that the Product, Dev and Tester share the same understanding and have equal partnership in the stories/features delivered. Example mapping facilitates these conversations and also goes hand in hand with the shift left approach. In a nutshell, it is a great team activity which results in substantial gains and increase in productivity. Takeaways: 1) Understand the importance of Example Mapping and how the participants can use it in their teams. Tips on convincing the entire team to try this approach. 2) Practical implementation examples for all kinds of stories whether it is a user story or a non-user story 3) Limitations of Example Mapping and when not to use it
Bio: Kika is passionate about testing and approaches testing with a holistic view. She has over 17 years of experience in the IT industry, working as a tester, developer, and a trainer. Kika believes in the power of people and collaborating to build a safe environment where the teams can thrive and work on creating great software. She enjoys teaching and getting involved in community activities like speaking at conferences, delivering workshops. She is also one of the tutors at the Coders Guild and a core member of Synapse QA. Also, a keen advocate of Women in Tech initiatives and a global ambassador. In her spare time, loves spending time with her family and writing short stories.
Antonello D’Ippolito
Title: How can I trust my test suite?
Overview: You recently started working on a new project, or you have been working on it for a while, and it has undergone many changes due to the contributions of multiple developers. This means that you may not be familiar with how the automated tests were written, or how effective they are. So, how can you be confident that you won’t break anything when you deploy new features to production, or when you refactor that old piece of software that’s holding you back? Do you have enough trust in your test suite to rely on the fact that a green CI build means that everything is okay? There are many tools that provide metrics about your code and tests, such as code coverage and CRAP metrics, but they have their limitations. In this talk, we will explore ways to evaluate the effectiveness of your test suite, how to improve it, and the benefits of having a robust and comprehensive set of automated tests.
Bio: I’m an Italian (now living in the Netherlands) software engineer. I’ve been working as PHP backend engineer for a long time, for many projects in many industries, in Italy and in the Netherlands. I’m interested in Domain Driven Design, Continuous Delivery and Test Driven Development. While I’m primarily a backend engineer, I’m also a Scrum Master and I have a strong interest in Agile methodologies in general. I’m a proud member (and speaker when possible) of Roma PHP User Group. I’m a musician too, and I have an original music self-produced project.
Ben Oconis
Title: Breaking Down Silos – Bridging Collaborative Gaps In Your Org
Overview: Do you often feel you are the last team in your organization to know about features before they reach customers or that when features are finally ready for testing, it’s too late to produce a high-quality product? Have you felt a divide between your testing team and other engineering or product teams within your organization? In this talk, Ben will review how in his organization he has built a bridge between product, engineering, stakeholders, and our testing team, forming a productive and cooperative environment. He will walk through how his testing team has built that relationship over time. He will review why testing teams may find themselves isolated in silos in their organization. He will review the impact this has on the organization. Finally, he will look into strategies such as building communication, empathy, early team-building exercises across teams, and testing as a whole team approach to help dig out of the silo and create open communication channels shared across your org. So join him and learn how to create a more collaborative and productive environment in your organization.
Bio: Ben is currently the lead QA at Storyblocks, the first stock subscription service. He comes from a customer support background where he spent 15 years managing customers. He attempts to take that background and apply it to find innovative ways to help improve quality across the organization. He has had some great successes and failures and hopes to share those with the testing community. When not testing, Ben enjoys hiking, reading comic books, watching movies, and hanging with his family, which includes 2 boys, a 7-year-old and 3-year old, his wife, and his old dog Dexter, who is a dachshund mix.
Valentin Ranshakov
Title: Kill regression!
Overview: In the fast-paced world of agile software development, efficiency and speed are key factors to success. Regression testing, while a powerful tool, may not always be the most practical solution in such environments. While this form of testing has traditionally been used to identify defects and issues with software, the process can be slow, expensive, and can impede development teams from experimenting with new code on production. For these reasons, many organizations have shifted their focus towards developing tools and environments that support testing on production. By leveraging tools like canary releases, shadowing traffic, and beta testing, developers can collect sufficient data points to ensure that a release is stable without the need for regression testing. These tools allow for more streamlined and efficient feedback on code changes, enabling engineers to experiment with new code and quickly identify and address any issues that may arise.
Bio: Valentin is a quality assurance wizard, with over a decade of experience in the fast-paced world of extreme scrum and agile environments. He’s a master of effective software quality assurance and testing, and he’s always pushing the boundaries of what’s possible. Today, Valentin is working on the cutting edge of consumer-focused software development as a Staff AQA Engineer, building high-quality mobile apps and backends that delight millions of users. But he’s not content to rest on his laurels – his biggest challenge is helping his teams build, break, and recover ecosystems at lightning speed, all while keeping their focus firmly on the end user. It’s a thrilling ride, but Valentin wouldn’t have it any other way.
Career Growth Track
Ken De Souza
Title: On-call doesn’t need to suck; ways to help make your on-call better
Overview: Being on-call can be really hard at times. Incidents happen and when they do, it can cause alert fatigue, long hours, lots of stress and not getting your day job done. In this talk, Ken will discuss his experience working for a company that grew rapidly over several years and where on-call needed to evolve to make it more humane. As part of this talk, Ken will discuss strategies such as: – Being an advocate for better quality software, so that it prevents outages and unnecessary alarms – Dealing with the emotional component of the firefighting related stress when incidents happen – How to help change your organization so that it makes on-call part of the culture. Attendees will take away: – Way of making your on-call rotation more humane – Identifying ways of evolving how your on-call rotation can work effectively in order to prevent incidents from happening – Ideas around making your alerting and monitoring better to prevent burnout and to increase your visibility.
Bio: Ken De Souza has been in software development for over 20 years. He is a software developer, currently specializing in building tools and culture related to helping developers securely deploy and monitor the code they create, with a passion for delivering high quality software at a rapid pace.
He has spoken at software development conferences over the last 10 years. He currently resides in Waterloo, Ontario, Canada.
Tina Fletcher
Title: Career Growth for Testers
Overview: Congrats, you’re a Tester! (…or a Test Developer, SDET, QA Specialist, or whatever your company happens to call people who focus primarily on software quality!) This is a really awesome role to be in because you need to maintain and exercise a diverse skillset and mindset every single day. Your unique position (it’s sort of like a connection point between many other roles in a software organization) helps you to think of problems, risks, and challenges that might not be identifiable UNLESS there is someone on the team specifically dedicated to thinking about software quality. So now that you’re here… how do you grow and progress within your role, and along your career journey? I’ll share my perspective on different career paths and roles that Testers can grow into, what you can do to help you get there, and things that help you improve professionally no matter what your career aspirations are.
Bio: Tina is an Engineering Director at D2L who brings a software quality-focused mindset to the teams and projects she leads. She’s also the president of the KWSQA, an occasional conference speaker, and a bit obsessed with her vegetable garden. Find her online at @fletchertinam or tinafletcher.ca.
Lisa Crispin and Janet Gregory
Title: Lean Coffee
Overview: Join in a Lean Coffee style meeting! Lean Coffee is a structured discussion technique where participants propose and then vote on topics to cover. There is no agenda defined in advance. This is a great way to connect and interact with your fellow software quality enthusiasts!
Bio: Lisa Crispin is the co-author, with Janet Gregory, of Holistic Testing: Weave Quality Into Your Product; Agile Testing Condensed: A Brief Introduction; More Agile Testing: Learning Journeys for the Whole Team; and Agile Testing: A Practical Guide for Testers and Agile Teams; the LiveLessons “Agile Testing Essentials” video course. She and Janet co-founded the Agile Testing Fellowship, which offers “Holistic Testing: Strategies for agile teams” and “Holistic Testing for Continuous Delivery” live training courses both remotely and in-person. Lisa was voted by her peers as the Most Influential Agile Testing Professional Person at Agile Testing Days in 2012. She is co-founder with Janet of Agile Testing Fellowship, Inc. and is happily available for training and consulting. Please visit www.lisacrispin.com, www.agiletestingfellow.com, and www.agiletester.ca for more. Contact Lisa on Twitter as @lisacrispin, and LinkedIn, https://www.linkedin.com/in/lisa-crispin-88420a/.
Janet Gregory is a testing and process consultant with DragonFire Inc. She specializes in showing agile teams how testing activities are necessary to develop good quality products. She works with teams to transition to agile development and teaches agile testing courses worldwide. She contributes articles to publications and enjoys sharing her experiences at conferences and user group meetings around the world. For more about Janet’s work and her blog, visit https://janetgregory.ca or https://agiletester.ca You can also follow her on twitter @janetgregoryca or LinkedIn.
She is the co-author with Selena Delesie of Assessing Agile Quality Practices with QPAM, and with Lisa Crispin of Agile Testing Condensed: A Brief Introduction (LeanPub 2019), More Agile Testing: Learning Journeys for the Whole Team (Addison-Wesley 2014), and Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009), the Live Lessons Agile Testing Essentials video course, and the courses, Holistic Testing: Strategies for agile teams and Holistic Testing for Continuous Delivery.
Together with Lisa Crispin, she has founded the Agile Testing Fellowship to grow a community of practitioners who care about quality. Check out https://agiletestingfellow.com to find out more about courses and membership.
Jerry Penner
Title: ASD-1: Disability or Testing Superpower?
Overview: ASD-1 (Aspbergers Syndrome) affects the way a person thinks, feels, and interacts with people and things. In this talk Jerry Penner describes the behaviours and traits of ASD-1, outlining his own journey of self-discovery with the condition. He then shows how ASD-1 can help or hinder a Software Tester’s abilities. Do you have it? Is it making you a better or worse Tester? Do you manage Testers? Can you see which of your people may have ASD-1 characteristics? At the end of this talk you will be able to identify neurodivergent ASD-1 behaviour and have an understanding of how to interact with such individuals in the workplace. Managers, this talk will help you manage your people in a way that makes sense to them and gets the most out of them. If you think you have ASD-1 you will take away an understanding of how to interact with your neurotypical workmates.
Bio: Jerry Penner is a passionate tester who has been helping ridiculously smart people build better software for over 17 years. One of his favourite testing tasks is helping developers and testers find important bugs faster with less effort. He has delivered talks at Desire2Learn, the KWSQA Quality Conference and STARCanada. He has assembled two test teams from the ground up and implemented Good Practices at two more companies. Over the course of his career he has gone beyond his station to identify areas of improvement in methods and processes and champion for their implementation. This has collectively saved his employers tens of thousands of hours of wasted labour and tens of millions of dollars in lost productivity, lost sales opportunity, lost efficiency of resources, and lost reputation hit points.