Chris Redekop: Infrastructure Testing in Bash, Linux, and AWS
Like any other code, infrastructure code deserves automated tests. But while many of us have taken the first few steps toward infrastructure as code, we are not all testing our infrastructure:
1. Express infrastructure in machine-readable code? Check!
2. Store the infrastructure code in Git? Check!
3. (Automatically) apply the infrastructure code during deployment? Check!
4. Test that #3 behaved as expected? Uh???
This presentation discusses infrastructure testing in the context of Bash, Linux, and AWS. Come to learn the tools available and how to write infrastructure tests for maximum effect.
I am an experienced software developer who works with my team to deliver valuable software quickly. I look for every opportunity to improve how we get software done.
Christine McGarry: The Fellowship of the Test: Building a Community Across Agile Teams
Your company has grown and your development team is ready to divide into smaller, project-focused groups. Or your development team is already divided and the project teams have started to drift apart. How do you encourage the spark of collaboration and communication between multiple project teams within your organization? Join Christine McGarry as she shares the story of how she began The Fellowship of the Test: a gathering of testers to spark collaboration and communication across multiple project teams. Christine will share her experiences of how she began the journey, what went well, what was a struggle, and what tips she can provide attendees should they wish to begin their own journey. Whether you are a leader of testing within your organization who wishes to ignite the fire of collaboration or you are an individual tester looking to build a grassroots campaign of community, you will leave with a series of actions you can take to begin your own journey (Lord of the Rings references are optional).
Christine McGarry loves working with clever and talented people to build better software. One of her favourite testing tasks is tracing the origin of a bug; problem solving and creativity at its best! She regularly reads testing blogs and books and loves to try new testing ideas to become a more effective tester. Christine has delivered talks at KWSQA Quality Conference and STARCanada. She has implemented several quality monitoring initiatives for both the technological side and people side of development teams, and is currently helping to foster community at a fast-growing startup.
Dave Westerveld: Automation anti-patterns and what to do about them
If you listen to the kinds of things testers complain about a recurring theme seems to be problems with test automation. Sometime this comes out in testers complaining to each other about how hard it is to add, run or change tests. Sometimes it comes out as complaining about how hard it is to keep up with changes from developers who keep ‘breaking tests.’ And sometimes it comes out as frustration with the number of bugs that get missed by the automation. Whatever the situation, we can probably all agree that our automation could improve.
In this talk we will dig into some of the reasons your automation might be frustrating. We will look at real life examples of automation anti-patterns as well as solutions that were used to improve the automation in those situations.
Automation anti-patterns we will talk about:
- Big test: Test bloat and some of the reasons for it
- Little test: Not using automation in places where it would be helpful
- Green test: Tests that never fail
- Red test: Tests that cry wolf
- My test: Tests you love because you wrote them
- Your test: Tests you are scared to change because someone else wrote them.
As we talk about how to recognize and correct these issues, you will gain the ability to recognize automation anti-patterns and you will also learn strategies for bringing back tests from the dead and making them useful again.
Dave Westerveld is a senior test analyst at Ansys Inc. where he has worked on testing many different projects ranging from well established products to early stages of major new initiatives. He has also been involved in many different aspects of the testing role including being a bug hunter, helping build out an automation framework and mentoring other testers. He has been involved in traditional automation initiatives at various stages and has also helped to drive up product quality through the creative use of automation tools. Dave has a desire to see teams efficiently producing high value software and is enthusiastic about understanding the way that automation tools can be used to help with this goal.
Graeme Harvey: When Automating Everything Seems Impossible – A Lesson from The Martian in Solving Complex Automation Problems
Maybe you have a large, complex system that takes weeks of testing every time you want to release. Or perhaps your test team is small, and can’t afford the time to continually perform a full suite of regression testing. Regardless of your problem, you’re looking at automation as the solution. But where do you start? And which fancy new, state-of-the-art, solve-all-your-problems solution are you going to use?
This talk draws parallels between implementing an “automation suite” from scratch and fictional astronaut Mark Watney’s attempt to get home from another planet in the 2015 cinematic masterpiece, The Martian. We’ll discuss things like tool selection, deciding what problems to solve (and where to start first), handling time and resource constraints, and more. Using personal, real world examples, I will highlight the lessons from Watney’s problem solving exercises on Mars – all oriented around one very simple piece of advice: “Just begin.”
Watney’s challenges were literally astronomical – he had to survive alone on (and get home from) a different planet! All you have to do is automate your testing. That doesn’t sound so bad anymore, does it?
Currently a Software Test Strategist with Waterloo-based company Magnet Forensics. After spending the early years of his career learning the basics of software testing, he has more recently developed passion in the areas of developing creative test strategies and powerful automation suites. Graeme believes good test organizations aren’t afraid to push the boundaries of creativity and move beyond traditional methods.
Hilary Weaver-Robb: Testing RESTful Web Services
A lot of folks doing testing (QAs, BAs, and devs alike) have experience testing applications on the front end – a graphical user interface on a website, or a mobile app. One of the often missed parts of these applications is the web services or REST APIs that power those interfaces. In this session we’ll focus on RESTful web services – what they are, how (and why) to do functional and exploratory testing, how we can automate some tests using C#, and tools that we can use to help us test them. Attendees will walk away with the understanding, resources, and techniques they need to effectively test and write automation for REST services.
Hilary Weaver-Robb is a Software Quality Architect at Quicken Loans, where she recently won an award for “Getting Sh!t Done”, makes friends with developers & mentors testers, and helps teams to improve their quality processes. She runs the Motor City Software Testers user group, scarcely maintains a blog at g33klady.com, tweets a lot, and spends a ton of time building stuff on Minecraft.
James Fogarty & Jeff MacBane: Mobbing for Test Design: Connecting with Your Colleagues’ Test Ideas
Do you have trouble generating test ideas? Are there bugs that are getting through because you missed certain tests? This workshop will teach you mob test design techniques that help you generate test ideas and your features and products. You will learn the benefits of mob test design and understand the value your colleagues bring to test design.
In this session, James and Jeff will:
This workshop will prepare you to practically apply of mob test design back on your team at work.
James Fogarty is a passionate software testing practitioner. He works at TechSmith, the makers of Snagit, Camtasia, and other visual communication software applications. He works with software testers, developers, and business professionals to level up the software quality by designing tools, teaching automation, and involving the entire development team in testing and improving software quality. He’s been testing digital cameras, imaging and commercial software for over sixteen years. James co-founded the Lansing Area Software Testers meetup; a group dedicated to improving software quality and the software community in the greater Lansing Area.
Jeff MacBane is a software tester at TechSmith, the makers of Snagit, Camtasia, and other visual communication software applications. He has over thirteen years of software testing experience in insurance, legal, medical and commercial software organizations. Jeff has a passion for testing software and continuing to learn about Agile and Scrum practices. He has presented at Mid-Michigan Agile Group and Lansing Area Software Testers meetups.
Jared Small: An Introduction to Test Data Management
The collection and use of data in organizations is rapidly growing – 90% of the world’s data has been produced in the last 2 years alone – Test and Development groups are increasingly being faced with the challenge of working with larger and larger amounts of complex data. Of course, time-to-test and time-to-market haven’t changed and software professionals are expected to deliver more, faster.
Do you wish you knew more about the concepts and ideas around the creation, storage, and re-use of test data? Wish you could rapidly create large amounts of data or safely get your hands on production data while at the same time preserving your customer’s privacy or private financial information?
This interactive session will give a general introduction to Test Data Management (TDM) and the speaker will share experiences related to a recent real-world project implementing TDM. Attendees will gain insights on data practices that can enhance and streamline the software development and delivery pipeline while protecting the privacy and security of our customer’s information.
Jared has over 15 years experience in software testing including over 10 years in leadership. He has a particular passion for testing and has been a volunteer on the board of the KWSQA for over 9 years. Jared has worked in small, medium and large organizations on web-based, mobile, and data-heavy Enterprise environments. He takes pride in a humanistic and empathetic approach to design, testing, and leadership. Jared is currently exploring interests in Test Data Management and Design Thinking.
Josh Roach: Unit testing one-off scripts – it’s important and you should do it!
Here’s the situation: you need to analyze a huge CSV export. Maybe you need to cross check the data against the result from an API call and filter to only the rows that you care about. I typically resort to using a simple script (usually written in Python) to do this work. Scripting provides a lot of benefits, including accuracy and repeatability (compared to manual scanning). But here’s the thing – I likely only need to run this script once (or maybe a couple of times). Conventional wisdom (aka developer laziness) would deem unit tests not important for this script. I want to make the case that not only are unit tests important, they should be considered mandatory. I’ll discuss a few real-world examples of how unit tests on one-off scripts have saved me, and go through some tools and techniques to teach you how to do this for your own scripts.
After I graduated from the University of Waterloo with a degree in Software Engineering, I got hired on at D2L and it’s been an awesome first 6 years of my career. I’ve worked as both a Software Developer and Test Developer, with testing experience largely on the developer side of things (unit, integration, scripting, etc.). Outside of work, I’m passionate about personal finance and simple living. And being the best husband/father I can be – my 1 year old keeps me busy 🙂
Kevin Malley: Let it Go, The “Frozen” Perception of Testing in Insurance
Technological advances have fueled a massive shift in the insurance industry. Mobile Technology, IoT and Big Data have turned typical development and testing practices upside down. Gone are the days of long release cycles, testing at the end and scripted tests your grandma could run. These “Old School” approaches don’t cut it anymore. Companies need access to information faster than ever to make quick investment and product pricing decisions to remain competitive. Not to mention customers are demanding a digital experience.
This talk will focus on how testing practices are adapting to change in the Insurance industry. We will explore the key technological advances with tips and tricks specific to each. We will explore modern test practices from exploratory to automation and CI / CD pipelines and how they can be applied and adapted for all types of technology, even the ones that were here before many of us.
With over 15 years of software experience across small and large companies and several industries Kevin believes that the agility of a software tester can be their biggest asset. The quality of a product is not checked into it at the end. Kevin has spent much of his career transforming teams and organizations to put quality first and deliver incremental value to the customer.
Maciek Konkolowicz – Automation Delivery Channels: Start Early, Achieve Scale
Maciek will introduce the idea of an automated delivery channel, its’ value, and will show three demos focused on three different implementations. One demo will focus on creating a delivery channel for legacy desktop applications (via Microsoft Test Lab), one will focus on a web automation delivery channel via TFS and Xamarin Test Cloud, and a third will focus on an automation delivery channel via Jenkins and Sauce Labs. Maciek’s talk will entrench the idea of why it is important to design a delivery channel early in the life cycle of an automated test solution, and then show how to do it, with examples available as takeaways.
Maciek has been a quality champion his entire professional life. After graduating from university he found himself working in IT process auditing, which introduced him to the need for quality processes. After a brief stint in the process area, he decided to become more technically focused and found his QA passion. As many of his QA compatriots have done, he started with manual testing, but quickly became interested in automation. For the past seven years, he’s been focusing on learning, implementing, showing and spreading the automation butter to whoever he can corner, be it Dev, QA, BA, or even Project Managers. He’s a passionate technologist who loves to externalize his thoughts to gain perspectives of others. He tries to document his lessons learned at http://macie-j.me and his bad jokes on twitter (@mkonkolowicz). He has spoken at local meetups and conferences and loves to share his passion for the quality crusade.
Mark Weiss: Hiring your first tester, do you need a Unicorn?
Hiring your first tester is an important decision that is usually left until the last minute for startups and small companies. It is very common to search for a “Unicorn” Tester; a Unicorn being a senior level employee in the field of testing which the company believes will solve all their quality problems. This talk will discuss what are the pitfalls of trying to find a Unicorn, why companies can’t attract a Unicorn, and what are some practical approaches to actually filling your first testing positions with a good candidate. All this from the vantage point of someone who others have dubbed a “Unicorn” and who has been on both sides of the hiring process.
Mark Weiss is a Senior Test Developer working at D2L, the creators of the Brightspace LMS. With a degree in CS, from the University of Waterloo, and over 8 years of software development experience, Mark has tested from all facets of the software life cycle. He’s created software as a Software Developer, supported software as a Tier 3 Technical Support Analyst, and has now fully engrossed himself in software testing in a Test Developer role. Mark’s ability of learn a product through and through and understand how people use it has allowed him to excel in his career. He’s passionate about automating the simple things and diving head first and figuring out the complex things. With an intuitive and creative mind he’s always looking for new challenges and inventing new tools and better ways to test.
Phil Kirkham: Buckets of Testing – working in a multi-project environment
As the sole exploratory tester in a company with multiple projects going on I had to work out how to work efficiently and to make teams aware of how I could help them. The current Agile literature didn’t seem to fit in with my particular needs so after some trial and error we came up with a plan where teams asses the size of the testing ‘bucket’ they think they will need. My talk will explain the challenges that we had and the plan we came up with to make the most of my expertise and use it across the company.
For a sole tester working in a company this talk should help give them some ideas on how they can survive and flourish. For other testers this talk might give them an idea of how they could be working with developers in a high-quality environment.
After working as a developer I moved to being a tester and then moved from England to Michigan. Currently working as the sole exploratory tester at Atomic Object, working on a range of projects from mobile to web to embedded. Moderator of the Software Testing Club website