Wednesday, January 30, 2019 – In Praise of the Generalist

Speaker: Matthew Middleton

RSVP Today!

Topic: Michael Bolton calls software testing “Applied Epistemology.” To me, this suggests that those who wish to practice the art and science of testing need to study broadly, to better understand the contexts in which they work. In this presentation, I’ll be discussing the need for generalists versus specialists, why I think cross-pollination is critical to the testing community as well as the other communities we interact with, and some examples of observations I’ve made with regards to overlapping challenges faced by the cyber security and software testing communities.

Bio: Matt is a QA/QC Analyst for Radient360, and has been a black box software tester for a decade, helping developers catch their bugs before they get out into the wild. He’s primarily been influenced by James Bach, Michael Bolton, and Cem Kaner, and subscribes to the Context-Driven School of Testing. He has taught software testing online, through the Association for Software Testing, as well as the inaugural Miramichi class for PLATO Testing.

Continue ReadingWednesday, January 30, 2019 – In Praise of the Generalist

Wednesday, November 28, 2018 – Big Data and Analytics for QA

Speaker: Sarah McKenna

RSVP Today!

Topic: At a Web scraping company covering 3,000 (and growing) websites, staying on top of changes to scripts and target third party sites was a constant struggle until we started logging each http call and response to an easily queried nosql db and built real time error analytics and error handling. Was not expensive or particularly hard yet made a huge impact on overall quality.

Bio: Passionate quality evangelist with 20 years experience leading QA in numerous NYC startups, corporations and government settings, mostly building test operations from scratch to cover functional, security and performance testing. Focused now on ways to introduce AI into automated functional testing.

Continue ReadingWednesday, November 28, 2018 – Big Data and Analytics for QA

Wednesday, October 24, 2018 – “Well-stated, half-solved:” Effectively Reporting, Triaging, and Managing Defects

Speaker: Lauren Weber

RSVP Today!

Topic: The quotation “A problem well stated is half-solved” is from American inventor and businessman Charles Kettering [1]. This axiom can be applied to defect reporting, especially when defects are complex, hard to reproduce, or cross several layers of software. A defect report that is complete and accurate can help development correct a problem faster, with fewer log gathering and test-fix cycles, resulting in better quality software delivered faster. In this presentation we discuss construction of effective defect reports, we examine techniques to move defects through their lifecycles efficiently; we also consider ways QA professionals can go beyond reporting defects to debugging them in-depth, and fixing them.

Bio: Lauren Weber has worked in the telecommunications industry since 2003, testing software and leading test teams. He has worked with embedded and real-time systems such as wireless modems, protocol stacks, and video processing solutions, often specializing in verifying standards compliance. He enjoys troubleshooting complex problems, learning from his mistakes, and reading Dilbert comics.

Continue ReadingWednesday, October 24, 2018 – “Well-stated, half-solved:” Effectively Reporting, Triaging, and Managing Defects

Wednesday, May 30, 2018 – Everything I Know About Test Automation I Learned From A Developer

Speaker: Lee Manchur

RSVP Today!

Topic: Effective test automation requires using the same principles and practices as “real” software development, but often QA teams are expected to write and maintain their own test automation frameworks without a development background. With this in mind, we asked our company’s developers to take a hard look at our automated test architecture, and were amazed at the efficiencies they found! This presentation will explore developer-initiated ideas for functional Selenium and API test suites which helped our team increase our coding efficiency by 50%, and, overall, become more effective testers (and developers). We will explore, in the context of automated tests, effective Object Oriented Design use (but no, this not “yet another” Page Object tutorial!), using abstract classes to maximize code reuse, quickly compare data sets with reflection, and using delegate patterns and methods to generate data. These ideas might be obvious to developers, but are new design patterns to add to the toolbox for most testers.

Bio: Lee Manchur is the QA Manager at OCAS, the application service for Ontario’s public colleges. Since 2014, he has been responsible for testing and guiding software quality practices throughout all of OCAS’ products, after spending nine years testing mobile applications at BlackBerry. A career-long software tester, Lee strives to establish comprehensive test suites within his teams by balancing exploratory and heuristic test practices with scalable automated test solutions.

Continue ReadingWednesday, May 30, 2018 – Everything I Know About Test Automation I Learned From A Developer

Wednesday, April 25, 2018 – Testing Through Time And Space: NASA’s Twenty-Year Mission to Saturn

Speaker: Andrea Connell

RSVP Today!

Topic: NASA’s Cassini mission to Saturn launched in 1997, and orbited the ringed planet continuously for thirteen years until the mission ended in 2017. Throughout this time, the Mission Sequencing Subsystem team at the Jet Propulsion Laboratory developed software used to design and validate the spacecraft’s science activities. As we learned more about the Saturn system and as the spacecraft aged, software changes were needed. Automating tests for software that was initially developed before modern architecture and testing methodologies existed posed many challenges. The limited-funding and risk-averse environment of a flagship planetary mission heightened these challenges. This talk will discuss the strategies taken and lessons learned from nearly two decades of flight.

Bio: Andrea Connell has held many roles in her ten-year technical career, including Software Developer, Database Administrator, Certified ScrumMaster, and Test Engineer. Andrea earned her Bachelor’s Degree in Computer Science from the University of Wisconsin – La Crosse and Master’s Degree in Computer Science from the University of Hawaii at Mānoa. She previously worked for, and is now a Software Engineer at NASA’s Jet Propulsion Laboratory.

Continue ReadingWednesday, April 25, 2018 – Testing Through Time And Space: NASA’s Twenty-Year Mission to Saturn

Wednesday, March 28, 2018: Buckets of Testing – working in a multi-project environment

Speaker: Phil Kirkham

RSVP Today!

Topic: As the sole exploratory tester in a company with multiple projects going on I had to work out how to work efficiently and to make teams aware of how I could help them. The current Agile literature didn’t seem to fit in with my particular needs so after some trial and error we came up with a plan where teams asses the size of the testing ‘bucket’ they think they will need. My talk will explain the challenges that we had and the plan we came up with to make the most of my expertise and use it across the company. For a sole tester working in a company this talk should help give them some ideas on how they can survive and flourish. For other testers this talk might give them an idea of how they could be working with developers in a high-quality environment.

Bio: After working as a developer I moved to being a tester and then moved from England to Michigan. Currently working as the sole exploratory tester at Atomic Object, working on a range of projects from mobile to web to embedded.

Continue ReadingWednesday, March 28, 2018: Buckets of Testing – working in a multi-project environment

Wednesday, February 28, 2018 – Senses Working Overtime: Improving Software Quality Through Accessibility and Inclusive Design

Speaker: Michael Larsen

RSVP Today!

Topic: Accessibility makes it possible for those with various disabilities to access information and services. Inclusive Design focuses on making choices so that software and services are usable by as many people as possible. They are distinct but complementary facets of software development and delivery, and they are difficult to add to software after the fact. Making software Accessible using Inclusive Design principles at the start, or as early as possible, makes it easier to develop software that can be used by more people, and allows the development team to deliver better quality, better user experience, and happier users all the way around. In this talk, I will demonstrate principles and processes that you can use to help make Accessibility and Inclusive Design a natural part of your development and testing activities.

Continue ReadingWednesday, February 28, 2018 – Senses Working Overtime: Improving Software Quality Through Accessibility and Inclusive Design

Wednesday, January 31, 2018: Shift-Left & Shift-Right Performance Testing for Superior End-User Satisfaction

Speaker: Arun Kumar Dutta

RSVP Today!

Topic: These days, end-user satisfaction is the most important factor for both traditional and digital business. End-user satisfaction can be provided with enhanced features quickly and with improved performance. This will not only satisfy them but also convince them to remain loyal and influence others. Though an application’s speed, scalability, stability and availability are not the only parameters for superior end user satisfaction, they are the most important factors.

In a shift-left performance testing approach, testing is moved to the left in the software development life cycle. Instead of doing performance testing at the pre-production stage just before release, performance testing needs to be started at early stages of SDLC. This will assist project team to avoid big losses and reduce overall cost.

In a shift-right performance testing approach, testing is moved to the right. This is done in production for E2E real user experience in terms of performance – speed, scalability, stability, availability, fail-over by controlled experiments and with continuous monitoring. This is like late testing but it is very powerful for end-user satisfaction.

Proactive shift-left and shift-right performance testing ensure superior end-user satisfaction by providing enhanced features at a faster rate with excellent performance. This talk will assist us to know the value that shift-left & shift-right performance testing can bring, why both of them are required for superior end-user satisfaction, and what are the things that we need to remember while making it as an ongoing process for enduring in market.


Continue ReadingWednesday, January 31, 2018: Shift-Left & Shift-Right Performance Testing for Superior End-User Satisfaction

TQ2017 and KWSQA Updates

The KWSQA ran Targeting Quality 2017 on September 25th and 26th at the Crowne Plaza Hotel and Conference Centre in Downtown Kitchener.

Thank you to all our Speakers, Sponsors and Attendees for making TQ2017 so great! The KWSQA Board is already planning Targeting Quality 2018, stay tuned for more details.

Our regular monthly lecture series (KWality Talks) returns October 25th, 2017.

Don’t forget about our ongoing KWality Talks CFP if you are interested in submitting a talk for our lecture series.

Want to re-live Angie Jones’s Opening Keynote (Owning Our Narrative) or Andrew Annett’s Closing Keynote (If your team is an object, what’s its API?)? Check them out here.

Continue ReadingTQ2017 and KWSQA Updates

Wednesday, May 31, 2017: The Drive-Thru Is Not Always Faster: Re-Thinking Your Testing Practice

SPEAKER: Mike Lyles


How many times have you sat in line at the drive-thru window waiting and waiting? You watch as some people park their car, walk inside, buy their food, and then leave…..all while you are still in line.

The drive-thru window was created to speed up the process and make things faster. However, many times that process is not as fast as a different way.

How many times have we done the same thing with testing? We focus on what we believe is the best and fastest process and we don’t allow ourselves to consider alternatives that might work just as well, or better.

You are not alone. And it would likely surprise you that many others in the testing community share the same struggles and needs for re-thinking.

In this session, Mike Lyles will share his findings in talking to various leaders and practitioners in testing, their struggles, their strategies to overcome them, and their creative approaches to providing alternative solutions to make their testing organizations successful.

Key Takeaways:
• Survey responses from testers from all over the world
• A discussion on the common problems every test team will face today
• Suggestions from the community on how to approach and resolve the daily challenges
• Strategies that can be implemented immediately in your team


Mike Lyles is a QA Director with over 24+ years of IT experience in multiple organizations, including Fortune 50 companies. He has exposure in various IT leadership roles: software development, program management office, and software testing. He has led various teams within testing organizations: functional testing, test environments, software configuration management, test data management, performance testing, test automation, and service virtualization.

Mike has been successful in career development, team building, coaching, and mentoring of IT & QA professionals. He has managed multiple high impact programs simultaneously, on time, and under budget.

Mike has been an international keynote speaker at multiple conferences and events, and is regularly published in testing publications and magazines. He is the President of his corporate Toastmasters Club. Mike’s passion to help others improve and grow, in the field of testing, leadership, and management, is his key motivation. His first published book on leadership will be released this year.

See for a listing of all conference speaking, webinars, podcasts, and articles.

Continue ReadingWednesday, May 31, 2017: The Drive-Thru Is Not Always Faster: Re-Thinking Your Testing Practice