9 QA roles that you did not know about
Everybody knows that QA engineers check software and look for bugs. Many people think that even a regular user can do this task, and they try to do without QA. In fact, this role is much deeper than it is commonly thought. Let’s talk about the QA roles on one of the projects where I was the QA engineers team lead.
We tapped into creating a solution for distribution and tracking of delivery system from online stores. It consisted of a data collection terminal for warehouse employees and a web part for managers. The solution automates everything from receiving an order and packing to tracking the parcel by order number.
Initially it was done by a in-house developer. He developed the functionality based on the customer’s words, without a specification, and he tested the functionality himself. When we got the product, it lost users due to errors and unstable work.
The quality system has been built from scratch, so this is an perfect example to show all our roles in action.
Role 1: The Quality master
Our goal is to make a quality product, so the QA specialist takes part in every stage of the product life cycle:
- We check the functionality for compliance with technical and business requirements, so that the product solves certain business problems;
- We identify architectural inconsistencies: what can be realized, and what is unrealizable. We immediately discuss with developers how we will implement ideas from the spec;
- We create test documentation, which the new member of the team will understand;
- We inform all interested parties about the state of the product and the time of the release;
- We organize demonstrations of the product.
So, we realized that, except for the fragments of the code, there is nothing else in the project.
I started with the analysis and updatе of outdated documentation that did not meet the current requirements. When our team grew, the documentation helped new employees to get involved in the project faster.
After that, I went on to the test documentation, covered all the functionality with the test cases. Test cases saved me time for testing: the things that used to take us a week to check, now took a couple of days to check.
Role 2: Analyst
The analyst turns project ideas into requirements, and we help with this:
- If there is no analyst, we prepare requirements by collaborating with the customer;
- If there is an analyst, we check the specification: we find inconsistencies, suggest improvements and monitor the updates of the requirements.
We didn’t have the analyst at first.
I used to study the product features myself. In order to understand it better, I went to the warehouse, talked to employees and customers and even scanned the parcels. As a result, I updated the specification - added the wishes of all users and new requirements.
Role 3: Facilitator
Ideally, the whole team works in one place, but it happens in different ways. If the participants are in different places, you need to organize communication and the process of the team's work.
What QA does:
- Organizes interaction with all team members. It doesn't matter where they are -whether they work at home or on another continent;
- Builds unified work processes. So everyone in the team is aware of the current tasks and know who to contact with questions;
- Builds the process of testing, drawing up test documentation, determining the scope of testing, organizing a demo;
- Controls the time of a quality product release.
In our case, there were no development processes on the project. I needed to organize interaction between developers, the customer and myself.
1. A workflow, which we will use to work with tasks;
2.The terms according to which we release the functionality;
3. Areas of responsibility.
As a result, the tasks were no longer lost, work on them became transparent for all participants: everyone knew who to go with different kinds of questions. This saved us time and simplified the involvement of new team members.
Role 4: Technology specialist
We analyze the needs of the product and choose tools for functional and automated testing and documentation. The expert should understand various tools very well to find the right one.
At the start, we realized that all the tools should be selected and tuned anew. When the developer worked alone, he determined what to work with. When I joined, we began to choose tools for testing and maintaining test documentation. In the end, we chose the ones that are convenient for both of us.
Role 5: Performance enhancer
We collect metrics for the project, analyze them and develop a plan for further improvements to our products.
There are many metrics, and not all of them are required in the project at the same time. Usually we determine what we will measure, after the documentation has been compiled.
Let's return to the beginning of the work. We did not take the metrics for the first couple of months, while adapting and tuning the work processes.
When the team grew, we began to conduct smoke or regression testing before each release. Each test case is marked as passed / not passed, according to the metric for test cases. If the percentage of non-existent cases is more than half and there are critical errors, we postpone the release before making corrections. Then we search for the reason of failed tests, we trace at what stage or in what functionality they appear. The easiest way is to talk with a developer or a QA. Sometimes this is enough to make fewer mistakes.
Role 6: Support engineer
We collect feedback from the users or the customer: we analyze what is inconvenient, where difficulties occur and what is missing. Then we discuss these proposals and improvements with the customer. After the release of the product, we make sure that users no longer face such problems.
Even during the trip to the warehouse, I found the weak spots that users complained about. For example, the system strongly slows down without checking the load. The user waited for several minutes to see the status of the order or add the delivery address. As a result, they got nervous and did not reach the stage of the order. The company was losing clients.
We conducted load testing, identified weak points where the system stops working, and then made a plan for developers. To see how we loaded the system - go to the article “Things to remember when launching a product”.
After a couple of weeks we talked to the operators and received a positive response - the failure rate dropped.
Role 7: Usability expert
We analyze the usability of the product. Sometimes we check it ourselves using the guidelines, sometimes we involve focus groups. It is necessary when the coverage is atypical. For example, an application for hunters or truckers: QA engineers can not imagine how the hunter behaves on the scene or in what area of the screen the trucker presses while driving. As a result, we collect the data and finalize the product.
After the product has started working stably, we tackled less obvious, but no less important things - convenience. Thanks to that we identified a problem when placing an order. When filling out the date and time of delivery using the calendar, the customer could miss the form. Then it got closed down, and all the entered data on the order was erased, it was necessary to fill all out again.
I suggested closing the order window only when you click on the close, cancel or confirm buttons. So the form stopped closing down prematurely, and users stopped being mad.
Role 8: Auditor
To perform audit, we go to the customer's office: on the spot we analyze the state of the product, identify weak and vulnerable places, work processes, which is not enough for full-fledged work. This way we get a report with the problems and their solutions, develop a strategy for corrections and improvements.
In our example, audit was not needed - we went there and studied all the nuances before the analyst did, so we didn’t need it further. We will share the way we conducted an external audit in one of our next articles.
Role 9: QA Automator
We analyze what actions can be automated to save time for routine work. Then we find out that we can cover the functionality with the autotests, draw up a plan and automate it.
In the first six months we did not need automation. If you are trying to figure out whether you need it, I advise you to read the article “How to understand that it's time to automate testing”..
During the year the team has grown from one person to three. We updated the requirements, covered them with test cases and built work processes. So only verified versions actually went to release. We noticed that user complaints were cut in half - this is an indicator that users are happy.
All this came about thanks to the well-coordinated work of the team that we built. Our specialists can combine roles depending on the needs and goals of the business, so that your products can always meet your needs.