Use case:

Quality Assurance

Our clients typically perform 2 types of quality assurance – internal QA (since they have a team of more than 1, and need to verify quality of their work before sending to clients), and client QA/approval (client verification before their work goes live)

For example, a cold email campaign is created by a copywriter and lead list builder, and sent on behalf of the client, so needs both internal and external QA. A similar process is carried out with copywriting for ad/email campaigns, landing pages, video editing, and so on.

This use case is focused on Internal Quality Assurance, but we also have a use case on how we automate client approval.

Challenges with Internal Quality Assurance (QA)

We first encountered this problem with a content agency that was generating up to 20 projects per day – the management was spending all of it’s time reviewing project quality. This was despite using:

  • Editing tools like Grammarly (some writers forgot to use it, and some parts of their style guide could not be detected by Grammarly)
  • Templates (some writers used the wrong template)
  • SOPs and training (the team was growing so fast that the majority of the team was relatively new)

The editor on the team was spending the majority of their day creating feedback looms for each project, so they were unable to scale beyond 5-10 projects per day.

We found out these problems by accident — we have been working with this client for years, and had the impression (after about a year working together) that “all” of the business was automated.

How We Automate Internal QA

First, we talked to the founder of the business and suggested that a great deal of the errors found were similar.

This meant that rather than creating a Loom for every project, they could instead create one Loom video per issue found.

We then created a table of common issues, where the management team could add issues, with a link to a training video where appropriate, a description about how to fix the issue, and also a “severity” score.

When a project was submitted, the editor could “tag” the project with the quality errors detected.

Then, once the QA was complete, an automation generated an error report to the writer (as a draft email that could be modified).

Some clients want something similar as a slack message, depending on the size of their team and whether they use freelancers or employees.

This enabled a few things:

  • Faster quality checks (as an editor could “tag” projects with the issues detected in a few clicks, rather than filming an entire loom video)
  • An editor could still provide custom feedback for each project (and they could gradually standardize more and more errors), but the vast majority of the feedback was automatic

And, since each project was linked to a writer, we could determine a rolling “quality score” for each writer based on the number and severity of the issues found in their projects. This was later used to automatically distribute projects to the best writers

This system also allows our clients to determine the most common errors over time across the entire team, so they can focus training where it’s needed

Over time, we also added automated detection of some errors, which further increased quality and saved time.

We have since found that a similar strategy can be used for any creative agency:

  1. Create an Airtable infrastructure that tracks projects (like videos, landing pages, email copy, and so on), the team member(s) assigned to each stage of the project, and a list of quality errors (error, loom, description etc)
  2. The editor/QA “tags” projects with the errors they find
  3. Over time, we can see the team members with the best quality, and the most common quality errors
  4. For text-based content (copywriting via email, landing pages etc), we can define per-client rules or general style guides to auto-detect quality mistakes that generic tools like Grammarly can’t catch

Similar Use Cases:

Want Something Similar In Your Business?

Book a free operations consultation below