Skip to content

Reviewing Notes

After each test, the AI produces a set of draft notes based on what it observed while you navigated the site. Your job is to read through those notes, make any changes, and confirm that they are ready to go.

The AI generates draft notes based on what it observed. But you are the expert. You decide what goes in the report.

Why your review matters

The AI watches what happens on screen and flags things that look like accessibility barriers. It is good at spotting patterns. But it cannot feel the friction of a confusing interaction or know whether something that looked fine actually caused you difficulty.

Your lived experience as a tester is the part the AI cannot replicate. When you review the notes, you are not just proofreading. You are adding the human layer that makes the findings meaningful.

Nothing gets sent to a report until you say it is ready.

Understanding the severity labels

Each note comes with a label that describes how much the issue affects real users. Here is what each one means:

LabelWhat it means
Big dealCritical - blocks people from using the site entirely
Worth fixingMajor - makes the site significantly harder to use
Small thingMinor - annoying, but users can work around it
Just a thoughtInfo - an observation that may or may not be a problem

Editing notes inline

Every note is editable. Click on any note to open it and make changes.

You might want to:

  • Reword the description so it matches what you actually experienced
  • Change the severity label if it feels too high or too low
  • Add detail about where exactly on the page the issue appeared
  • Remove a note if you checked and it turned out not to be an issue

Save your changes and move on to the next note. There is no time pressure. Take as long as you need.

Adding your own observations

The AI will not catch everything. If you noticed something during the test that does not appear in the draft notes, you can add it yourself.

Look for the “Add a note” button at the bottom of the notes list. You will be asked to:

  1. Describe what you observed
  2. Choose a severity label
  3. Note where on the site it happened (optional but helpful)

Your added notes appear alongside the AI-generated ones. There is no difference in how they are treated in the final report.

Confirming your notes

When you are satisfied with the notes, click the Looks good! button at the bottom of the review screen.

Here is what happens next:

  1. Agent 3 takes your confirmed notes and generates the accessibility report for this test
  2. Agent 4 combines findings across all testers who worked on the same study

This means the report cannot be generated until you confirm. Your confirmation is the signal that the findings are accurate and ready to use.

You always have the final say

The AI is a starting point, not the final word. It helps you capture findings quickly, but every note in the report reflects a decision you made.

If something looks wrong, change it. If the AI missed something, add it. If a note does not match your experience, remove it.

The report is yours. The confirmation is yours. That is by design.