Lean Accountability FrameworkAbout

The Lean AI Accountability Framework

Guidance for teams using AI in their day to day work

Why this exists

This guidance helps teams be clear about when AI was used, what it helped with, and who is responsible for the final work.

It keeps responsibility with people.

It is not here to:

It is not here to:

Who should use this

This applies to anyone involved in digital delivery, including:

It applies to work outputs, not individuals.

Ground rules

Someone always owns the work

AI can help, but a person is always responsible for the final output and what it’s used for.

Match the effort to the risk

If the work could affect users, policy, or operations, be clearer about how AI was used.

Don’t focus on the tool

Which AI tool you used is less important than what you used it for and what decisions were made by people.

Treat AI like any other tool

If AI meaningfully shaped the work, say so.

AI accountability note (required)

If AI played a meaningful role in creating something, add a short accountability note close to the work.

Use whatever format fits the space, but follow this structure.

What to include

AI:
Assisted: true
Tools: "<tool name>"
Role: "<what the tool helped with>"
Human accountable: "<name or role>"
Confidence Level: "<high | medium | low>"
Date Created: "<DD/MM/YYYY>"

Add more detail when it matters

Use these when the work is higher risk or more visible.

AI:
Tools: "<tool name>"
Provider: "<organisation>"
Risks_considered:- "<risk>"
Limitations:- "<known limitation>"
Review Status: "<reviewed | reviewed and amended | exploratory>"
Last Updated: "<DD/MM/YYYY>"

The indentation used helps to see the relationship between connected values.