Articles Governance

How to Build a Steering Committee Report

A practical guide to writing steering committee reports that give decision-makers accurate information in a format they can act on. Covers structure, RAG ratings, risk registers and escalations.

A steering committee report has one purpose: to give the committee the information it needs to govern the program.

Most steering committee reports do not do this. They are too long, structured in the wrong order, use RAG ratings that have no consistent definition, bury the important information in narrative text, and end with a vague “the program remains on track” conclusion that tells the committee nothing useful.

The consequence is that steering committees cannot provide effective governance. They cannot identify problems early because the reporting does not surface them. They cannot make timely decisions because the information they need is not in the report. And they cannot maintain accountability because they cannot tell whether the program is actually delivering what it committed to.

What a committee needs to know

Before writing a steering committee report, be clear about what the committee is for. A steering committee is not a progress review meeting - it is a governance body. Its job is to make decisions, provide accountability, and remove obstacles that the delivery team cannot resolve.

To do that job, a committee member needs to know:

  • Is the program on track to deliver the agreed scope, on time and within budget?
  • What are the most significant risks and issues, and what is being done about them?
  • Are there any decisions the committee needs to make?
  • Are there any matters the committee needs to be aware of that may require future decisions?

Everything else in the report is supporting detail. The structure should lead with these questions, not bury them on page four.

The right structure for a steering committee report

A well-structured steering committee report follows this order:

Overall status. A single RAG rating and one paragraph summary. What is the overall health of the program? What is driving the current status? This should be readable in 30 seconds.

Decisions required. An explicit list of decisions the committee is being asked to make at this meeting. If there are no decisions required, say so. Committees that do not know what they are being asked to decide spend time on the wrong things.

Progress summary. What was planned for this period, what was delivered, and what was not. Expressed in milestones and deliverables, not in activity narrative.

Financial position. Actual expenditure to date versus approved budget. Forecast to completion versus approved budget. Any variances and the explanation for them.

Risks and issues. The top five to eight risks and issues, each with a RAG status, an owner, and a treatment or mitigation. Not a list of thirty items with equal weight - a prioritised view of what matters.

Upcoming milestones. What is scheduled for the next reporting period, with any known risks to those milestones flagged.

Escalations. Any matters that require committee awareness or action beyond the standard decisions listed at the top.

This structure allows a committee member to read the first page of the report and have an accurate picture of the program state. The following pages provide the detail for items that require it.

RAG ratings that mean something

RAG ratings are only useful if they have consistent definitions. A Red status on one program might mean the same as an Amber on another if the criteria have not been defined. When definitions are absent or inconsistently applied, the ratings stop conveying information and start conveying politics.

Define RAG status criteria clearly:

Green: The program is tracking to plan. Identified risks are being managed within normal delivery processes. No escalation to the committee is required.

Amber: The program is experiencing issues that may affect delivery if not addressed. Specific risks or issues require committee awareness. The delivery team has a plan to address them and is tracking to that plan.

Red: The program is not tracking to plan and the current trajectory will result in a missed milestone, budget overrun, or scope change unless the committee takes action. Specific decisions or support from the committee are required.

Amber is not a softened version of Red. Amber means there is a problem being managed. Red means the problem is not being managed adequately and the committee needs to act.

The common calibration failure is rating programs Amber when they are Red. This happens because of the cultural discomfort of reporting Red status and because of pressure from program sponsors to present the program positively. The consequence is that the committee cannot respond appropriately because the actual status has been obscured.

The risk register in a steering committee report

Not every risk on the program risk register belongs in a steering committee report. The operational risk register might have thirty items; the committee should see the five to eight that are at the level of severity and likelihood that require committee awareness or action.

The risks that belong in the committee report are:

  • Risks that are currently Red or Amber and are not responding to current mitigations
  • Risks that require a decision or support from the committee to mitigate
  • Risks that are about to move to a higher severity if a specific trigger event occurs
  • Risks that have financial or scope implications if they materialise

For each risk in the committee report, include: the risk description (specific, not generic), the likelihood and impact rating, the current mitigation, the residual risk level, the owner, and what the committee is being asked to do.

“Vendor delivery risk - Medium - Owner: PM” is not useful. “Vendor X has not met three consecutive milestone commitments. The contractual remedy of a remediation plan has been applied. If the next milestone (15 April) is not met, the contract includes an exit provision that we may need to exercise. Committee is asked to note this position and confirm the organisation’s appetite for contract exit if the milestone is not met” - that is useful.

Writing escalations clearly

An escalation is a matter brought to the committee’s attention because it requires their decision or awareness outside the normal program reporting. Escalations should be clearly labelled and clearly structured.

A well-written escalation includes:

The issue. What is happening? Described specifically, without jargon or euphemism.

The impact. What happens if this is not resolved? What does the program miss? What does it cost?

The options. What are the realistic options for resolving or mitigating the issue? At least two options, each with its implications.

The recommendation. What does the delivery team recommend, and why?

The decision required. What specifically is the committee being asked to approve, note, or direct?

An escalation that presents the issue without the decision required is an information item, not an escalation. An escalation that presents the decision required without the context is not enough for the committee to make an informed choice.

What makes committee reporting sustainable

Steering committee reports that are onerous to produce will not be produced accurately. If the reporting process requires the delivery team to spend two days per fortnight compiling reports, the reporting will be rushed and the quality will suffer.

Design the reporting to be producible in a reasonable time from information the delivery team already has. This means:

Keeping templates consistent. A committee that receives a different report format every second meeting cannot build familiarity with the structure or compare status across periods.

Linking the report to the program plan. The progress section should come directly from the program schedule. If the schedule is maintained, the report should take an hour to produce, not a day.

Establishing a clear data collection process. If the PM needs input from multiple workstream leads or vendors to produce the report, that process needs to be established and run to a consistent schedule.

Reviewing the reporting framework regularly. What the committee needs to see will change as the program progresses. A reporting framework that was right at program initiation may not be right during a complex integration phase. Review the format at the end of each major phase.


The Steering Committee Reporting Pack includes a status report template, executive dashboard format, risk register structure, RAG rating criteria guide and escalation pack template. See the templates page for details.

steering committee governance program reporting executive reporting RAG status

The Project Recovery Brief

Practical guidance for project managers and technology leaders working through difficult delivery.

Subscribe free