What Makes a Good Design Review

A good design review isn't about critique. It's about alignment — getting every person in the room to the same understanding of the work before anyone goes off and builds something different from what was intended.

Two designers side by side reviewing printed design at whiteboard

Design reviews have a reputation problem. In many teams, they're treated as either performance events — where the designer presents and defends, and the audience critiques — or rubber-stamp ceremonies where work gets approved without real engagement. Neither version produces good outcomes. The first creates adversarial dynamics. The second lets problems through that will surface later, at higher cost.

The framing that works better: a design review is a structured alignment session. Its job is to ensure that the people who will act on the design — whether building, approving, or extending it — all have the same understanding of what the design is doing, why, and what decisions it contains. Critique is a part of that. But it's not the point.

Before the Review: Setting Up for Success

The quality of a design review is largely determined by what happens before it starts. Three things that consistently improve review outcomes:

  1. Share the work in advance. Reviewing cold — seeing a design for the first time in the meeting — produces shallow feedback. Give reviewers 24-48 hours with the material before a live discussion. The meeting becomes analysis instead of first impressions.
  2. State the review objective explicitly. "We're reviewing this for directional alignment on the navigation concept" is different from "we're reviewing this for production readiness." Make sure the room knows which one it is before anyone opens a file.
  3. Specify what feedback you need. "Does the information hierarchy make sense to someone who hasn't seen this before?" generates more useful feedback than "what do you think?" The designer should frame the questions they actually need answered.

The Feedback That's Actually Useful

Not all design feedback is created equal. In our experience working with design teams, the feedback that most often leads to better outcomes is feedback that references the design's goals, not the reviewer's preferences. "The CTA placement doesn't match where users' eyes naturally go after reading the headline" is actionable. "I personally would make the CTA bigger" is an opinion that may or may not be relevant to anything.

The framework we recommend:

Feedback Type Example Usefulness
Goal-referenced "This layout buries the signup CTA for first-time visitors" High — directly actionable
User-referenced "A user scanning on mobile might miss this entirely" High — adds perspective
Precedent-referenced "This conflicts with how we handle empty states on the dashboard" High — identifies inconsistency
Preference-based "I'd use a different shade of blue here" Low — unless blue has brand significance
Vague impression "Something feels off about this section" Low unless the reviewer can articulate what

Running the Review Itself

A good design review has a facilitator — not necessarily the most senior person in the room, but someone who can keep discussion on track, surface competing opinions without letting them derail the session, and ensure that every open question has a defined next action before the review ends.

The single change that has the largest impact on review quality: end every review with a written list of open decisions and who is responsible for resolving each one. If you leave the room without this, the review didn't actually accomplish alignment — it created the impression of it.

The goal at the end of any review is for every participant to have the same answer to three questions: What did we decide? What's still open? What happens next? If those three questions can't be answered clearly, the review isn't done yet — even if the time slot is up.

Design reviews don't have to be painful. The reviews that damage team morale are the ones without clear purpose, without pre-shared material, and without a structured feedback format. Fix those three things and the review becomes a tool instead of a trial. That's worth the effort to get right.

Continue Reading