Case Study
Reducing the project review by 75%.
Duration
6 Weeks

Context
UX Anudeep runs a cohort-based UX Design program where every student submits a Design Thinking project for review.
After a hackathon, submissions spiked from 15 to 40+ per month.
Anudeep personally reviewed each project for 30–40 minutes, totaling 26+ hours/month — an approach that was not scalable.
The friction:
Repetitive 1:1 feedback loops
Delayed reviews
Cognitive overload on the mentor
Challenge:
How might we retain feedback quality while reducing total review time?
Approach 1 – Assisted Reviews
I started by reviewing student case studies directly on Medium, leaving detailed comments.
Anudeep would then use these notes during the live feedback calls — the intent was to reduce his cognitive load.
Why it failed:
Each case still took 25–30 mins on call.
Reviewing + commenting took 2 hours per project for me.
Finding capable reviewers became another bottleneck.
We realized we weren’t scaling — we were just splitting the same workload.
Approach 2 – Feedback Automation Tool
After reviewing 10–15 projects, I observed recurring patterns of mistakes.
So, I designed a templated feedback system — a tool where reviewers could select pre-written comments for each common error.
Why it made sense:
The reviewer could focus on identifying mistakes, not writing explanations.
In theory — faster reviews, consistent feedback.
Why it failed:
Reviewers still needed time to read and interpret each project.
The work felt repetitive and monotonous.
Students valued Anudeep’s 1:1 explanations, not just the text feedback.
Automation solved efficiency, not the human need for clarity.


Screenshots from the automation tool
Approach 3 – Common Mistakes Workbook (Breakthrough)
After stepping back, I reframed the problem:
“We’re not just reviewing — we’re teaching. Can we scale learning instead of reviews?”
Using insights from 30+ projects, I categorized ~20 recurring mistakes and created a Common Mistakes Workbook — each mistake explained with examples and the right design approach.
We then hosted two 3-hour live calls, where Anudeep walked through each mistake in depth, using examples from student projects.
Why it worked:
✅ Students learned collectively and internalized feedback patterns.
✅ Review time dropped from 26 hours → 6 hours (75% reduction).
✅ The process scaled effortlessly to 100+ students.
✅ Feedback depth improved instead of dropping.

Cover picture of the workbook

Presentation slides used by Anudeep
My Contribution
Empathized deeply with both mentor and student pain points.
Redefined the problem — from “speeding up reviews” to “scaling feedback learning.”
Created clarity through a structured workbook and session design.
Removed friction in repetitive reviews.
Enabled better decisions for the mentor — where to go deep and where to standardize.
Outcome & Learnings
75% reduction in total review time.
Higher student satisfaction through shared learning.
A repeatable, scalable model — shifting from 1:1 to 1:many feedback.
Key learning:
Not all scale problems need automation — sometimes, structure and empathy do the job better.
Contact Me
+91 8444 86 85 95
rahul.ag399@gmail.com
