AI Lesson Planner MVP

Industry

AI, Lesson Planning

Role

Senior UX Designer

Team

PM, Delivery Manager, Engineering, SME

Year

2025 - Present

Challenge

Research validated what teachers need from AI, but a new challenge was bridging abstract insights into scoped and feasible design. Teachers need curriculum alignment, a collaborative AI, and seamless customisation. But what does that actually look like as product experience, and what fits within MVP constraints?

Solution Discovery

I led the discovery-to-definition process that moved us from research insights to scoped product plan. I ran workshops that turned abstract teacher needs into testable concepts, then built a prioritisation system using user value, frequency, feasibility, and pedagogical impact to make scope decisions.

Working with content SMEs, we defined the experience of "quality" i.e. what it is that makes teachers trust AI outputs. Through wireframing and moderated testing, I validated that our interaction patterns aligned with how teachers actually think about lesson planning.

Working closely with engineering, I'm ensuring our AI approach was scalable and that interface interactions were technically feasible - translating our 'quality' expectations into buildable requirements and making decisions about what's realistic for the MVP scope.

Progress & Impact

We established direction with testable assumptions and used these to align on MVP objectives with several accompanying measurable success metrics tracking usage and customer value.

I established design hypotheses which we will validate through iterative testing through refinement and through real usage upon release.

The MVP beta release is our validation milestone, where real user behaviour will confirm or challenge each hypotheses, generating evidence to shape future development based on actual usage patterns, not assumptions.


From Insights to Opportunities

I ran a strategic workshop to, establish goals, objectives and success criteria. We then generated research-based assumptions around key thematic areas to fuel ideation, for potential features we could build. These ideas were plotted these ideas on an impact vs. difficulty matrix to visualise trade-offs, which became the foundation for structured prioritisation against five criteria: user value, interaction frequency, pedagogical impact, feasibility, and strategic alignment.

The result: core MVP features, near-term stretch goals, and a clear now-next-later roadmap based on validated

Establishing testable design principles

To ensure design decisions remained grounded and consistent during ideation, I established core UX principles that translated research insights into actionable design guidance:


  1. Teachers need to understand how AI generates content and what curriculum it's grounded in.

  1. Teachers want AI as a thinking partner, not a replacement. Interactions should feel like co-creation where teachers maintain control and agency throughout.

  1. Customisation is essential, so design should enable real-time adaptation within the generation flow, not as a separate task.

Validating via research & testing

As of writing this case study, I'm currently developing wireframes, user flows and running prototype sessions to validate interaction patterns before engineering investment, while also conducting contextual interviews with teachers to further understand the nuanced details of lesson planning workflows.

Testing is currently exploring whether the proposed user flows aligns with teachers' planning mental models. E.g. Does this hierarchy match how teachers think?

With contextual inquiries I'm aiming to learn what teachers mean when they request capabilities like "differentiate this lesson" or "make it more engaging"… translating abstract requests into concrete design affordances.

This dual approach ensures iterations are grounded in observed behaviour, and each design decision is validated against real teaching practice before we commit to building.

Next steps

Finalise wireframes from current testing, work with UI to develop delivery files, collaborate with engineering on technical specs, prepare for alpha testing.

Our goal with the beta launch is to validate our hypotheses through real data. We won't build the perfect lesson planning tool but we will deliver something that provides immediate value to users while generating the learning needed to evolve the product.

Protected Page
Enter the password to access this content.