Redesigning StudyClues: A Conversational AI Assistant for Course-Grounded Learning
As Lead Product Designer at LearningClues, I led the redesign of an AI-powered study companion alongside an evolving roadmap of new AI features.
The Problem
StudyClues' AI technology is growing rapidly to support a more powerful and adaptive study experience, but the product has significantly low usage with students finding it difficult to navigate and discover what the system can do.
Missing a conversation history panel to surface past conversations
No home for AI modes to surface new AI capabilities in chat

Popular questions are hidden behind an overlay, failing to capture user attention to help them get started
The Goal
Redesign StudyClues into a welcoming and usable interface that reflects the LearningClues brand, surfaces new AI features clearly, and helps students navigate the system with ease.

The Impact
The redesign now supports 10,000+ students across 100+ courses at R1 institutions including UC Berkeley and UW–Madison. It was demoed at EDTECH Week 2025 (NYC) and received strong positive feedback on the clarity of the interface.
Pictured alongside Perry Samson, CEO & Co-Founder @ LearningClues, presenting my work at EDTECH WEEK 2025 in NYC.
What is StudyClues?
StudyClues is the student-facing AI chatbot at LearningClues, an AI-powered learning platform for higher education. Unlike generic AI tools like ChatGPT, StudyClues only responds using a course's own materials: lecture videos, course documents, and announcements from the LMS. Each response includes in-text citations to the exact lecture moment or PDF page that generated it.
StudyClues Demo
Why this Redesign Mattered
When I transitioned into my role as Lead Product Designer at LearningClues in June 2025, StudyClues had limited functionality and its interface was unable to reflect the new AI capabilities that the team was building.
3 Issues Converged
The interface was outdated
There was no chat history and popular questions were hidden behind an overlay that PostHog data showed students rarely opened, signaling a possible discoverability issue.
New AI Features had no UX Home
The engineering team was actively building three new AI capabilities that needed clear entry points.
Discoverability Issues
A sister product for generating course-based practice tests was getting low engagement despite a need for it expressed by client institutions and end-users.
Constraints and How I Worked Around Them
Designing for Moving Targets
No Formal User Research Budget
Accessibility was Non-Negotiable: A Throughline
3 Decisions that Shaped the Redesign
Treating AI capabilities as visible modes, not hidden features
What I Did
I surfaced the three new AI modes as persistent action chips directly below the chat input field. Each chip has a "selected" state that signals the active mode. New modes are launched with a "New" badge to draw attention.

Why I Did it that Way
While studying ChatGPT and Claude in incognito mode, I noticed something interesting. For long-time users, AI modes like deep research were tucked behind a dropdown. For new users, the same modes appeared more prominently in the chat input. The pattern was clear:
Products in their growth stage require visibility; mature products optimize for power users.
StudyClues is in its growth stage. Students don't yet know what it can do. Hiding capabilities behind menus would have been the wrong call. Visible chips meant students could see the system's range without being told.
How Each Mode was Treated
Each mode received a different visual treatment based on the cognitive load it asked of the student:
Multi-course Mode
This opens a sub-menu so students can pick which courses to search across. By default, all courses they've taken are selected.


Coach Me Mode uses Socratic questioning to help students think through problems instead of giving immediate answers, a feature requested directly in student interviews. I deliberately kept the UI quiet here because the cognitive work is happening in the conversation.
Quiz Me Mode triggers an in-line practice test (more on this in Decision 3).
Fixing first-touch friction with onboarding and contextual prompts
What I Did
I redesigned the very first thing a student sees when they open StudyClues, in two parts:
A 4-step onboarding flow
This runs the first time a student lands on the platform. Step 1 explicitly addresses anonymity. The remaining steps walk through how to use the system. Students can skip, but a visible progress bar communicates that the tutorial is short.
Why I Did it that Way
The onboarding step was driven by customer success feedback. Some client institutions reported that students were hesitant to ask questions because they thought instructors could see who asked what. The welcome message had always said this, but most students didn't read it. Putting anonymity as a headline text in step one of a tutorial flow, was a more reliable way to make it land.
Surfacing Popular Questions Upfront
Why I Did it that Way
This decision came from PostHog data. The original design hid the four most-common student questions behind an overlay, and almost no one opened it. I first tried making the overlay trigger more prominent using a better border, brighter color. The needle barely moved :/
The fix wasn't a better button. It was removing the overlay entirely. Students wanted to see suggestions immediately, not click to find them. So I redesigned them as inline bubbles that appear at the start of a fresh conversation and disappear after the first message is sent.
What was the Result
The cleaner landing state was called out positively in client institution feedback and PostHog data started showing students actually engaging with popular questions. Customer success reported that students seemed more confident in using StudyClues' capabilities after onboarding launched.
What I Learnt
No one reads the fine print. If something is critical for users to know — like anonymity, in this case — it has to be made unmissable. A welcome message is not enough.
Embedding a sister product to drive cross-product adoption
What I Did
I introduced "Quiz Me" as a mode within StudyClues that ports PracticeClues' core experience directly into the chat. PracticeClues is a sister product of LearningClues that uses AI to generate practice tests from course content.
Practice Test In-Chat Embed
When a student selects the "Quiz Me" mode, the AI follows up to ask what topic they want to be quizzed on and how many questions they want. Once confirmed, it generates an interactive practice test that appears as an in-chat embed.
Why I Did it that Way
PracticeClues had low standalone usage even after two earlier interventions: I listed it in the left sidebar above conversation history (modeled on ChatGPT's model list), then renamed it from "PracticeClues" to "Practice Tests" in the top navbar to reduce platform-name friction. Both helped, but usage stayed below target.
PostHog data showed that students were already asking StudyClues for practice questions in chat. The need was there but the functionality was missing. Hence, it made sense to embed it inside the workflow students were already using.

Resizeable Side Panel View
The practice test embed can be expanded into a side panel for a more focused, full-screen view.
Why I Did it that Way
When the sidepanel opens, the conversation history and left sidebar collapse, which gives students a focused, less cluttered space to work through the test. This was particularly important for neurodivergent students who'd shared in feedback that visual clutter made it harder to concentrate. Students who wanted to keep the chat visible could stay in the inline embed.
What was the Result
Qualitative feedback from students has been positive. They appreciated being able to take a quiz in the same conversation where they'd been asking questions, rather than context-switching to a separate platform. PracticeClues has seen an increase in usage, but we're still evaluating PostHog data to confirm a direct causal relationship.
What I Learnt
Sometimes the best way to grow a product isn't to fight for its standalone visibility. It's to embed it inside a workflow that's already working.
Balancing User Needs with Business Constraints
Not everything in a redesign makes it to production and learning when to let go is part of the work.
Students consistently asked for a web search mode for the ability to search beyond their course content. I designed the full flow, but it was cut for cost reasons since expanded retrieval would have been unsustainable on our existing OpenAI API spend. If I'd had more time, I would have pushed for an A/B test to model whether usage justified the cost. Sometimes business constraints trump user requests, and part of the work is recognizing when.
An overlay to allow students to create web URL collections within StudyClues for targeted web search
Responsiveness and Accessibility
Throughout the redesign, every screen was validated against WCAG 2.2 AA using the Stark plugin in Figma ensuring color contrast, focus states, touch targets, and keyboard navigation — with Wave and Axe used on deployed builds.
The interface was designed responsively across three breakpoints (1440px, 1024px, and 768px), with all components built using auto layout so they adapt cleanly across screen sizes without breaking.

Outcomes
Increased Student Reach
10,000+ students across 100+ courses at R1 institutions including UC Berkeley and UW–Madison.
Increased Student Satisfaction
Students appreciated being able to attempt quizzes inline and noted that the new interface felt clean and easy to navigate.
Cross-Product Lift
Early signals suggest the Quiz Me embed has increased PracticeClues engagement. Full PostHog analysis is in progress.
Endorsement from Leadership
She's proactive about identifying problems rather than waiting for direction. One examples is the StudyClues redesign where she took the product from a basic chat interface […] to a full-featured UI with conversation history, a complex and space-efficient layout for advanced features like quizzing, and accessibility standard considerations." Read More
— Achintya Kattemalavadi, Lead Software Engineer, LearningClues
Memuna is a gifted UX/UI designer whose work has had a direct and lasting impact on our company. She has a rare ability to create interfaces that are not only visually compelling, but intuitive, elegant, and grounded in user needs." Read More
— Perry Samson, CEO, LearningClues
Reflections
What I Learnt
Designing for AI is designing for trust and agency
Designing for AI is mostly about designing for trust. As models get more capable, the harder problem isn't the model itself. It's helping users build an accurate mental model of what the system can do, and making sure they always feel in control of the interaction.
I also learnt that good AI products should be agentic in a specific sense. The system should offer its own capabilities rather than wait to be asked.
When research isn't possible, competitors are
When user research isn't possible, competitor products become a research substitute. ChatGPT, Claude, and Gemini represent millions of hours of design decisions that students were already conditioned to.
Proactively asking for feedback also matters. Working solo doesn't have to mean alone.
What I would do Differently
More Evaluative Research
We often moved on to building the next feature without first evaluating whether the previous one was actually working. With more time, I would have pushed harder for usability testing rounds between launches.
Push earlier for data access
Several decisions in this redesign were initially based on intuition and competitor patterns, then later confirmed by PostHog data. Having that data earlier would have let me move faster on the right things and avoid investing in the wrong ones.
Thanks for sticking till the end! :)






