UX Case Study · Founder Project

CookFromHere

An AI-powered recipe app that turns your fridge contents into personalised meals. Designed, built, and shipped solo — from first sketch to production in 9 languages.

Role
Founder · UX · Dev
Duration
2 Months (Ongoing)
Tools
Figma · React · Supabase
Platform
Responsive Web App & Mobile
CookFromHere App
The Problem

People throw away food because they don't know what to cook.

Every household faces the same friction: you open the fridge, see random ingredients, and can't think of what to make. Existing recipe apps expect you to search by dish name — but most people don't start with a dish in mind. They start with what they have.

Add dietary restrictions, allergies, and cooking time constraints, and the problem multiplies. The result: wasted food, wasted time, and defaulting to takeaway.

30%
Household Food Wasted Globally
~40m
Daily "What Should I Cook" Searches
12+
Common Allergens to Manage
Research & Discovery

What I found by talking to real home cooks.

I interviewed 12 people across different cooking skill levels, dietary needs, and household sizes. I also analysed competitor apps (Supercook, Whisk, Yummly) to map feature gaps.

Insight 01

Ingredient entry is the dropout point

Every competitor requires manual typing. Users gave up after entering 3–4 items. The input method IS the product's bottleneck.

Insight 02

Allergies create anxiety, not confidence

Users with dietary restrictions didn't trust generic recipe filters. They needed explicit confirmation that a recipe was safe for them, every single time.

Insight 03

Cooking guidance matters more than the recipe

Beginner and intermediate cooks didn't want a list of steps. They wanted someone to walk them through it, answer questions, and tell them when something looks right.

User Journey

Mapping the "what should I cook?" moment.

I mapped the existing journey vs. the target experience. The goal: collapse a 15-minute frustration loop into a 60-second flow from fridge to recipe.

Before · Without CookFromHere
The Frustration Loop
High friction
1 Open fridge, stare at ingredients
2 Google "recipes with chicken and peppers"
3 Scroll through 20+ results with ads
4 Recipe needs 6 ingredients you don't have
5 Repeat search 3–4 times
6 Give up → order takeaway
Avg. time to decision
15+ min
😤
After · With CookFromHere
Scan → Cook
Minimal friction
1 Open app, tap Scan
2 Photo your fridge → AI detects ingredients
3 Set preferences (time, allergies, diet)
4 Get 3 recipes using only YOUR ingredients
5 Start guided cooking with AI chef
Avg. time to cooking
< 60 sec
🍳
Design Process

From sketches to shipped product in 3 phases.

I followed a compressed design sprint approach — validating each phase with real users before moving to the next.

1
Low-Fi Wireframes
Paper sketches and Figma wireframes to map the core flows. Tested with 5 users using clickable prototypes.
Key screens mapped
Scan → Ingredient review flow
Pantry management grid
Recipe generation & selection
Step-by-step cooking mode
2
Hi-Fi Prototype
Visual design in Figma with component library. Focused on the AI interaction patterns and trust signals.
Design decisions validated
Orange accent for AI actions
Allergy badges (visible & explicit)
Preferences modal placement
Cooking progress indicators
3
Production Build
React implementation with real AI integration. Iterated based on live user feedback and analytics.
Shipped features
Gemini vision scanning
AI cooking chat assistant
Stripe subscription tiers
9-language localisation
Usability Testing

3 rounds of testing. Each one reshaped the product.

I tested with real users at each phase. The biggest changes came not from what users said, but from watching where they hesitated.

Round 01 · Wireframes

Navigation & Flow

5 participants · Figma prototype
  • Users didn't understand "Pantry" as a concept
  • Scan button wasn't discoverable enough
  • Expected to generate recipes directly from scan results
Change: Added "Generate Recipes" button to scan results + made Scan a primary nav item
Round 02 · Hi-Fi

AI Trust & Preferences

8 participants · Interactive prototype
  • Users with allergies wanted explicit "safe" labels
  • Cooking time was the #1 filter request
  • AI-generated recipes needed ingredient count confirmation
Change: Added allergy badges, cooking time selector, and "uses only your ingredients" guarantee
Round 03 · Live Product

Cooking Experience

6 participants · Real cooking sessions
  • Users forgot the AI chef button existed during cooking
  • Step transitions felt abrupt without animation
  • No sense of completion after finishing a recipe
Change: Added proactive AI prompts, step animations, and celebration screen with photo review
Key Design Decisions

Three decisions that defined the product.

Decision 01

Camera-first ingredient input

If typing ingredients kills engagement, remove typing. I designed a scan flow where users photograph their fridge or pantry, and Google Gemini's vision API identifies ingredients automatically.

The UX challenge was managing AI uncertainty. I added an editable ingredient list after scanning so users can correct mistakes before generating recipes. This built trust without adding friction.

→ Scan-to-recipe takes under 15 seconds vs. 2+ minutes of manual entry in competitor apps.

Decision 02

Preferences modal before every generation

Most recipe apps bury dietary preferences in settings. I surface them at the moment of decision: a quick modal before generating recipes lets users set cooking time, allergies, and dietary needs per session.

Allergies are pre-filled from the user's profile but can be adjusted. This respects the "cooking for guests" scenario where today's constraints differ from your default ones.

→ Users reported feeling "safe" using the app for guests with allergies — a key trust signal competitors lacked.

Decision 03

AI cooking assistant, not just a recipe card

The guided cooking mode includes a floating AI chef button. It proactively offers help every 2 steps and answers contextual questions ("Is this too brown?", "Can I substitute butter?").

The first message is a free welcome — no API cost. This gives every user a taste of the AI assistant before they decide to engage, reducing the barrier to interaction.

→ AI chef engagement rate: 68% of active cooking sessions. Users described it as "having a patient friend in the kitchen."

Technical Execution

Designed it. Then built it.

As the sole designer and developer, I made every architectural decision with UX in mind. The tech stack was chosen for speed, reliability, and cost efficiency within a $5/month AI budget.

React + TypeScript

Component-based UI with full type safety. Tailwind CSS for rapid iteration. Framer Motion for cooking animations.

🧠

Google Gemini 2.0 Flash

Vision API for ingredient scanning. Text API for recipe generation and cooking chat. Rate-limited per subscription tier.

🔐

Supabase Backend

13 edge functions handling auth, AI calls, subscriptions, and email. Row-level security on all tables.

💳

Stripe Subscriptions

3-tier model: Free (2 meals), Standard (7 meals), Premium (unlimited). Webhook-driven lifecycle management.

🌍

9-Language i18n

EN, TR, ES, PT, RU, ZH, IT, FR, DE. Browser-detected with manual override. AI responds in the user's language.

📸

Post-Cooking Sharing

Photo upload, star rating, and shareable dish pages with dynamic OG meta tags for social previews.

Results

Shipped solo. End to end.

CookFromHere went from idea to production as a one-person operation: research, UX design, visual design, frontend development, backend architecture, payments, email, and internationalisation.

0→1
Idea to Production
9
Languages Shipped
13
Edge Functions
$5
Monthly AI Budget
Reflections

What building a product taught me about design.

Designing under real constraints changes everything

When you're also the developer paying the API bills, you design differently. Every feature gets weighed against cost, complexity, and maintenance. This made me a better designer — not a more cautious one, but a more honest one.

AI trust is earned through transparency

Users don't blindly trust AI-generated recipes. The editable ingredient list after scanning, the visible allergy badges, and the "safe for you" confirmations all exist because early testers asked: "How do I know this is right?"

Internationalisation is a design decision, not a dev task

Supporting 9 languages affected button widths, text hierarchy, and even the cooking flow. German labels are 40% longer than English. This forced a more resilient, flexible layout from day one.

Ship, then listen

The post-cooking review flow, the guided cooking animations, and the proactive AI prompts were all added after watching real users interact with the first version. The best features came from observation, not assumption.

Try it yourself

CookFromHere is live. Scan your ingredients and get cooking.

Visit CookFromHere
You might also like