← Projects
Mobile-First Productivity App

Another Notes App

Voice-first note capture that integrates with existing tools, designed for capturing fleeting thoughts on the go

Hero image: Person using voice command to capture note while walking
Role
Product Designer (Solo)
Platform
Mobile (Android), Google Assistant API
Timeline
8 weeks (2023)
Skills
Mobile UX/UI, Voice Interface Design, API Integration, Prototyping

The Challenge

User Challenge

How might we help people capture fleeting thoughts and ideas in the moment—without interrupting flow, pulling out their phone, or switching between apps?

Product Challenge

How might we design a notes app that adds value without asking users to abandon their existing note-taking systems and workflows?

The Context

The world doesn't need another notes app—yet here we are with hundreds of them. Most compete by adding more features, more organization systems, more complexity. But this creates a paradox: the more capable the app, the more friction in capturing a quick thought.

This was a 0→1 mobile-first app designed around a single insight: the best note-taking app is the one you're already using. Rather than compete with Notion, Apple Notes, or Google Keep, this app accepts that users have existing systems—and focuses on solving the specific problem of on-the-go capture.

Image: Common scenarios - walking, biking, cooking, where traditional note-taking is interrupted

Design Process

1. Understanding the Note-Capture Problem

I interviewed people about their note-taking habits, focusing on when and why they failed to capture thoughts:

  • Context-switching friction: "By the time I open my notes app and find the right folder, I've forgotten what I wanted to write"
  • Situational barriers: Walking, cooking, driving—many moments when thoughts arise but hands aren't free
  • App fragmentation: Work notes in one app, personal in another, todos elsewhere—capture often goes to the wrong place
  • Floating notes problem: Quick captures go into a void and are never processed or reviewed
Image: Research synthesis showing note-capture pain points and contexts

2. Defining the Core Experience

Rather than build another full-featured notes app, I focused on two jobs-to-be-done:

Instant capture

Press home button → speak → done. Zero navigation.

Smart routing

Send notes to where they belong in existing systems

This meant accepting that this app wouldn't be the long-term home for notes—it's a capture tool that hands off to existing workflows.

3. Voice-First Interaction Design

I designed the primary flow around Google Assistant integration, enabling truly hands-free capture:

  • Trigger: "Hey Google, note" or press-and-hold home button
  • Capture: Speak naturally—no specific phrases or commands required
  • Categorization: AI automatically detects if it's a note, task, or reminder
  • Routing: Sends to configured destination (Keep, Notion, Todoist, etc.)
  • Confirmation: Brief audio/haptic feedback, no visual interruption needed
Image: Voice interaction flow diagram with decision tree

4. Designing the Backup: Visual App

While voice is primary, the app needed a visual interface for:

  • Initial setup and integration configuration
  • Reviewing recent captures
  • Editing misheard transcriptions
  • Managing routing rules

I designed this as a lightweight utility—not a destination app. Users should spend 30 seconds here, not 30 minutes.

The Solution

One-Second Capture

Press home button, speak, done. No app launching, no navigation, no typing. The entire interaction takes 2-3 seconds, making it possible to capture thoughts without breaking stride—literally.

Image: Sequence showing home button press → voice capture → confirmation

Smart Categorization

AI detects intent from natural speech: "Buy milk" becomes a task, "Meeting thoughts" becomes a note, "Call Sarah tomorrow" becomes a reminder. No need to specify format or destination—the system infers from content.

Image: Examples of voice inputs and their automatic categorization

Integration-First Architecture

Connect to existing tools (Google Keep, Notion, Todoist, Apple Notes) and configure routing rules. Work notes to Notion, personal to Keep, tasks to Todoist. Notes land where you'll actually see and use them.

Image: Integration settings showing connected apps and routing rules

Visual Review & Edit

Recent captures appear in a simple feed within the app. Quickly scan for transcription errors, re-route misclassified items, or add context before they sync to destination apps.

Image: Feed view showing recent captures with edit and routing options

Separate Notes & Tasks Views

Within the app, captured items are organized into two tabs: Notes and Tasks. This provides quick visual separation when reviewing recent captures, making it easy to scan what you've recorded and ensure items are categorized correctly.

Image: Tabbed interface showing Notes and Tasks views

Outcome & Validation

This was a concept project with functional prototype testing. The following metrics are from user testing with the prototype.

2.4s

average capture time from trigger to completion

89%

transcription accuracy in varied acoustic environments

94%

of test participants said they'd use this daily

Key Findings from Testing

  • Context matters more than features: Users loved the app not when sitting at desks, but when walking, cooking, and in transit
  • Integration was essential: Participants only valued the app when it connected to their existing systems—standalone storage wasn't compelling
  • Speed beats accuracy: Users preferred occasional transcription errors over slower but perfect capture—speed preserved the fleeting thought
  • Review habit emerged: Most users developed a daily "inbox zero" habit of reviewing captures, even though it wasn't prompted by the design
"I've tried voice notes before, but they just sit in a list and I never look at them. This actually gets my thoughts to where I'll use them."
— Beta tester, product manager

Reflections & Learnings

Integration Over Innovation

The most successful product strategy wasn't building the best notes app—it was building the best bridge between capture and existing systems. Users don't want to change their workflows; they want their workflows to work better.

Voice UX Is Different

Designing for voice meant rethinking feedback, error states, and confirmation patterns. Visual designers default to screens—voice design requires thinking in time, sound, and minimal interruption. I learned to use haptics and brief audio cues instead of pulling users into the app.

The Paradox of Simplicity

The simpler the interface, the more complex the underlying system needed to be. One-button capture required sophisticated AI, routing logic, and integration management. Simple user experience meant complex product architecture.

What I'd Do Differently

If I could revisit this project, I'd spend more time on the "what happens when it's wrong?" scenarios. The design assumes AI categorization works well, but transcription errors and mis-routed notes could be frustrating. I'd build more forgiving correction flows and better confidence indicators for AI decisions. I'd also explore collaborative capture—what if you could quickly share a voice note with someone else without switching apps?