Therapy Beagle

Reimagining AI as a slow, reflective companion — not a hyper-efficient assistant.

TherapyBeagle challenges the productivity-first model of AI by introducing emotional slowness and care into digital interactions. Built entirely locally using LLaMA 3 (via Ollama), the chatbot invites users to breathe, reflect, and feel — not just respond. Note: This is not a therapeutic replacement, but rather a speculative design tool for emotional expression and companionship.

Client

Self

Services

Product Design UI/UX Human Computer Interaction

Industries

Human-Computer Interaction (HCI)

Date

April 2025

01—Problem Many students and creatives feel emotionally fatigued, isolated, or anxious, yet find digital tools too clinical or transactional to use meaningfully. Most journaling apps and mental-health chatbots chase efficiency, overlooking the warmth and slowness genuine reflection requires. 02—Target Users Emotional support seekers—people who want a non-judgmental listener Privacy-conscious users—uncomfortable sharing data with commercial AI Students & creatives—navigating burnout, stress, or loneliness 03—Approach & Tech Stack I treated TherapyBeagle as both a technical build and an emotional design challenge exploring how local LLMs, conversational UI, and lightweight memory systems could foster trust. Tech Stack: 1. Backend: LLaMA 3.2 via Ollama (fully local, offline-first) 2. Framework: Flask (Python) 3. Frontend: HTML, CSS, JavaScript (includes theme toggle and accessibility) 4. Memory System: Custom SessionMemory class using JSON persistence. 04—Key Features & UX Decisions Contextual Memory Remembers the last six exchanges to build conversational continuity and emotional resonance. 1. Emotion Tagging Softly detects themes like grief, anxiety, or loneliness to subtly adjust the bot's tone. 2. Typing Animation Introduces “thinking” pauses to encourage mindful pacing over instant response. 3. Local-Only Storage Every interaction stays on-device—no cloud, no data exposure. 4. Theme Toggle Light/dark and high-contrast modes for visual comfort during longer sessions. 05—Architecture Overview 1. `main.py`—routes input, builds prompts, queries LLaMA, logs memory 2. `SessionMemory`—tracks user state, emotional tags, persistent JSON logs 3. `index.html`—renders chat UI, feedback bubbles, reset & theme controls

06 — User Testing & Iteration - I tested TherapyBeagle with five peers over a two-week period (students and creatives). What users said: - “Feels gentler than Replika.” - “I like that it doesn’t judge or try to fix me.” - “Knowing it’s not cloud-based helped me open up.” - “I wish it remembered what I said earlier.” What changed based on feedback: - Added session-based memory for contextual continuity - Introduced emotional tagging for themes like grief, stress, and reflection - Rewrote prompt structures to acknowledge past responses (“Last time, you mentioned…”) 07—Outcomes & Impact Quantitative Outcomes - Average conversation length: 7+ turns - 70% said they’d use it again “when feeling overwhelmed or stuck.” Qualitative Insights - Users responded better to a chatbot that listens — not one that tries to fix - Local-only hosting created a sense of privacy and agency - Emotional design isn’t a layer — it lives in pacing, tone, and memory 08 — Final Reflection TherapyBeagle taught me that compassion can be designed — not through logic alone, but through interface tone, timing, and trust. This project reinforced my belief that emotionally intelligent tools require more than clean visuals — they demand systems that care. It also deepened my interest in building offline-first, privacy-centered, and emotionally aware digital experiences.