top of page

Why RAG Is Broken for Mental Health - And What We Built Instead

  • Writer: Udayaditya Barua
    Udayaditya Barua
  • Feb 3
  • 3 min read
Standard RAG retrieves keywords — but human emotion doesn't work that way. Learn how Flammingo's hybrid Graph + Vector engine goes beyond search to actually derive meaning from your mental health data.

If you're an engineer in 2026, you can't escape the acronym RAG - Retrieval Augmented Generation.


It's the industry standard for making AI feel "smart." The premise is elegant: when a user asks a question, the AI searches a database for relevant documents, pastes them into the prompt, and generates an answer.


Brilliant technology. But for mental health, it has a fatal flaw.


RAG Is Just "Ctrl+F" on Steroids


At its core, RAG is a search engine. It matches keywords. If you search for "anxiety," it fetches every note containing the word "anxiety."


But human emotion is almost never that literal.


You might be anxious without ever typing the word. You might write about a "tight chest." Bad sleep. Hating Tuesdays. Each of these is a signal - but none of them share a keyword.


Standard RAG finds the text. It misses the meaning.


The Problem: Dots Without Lines


Standard RAG treats your memories like files in a cabinet. It can pull out File A and File B, but it has no idea they're connected — unless they share the exact same words.


The result is what I call the "dementia effect." The AI can recite your own words back to you. It just can't make sense of them.


Here's what that looks like in practice:
You: "Why am I so tired?"
RAG AI: "Here are 5 journal entries where you mentioned 'tired.'"

What the AI completely misses: you only say "tired" on days after you talk about calling your mom. That's not a coincidence. That's a pattern — and it's the insight that actually matters.


The Solution: Graph + Vector


At Flammingo, we moved beyond standard RAG. We built a hybrid engine that layers two technologies together - and the combination is where the real power lives.


Vector Database - The Vibe

We don't just search for words. We calculate mathematical distance between them.


Every emotion you express gets converted into a vector — a set of numbers that captures its meaning. This lets the AI understand that "heart racing" and "nervous" are semantically close, even though they share zero words in common.


Vectors give us similarity without requiring exact matches.


Knowledge Graph - The Logic

This is the real game-changer.


Instead of storing your entries as isolated documents, we map your life as a web of relationships. Every person, event, and emotion becomes a node. The connections between them become edges — and those edges carry meaning.


Node A (Person): Mom
Node B (Event): Phone Call
Node C (Emotion): Drained
Edge: Phone Call → causes → Drained

The graph doesn't just store what happened. It understands why it mattered.


From Search to Cognition

Layering vectors and graphs is where everything changes.


When you ask Mingo "Why am I tired?", it doesn't just keyword-match against the word "tired." It traverses the graph. It sees that Node A (Mom) connects frequently to Node C (Drained) through Node B (Phone Call). The vector layer confirms that "tired," "drained," and "exhausted" are all pointing at the same emotional space.

The result isn't a list of old journal entries. It's an actual insight.


RAG: Retrieves data.


Flammingo: Derives meaning.


We're not building a search engine for your journal. We're building a cognition engine for your life.


Comments


bottom of page