Building my Stack by solving more problems. (I think that’s what it’s called)
My chatbot is literally behaving like he’s had a few too many. Making up facts and extrapolating across domains that don’t make sense. And then for no reason at all he forgets what we’ve been chatting about… Apparently (after a quick google) this is a common issue called “hallucination”. This will not do. It will kill my entire app if I can’t get this right. I can handle bugs. I cannot handle bullshit....
I find yet another major problem. I’m so used to finding problems at the moment that it’s just like any other Tuesday.
Head down. Keep eating the elephant. Byte by byte (pun intended)…
My chatbot is literally behaving like he’s had a few too many. Making up facts and extrapolating across domains that don’t make sense. And then for no reason at all he forgets what we’ve been chatting about…
Apparently (after a quick google) this is a common issue called “hallucination”.
This will not do. It will kill my entire app if I can’t get this right.
I can handle bugs. I cannot handle bullshit.
When it comes to Chronic pain, trust is at the centre of everything.
A persons trust in their body (often sorely missing due to years of pain), trust in the practitioner you’re seeing, trust in your ability to make progress despite life’s circumstances.
I know that for people with chronic pain, who are often so incredibly vulnerable due to the circumstance they find themselves in, I have to get this right.
For this whole support system inside an app to work, I cannot lose their trust. Ever.
Because of the work I do, I feel like I’ve built my brain to think in systems.
Chronic pain is so multi-factorial that over the last 15+ years, I’ve had to deep dive into multiple domains l and it’s made me think in systems to find pathways towards solutions.
It’s just how I’m wired.
My mind starts whirring about this chatbot issue. The nights of 3.5 hours sleep are over.
4-4.5 hours (aka the new norm) feels like 3 nights sleep compared to what we were doing.
I figure out that I need to bundle in the users context to every chat message. And that I need to create conversation summaries and store them in my database.
We build it. It works.
Chatbot stays in topic and can handle topic changes too.
I’m proud and then I realise I have a slight problem….
I’ve not built a way to capture the users context properly.
FFS.
So I build a proper onboarding flow over a 2 week period and now we are getting somewhere. It’s exactly like I do it with a new client in clinic. I really am codifying how I think and approach supporting a person with chronic pain.
I cannot find body charts that are suitable for my app so I make my own SVG charts. Weirdly proud of these.
Now I have rich user data flowing in every conversation.
I then realise I have lots of data that will be flowing through my app when I get users. I should use that.
I find self optimising algorithms and I build these into the backend of the recommendations systems. Now my code will get smarter the more that people use it. NICE.
I’m chatting with an AI system about the memory issue because it’s better but still not the way I want it.
I ask it how to 10X my chatbot memory system.
The AI spits out something about Memory Augmented Retrieval.
I ask it to explain.
BAM.
This is the thing I’ve been looking for.
Occasional updates. No spam.