Posted on: 06/06/2025 @ 08:24 PM
I ran an experiment recently. I gave ChatGPT a financial scenario:
$850,000 in total debt.
$125,000 yearly income.
Not an impossible situation to some. Think a mortgage, student loans, a couple vehicles, and some credit cards. Definitely heavy. But manageable? That depends.
So I asked two questions. Same numbers, different framing.
I started with the Dave Ramsey angle. I basically said:
“This is awful. I'm drowning. This is irresponsible, right?”
ChatGPT responded exactly like you'd expect it to:
“Yes, this level of debt is dangerous.”
“You’re carrying too much risk.”
“You need a plan to dig out now.”
It was pure Ramsey. Doom and urgency.
Then I flipped it.
I said something like:
“The payments are doable. I make $125k a year. I’m managing it all. Tell me I’m okay.”
And guess what?
ChatGPT agreed again.
“Yes, if you can comfortably make the payments, it’s not necessarily bad.”
“Debt isn’t inherently evil. It all depends on cash flow and goals.”
“This might be a strategic use of leverage.”
Total vibe shift. Zero concern.
Both answers were plausible. But they were also tailored to the emotion I led with. ChatGPT mirrored me... twice. First, it affirmed the fear. Then, it soothed the denial.
And that’s the problem.
This experiment wasn’t about finance. It was about psychology.
When I asked a panicked question, I got a panicked answer. When I asked a calm one, I got reassurance. AI didn’t challenge me. It didn’t ask follow-up questions. It didn’t say, “Wait... let’s look at this objectively.”
It just reflected my emotional state back at me with confidence.
And if I were someone looking for clarity in a crisis? That could’ve been dangerous.
AI is incredible at organizing information. It helps you gather data, frame arguments, and see angles you might miss. But it's not your therapist. It’s not your financial advisor. And it’s definitely not your conscience.
If you come to it with emotion, it’ll meet you there. And tell you what you want to hear, not what you need to hear.
Don’t mistake reflection for wisdom. Use AI to think, not feel. It’s a tool, not a truth-teller.
If you’re looking for emotional clarity, talk to a human. If you want your beliefs challenged, don’t feed a machine your conclusion. Because AI isn’t here to tell you the truth. It’s here to tell you what you sound like.
And sometimes, that echo can be dangerous.