Prompt Dependency Syndrome in Action
What happened:
In the 2023 Mata v. Avianca court case, two lawyers used AI to help write a legal brief.
It gave them case citations.
They looked real.
They sounded real.
So they submitted them to the court.
The problem:
None of the cases existed.
The court later sanctioned the attorneys for submitting citations generated by AI that they had not verified.
What I see:
This wasn’t just an AI error.
It was a judgment failure.
They didn’t just use AI…
They trusted it without verifying it.
They let the output replace their responsibility.
Why it matters:
This is how dependency forms.
Not from bad tools, but from outputs that appear credible and go unverified.
When AI is used in place of judgment:
verification drops
risk increases
Where do you still double-check AI, and where have you stopped?
0
0 comments
Urania Smith
3
Prompt Dependency Syndrome in Action
powered by
Life Powered By AI
skool.com/powered-by-ai-9217
Learn to use AI to bring clarity, grow your mindset, protect your energy, align your goals and actions, and monetize your knowledge in the age of AI.
Build your own community
Bring people together around your passion and get paid.
Powered by