AI HALLUCINATION
Artificial intelligence (AI) might confidently answer you, even when it's completely wrong.
This is what's called AI HALLUCINATION in the ai ​​world.
How can you tell if an answer is correct and not just a clever guess?
Here are three simple ways to check.
Method 1: Ask for the source. After any important answer from the AI, ask it:
"What's the source? Give me the link or the name of the study."
If the source is real, it will readily provide it.
However, if it starts giving nonexistent links or unclear information, the answer is likely a hallucination.
Method 2: Use Perplexity. If you doubt an answer, put the same question into Perplexity. This engine provides the answer along with links to real sources, such as articles and research papers.
This allows you to quickly verify whether the information is correct or not.
Method 3: Use RAG-based tools. Some AI tools search real sources before giving an answer.
For example:
NotebookLM
Gemini with search
Tools that use the RAG system
Note: THESE ARE SIMPLY TOOLS THAT REDUCE THE LIKELIHOOD OF HALLUCINATIONS BECAUSE THEY ARE BASED ON RELIABLE SOURCES.
4
1 comment
Mohamed salem El baihi
3
AI HALLUCINATION
AI Automation Society
skool.com/ai-automation-society
Learn to get paid for AI solutions, regardless of your background.
Leaderboard (30-day)
Powered by