A humorous yet practical guide to AI-assisted development. DON'T PANIC.
View the Project on GitHub HermeticOrmus/hitchhikers-guide-to-vibe-engineering
Risk Level: 🔴 DANGER
HALLUCINATION (n.): When an AI generates something that looks correct, sounds confident, and is completely made up. Named for its similarity to seeing things that aren’t there, except in this case, the things are npm packages, API methods, and library functions.
Hallucinations are dangerous because they’re indistinguishable from real information in the AI’s output.
The AI doesn’t signal uncertainty:
# AI might generate:
from fastsort import QuickSort # This library doesn't exist
result = QuickSort.parallel_sort(data) # This API is invented
It looks legitimate. It compiles (sometimes). It just doesn’t work.
# Not real
from datahelper import CSVParser
from quickauth import JWTManager
from fastvalidate import EmailValidator
# Real library, fake method
import pandas as pd
df.smart_merge(other_df) # .smart_merge() doesn't exist
// Real function, fake parameter
fetch(url, {
retryCount: 3, // Not a real option
cacheMode: 'smart' // Also invented
})
# Real service, fake endpoint
curl https://api.github.com/repos/owner/repo/smart_analytics
"React 18's useAutoMemo hook automatically memoizes..."
(useAutoMemo doesn’t exist)
Be suspicious when:
# For npm packages
npm search <package-name>
npm info <package-name>
# For Python packages
pip search <package-name> # deprecated, use web
pip show <package-name>
# For any library
# Just Google it. If it exists, you'll find it.
# In Python REPL
import library
help(library.method_name)
dir(library)
# If it exists, help() works
# If not, you get AttributeError
# Test in isolation before integrating
from mysterious_library import mystery_function
# If this fails, the library is hallucinated
print(mystery_function("test"))
The AI predicts what tokens should come next based on patterns. When you describe a problem:
The AI isn’t lying. It’s generating statistically plausible sequences. Sometimes plausible ≠ real.
Assume AI-suggested libraries/methods might not exist until verified.
Check before using. Search, documentation, pip show.
Test new dependencies in isolation before integrating.
Code review catches hallucinations that got past you.
Automated builds fail on missing dependencies.
“If the AI suggests a library that perfectly solves your problem and you’ve never heard of it, verify before you
npm install.”
This builds the verification habit.