Build an Offline AI Survival Kit: No Internet Required
Want to go deeper than this article?
The AI Learning Path covers this topic and more — hands-on chapters across 10 courses across 10 courses.
Build an Offline AI Survival Kit: No Internet Required
Published on April 23, 2026 • 17 min read
The 2024 CrowdStrike outage knocked airlines, hospitals, and 911 systems offline for most of a day. Three weeks later a fiber cut took my entire town off the internet for 38 hours. I had a fully-charged laptop, no signal, and a sudden appreciation for every assumption I had made about being "always online."
That weekend I started building a self-contained AI kit that works with zero connectivity. Not as a doomsday project — just a quiet backup that stays useful when the internet is unavailable, throttled, or filtered. Six months in, it has been useful during four real outages, two flights, and one extended hospital visit where the WiFi cost $15/day.
This is the exact build. One laptop. About 200 GB of curated content. Models, knowledge bases, maps, medical references, and repair docs that work with the WiFi off. The kit fits on most modern machines and costs nothing if you already own the hardware.
Quick Start: The 30-Minute Minimum Kit {#quick-start}
If you want to be useful in an outage by tonight, do these four things in order. Total time: about 30 minutes plus download.
# 1. Install Ollama (works fully offline after install)
brew install ollama # macOS
curl -fsSL https://ollama.com/install.sh | sh # Linux
# 2. Pull a capable but compact model
ollama pull llama3.2:8b # 4.7 GB
# 3. Download Wikipedia in a single file (Kiwix)
# 100 GB full English version, or 4.3 GB "best of" subset
mkdir -p ~/offline-kit/zim
cd ~/offline-kit/zim
curl -L -o wikipedia_en_top_maxi.zim \
"https://download.kiwix.org/zim/wikipedia/wikipedia_en_all_nopic_2024-01.zim"
# 4. Install Kiwix reader
brew install --cask kiwix # macOS
sudo apt install kiwix-tools # Linux
That gets you a working offline AI assistant plus all of Wikipedia, fully searchable, no connection. Everything below adds depth: medical, maps, code docs, repair manuals, and resilience tooling.
What "Offline AI" Actually Means {#what-offline-means}
Three categories matter. Mixing them up is how people end up with kits that fall apart at exactly the wrong moment.
- Generative AI offline. A locally-run LLM via Ollama or llama.cpp. Excellent for reasoning, drafting, translation, and brainstorming. Bad for facts it was not trained on (recent events, hyper-local detail).
- Knowledge corpora offline. Static, indexable archives — Wikipedia, Stack Exchange, medical libraries, OpenStreetMap, repair guides. Authoritative for the facts they contain. Searchable instantly.
- Tooling that works offline. Map viewers, dictionary apps, calculators, code editors with offline docs, password managers with local vaults. Boring but critical when the internet is down.
A serious survival kit covers all three. Most "offline AI" guides cover only the first.
Hardware Tier Options {#hardware-tiers}
You do not need a Mac Studio. You need a machine that runs a 7B-8B model and stores ~200 GB of compressed knowledge.
| Tier | Hardware | Storage | Models You Can Run | Cost |
|---|---|---|---|---|
| Minimum | Used ThinkPad T480 / MacBook Air M1 | 256 GB SSD | 3B-7B (q4) | $200-$400 used |
| Comfortable | Modern laptop, 16GB RAM | 512 GB SSD | 7B-13B (q4) | $700-$1,200 |
| Bunker-grade | Mac Mini M4 + UPS + external 2TB | 2 TB external | Up to 70B (q4) | $1,500 |
| Edge / portable | Raspberry Pi 5 (8GB) + 1TB SSD | 1 TB SSD | 1B-3B models | $160 |
| Backup-only | Old smartphone with PocketPal | Internal storage | 1B models | $0-$150 |
The Raspberry Pi 5 tier deserves attention: a Pi 5 with an NVMe HAT and a 1TB SSD runs llama.cpp at usable speeds and consumes 5-7 W, which means a 20,000 mAh USB-C power bank gives you about 12 hours of operation. That is the closest thing to a "portable AI library" available.
For the deeper Pi build, see our LLMs on Raspberry Pi 5 guide.
Step 1: Pick and Pre-Stage Your Models {#models}
For an offline-only environment, choose models that punch above their weight at small sizes. Three I keep on my kit:
# Reasoning + general assistant (the workhorse)
ollama pull llama3.2:8b # 4.7 GB
# Fast, low-resource fallback
ollama pull phi3.5:mini # 2.2 GB
# Code-focused (for fixing scripts, building tools offline)
ollama pull qwen2.5-coder:7b # 4.4 GB
# Tiny model for very low-power devices (Pi, phone)
ollama pull gemma:2b # 1.4 GB
Verify each model can answer fully offline:
# Disable network and confirm the model still responds
sudo ifconfig en0 down # macOS Wi-Fi
ollama run llama3.2:8b "Explain how to purify drinking water in 5 short steps."
sudo ifconfig en0 up
If it answered cleanly, your generative half is ready.
Pro tip: Make a manifest. After all your downloads, run:
ollama list > ~/offline-kit/MANIFEST_models.txt
Keep this file printed and tucked into your kit. When you stand up a backup machine in an emergency, you will know exactly what to re-pull.
Step 2: Build the Knowledge Base with Kiwix {#kiwix}
Kiwix packages Wikipedia, Stack Exchange, medical references, project documentation, and more into single .zim files. The full English Wikipedia without images is roughly 50 GB. With images it is 100 GB.
Download these from library.kiwix.org in priority order:
| File | Size | Why It Matters |
|---|---|---|
| wikipedia_en_all_maxi | 100 GB | Full Wikipedia w/ images |
| wikipedia_en_all_nopic_maxi | 50 GB | Faster, no images, same content |
| wikipedia_en_medicine | 1.0 GB | Focused medical articles |
| wikipedia_en_top | 4.3 GB | Curated "best of" Wikipedia |
| wikem | 0.4 GB | Emergency medicine reference |
| wikivoyage | 1.6 GB | Travel info, useful when stranded |
| stackoverflow | 92 GB | Programming Q&A |
| stack-mathematics | 4 GB | Math reference |
| ifixit | 4 GB | Repair manuals (laptops, phones, cars) |
| openstreetmap-en | 65 GB | Worldwide map data, searchable |
| project-gutenberg | 16 GB | 70,000+ public-domain books |
A practical 200 GB kit:
- Wikipedia nopic (50 GB)
- iFixit (4 GB)
- WikEm medical (0.4 GB)
- Stack Overflow (92 GB)
- Project Gutenberg (16 GB)
- OpenStreetMap your continent (~25 GB)
- Wikivoyage (1.6 GB)
- Wikipedia medicine (1 GB)
Total: ~190 GB. Comfortable on any modern 512 GB or 1 TB drive.
# Download with resume support — these files are huge
cd ~/offline-kit/zim
aria2c --max-connection-per-server=4 --split=4 \
https://download.kiwix.org/zim/wikipedia/wikipedia_en_all_nopic_2024-01.zim
Once you have the .zim files, serve them locally:
# Start a local Kiwix server on port 8080
kiwix-serve --port=8080 ~/offline-kit/zim/*.zim
Open http://127.0.0.1:8080 in any browser. You now have a fully searchable knowledge library.
Step 3: Wire Knowledge Into the LLM (RAG, Offline) {#rag-offline}
The combination of Ollama + Kiwix becomes much more useful when the model can cite facts from your archive. Set up a tiny offline RAG pipeline:
# Install minimal stack
pip install llama-index llama-index-llms-ollama \
llama-index-embeddings-huggingface chromadb sentence-transformers
# Pull an embedding model that works offline
ollama pull nomic-embed-text
# offline_rag.py
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex, Settings
from llama_index.llms.ollama import Ollama
from llama_index.embeddings.huggingface import HuggingFaceEmbedding
Settings.llm = Ollama(model="llama3.2:8b", request_timeout=120.0)
Settings.embed_model = HuggingFaceEmbedding(
model_name="sentence-transformers/all-MiniLM-L6-v2"
)
# Point at your local survival docs (PDFs, markdown, txt)
docs = SimpleDirectoryReader("~/offline-kit/docs").load_data()
index = VectorStoreIndex.from_documents(docs)
index.storage_context.persist(persist_dir="~/offline-kit/index")
q = index.as_query_engine(similarity_top_k=4)
print(q.query("What is the immediate treatment for a deep laceration on the forearm?"))
This works without any network connection. The model reasons over your indexed documents.
For a deeper offline-RAG walkthrough (production patterns, chunking, eval), see our Ollama + ChromaDB RAG pipeline.
Step 4: Offline Maps {#offline-maps}
OpenStreetMap data through Kiwix is searchable but limited. For real navigation, install Organic Maps or OsmAnd on every device in the kit.
# Linux: download regional OSM data
mkdir -p ~/offline-kit/maps
cd ~/offline-kit/maps
curl -L -o north-america.osm.pbf \
https://download.geofabrik.de/north-america-latest.osm.pbf
# Render tiles offline with a tile server (advanced)
docker run -d --name osm-tile-server \
-v ~/offline-kit/maps:/data \
-p 8081:80 overv/openstreetmap-tile-server
For phones, Organic Maps lets you download entire countries. Do this for your home country, neighboring regions, and any place you travel often.
Step 5: Medical and First-Aid References {#medical}
Wikipedia covers a lot. Wikipedia is also not authoritative. Add these:
| Resource | Format | Notes |
|---|---|---|
| WikEm Emergency Medicine | Kiwix .zim | Concise, evidence-based |
| WHO essential medicines list | Public domain | |
| Where There Is No Doctor (Hesperian) | Free, plain-language | |
| MSF Clinical Guidelines | Field medicine reference | |
| Red Cross First Aid app | Mobile app | Works offline once installed |
# Where There Is No Doctor (English, free)
mkdir -p ~/offline-kit/docs/medical
cd ~/offline-kit/docs/medical
curl -L -o where-there-is-no-doctor.pdf \
"https://store.hesperian.org/prod/Where_There_Is_No_Doctor.html"
These also become RAG inputs in step 3 — the LLM can answer practical questions citing the specific documents.
Step 6: Repair, Survival, and Practical Manuals {#manuals}
# Download iFixit zim
curl -L -o ifixit_en_all_2024-01.zim \
https://download.kiwix.org/zim/ifixit/ifixit_en_all_2024-01.zim
# US Army Survival Manual (FM 21-76, public domain)
curl -L -o us-army-survival-fm-21-76.pdf \
https://www.bits.de/NRANEU/others/amd-us-archive/FM21-76(70).pdf
Add: vehicle repair (Haynes manuals if you own them), HAM radio reference, your country's emergency-services radio frequencies, electrical wiring color codes for your region, knot-tying reference, and any equipment manuals for tools you actually own.
Step 7: Tools That Work Offline {#tools}
Skip cloud-only apps. Install:
- Obsidian — local-first markdown notes, no account needed
- VS Code — offline-capable; download extensions ahead of time
- DEVdocs offline — programming language references
- KeepassXC — local password vault
- OnlyOffice or LibreOffice — document editing
- Anki — flashcards (download decks before)
- Calibre — ebook library + reader
- VLC — plays anything you throw at it
- PocketPal AI / MLC Chat — phone-based LLM as a backup
# Bulk install via Homebrew on macOS
brew install --cask obsidian visual-studio-code devdocs keepassxc \
libreoffice anki calibre vlc
For Linux:
sudo apt install obsidian code keepassxc libreoffice anki calibre vlc
Step 8: Power and Resilience {#power}
The smartest model in the world is useless if your laptop dies in 90 minutes. Add real power resilience.
| Item | Use | Approx Cost |
|---|---|---|
| 20,000 mAh USB-C PD power bank | 1-2 laptop charges | $60-$90 |
| 100W solar panel + charge controller | Slow daytime charging | $120 |
| Bluetti or Anker portable power station | Multi-day power | $300-$700 |
| UPS with 30-min runtime | Brownout protection | $80-$150 |
| Spare laptop battery (if removable) | Genuine backup | $40-$120 |
A Mac Mini M4 + UPS combo runs about $1,500 total and gives you a desktop-class AI workstation that can sustain through a multi-hour outage. For full off-grid resilience, the Bluetti AC180 (1152 Wh) charges from solar and runs a Mac Mini for 25+ hours.
Step 9: Mobile and Wearable Backup {#mobile-backup}
Your laptop will not always be with you. Set up a phone backup:
- iOS: Install PocketPal AI. Pre-download Phi-3.5 Mini and Llama 3.2 1B.
- Android: Install MLC Chat or Layla Lite. Same models work.
- Pre-load Organic Maps with regional tiles.
- Install Kiwix mobile app and copy the medical + survival
.zimfiles to phone storage. - Cache key Wikipedia articles with offline reading apps.
This gives you a fallback brain in your pocket even if you lose the laptop.
Step 10: Disaster-Test the Kit {#disaster-test}
A kit you have not stress-tested is a kit you cannot trust. Run a 4-hour offline simulation once a month:
- Disconnect from WiFi and disable cellular.
- Pretend the public internet is unreachable.
- Try to: look up first aid for a broken arm, find a route to a hospital 50 miles away, debug a Python script, draft an email to a relative, identify a wild plant, and translate a phrase to Spanish.
- Note where the kit failed. Add resources to fill gaps.
After three monthly tests, the kit becomes genuinely reliable. Skip the test and you will discover gaps when stakes are high.
Pitfalls and Gotchas {#pitfalls}
Gotcha 1: Models hallucinate facts. Local LLMs sound authoritative even when wrong, especially about local geography or recent events. Always cross-reference critical info with Kiwix or your downloaded references. The LLM is the reasoning engine, not the source of truth.
Gotcha 2: Refresh annually. Wikipedia dumps update. Medical guidelines change. iFixit gets new repair guides. Set a calendar reminder for January 1 to re-download everything. An out-of-date kit gives wrong answers with high confidence.
Gotcha 3: Storage redundancy matters. Keep the entire kit on at least two physical drives. SSDs fail. External drives die. rsync -av ~/offline-kit/ /Volumes/Backup/offline-kit/ is your friend.
Gotcha 4: Encrypt the drive. A survival kit by definition contains personal notes, passwords, and medical references. Use FileVault (macOS), BitLocker (Windows), or LUKS (Linux). A lost laptop should not become a privacy disaster.
Gotcha 5: Documentation > tools. A printed one-page index of "what is on this drive and how to start each tool" matters more than a fancier model. Print the manifest.
Storage Budget {#storage-budget}
| Component | Size |
|---|---|
| Ollama models (3-4 of them) | 12-15 GB |
| nomic-embed embeddings model | 0.5 GB |
| Wikipedia (nopic) | 50 GB |
| Stack Overflow | 92 GB |
| iFixit | 4 GB |
| OpenStreetMap (regional) | 25 GB |
| Project Gutenberg | 16 GB |
| Medical PDFs | 0.5 GB |
| Repair / survival PDFs | 0.5 GB |
| Maps (OsmAnd / Organic) | 8 GB |
| Tools and apps | 4 GB |
| Personal docs / notes | 2-5 GB |
| Total | ~215 GB |
A 512 GB drive holds the kit comfortably with room to grow. A 1 TB drive lets you add full-image Wikipedia and large book collections.
FAQs From Readers Building This {#faqs-from-readers}
I built an early version of this kit in 2024 and shipped a cleaner version in 2026. Hundreds of readers have built their own and sent feedback. The most common questions are answered in the FAQ section below — but the most common mistake I see is treating the LLM as a search engine. It is not. The LLM is a reasoning layer. The Kiwix archives are the search engine. Use both.
For the model side specifically, our best local AI models for 8GB RAM breaks down which compact models stay coherent for survival-style queries. For Pi-based portable kits, the LLMs on Raspberry Pi 5 guide goes deep on power and thermal tradeoffs.
For Wikipedia archives, the canonical reference is the Kiwix content catalog. For survival reference downloads, the Hesperian Health Guides library is the gold standard for plain-language medical information.
Why This Matters Even If Civilization Is Fine {#why-it-matters}
Internet outages happen. Travel happens. Hospitals charge $15/day for WiFi. Some countries filter the open web. Long flights still exist. Your home router will eventually die at 11pm on a Sunday.
A working offline kit is not a doomsday product. It is a quiet utility that makes you more capable in dozens of ordinary situations. The week I built mine, I helped a neighbor troubleshoot their well pump using iFixit content, drafted a complaint letter on a flight, and looked up a dosing question for a child with a fever — all without a connection.
Build it once. Refresh it yearly. Forget about it the rest of the time.
Final Configuration File {#final-config}
Save this as ~/offline-kit/README.md so future-you remembers the kit's contents:
# Offline AI Survival Kit
Last refreshed: 2026-04-23
## How to start
1. Open Terminal: `ollama serve &`
2. Open Terminal: `kiwix-serve --port=8080 ~/offline-kit/zim/*.zim`
3. Visit http://127.0.0.1:8080 for knowledge
4. Run `ollama run llama3.2:8b` for AI
## Backup drive
- Encrypted USB at /Volumes/Backup/offline-kit
- Last sync: <date>
## Update reminders
- Models: every 6 months
- Wikipedia: every 12 months
- iFixit: every 6 months
- Medical PDFs: every 12 months
That single file has saved me three times when I had to stand the kit up on a borrowed machine.
Go from reading about AI to building with AI
10 structured courses. Hands-on projects. Runs on your machine. Start free.
Enjoyed this? There are 10 full courses waiting.
10 complete AI courses. From fundamentals to production. Everything runs on your hardware.
Build Real AI on Your Machine
RAG, agents, NLP, vision, and MLOps - chapters across 10 courses that take you from reading about AI to building AI.
Want structured AI education?
10 courses, 160+ chapters, from $9. Understand AI, don't just use it.
Continue Your Local AI Journey
Comments (0)
No comments yet. Be the first to share your thoughts!