Skip to content

ypatole035-ai/llamdrop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

6 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

llamdrop ๐Ÿฆ™

Run AI on any device. No PC. No subscription. No struggle.

License: GPL v3 Platform Status Free Forever


What is llamdrop?

llamdrop is a free, open-source tool that lets anyone run a local AI model on whatever device they own โ€” an Android phone, an old laptop, a Raspberry Pi, a budget PC, even a gaming console running Linux.

It reads your hardware automatically, finds AI models that will actually work on your specs, downloads the right one, and runs it. You don't need to know what quantization means. You don't need to read any documentation. You just run it.

llamdrop will always be completely free. It cannot be sold. Ever. That's not a promise โ€” it's written into the license (GPL v3).


Who is this for?

This project was born from a real experience โ€” spending hours trying to run local AI on a phone with no PC, no budget, and no guidance. Dozens of crashes, incompatible models, RAM errors that made no sense.

llamdrop is for anyone on low-end or budget hardware who keeps getting left out:

  • ๐Ÿ“ฑ Phone users โ€” Android via Termux, no PC needed
  • ๐Ÿ’ป Old laptop owners โ€” that 2012 laptop collecting dust can run AI
  • ๐Ÿ“ Raspberry Pi / SBC users โ€” Pi 4, Pi 5, Orange Pi, etc.
  • ๐ŸŽฎ Console / embedded Linux users โ€” if it runs Linux, llamdrop runs on it
  • ๐Ÿ’ธ People who can't afford ChatGPT, Claude, or Gemini subscriptions
  • ๐ŸŒ Users in regions where $20/month is not a small amount
  • ๐Ÿง‘โ€๐ŸŽ“ Students and self-learners wanting to experiment with AI for free
  • ๐Ÿ”ง Developers and tinkerers who want to test local AI on constrained hardware

If you've ever given up trying to run local AI because it was too complicated, crashed too many times, or cost too much โ€” this is for you.


Features (v0.1 โ€” In Development)

  • ๐Ÿ” Auto device detection โ€” reads your RAM, CPU, OS without asking you anything
  • ๐Ÿ“‹ Smart model browser โ€” two modes:
    • โœ… Verified catalog โ€” curated models confirmed working on low-end devices
    • ๐Ÿ”Ž Live HuggingFace search โ€” search any model, with live RAM estimates
  • โฌ‡๏ธ Smart downloader โ€” auto-picks the right quantization for your RAM, shows progress, resumes if interrupted
  • ๐Ÿš€ Auto-tuned launcher โ€” sets threads, context size, batch size for your exact device
  • ๐Ÿ’ฌ Stable chat โ€” automatic context trimming prevents out-of-memory crashes
  • ๐Ÿ’พ Session save/load โ€” resume conversations where you left off
  • โš ๏ธ RAM monitor โ€” live warning if memory gets dangerous during chat

Model System

llamdrop uses a two-layer model system:

Layer 1 โ€” Verified Catalog (models.json)

A community-maintained list of models confirmed to work on low-RAM devices. Every entry has been tested, has known RAM requirements, and is safe to download. This is what most users should use.

Layer 2 โ€” Live HuggingFace Search

Search any model on HuggingFace directly from llamdrop. The tool estimates RAM requirements from file size and quantization type. Clearly marked as unverified โ€” for experienced users who want to explore beyond the catalog.

Current verified model tiers:

Tier Available RAM Example Models
1 โ€” Ultra low 1.5 โ€“ 3 GB Qwen2.5-0.5B, SmolLM2-360M, Qwen2.5-1.5B Q2
2 โ€” Standard 3 โ€“ 5 GB Qwen2.5-1.5B Q4, Phi-3-mini, Gemma-2-2B, Llama-3.2-1B
3 โ€” Better hardware 5 โ€“ 7 GB Llama-3.2-3B, Qwen2.5-3B, Phi-3.5-mini

All verified models are free, open-source, and downloadable without login or account.


Supported Platforms

llamdrop runs on any device that can run Python 3 in a Linux terminal.

Platform Status Notes
Android via Termux ๐ŸŽฏ Primary test platform Built and tested here first
Linux laptop / desktop โœ… Fully supported Any distro, x86_64 or ARM64
Raspberry Pi 4 / 5 โœ… Fully supported ARM64
Old Windows PC (WSL) โœ… Should work Via Windows Subsystem for Linux
Chromebook (Linux mode) ๐Ÿ”„ Should work ARM64 or x86_64
Orange Pi / SBC ๐Ÿ”„ Should work ARM64 Linux
iOS โŒ Not supported No proper terminal environment

Quick Install

curl -sL https://raw.githubusercontent.com/ypatole035-ai/llamdrop/main/install.sh | bash

โš ๏ธ llamdrop v0.1 is under active development. The installer is not functional yet. Star and Watch this repo to get notified the moment it's ready.


Project Structure

llamdrop/
โ”œโ”€โ”€ llamdrop.py          # Main entry point
โ”œโ”€โ”€ install.sh           # One-line installer
โ”œโ”€โ”€ models.json          # Verified model catalog
โ”œโ”€โ”€ modules/
โ”‚   โ”œโ”€โ”€ device.py        # Hardware detection (RAM, CPU, OS)
โ”‚   โ”œโ”€โ”€ browser.py       # Model browser โ€” verified + HF live search
โ”‚   โ”œโ”€โ”€ downloader.py    # Smart download with resume + quantization picker
โ”‚   โ”œโ”€โ”€ launcher.py      # llama.cpp wrapper with auto-tuned flags
โ”‚   โ””โ”€โ”€ chat.py          # Chat loop with context trimming + RAM monitor
โ””โ”€โ”€ docs/
    โ”œโ”€โ”€ CONTRIBUTING.md  # How to contribute
    โ””โ”€โ”€ DEVICES.md       # Community device compatibility list

Roadmap

v0.1 โ€” Core (In Development)

  • Device detection (RAM, CPU, OS, storage)
  • Verified model browser with tier system
  • Smart downloader with quantization auto-selection
  • llama.cpp auto-installer
  • Auto-tuned launcher
  • Basic chat with context trimming

v0.2 โ€” Search & Polish

  • Live HuggingFace model search with RAM estimates
  • Session save and resume
  • RAM live monitor during chat
  • Vulkan GPU acceleration for supported devices
  • Multi-language UI (Hindi, Spanish, Arabic, Portuguese)
  • Better error messages in plain language

v0.3 โ€” Community

  • Web-based model catalog (GitHub Pages)
  • Community device profile submissions
  • Automated model testing before catalog addition

Contributing

You don't need to be a developer to contribute:

  • ๐Ÿ“ฒ Test a model on your device โ†’ open a PR to update models.json
  • ๐ŸŒ Translate the UI into your language
  • ๐Ÿ“ Write a setup guide for your specific device
  • ๐Ÿ› Report a crash via GitHub Issues
  • โญ Star this repo โ€” it helps others find it when they need it most

See CONTRIBUTING.md for full details.


License

GNU General Public License v3.0 โ€” see LICENSE

In plain language:

  • โœ… Free to use forever
  • โœ… Free to modify and share
  • โŒ Cannot be sold
  • โŒ Cannot be made closed-source
  • โŒ Cannot be put behind a paywall

llamdrop will always be free. That is non-negotiable.


The Story

This project started because one developer spent hours trying to run local AI on an Oppo F19 Pro+ with no PC and no budget. Dozens of crashes. Models that were incompatible. RAM errors with no explanation. When it finally worked โ€” with a tiny 1.5B model running in Termux โ€” the thought was: nobody should have to go through all of that just to get started.

llamdrop is the tool that should have existed already.

Built by @ypatole035-ai and contributors. If llamdrop helped you, star the repo and share it with someone who needs it.

About

Run AI on any device. No PC, no subscription, no struggle. Auto-detects your hardware, picks the right model, downloads it, and runs it. Built for people who can't afford cloud AI. Free & open source.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages