Skip to content

asimov-modules/asimov-ollama-module

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

11 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

ASIMOV Ollama Module

License Package on Crates.io Documentation

ASIMOV Ollama module.

✨ Features

  • To be determined!

πŸ› οΈ Prerequisites

  • Rust 1.85+ (2024 edition) if building from source code
  • An available Ollama endpoint.

⬇️ Installation

Installation with ASIMOV CLI

asimov module install ollama -v

Installation from Source Code

cargo install asimov-ollama-module

βš™ Setup

This module uses an Ollama endpoint to generate responses. On macOS a simple approach to run Ollama locally is:

$ brew install ollama
$ brew services start ollama  # Start the service formula immediately and register it to launch at login (or boot).

Then you should be able to use asimov-ollama-prompter. Alternatively downloading and running the application should also work.

Downloading Models

You can download a model either through the CLI:

$ ollama pull gemma3:1b

Or in the application:

Model download in Ollama application

πŸ‘‰ Examples

$ echo "In two sentences, why is the sky blue?" | asimov-ollama-prompter -m gemma3:1b
The sky appears blue because of a phenomenon called Rayleigh scattering, where sunlight is split into different colors of light. Blue light is scattered more effectively by the tiny particles in the atmosphere than other colors, making it visible to our eyes.

βš™ Configuration

Provide a model name either by module configuration

asimov module config ollama

Or through environment variables

export ASIMOV_OLLAMA_MODEL="..."

Optional configuration

Name Environment Variable Default
endpoint ASIMOV_OLLAMA_API_ENDPOINT http://localhost:11434

πŸ“š Reference

Prompt

echo "Why is the sky blue?" | asimov-ollama-prompter

πŸ‘¨β€πŸ’» Development

git clone https://github.com/asimov-modules/asimov-ollama-module.git

Share on X Share on Reddit Share on Hacker News Share on Facebook Share on LinkedIn

About

🚧 ASIMOV module for local inference using the Ollama model runtime.

Topics

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Contributors