-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Open
Description
Here is the result of my command. Is this error inside the container or outside? The weird part to me is:
genai-stack-pull-model-1 | pulling ollama model llama2 using http://llm-gpu:11434
The docs told me to add that URL to the .env file. However, I certainly don't have server running there.
$ docker compose --profile linux-gpu up
WARN[0000] The "LANGCHAIN_PROJECT" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_ACCESS_KEY_ID" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_SECRET_ACCESS_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_DEFAULT_REGION" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_PROJECT" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_ACCESS_KEY_ID" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_SECRET_ACCESS_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_DEFAULT_REGION" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_PROJECT" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_ACCESS_KEY_ID" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_SECRET_ACCESS_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_DEFAULT_REGION" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_PROJECT" variable is not set. Defaulting to a blank string.
WARN[0000] The "LANGCHAIN_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_ACCESS_KEY_ID" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_SECRET_ACCESS_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "AWS_DEFAULT_REGION" variable is not set. Defaulting to a blank string.
[+] Running 4/4
✔ llm-gpu 3 layers [⣿⣿⣿] 0B/0B Pulled 1.3s
✔ aece8493d397 Already exists 0.0s
✔ 3b9196308e0f Already exists 0.0s
✔ e75cbce7870b Already exists 0.0s
[+] Building 0.0s (0/0) docker:desktop-linux
[+] Running 8/8
✔ Container genai-stack-llm-gpu-1 Created 0.0s
✔ Container genai-stack-database-1 Running 0.0s
✔ Container genai-stack-pull-model-1 Recreated 0.1s
✔ Container genai-stack-api-1 Recreated 0.1s
✔ Container genai-stack-bot-1 Recreated 0.1s
✔ Container genai-stack-pdf_bot-1 Recreated 0.1s
✔ Container genai-stack-loader-1 Recreated 0.1s
✔ Container genai-stack-front-end-1 Recreated 0.1s
Attaching to genai-stack-api-1, genai-stack-bot-1, genai-stack-database-1, genai-stack-front-end-1, genai-stack-llm-gpu-1, genai-stack-loader-1, genai-stack-pdf_bot-1, genai-stack-pull-model-1
genai-stack-pull-model-1 | pulling ollama model llama2 using http://llm-gpu:11434
genai-stack-pull-model-1 | Error: Head "http://llm-gpu:11434/": dial tcp 172.18.0.4:11434: connect: no route to host
genai-stack-pull-model-1 exited with code 1
Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: error during container init: error running hook #0: error running hook: exit status 1, stdout: , stderr: Auto-detected mode as 'legacy'
nvidia-container-cli: initialization error: load library failed: libnvidia-ml.so.1: cannot open shared object file: no such file or directory: unknown
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels