File tree Expand file tree Collapse file tree
Expand file tree Collapse file tree Original file line number Diff line number Diff line change @@ -4,6 +4,12 @@ All notable changes to this project will be documented in this file.
44The format is based on [ Keep a Changelog] ( https://keepachangelog.com/en/1.0.0/ ) ,
55and this project adheres to [ Semantic Versioning] ( https://semver.org/spec/v2.0.0.html ) .
66
7+ ## [ 1.20.0] - 2024-06-13
8+
9+ ### Changed
10+ - [ Build] Simplify the python dependency installation
11+ - [ Build] Downgrade the "torch" package to 2.1.2+cu121
12+
713## [ 1.19.0] - 2024-06-13
814
915### Added
Original file line number Diff line number Diff line change @@ -189,17 +189,18 @@ cmake `
189189
190190Copy-Item - Path " ../../OpenBLAS/bin/libopenblas.dll" - Destination " ./bin/Release/libopenblas.dll"
191191
192- Set-Location - Path " ../"
192+ Set-Location - Path " ../../../ "
193193
194- conda activate llama.cpp
195-
196- # We are installing the latest available version of the dependencies.
197- pip install -- upgrade -- upgrade- strategy " eager" - r ./ requirements.txt
194+ Write-Host " [Python] Installing dependencies..." - ForegroundColor " Yellow"
198195
199- Set-Location - Path " ../../ "
196+ conda activate llama.cpp
200197
201- # We are enforcing specific versions on some packages.
202- pip install - r ./ requirements_override.txt
198+ # We are installing the latest available version of all llama.cpp
199+ # project dependencies and also overriding some package versions.
200+ pip install `
201+ -- upgrade `
202+ -- upgrade- strategy " eager" `
203+ -- requirement ./ requirements_override.txt
203204
204205conda list
205206
Original file line number Diff line number Diff line change 1- # We are using a specific version of the "torch"
2- # package which supports a specific CUDA version.
3- --extra-index-url https://download.pytorch.org/whl/nightly/cu121
4- torch==2.4.0.dev20240516+cu121
1+ # We are importing the llama.cpp project dependencies.
2+ --requirement ./vendor/llama.cpp/requirements.txt
3+
4+ # We are overriding the "torch" package version with a
5+ # specific compatible version that also supports CUDA.
6+ --extra-index-url https://download.pytorch.org/whl/cu121
7+ torch==2.1.2+cu121
You can’t perform that action at this time.
0 commit comments