- Support for together.ai inference server
- Support for ollama local inference server
- GPT-4 Vision support
- Added
:jsonand:md_jsonmodes to support more models and inference servers
- Default http settings and where they are stored
before:
config :openai, http_options: [...]after:
config :instructor, :openai, http_options: [...]- OpenAI client to allow for better control of default settings and reduce dependencies
v0.0.4 - 2024-01-15
Instructor.Adapters.Llamacppfor running instructor against local llms.use Instructor.EctoTypefor supporting custom ecto types.- More documentation
- Bug fixes in ecto --> json_schema --> gbnf grammar pipeline, added better tests
v0.0.3 - 2024-01-10
- Schemaless Ecto support
response_model: {:partial, Model}partial streaming moderesponse_model: {:array, Model}record streaming mode
- Bug handling nested module names
v0.0.2 - 2023-12-30
use Instructor.Validatorfor validation callbacks on your Ecto Schemasmax_retries:option to reask the LLM to fix any validation errors
v0.0.1 - 2023-12-19
- Structured prompting with LLMs using Ecto