Open
Conversation
COD-254 Add Explain as Python Service
This tracks separating the chatbot component of the site into its own service. NB: Code pushed should be considered provisional. There's plenty to do to make this fast and reliable, and we should use the Typescript OpenAI API for as long as we can get away with it |
|
Someone is attempting to deploy this pull request to a Personal Account on Vercel that is not owned by them. To accomplish this, the commit author's email address needs to be associated with a GitHub account. Learn more about how to change the commit author information. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
This PR adds rough tooling to separate calls to predictive models into an independent service. This will grant us greater flexibility in how our AI tutor is designed on the backend in the event that we want to change models or develop more custom model architectures. But for the time being, we should continue to rely on the typescript OpenAI API, since there's a lot more optimization required to transfer the service to Python in a way that will be fast and robust.
Files Added
interfaces.py: classes to unify function signatures from different LLM toolkitsapp.py- a toy Flask app to work as a backend serverconfig.yml- parameters for model classesrequirements.txt- python dependenciessetup.sh- script to prepare setup the virtual env