Replies: 3 comments 1 reply
-
|
Hello! Thanks for flagging this project I wasn't aware of. It sounds interesting and I'll try it soon. I think it makes sense to keep an eye on alternatives to FastAPI, but:
Cheers! |
Beta Was this translation helpful? Give feedback.
-
|
I mentioned this project in fact more to know if for SimpleAI, it could be an advantage or not. From what you mention, FastAPi seems to remain the best solution, in the absence of significant performance gains. Maybe when the library has been tested by the community, its use will become clearer and indispensable. |
Beta Was this translation helpful? Give feedback.
-
|
You can move this issue as a discussion |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I just came across this LitServe repository, which promises 2x performance over FastAPI.
LitServe is designed by the creators of pytorch-lightning, and is specifically designed for deploying AI models.
It might be interesting to test this to compare performance for SimpleAI.
Beta Was this translation helpful? Give feedback.
All reactions