Conversation
| }: { | ||
| query: string; | ||
| relevant: string; | ||
| // model: string; |
There was a problem hiding this comment.
this is correct here to define this for the askQuestion() - then you can pass in the model ID where you have the model: selectedModel. You will just have to only use the cloud models here (for now) since we are not handling the download and usage of any local model.
Using the local models requires the modelProgressDownloadController file that you can see an example of here to monitor their download status so they can actually be passed into your question logic.
For now i would just stick to the cloud models:
- OpenAI models (4 models)
- Palm2 models (2 models)
- gemini models (1 model)
|
Hey @Arindam200 Could I get on update on this pr as well? Don't want them sitting here for too long |
|
Sorry for the Delay! Will update the changes shortly |
|
Thank you! Just want to ensure that this project stays active 👍 |
Description
In this PR I have fetched the available LLM models and added a Dropdown to switch between different LLM Models.
To Do:
ScreenShot:
This PR fixes #89