-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
💡 [Feature]: Add support for picking LLM model for GitHub Copilot chat participant #334
Comments
so it is described here example: https://github.com/microsoft/vscode-extension-samples/blob/main/chat-sample/src/simple.ts |
prototypewe should:
|
Since I cannot seem to make it work with latest VS Code update lets wait a few weeks until it will get out of insiders and into standard VS Code version before we open it up |
…elcome view. Closes #334 (#392) ## 🎯 Aim The aim is to update vscode types and approach with latest. Recheck and cleanup what changed. When upgraded vscode we may use latest improvements of how to handle model picking and loading history in chat command. I checked that and unfrotonatly other models than gpt-4o bring totally different responses which are not possible to pars with current logic. Also loading history seems to be lacking some parts we get with current implementation, so those parts are not changed. Lastly I added action to activate SPFx Copilot from welcome view ## 📷 Result ![image](https://github.com/user-attachments/assets/8f676a95-62a2-4475-a946-b38ce2e2c811) ## ✅ What was done - [X] updated vscode engine and typing to latest - [X] refactor auth provider - [X] rechecked current chat logic implementation - [X] added copilot action to welcome view ## 🔗 Related Issue Closes: #334
🎯 Aim of the feature
Currently in preview GitHub Copilot added the functionality to select the LLM model you want to use in the response
We should check if we may adapt SPFx Toolkit Chat participant to also use it. Currently we use GPT 4o
📷 Images (if possible) with expected result
No response
🤔 Additional remarks or comments
No response
The text was updated successfully, but these errors were encountered: