-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API: Using sendRequest
and getting cut off response (including in chat-sample)
#1358
Comments
More requests that finish but seemingly cut off.
|
I have recreate this issue with the vscode-extension-samples -> chat-sample: https://github.com/microsoft/vscode-extension-samples/tree/main/chat-sample
Copilot DiagsCopilot
Environment
Feature Flags
Node setup
Network Configuration
Reachability
VS Code Configuration
Extensions
Authentication
|
sendRequest
and getting cut off responsesendRequest
and getting cut off response (including in chat-sample)
So you're seeing the issue with the chat-sample, and no modifications to it? A couple days later, is this still happening? Could have been a transient service issue. It looks like you found the "Copilot: Collect Diagnostics" command, thanks. Could you also run "Developer: GitHub Copilot Chat Diagnostics"? I'm not sure what else to look at, any ideas @jrieken? |
What version of VS Code and chat do you use? Around a week ago there was a bug in this area which since has been fixed |
My Copilot responses are being cut of when using the API to implement my chat. It doesn't seem to happen all the time.
Code
More details with screenshots
When using standard copilot, it does not cut off. Could it be the case that my input/context is too big?
Further progress using a different model but same issue.
Took out all my User messages / context and still getting the same issue.
With logs:
The text was updated successfully, but these errors were encountered: