Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llmdeploy 使用openai形式提示词请求报错[Bug] #1939

Closed
2 tasks done
hitzhu opened this issue Jul 6, 2024 · 3 comments
Closed
2 tasks done

llmdeploy 使用openai形式提示词请求报错[Bug] #1939

hitzhu opened this issue Jul 6, 2024 · 3 comments

Comments

@hitzhu
Copy link

hitzhu commented Jul 6, 2024

Checklist

  • 1. I have searched related issues but cannot get the expected help.
  • 2. The bug has not been fixed in the latest version.

Describe the bug

使用llmdeploy部署internvl ,并使用官方openai 形式的提示词demo请求,后端报错
image
image
image

Reproduction

rt

Environment

rt

Error traceback

No response

@lvhan028
Copy link
Collaborator

lvhan028 commented Jul 7, 2024

The internvl2 feature has been implemented in the latest main branch, but hasn't been released yet.
It is going to be released in v0.5.1 next week.

@zhyncs
Copy link
Collaborator

zhyncs commented Jul 7, 2024

@hitzhu You can install the latest whl from https://github.com/zhyncs/lmdeploy-build/releases/tag/ab5b7ce

@hitzhu hitzhu closed this as completed Jul 9, 2024
@hitzhu
Copy link
Author

hitzhu commented Jul 9, 2024

已解决,根据commit记录改源码就可以

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants