Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to implement batch_chat for videochat2 benchmark? #187

Open
dragen1860 opened this issue Jun 4, 2024 · 1 comment
Open

how to implement batch_chat for videochat2 benchmark? #187

dragen1860 opened this issue Jun 4, 2024 · 1 comment

Comments

@dragen1860
Copy link

Dear author:
Thnaks for the really solid work of videochat2.
Recently i finetunning a individual model on my custom dataset and want to test on my benchmark samples, which as up to 30K samples. It is extreme time-consuming for chat one by one and then compute the accuracy. Does anyone know how to implement a batch_chat like sytle parallel inference function, so that i could inference several samples meanwhile. thanks.

@Andy1621
Copy link
Collaborator

Andy1621 commented Jun 4, 2024

Hi! We will release the vllm verision for batch_chat recently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants