Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kwargs for torch.compile in BaseHandler #2796

Merged
merged 5 commits into from
Nov 16, 2023

Conversation

eballesteros
Copy link
Contributor

Description

Enables passing and arbitrary kwarg dict for torch.compile in BaseHandler, as discussed in issue #2783 .

Type of change

Please delete options that are not relevant.

  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

Feature/Issue validation/testing

Modified test_torch_compile.py to register 2 models:

  • one configured with the backend as a str
  • one configured with backend and mode in a dict
>> pytest serve/test/pytest/test_torch_compile.py

=========================================================================================== test session starts ============================================================================================
platform linux -- Python 3.9.18, pytest-7.4.3, pluggy-1.3.0
rootdir: /home/model-server/serve
collected 5 items

serve/test/pytest/test_torch_compile.py .....                                                                                                                                                        [100%]

============================================================================================= warnings summary =============================================================================================
serve/test/pytest/test_torch_compile.py:11
  /home/model-server/serve/test/pytest/test_torch_compile.py:11: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
    from pkg_resources import packaging

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
================================================================================= 5 passed, 1 warning in 85.48s (0:01:25) ==================================================================================

Checklist:

  • Did you have fun?
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

@eballesteros
Copy link
Contributor Author

a couple of notes on the testing:

  • I commented out locally the skip mark in test_serve_inference to check that it was passing
  • if test_serve_inference is skipped, I'm pretty sure the other tests in test_torch_compile.py will pass even if the handler initialization fails, since:
    • the status will still be healthy in test_server_status
    • the model(s) will still appear as registered in test_registered_model. The model(s) worker status would not be READY, but this is not getting checked at the moment

@msaroufim msaroufim self-requested a review November 16, 2023 04:11
Copy link
Member

@msaroufim msaroufim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

very cool thank you for this, if regression tests pass we should be good to go - if you're planning on contributing more lmk and share a good email. i can invite you the pytorch slack where we can chat in higher bandwidth

@msaroufim
Copy link
Member

Errors look legit and check out details on pre-commit in our contributing.md

@eballesteros
Copy link
Contributor Author

Errors look legit and check out details on pre-commit in our contributing.md

my bad, it looks like the type annotations I added in the test helper were not yet supported in python 3.8. I got rid of them. Also installed the pre-commit hooks and fixed the linting.

if you're planning on contributing more lmk and share a good email. i can invite you the pytorch slack where we can chat in higher bandwidth

sure! Sounds good. I think I just sent an email to one of your addresses. Let me know if you can't find it

@msaroufim msaroufim added this pull request to the merge queue Nov 16, 2023
Merged via the queue into pytorch:master with commit a8ca657 Nov 16, 2023
13 checks passed
@chauhang chauhang added this to the v0.10.0 milestone Feb 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants