Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add omp in env for torchrun and update doc #2320

Merged
merged 18 commits into from
May 10, 2023
Merged

add omp in env for torchrun and update doc #2320

merged 18 commits into from
May 10, 2023

Conversation

lxning
Copy link
Collaborator

@lxning lxning commented May 4, 2023

Description

Please read our CONTRIBUTING.md prior to creating your first pull request.

Please include a summary of the feature or issue being fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.

Fixes #(issue)

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

Feature/Issue validation/testing

Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

  • Test A
    Logs for Test A

  • Test B
    Logs for Test B

Checklist:

  • Did you have fun?
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

@codecov
Copy link

codecov bot commented May 4, 2023

Codecov Report

Merging #2320 (4377f9e) into master (2f1f52f) will increase coverage by 0.02%.
The diff coverage is 0.00%.

❗ Current head 4377f9e differs from pull request most recent head c42bafe. Consider uploading reports for the commit c42bafe to get more accurate results

@@            Coverage Diff             @@
##           master    #2320      +/-   ##
==========================================
+ Coverage   69.37%   69.39%   +0.02%     
==========================================
  Files          77       77              
  Lines        3438     3441       +3     
  Branches       57       57              
==========================================
+ Hits         2385     2388       +3     
  Misses       1050     1050              
  Partials        3        3              
Impacted Files Coverage Δ
ts/model_service_worker.py 64.33% <0.00%> (-1.38%) ⬇️

... and 1 file with indirect coverage changes

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

Copy link
Contributor

@chauhang chauhang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Li for the updates. Few comments:

  1. Will be good to move the Download_models.py one level up as a common file after Pippy deferred init #2310 is merged.

  2. For the Large model best practices, we should expand and provide more details for model packaging, threads and other configs and troubleshooting tips. This can be done as a separate PR

Copy link
Collaborator

@HamidShojanazeri HamidShojanazeri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lxning added some minor suggestions.

docs/large_model_inference.md Outdated Show resolved Hide resolved
docs/large_model_inference.md Outdated Show resolved Hide resolved
docs/large_model_inference.md Outdated Show resolved Hide resolved
docs/large_model_inference.md Outdated Show resolved Hide resolved
@@ -1,2 +1,2 @@
transformers
deepspeed
transformers==4.28.1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason for pinning the version numbers? Can we do >= instead?

Copy link
Collaborator Author

@lxning lxning May 10, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The version combination of DS and transformer is not stable. Some combination does not work. So far this combination works for all the models we have tested.

Copy link
Contributor

@chauhang chauhang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left few comments please check

Copy link
Contributor

@chauhang chauhang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For OMP_NUM_THREADS, please add documentation for how user can set this value

@lxning lxning merged commit 4fe5273 into master May 10, 2023
8 of 9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants