Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-GPU Inference for the SAT Version #371

Open
ZhengxuYan opened this issue Sep 27, 2024 · 0 comments
Open

Multi-GPU Inference for the SAT Version #371

ZhengxuYan opened this issue Sep 27, 2024 · 0 comments
Assignees

Comments

@ZhengxuYan
Copy link

Feature request / 功能建议

Can we enable inference on multiple GPUs for the SAT version? While I see scripts that can fine-tune the SAT version with multiple GPUs, it's unclear whether multi-GPU support extends to inference as well.

Motivation / 动机

The motivation for this feature request stems from encountering out-of-memory (OOM) issues during inference. Enabling multi-GPU support for inference could significantly improve processing times without memory limitations. This would enhance usability and performance.

Your contribution / 您的贡献

N/A

@zRzRzRzRzRzRzR zRzRzRzRzRzRzR self-assigned this Sep 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants