Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transfer skorch models from CUDA to CPU for inference #1096

Open
lukethomrichardson opened this issue Feb 20, 2025 · 1 comment
Open

Transfer skorch models from CUDA to CPU for inference #1096

lukethomrichardson opened this issue Feb 20, 2025 · 1 comment

Comments

@lukethomrichardson
Copy link

lukethomrichardson commented Feb 20, 2025

Hi all, I am training skorch models locally in a CUDA-enabled torch environment, and, if possible, I would like to transfer the entirety of the model to CPU so that they can be registered and used for inference in a CPU-only environment. Is there a best method for accomplishing this?

I'm pretty new to skorch and deep learning so I'm not sure if this is even possible, but, if so, a skorch helper method for converting a model to CPU would be a nice-to-have feature.

Edit: Just noticed a very similar (old) issue that is still open at time of posting (#553). The conversation there didn't seem to completely resolve. Let me know if should post there or if reviving this topic here would be preferable.

@BenjaminBossan
Copy link
Collaborator

If you train a model on GPU, save it, then load it on a machine without GPU, it should already work and be automatically transferred to CPU. Please give this a try and tell us if you encounter problems.

The thread you cited is a bit different, as it is about changing the device within the same process.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants