-
-
Notifications
You must be signed in to change notification settings - Fork 8.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cross_val_score and GPU (not memory leakage case) #11298
Comments
The error comes from Longer explanation: |
cc @dantegd |
@trivialfis thank you, i've got the underlying reason of the error and was looking for natural way to use sklearn wrappers to combine xgb model on GPU and useful CPU-only sklearn methods like cross_validate and grid search. My idea was to make pipeline with custom transformers (
But i didn't find a way to make custom transformer for targets and not sure i can use transformer in pipeline after the estimator. So i decided to code cross validation myself and even fine ready-to-copy example in documentation. Still don't know how to do XGBClassifier on GPU + grid search though. |
For now, you can put the data on CPU and let XGBoost make the necessary copy. Yes, there are warnings from XGBoost during calls to |
I'm closing this as it relates to Sklearn's ability to handle GPU data. Feel free to reopen if you have further questions. |
I try to use sklearn's cross_val_score with XGBClassifier while both model and data are on GPU and python asks me to move something on CPU explicitly. What exactly i have to move to CPU (model, x_all, y_all, weights,...) and would my cross validation procedure works on GPU after i'll do it?
Selected lines of the code to describe model selection part:
The error on
cv_metrics_list=cross_val_score(...
rowThe text was updated successfully, but these errors were encountered: