Skip to content

Commit 626416c

Browse files
authored
update the main branch for 2506 release (#550)
update the main branch for 2506 release. Please create a merge commit, not squash.
2 parents e538734 + 9a74df1 commit 626416c

File tree

33 files changed

+1418
-1345
lines changed

33 files changed

+1418
-1345
lines changed

docs/get-started/xgboost-examples/csp/databricks/databricks.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ Navigate to your home directory in the UI and select **Create** > **File** from
2121
create an `init.sh` scripts with contents:
2222
```bash
2323
#!/bin/bash
24-
sudo wget -O /databricks/jars/rapids-4-spark_2.12-25.04.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/25.04.0/rapids-4-spark_2.12-25.04.0.jar
24+
sudo wget -O /databricks/jars/rapids-4-spark_2.12-25.06.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/25.06.0/rapids-4-spark_2.12-25.06.0.jar
2525
```
2626
1. Select the Databricks Runtime Version from one of the supported runtimes specified in the
2727
Prerequisites section.
@@ -68,7 +68,7 @@ create an `init.sh` scripts with contents:
6868
```bash
6969
spark.rapids.sql.python.gpu.enabled true
7070
spark.python.daemon.module rapids.daemon_databricks
71-
spark.executorEnv.PYTHONPATH /databricks/jars/rapids-4-spark_2.12-25.04.0.jar:/databricks/spark/python
71+
spark.executorEnv.PYTHONPATH /databricks/jars/rapids-4-spark_2.12-25.06.0.jar:/databricks/spark/python
7272
```
7373
Note that since python memory pool require installing the cudf library, so you need to install cudf library in
7474
each worker nodes `pip install cudf-cu11 --extra-index-url=https://pypi.nvidia.com` or disable python memory pool

docs/get-started/xgboost-examples/csp/databricks/init.sh

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
1-
# Copyright (c) 2024, NVIDIA CORPORATION.
1+
#
2+
# Copyright (c) 2025, NVIDIA CORPORATION.
23
#
34
# Licensed under the Apache License, Version 2.0 (the "License");
45
# you may not use this file except in compliance with the License.
@@ -11,11 +12,12 @@
1112
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1213
# See the License for the specific language governing permissions and
1314
# limitations under the License.
15+
#
1416

1517
sudo rm -f /databricks/jars/spark--maven-trees--ml--10.x--xgboost-gpu--ml.dmlc--xgboost4j-gpu_2.12--ml.dmlc__xgboost4j-gpu_2.12__1.5.2.jar
1618
sudo rm -f /databricks/jars/spark--maven-trees--ml--10.x--xgboost-gpu--ml.dmlc--xgboost4j-spark-gpu_2.12--ml.dmlc__xgboost4j-spark-gpu_2.12__1.5.2.jar
1719

18-
sudo wget -O /databricks/jars/rapids-4-spark_2.12-25.04.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/25.04.0/rapids-4-spark_2.12-25.04.0.jar
20+
sudo wget -O /databricks/jars/rapids-4-spark_2.12-25.06.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/25.06.0/rapids-4-spark_2.12-25.06.0.jar
1921
sudo wget -O /databricks/jars/xgboost4j-gpu_2.12-1.7.1.jar https://repo1.maven.org/maven2/ml/dmlc/xgboost4j-gpu_2.12/1.7.1/xgboost4j-gpu_2.12-1.7.1.jar
2022
sudo wget -O /databricks/jars/xgboost4j-spark-gpu_2.12-1.7.1.jar https://repo1.maven.org/maven2/ml/dmlc/xgboost4j-spark-gpu_2.12/1.7.1/xgboost4j-spark-gpu_2.12-1.7.1.jar
2123
ls -ltr

docs/get-started/xgboost-examples/csp/dataproc/gcp.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,4 +187,5 @@ gcloud dataproc clusters create $CLUSTER_NAME \
187187
--subnet=default
188188
```
189189

190-
The new cluster should be up and running within 3-4 minutes!
190+
The new cluster should be up and running within 3-4 minutes!
191+

docs/get-started/xgboost-examples/on-prem-cluster/kubernetes-scala.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ export SPARK_DOCKER_IMAGE=<gpu spark docker image repo and name>
4040
export SPARK_DOCKER_TAG=<spark docker image tag>
4141

4242
pushd ${SPARK_HOME}
43-
wget https://github.com/NVIDIA/spark-rapids-examples/raw/branch-25.04/dockerfile/Dockerfile
43+
wget https://github.com/NVIDIA/spark-rapids-examples/raw/branch-25.06/dockerfile/Dockerfile
4444

4545
# Optionally install additional jars into ${SPARK_HOME}/jars/
4646

docs/get-started/xgboost-examples/prepare-package-data/preparation-python.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ For simplicity export the location to these jars. All examples assume the packag
55
### Download the jars
66

77
Download the RAPIDS Accelerator for Apache Spark plugin jar
8-
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/25.04.0/rapids-4-spark_2.12-25.04.0.jar)
8+
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/25.06.0/rapids-4-spark_2.12-25.06.0.jar)
99

1010
### Build XGBoost Python Examples
1111

docs/get-started/xgboost-examples/prepare-package-data/preparation-scala.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ For simplicity export the location to these jars. All examples assume the packag
55
### Download the jars
66

77
1. Download the RAPIDS Accelerator for Apache Spark plugin jar
8-
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/25.04.0/rapids-4-spark_2.12-25.04.0.jar)
8+
* [RAPIDS Spark Package](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/25.06.0/rapids-4-spark_2.12-25.06.0.jar)
99

1010
### Build XGBoost Scala Examples
1111

docs/img/guides/tpcds.png

-32 Bytes
Loading

examples/ML+DL-Examples/Optuna-Spark/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -147,8 +147,8 @@ We use [RAPIDS](https://docs.rapids.ai/install/#get-rapids) for GPU-accelerated
147147
``` shell
148148
sudo apt install libmysqlclient-dev
149149

150-
conda create -n rapids-25.04 -c rapidsai -c conda-forge -c nvidia \
151-
cudf=25.04 cuml=25.04 python=3.10 'cuda-version>=12.0,<=12.5'
150+
conda create -n rapids-25.06 -c rapidsai -c conda-forge -c nvidia \
151+
cudf=25.06 cuml=25.06 python=3.10 'cuda-version>=12.0,<=12.5'
152152
conda activate optuna-spark
153153
pip install mysqlclient
154154
pip install optuna joblib joblibspark ipywidgets

examples/ML+DL-Examples/Optuna-Spark/optuna-examples/databricks/init_optuna.sh

Lines changed: 16 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,19 @@
11
#!/bin/bash
2-
# Copyright (c) 2024, NVIDIA CORPORATION.
2+
#
3+
# Copyright (c) 2025, NVIDIA CORPORATION.
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License");
6+
# you may not use this file except in compliance with the License.
7+
# You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing, software
12+
# distributed under the License is distributed on an "AS IS" BASIS,
13+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
# See the License for the specific language governing permissions and
15+
# limitations under the License.
16+
#
317

418
set -x
519

@@ -41,7 +55,7 @@ fi
4155

4256

4357
# rapids import
44-
SPARK_RAPIDS_VERSION=25.04.0
58+
SPARK_RAPIDS_VERSION=25.06.0
4559
curl -L https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/${SPARK_RAPIDS_VERSION}/rapids-4-spark_2.12-${SPARK_RAPIDS_VERSION}.jar -o \
4660
/databricks/jars/rapids-4-spark_2.12-${SPARK_RAPIDS_VERSION}.jar
4761

examples/ML+DL-Examples/Optuna-Spark/optuna-examples/databricks/start_cluster.sh

Lines changed: 16 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,19 @@
11
#!/bin/bash
2-
# Copyright (c) 2024, NVIDIA CORPORATION.
2+
#
3+
# Copyright (c) 2025, NVIDIA CORPORATION.
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License");
6+
# you may not use this file except in compliance with the License.
7+
# You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing, software
12+
# distributed under the License is distributed on an "AS IS" BASIS,
13+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
# See the License for the specific language governing permissions and
15+
# limitations under the License.
16+
#
317

418
if [[ -z ${INIT_PATH} ]]; then
519
echo "Please export INIT_PATH per README.md"
@@ -12,7 +26,7 @@ json_config=$(cat <<EOF
1226
"spark_version": "13.3.x-gpu-ml-scala2.12",
1327
"spark_conf": {
1428
"spark.task.resource.gpu.amount": "1",
15-
"spark.executorEnv.PYTHONPATH": "/databricks/jars/rapids-4-spark_2.12-25.04.0.jar:/databricks/spark/python:/databricks/python3",
29+
"spark.executorEnv.PYTHONPATH": "/databricks/jars/rapids-4-spark_2.12-25.06.0.jar:/databricks/spark/python:/databricks/python3",
1630
"spark.executor.cores": "8",
1731
"spark.rapids.memory.gpu.minAllocFraction": "0.0001",
1832
"spark.plugins": "com.nvidia.spark.SQLPlugin",

0 commit comments

Comments
 (0)