Skip to content

Commit 43dd3ae

Browse files
authored
Merge branch-25.08 into main (NVIDIA#13294)
Merge branch-25.08 into main Note: merge this PR with **Create a merge commit to merge**
2 parents 4da10ee + f871a86 commit 43dd3ae

File tree

2 files changed

+6
-4
lines changed

2 files changed

+6
-4
lines changed

CHANGELOG.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Change log
2-
Generated on 2025-08-09
2+
Generated on 2025-08-11
33

44
## Release 25.08
55

@@ -74,7 +74,6 @@ Generated on 2025-08-09
7474
|[#12969](https://github.com/NVIDIA/spark-rapids/issues/12969)|[BUG] Multiple test failures in `src.main.python.hash_aggregate_test` across various CI jobs|
7575
|[#11047](https://github.com/NVIDIA/spark-rapids/issues/11047)|Revisit the ANSI tests which is enabled by default in Spark 4.0.0|
7676
|[#13096](https://github.com/NVIDIA/spark-rapids/issues/13096)|[BUG] [Spark-4.0] NDS query_94 and query_95 are failing with IllegalArgumentException|
77-
|[#12883](https://github.com/NVIDIA/spark-rapids/issues/12883)|[BUG] HostAllocSuite failed split should not happen immediately after fallback on memory contention failed|
7877
|[#13018](https://github.com/NVIDIA/spark-rapids/issues/13018)|[BUG] AppendDataExecV1 falls back when running CTAS/RTAS on Delta 3.3.x|
7978
|[#13022](https://github.com/NVIDIA/spark-rapids/issues/13022)|[BUG] GpuRowToColumnarExec with RequireSingleBatch allocates a large amount of memory|
8079
|[#12857](https://github.com/NVIDIA/spark-rapids/issues/12857)|[BUG] GPU generate different output as CPU|
@@ -129,6 +128,8 @@ Generated on 2025-08-09
129128
### PRs
130129
|||
131130
|:---|:---|
131+
|[#13286](https://github.com/NVIDIA/spark-rapids/pull/13286)|Temporarily disable timezone America/Coyhaique to unblock branch-25.08 release|
132+
|[#13282](https://github.com/NVIDIA/spark-rapids/pull/13282)|Update changelog for the v25.08 release [skip ci]|
132133
|[#13280](https://github.com/NVIDIA/spark-rapids/pull/13280)|Fix fallback test params for Delta MergeCommand, UpdateCommand, and DeleteCommand|
133134
|[#13258](https://github.com/NVIDIA/spark-rapids/pull/13258)|Update changelog for the v25.08 release [skip ci]|
134135
|[#13257](https://github.com/NVIDIA/spark-rapids/pull/13257)|Update dependency version JNI, private, hybrid to 25.08.0|

integration_tests/src/main/python/timezones.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Copyright (c) 2025-2025, NVIDIA CORPORATION.
1+
# Copyright (c) 2025, NVIDIA CORPORATION.
22
#
33
# Licensed under the Apache License, Version 2.0 (the "License");
44
# you may not use this file except in compliance with the License.
@@ -21,4 +21,5 @@
2121

2222
# Dynamically get supported timezones from JVM.
2323
# Different JVMs can have different timezones, should not use a constant list here.
24-
all_timezones = spark_jvm().java.time.ZoneId.getAvailableZoneIds()
24+
# Note: excludes `America/Coyhaique`, refer to bug: https://github.com/NVIDIA/spark-rapids/issues/13285
25+
all_timezones = [tz for tz in spark_jvm().java.time.ZoneId.getAvailableZoneIds() if tz != 'America/Coyhaique']

0 commit comments

Comments
 (0)