No Possible Issues Found via Git Search

renaissance-dec-tree_0

[2026-01-10T18:20:12.056Z] Running test renaissance-dec-tree_0 ... [2026-01-10T18:20:12.056Z] =============================================== [2026-01-10T18:20:12.394Z] renaissance-dec-tree_0 Start Time: Sat Jan 10 18:20:12 2026 Epoch Time (ms): 1768069212068 [2026-01-10T18:20:12.394Z] variation: NoOptions [2026-01-10T18:20:12.394Z] JVM_OPTIONS: [2026-01-10T18:20:12.394Z] { \ [2026-01-10T18:20:12.394Z] echo ""; echo "TEST SETUP:"; \ [2026-01-10T18:20:12.394Z] echo "Nothing to be done for setup."; \ [2026-01-10T18:20:12.394Z] mkdir -p "/home/jenkins/workspace/Test_openjdk25_hs_extended.perf_riscv64_linux/aqa-tests/TKG/../TKG/output_17680672638172/renaissance-dec-tree_0"; \ [2026-01-10T18:20:12.394Z] cd "/home/jenkins/workspace/Test_openjdk25_hs_extended.perf_riscv64_linux/aqa-tests/TKG/../TKG/output_17680672638172/renaissance-dec-tree_0"; \ [2026-01-10T18:20:12.394Z] echo ""; echo "TESTING:"; \ [2026-01-10T18:20:12.394Z] "/home/jenkins/workspace/Test_openjdk25_hs_extended.perf_riscv64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk25_hs_extended.perf_riscv64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk25_hs_extended.perf_riscv64_linux/aqa-tests/TKG/../TKG/output_17680672638172/renaissance-dec-tree_0"/dec-tree.json" dec-tree; \ [2026-01-10T18:20:12.394Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-dec-tree_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk25_hs_extended.perf_riscv64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk25_hs_extended.perf_riscv64_linux/aqa-tests/TKG/../TKG/output_17680672638172/renaissance-dec-tree_0"; else echo "-----------------------------------"; echo "renaissance-dec-tree_0""_FAILED"; echo "-----------------------------------"; fi; \ [2026-01-10T18:20:12.394Z] echo ""; echo "TEST TEARDOWN:"; \ [2026-01-10T18:20:12.394Z] echo "Nothing to be done for teardown."; \ [2026-01-10T18:20:12.394Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk25_hs_extended.perf_riscv64_linux/aqa-tests/TKG/../TKG/output_17680672638172/TestTargetResult"; [2026-01-10T18:20:12.394Z] [2026-01-10T18:20:12.394Z] TEST SETUP: [2026-01-10T18:20:12.394Z] Nothing to be done for setup. [2026-01-10T18:20:12.394Z] [2026-01-10T18:20:12.394Z] TESTING: [2026-01-10T18:20:14.675Z] WARNING: A terminally deprecated method in sun.misc.Unsafe has been called [2026-01-10T18:20:14.675Z] WARNING: sun.misc.Unsafe::objectFieldOffset has been called by scala.runtime.LazyVals$ (file:/home/jenkins/workspace/Test_openjdk25_hs_extended.perf_riscv64_linux/aqa-tests/TKG/output_17680672638172/renaissance-dec-tree_0/launcher-182012-7894764543906335836/renaissance-harness_3/lib/scala3-library_3-3.3.4.jar) [2026-01-10T18:20:14.675Z] WARNING: Please consider reporting this to the maintainers of class scala.runtime.LazyVals$ [2026-01-10T18:20:14.675Z] WARNING: sun.misc.Unsafe::objectFieldOffset will be removed in a future release [2026-01-10T18:20:37.916Z] NOTE: 'dec-tree' benchmark uses Spark local executor with 4 (out of 4) threads. [2026-01-10T18:20:45.252Z] ====== dec-tree (apache-spark) [default], iteration 0 started ====== [2026-01-10T18:20:45.602Z] GC before operation: completed in 217.826 ms, heap usage 46.716 MB -> 35.310 MB. [2026-01-10T18:21:25.819Z] 18:21:19.623 ERROR [Executor task launch worker for task 2.0 in stage 7.0 (TID 16)] org.apache.spark.executor.Executor - Exception in task 2.0 in stage 7.0 (TID 16) [2026-01-10T18:21:25.819Z] java.lang.NullPointerException: Cannot enter synchronized block because "lock" is null [2026-01-10T18:21:25.819Z] at org.apache.spark.util.KeyLock.releaseLock(KeyLock.scala:48) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.819Z] at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:66) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.819Z] at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.819Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1444) ~[spark-catalyst_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.819Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205) ~[spark-catalyst_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.819Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39) ~[spark-catalyst_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.819Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1369) ~[spark-catalyst_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1366) ~[spark-catalyst_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1(objects.scala:94) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1$adapted(objects.scala:93) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2(RDD.scala:880) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2$adapted(RDD.scala:880) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.execution.SQLExecutionRDD.$anonfun$compute$1(SQLExecutionRDD.scala:52) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.internal.SQLConf$.withExistingConf(SQLConf.scala:158) ~[spark-catalyst_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.execution.SQLExecutionRDD.compute(SQLExecutionRDD.scala:52) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [spark-core_2.13-3.5.3.jar:3.5.3] [2026-01-10T18:21:25.820Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1090) [?:?] [2026-01-10T18:21:25.820Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:614) [?:?] [2026-01-10T18:21:25.820Z] at java.lang.Thread.run(Thread.java:1474) [?:?] [2026-01-10T18:21:25.820Z] 18:21:19.796 WARN [task-result-getter-2] org.apache.spark.scheduler.TaskSetManager - Lost task 2.0 in stage 7.0 (TID 16) (test-rise-ubuntu2404-riscv64-4.adoptopenjdk.net executor driver): java.lang.NullPointerException: Cannot enter synchronized block because "lock" is null [2026-01-10T18:21:25.820Z] at org.apache.spark.util.KeyLock.releaseLock(KeyLock.scala:48) [2026-01-10T18:21:25.820Z] at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:66) [2026-01-10T18:21:25.820Z] at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1444) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1369) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1366) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1(objects.scala:94) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1$adapted(objects.scala:93) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2(RDD.scala:880) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2$adapted(RDD.scala:880) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.execution.SQLExecutionRDD.$anonfun$compute$1(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.internal.SQLConf$.withExistingConf(SQLConf.scala:158) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.execution.SQLExecutionRDD.compute(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.820Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.820Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2026-01-10T18:21:25.820Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2026-01-10T18:21:25.820Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2026-01-10T18:21:25.820Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2026-01-10T18:21:25.820Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2026-01-10T18:21:25.820Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2026-01-10T18:21:25.820Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2026-01-10T18:21:25.820Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2026-01-10T18:21:25.820Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2026-01-10T18:21:25.820Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2026-01-10T18:21:25.820Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1090) [2026-01-10T18:21:25.820Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:614) [2026-01-10T18:21:25.820Z] at java.base/java.lang.Thread.run(Thread.java:1474) [2026-01-10T18:21:25.820Z] [2026-01-10T18:21:25.820Z] 18:21:19.803 ERROR [task-result-getter-2] org.apache.spark.scheduler.TaskSetManager - Task 2 in stage 7.0 failed 1 times; aborting job [2026-01-10T18:21:25.820Z] 18:21:19.906 ERROR [main] org.apache.spark.ml.util.Instrumentation - org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 7.0 failed 1 times, most recent failure: Lost task 2.0 in stage 7.0 (TID 16) (test-rise-ubuntu2404-riscv64-4.adoptopenjdk.net executor driver): java.lang.NullPointerException: Cannot enter synchronized block because "lock" is null [2026-01-10T18:21:25.820Z] at org.apache.spark.util.KeyLock.releaseLock(KeyLock.scala:48) [2026-01-10T18:21:25.820Z] at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:66) [2026-01-10T18:21:25.820Z] at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1444) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1369) [2026-01-10T18:21:25.820Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1366) [2026-01-10T18:21:25.821Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1(objects.scala:94) [2026-01-10T18:21:25.821Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1$adapted(objects.scala:93) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2(RDD.scala:880) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2$adapted(RDD.scala:880) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.821Z] at org.apache.spark.sql.execution.SQLExecutionRDD.$anonfun$compute$1(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.821Z] at org.apache.spark.sql.internal.SQLConf$.withExistingConf(SQLConf.scala:158) [2026-01-10T18:21:25.821Z] at org.apache.spark.sql.execution.SQLExecutionRDD.compute(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.821Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2026-01-10T18:21:25.821Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2026-01-10T18:21:25.821Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2026-01-10T18:21:25.821Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2026-01-10T18:21:25.821Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2026-01-10T18:21:25.821Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2026-01-10T18:21:25.821Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2026-01-10T18:21:25.821Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1090) [2026-01-10T18:21:25.821Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:614) [2026-01-10T18:21:25.821Z] at java.base/java.lang.Thread.run(Thread.java:1474) [2026-01-10T18:21:25.821Z] [2026-01-10T18:21:25.821Z] Driver stacktrace: [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2856) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2792) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2791) [2026-01-10T18:21:25.821Z] at scala.collection.immutable.List.foreach(List.scala:334) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2791) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1247) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1247) [2026-01-10T18:21:25.821Z] at scala.Option.foreach(Option.scala:437) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1247) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3060) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2994) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2983) [2026-01-10T18:21:25.821Z] at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) [2026-01-10T18:21:25.821Z] at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:989) [2026-01-10T18:21:25.821Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2393) [2026-01-10T18:21:25.821Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2414) [2026-01-10T18:21:25.821Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2433) [2026-01-10T18:21:25.821Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2458) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1049) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.withScope(RDD.scala:410) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.collect(RDD.scala:1048) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.PairRDDFunctions.$anonfun$collectAsMap$1(PairRDDFunctions.scala:738) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.RDD.withScope(RDD.scala:410) [2026-01-10T18:21:25.821Z] at org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.scala:737) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.tree.impl.RandomForest$.findSplitsBySorting(RandomForest.scala:1054) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.tree.impl.RandomForest$.findSplits(RandomForest.scala:1025) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.tree.impl.RandomForest$.run(RandomForest.scala:282) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.classification.DecisionTreeClassifier.$anonfun$train$1(DecisionTreeClassifier.scala:143) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.util.Instrumentation$.$anonfun$instrumented$1(Instrumentation.scala:191) [2026-01-10T18:21:25.821Z] at scala.util.Try$.apply(Try.scala:217) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.util.Instrumentation$.instrumented(Instrumentation.scala:191) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.classification.DecisionTreeClassifier.train(DecisionTreeClassifier.scala:116) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.classification.DecisionTreeClassifier.train(DecisionTreeClassifier.scala:48) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.Predictor.fit(Predictor.scala:114) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.Predictor.fit(Predictor.scala:78) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$5(Pipeline.scala:151) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.MLEvents.withFitEvent(events.scala:130) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.MLEvents.withFitEvent$(events.scala:123) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.util.Instrumentation.withFitEvent(Instrumentation.scala:42) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$4(Pipeline.scala:151) [2026-01-10T18:21:25.821Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2026-01-10T18:21:25.821Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2026-01-10T18:21:25.821Z] at scala.collection.AbstractIterator.foreach(Iterator.scala:1303) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$2(Pipeline.scala:147) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.MLEvents.withFitEvent(events.scala:130) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.MLEvents.withFitEvent$(events.scala:123) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.util.Instrumentation.withFitEvent(Instrumentation.scala:42) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$1(Pipeline.scala:133) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.util.Instrumentation$.$anonfun$instrumented$1(Instrumentation.scala:191) [2026-01-10T18:21:25.821Z] at scala.util.Try$.apply(Try.scala:217) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.util.Instrumentation$.instrumented(Instrumentation.scala:191) [2026-01-10T18:21:25.821Z] at org.apache.spark.ml.Pipeline.fit(Pipeline.scala:133) [2026-01-10T18:21:25.821Z] at org.renaissance.apache.spark.DecTree.run(DecTree.scala:102) [2026-01-10T18:21:25.821Z] at org.renaissance.harness.ExecutionDriver.executeOperation(ExecutionDriver.java:137) [2026-01-10T18:21:25.821Z] at org.renaissance.harness.ExecutionDriver.executeBenchmark(ExecutionDriver.java:93) [2026-01-10T18:21:25.821Z] at org.renaissance.harness.RenaissanceSuite$.runBenchmarks$$anonfun$1(RenaissanceSuite.scala:172) [2026-01-10T18:21:25.821Z] at scala.runtime.function.JProcedure1.apply(JProcedure1.java:15) [2026-01-10T18:21:25.821Z] at scala.runtime.function.JProcedure1.apply(JProcedure1.java:10) [2026-01-10T18:21:25.821Z] at scala.collection.immutable.List.foreach(List.scala:334) [2026-01-10T18:21:25.821Z] at org.renaissance.harness.RenaissanceSuite$.runBenchmarks(RenaissanceSuite.scala:161) [2026-01-10T18:21:25.821Z] at org.renaissance.harness.RenaissanceSuite$.main(RenaissanceSuite.scala:130) [2026-01-10T18:21:25.821Z] at org.renaissance.harness.RenaissanceSuite.main(RenaissanceSuite.scala) [2026-01-10T18:21:25.821Z] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) [2026-01-10T18:21:25.822Z] at java.base/java.lang.reflect.Method.invoke(Method.java:565) [2026-01-10T18:21:25.822Z] at org.renaissance.core.Launcher.loadAndInvokeHarnessClass(Launcher.java:129) [2026-01-10T18:21:25.822Z] at org.renaissance.core.Launcher.launchHarnessClass(Launcher.java:78) [2026-01-10T18:21:25.822Z] at org.renaissance.core.Launcher.main(Launcher.java:43) [2026-01-10T18:21:25.822Z] Caused by: java.lang.NullPointerException [2026-01-10T18:21:25.822Z] at org.apache.spark.util.KeyLock.releaseLock(KeyLock.scala:48) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:66) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1444) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1369) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1366) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1(objects.scala:94) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1$adapted(objects.scala:93) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2(RDD.scala:880) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2$adapted(RDD.scala:880) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.execution.SQLExecutionRDD.$anonfun$compute$1(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.internal.SQLConf$.withExistingConf(SQLConf.scala:158) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.execution.SQLExecutionRDD.compute(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2026-01-10T18:21:25.822Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2026-01-10T18:21:25.822Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2026-01-10T18:21:25.822Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2026-01-10T18:21:25.822Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1090) [2026-01-10T18:21:25.822Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:614) [2026-01-10T18:21:25.822Z] at java.base/java.lang.Thread.run(Thread.java:1474) [2026-01-10T18:21:25.822Z] [2026-01-10T18:21:25.822Z] 18:21:19.910 ERROR [main] org.apache.spark.ml.util.Instrumentation - org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 7.0 failed 1 times, most recent failure: Lost task 2.0 in stage 7.0 (TID 16) (test-rise-ubuntu2404-riscv64-4.adoptopenjdk.net executor driver): java.lang.NullPointerException: Cannot enter synchronized block because "lock" is null [2026-01-10T18:21:25.822Z] at org.apache.spark.util.KeyLock.releaseLock(KeyLock.scala:48) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:66) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1444) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1369) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1366) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1(objects.scala:94) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1$adapted(objects.scala:93) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2(RDD.scala:880) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2$adapted(RDD.scala:880) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.execution.SQLExecutionRDD.$anonfun$compute$1(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.internal.SQLConf$.withExistingConf(SQLConf.scala:158) [2026-01-10T18:21:25.822Z] at org.apache.spark.sql.execution.SQLExecutionRDD.compute(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.822Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.822Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2026-01-10T18:21:25.822Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2026-01-10T18:21:25.822Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2026-01-10T18:21:25.822Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2026-01-10T18:21:25.822Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2026-01-10T18:21:25.822Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1090) [2026-01-10T18:21:25.822Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:614) [2026-01-10T18:21:25.822Z] at java.base/java.lang.Thread.run(Thread.java:1474) [2026-01-10T18:21:25.822Z] [2026-01-10T18:21:25.822Z] Driver stacktrace: [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2856) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2792) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2791) [2026-01-10T18:21:25.822Z] at scala.collection.immutable.List.foreach(List.scala:334) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2791) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1247) [2026-01-10T18:21:25.822Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1247) [2026-01-10T18:21:25.823Z] at scala.Option.foreach(Option.scala:437) [2026-01-10T18:21:25.823Z] at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1247) [2026-01-10T18:21:25.823Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3060) [2026-01-10T18:21:25.823Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2994) [2026-01-10T18:21:25.823Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2983) [2026-01-10T18:21:25.823Z] at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) [2026-01-10T18:21:25.823Z] at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:989) [2026-01-10T18:21:25.823Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2393) [2026-01-10T18:21:25.823Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2414) [2026-01-10T18:21:25.823Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2433) [2026-01-10T18:21:25.823Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2458) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1049) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.withScope(RDD.scala:410) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.collect(RDD.scala:1048) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.PairRDDFunctions.$anonfun$collectAsMap$1(PairRDDFunctions.scala:738) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.withScope(RDD.scala:410) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.scala:737) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.tree.impl.RandomForest$.findSplitsBySorting(RandomForest.scala:1054) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.tree.impl.RandomForest$.findSplits(RandomForest.scala:1025) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.tree.impl.RandomForest$.run(RandomForest.scala:282) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.classification.DecisionTreeClassifier.$anonfun$train$1(DecisionTreeClassifier.scala:143) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.util.Instrumentation$.$anonfun$instrumented$1(Instrumentation.scala:191) [2026-01-10T18:21:25.823Z] at scala.util.Try$.apply(Try.scala:217) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.util.Instrumentation$.instrumented(Instrumentation.scala:191) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.classification.DecisionTreeClassifier.train(DecisionTreeClassifier.scala:116) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.classification.DecisionTreeClassifier.train(DecisionTreeClassifier.scala:48) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.Predictor.fit(Predictor.scala:114) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.Predictor.fit(Predictor.scala:78) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$5(Pipeline.scala:151) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.MLEvents.withFitEvent(events.scala:130) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.MLEvents.withFitEvent$(events.scala:123) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.util.Instrumentation.withFitEvent(Instrumentation.scala:42) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$4(Pipeline.scala:151) [2026-01-10T18:21:25.823Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2026-01-10T18:21:25.823Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2026-01-10T18:21:25.823Z] at scala.collection.AbstractIterator.foreach(Iterator.scala:1303) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$2(Pipeline.scala:147) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.MLEvents.withFitEvent(events.scala:130) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.MLEvents.withFitEvent$(events.scala:123) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.util.Instrumentation.withFitEvent(Instrumentation.scala:42) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$1(Pipeline.scala:133) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.util.Instrumentation$.$anonfun$instrumented$1(Instrumentation.scala:191) [2026-01-10T18:21:25.823Z] at scala.util.Try$.apply(Try.scala:217) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.util.Instrumentation$.instrumented(Instrumentation.scala:191) [2026-01-10T18:21:25.823Z] at org.apache.spark.ml.Pipeline.fit(Pipeline.scala:133) [2026-01-10T18:21:25.823Z] at org.renaissance.apache.spark.DecTree.run(DecTree.scala:102) [2026-01-10T18:21:25.823Z] at org.renaissance.harness.ExecutionDriver.executeOperation(ExecutionDriver.java:137) [2026-01-10T18:21:25.823Z] at org.renaissance.harness.ExecutionDriver.executeBenchmark(ExecutionDriver.java:93) [2026-01-10T18:21:25.823Z] at org.renaissance.harness.RenaissanceSuite$.runBenchmarks$$anonfun$1(RenaissanceSuite.scala:172) [2026-01-10T18:21:25.823Z] at scala.runtime.function.JProcedure1.apply(JProcedure1.java:15) [2026-01-10T18:21:25.823Z] at scala.runtime.function.JProcedure1.apply(JProcedure1.java:10) [2026-01-10T18:21:25.823Z] at scala.collection.immutable.List.foreach(List.scala:334) [2026-01-10T18:21:25.823Z] at org.renaissance.harness.RenaissanceSuite$.runBenchmarks(RenaissanceSuite.scala:161) [2026-01-10T18:21:25.823Z] at org.renaissance.harness.RenaissanceSuite$.main(RenaissanceSuite.scala:130) [2026-01-10T18:21:25.823Z] at org.renaissance.harness.RenaissanceSuite.main(RenaissanceSuite.scala) [2026-01-10T18:21:25.823Z] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) [2026-01-10T18:21:25.823Z] at java.base/java.lang.reflect.Method.invoke(Method.java:565) [2026-01-10T18:21:25.823Z] at org.renaissance.core.Launcher.loadAndInvokeHarnessClass(Launcher.java:129) [2026-01-10T18:21:25.823Z] at org.renaissance.core.Launcher.launchHarnessClass(Launcher.java:78) [2026-01-10T18:21:25.823Z] at org.renaissance.core.Launcher.main(Launcher.java:43) [2026-01-10T18:21:25.823Z] Caused by: java.lang.NullPointerException [2026-01-10T18:21:25.823Z] at org.apache.spark.util.KeyLock.releaseLock(KeyLock.scala:48) [2026-01-10T18:21:25.823Z] at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:66) [2026-01-10T18:21:25.823Z] at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1444) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1369) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1366) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1(objects.scala:94) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1$adapted(objects.scala:93) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2(RDD.scala:880) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2$adapted(RDD.scala:880) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.execution.SQLExecutionRDD.$anonfun$compute$1(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.internal.SQLConf$.withExistingConf(SQLConf.scala:158) [2026-01-10T18:21:25.823Z] at org.apache.spark.sql.execution.SQLExecutionRDD.compute(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.823Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.823Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2026-01-10T18:21:25.823Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2026-01-10T18:21:25.823Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2026-01-10T18:21:25.823Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2026-01-10T18:21:25.823Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2026-01-10T18:21:25.823Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2026-01-10T18:21:25.823Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2026-01-10T18:21:25.823Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2026-01-10T18:21:25.823Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2026-01-10T18:21:25.823Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2026-01-10T18:21:25.823Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1090) [2026-01-10T18:21:25.823Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:614) [2026-01-10T18:21:25.823Z] at java.base/java.lang.Thread.run(Thread.java:1474) [2026-01-10T18:21:25.823Z] [2026-01-10T18:21:25.823Z] ====== dec-tree (apache-spark) [default], iteration 0 failed (SparkException) ====== [2026-01-10T18:21:25.823Z] Benchmark 'dec-tree' failed with exception: [2026-01-10T18:21:25.823Z] org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 7.0 failed 1 times, most recent failure: Lost task 2.0 in stage 7.0 (TID 16) (test-rise-ubuntu2404-riscv64-4.adoptopenjdk.net executor driver): java.lang.NullPointerException: Cannot enter synchronized block because "lock" is null [2026-01-10T18:21:25.823Z] at org.apache.spark.util.KeyLock.releaseLock(KeyLock.scala:48) [2026-01-10T18:21:25.824Z] at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:66) [2026-01-10T18:21:25.824Z] at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1444) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1369) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1366) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1(objects.scala:94) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1$adapted(objects.scala:93) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2(RDD.scala:880) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2$adapted(RDD.scala:880) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.execution.SQLExecutionRDD.$anonfun$compute$1(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.internal.SQLConf$.withExistingConf(SQLConf.scala:158) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.execution.SQLExecutionRDD.compute(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2026-01-10T18:21:25.824Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2026-01-10T18:21:25.824Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2026-01-10T18:21:25.824Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2026-01-10T18:21:25.824Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2026-01-10T18:21:25.824Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2026-01-10T18:21:25.824Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2026-01-10T18:21:25.824Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1090) [2026-01-10T18:21:25.824Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:614) [2026-01-10T18:21:25.824Z] at java.base/java.lang.Thread.run(Thread.java:1474) [2026-01-10T18:21:25.824Z] [2026-01-10T18:21:25.824Z] Driver stacktrace: [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2856) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2792) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2791) [2026-01-10T18:21:25.824Z] at scala.collection.immutable.List.foreach(List.scala:334) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2791) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1247) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1247) [2026-01-10T18:21:25.824Z] at scala.Option.foreach(Option.scala:437) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1247) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3060) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2994) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2983) [2026-01-10T18:21:25.824Z] at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) [2026-01-10T18:21:25.824Z] at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:989) [2026-01-10T18:21:25.824Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2393) [2026-01-10T18:21:25.824Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2414) [2026-01-10T18:21:25.824Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2433) [2026-01-10T18:21:25.824Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2458) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1049) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.withScope(RDD.scala:410) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.collect(RDD.scala:1048) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.PairRDDFunctions.$anonfun$collectAsMap$1(PairRDDFunctions.scala:738) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.withScope(RDD.scala:410) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.scala:737) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.tree.impl.RandomForest$.findSplitsBySorting(RandomForest.scala:1054) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.tree.impl.RandomForest$.findSplits(RandomForest.scala:1025) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.tree.impl.RandomForest$.run(RandomForest.scala:282) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.classification.DecisionTreeClassifier.$anonfun$train$1(DecisionTreeClassifier.scala:143) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.util.Instrumentation$.$anonfun$instrumented$1(Instrumentation.scala:191) [2026-01-10T18:21:25.824Z] at scala.util.Try$.apply(Try.scala:217) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.util.Instrumentation$.instrumented(Instrumentation.scala:191) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.classification.DecisionTreeClassifier.train(DecisionTreeClassifier.scala:116) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.classification.DecisionTreeClassifier.train(DecisionTreeClassifier.scala:48) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.Predictor.fit(Predictor.scala:114) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.Predictor.fit(Predictor.scala:78) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$5(Pipeline.scala:151) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.MLEvents.withFitEvent(events.scala:130) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.MLEvents.withFitEvent$(events.scala:123) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.util.Instrumentation.withFitEvent(Instrumentation.scala:42) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$4(Pipeline.scala:151) [2026-01-10T18:21:25.824Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2026-01-10T18:21:25.824Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2026-01-10T18:21:25.824Z] at scala.collection.AbstractIterator.foreach(Iterator.scala:1303) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$2(Pipeline.scala:147) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.MLEvents.withFitEvent(events.scala:130) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.MLEvents.withFitEvent$(events.scala:123) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.util.Instrumentation.withFitEvent(Instrumentation.scala:42) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.Pipeline.$anonfun$fit$1(Pipeline.scala:133) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.util.Instrumentation$.$anonfun$instrumented$1(Instrumentation.scala:191) [2026-01-10T18:21:25.824Z] at scala.util.Try$.apply(Try.scala:217) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.util.Instrumentation$.instrumented(Instrumentation.scala:191) [2026-01-10T18:21:25.824Z] at org.apache.spark.ml.Pipeline.fit(Pipeline.scala:133) [2026-01-10T18:21:25.824Z] at org.renaissance.apache.spark.DecTree.run(DecTree.scala:102) [2026-01-10T18:21:25.824Z] at org.renaissance.harness.ExecutionDriver.executeOperation(ExecutionDriver.java:137) [2026-01-10T18:21:25.824Z] at org.renaissance.harness.ExecutionDriver.executeBenchmark(ExecutionDriver.java:93) [2026-01-10T18:21:25.824Z] at org.renaissance.harness.RenaissanceSuite$.runBenchmarks$$anonfun$1(RenaissanceSuite.scala:172) [2026-01-10T18:21:25.824Z] at scala.runtime.function.JProcedure1.apply(JProcedure1.java:15) [2026-01-10T18:21:25.824Z] at scala.runtime.function.JProcedure1.apply(JProcedure1.java:10) [2026-01-10T18:21:25.824Z] at scala.collection.immutable.List.foreach(List.scala:334) [2026-01-10T18:21:25.824Z] at org.renaissance.harness.RenaissanceSuite$.runBenchmarks(RenaissanceSuite.scala:161) [2026-01-10T18:21:25.824Z] at org.renaissance.harness.RenaissanceSuite$.main(RenaissanceSuite.scala:130) [2026-01-10T18:21:25.824Z] at org.renaissance.harness.RenaissanceSuite.main(RenaissanceSuite.scala) [2026-01-10T18:21:25.824Z] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) [2026-01-10T18:21:25.824Z] at java.base/java.lang.reflect.Method.invoke(Method.java:565) [2026-01-10T18:21:25.824Z] at org.renaissance.core.Launcher.loadAndInvokeHarnessClass(Launcher.java:129) [2026-01-10T18:21:25.824Z] at org.renaissance.core.Launcher.launchHarnessClass(Launcher.java:78) [2026-01-10T18:21:25.824Z] at org.renaissance.core.Launcher.main(Launcher.java:43) [2026-01-10T18:21:25.824Z] Caused by: java.lang.NullPointerException [2026-01-10T18:21:25.824Z] at org.apache.spark.util.KeyLock.releaseLock(KeyLock.scala:48) [2026-01-10T18:21:25.824Z] at org.apache.spark.util.KeyLock.withLock(KeyLock.scala:66) [2026-01-10T18:21:25.824Z] at org.apache.spark.util.NonFateSharingLoadingCache.get(NonFateSharingCache.scala:94) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1444) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:205) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:39) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1369) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1366) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1(objects.scala:94) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.execution.DeserializeToObjectExec.$anonfun$doExecute$1$adapted(objects.scala:93) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2(RDD.scala:880) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndexInternal$2$adapted(RDD.scala:880) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.execution.SQLExecutionRDD.$anonfun$compute$1(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.internal.SQLConf$.withExistingConf(SQLConf.scala:158) [2026-01-10T18:21:25.824Z] at org.apache.spark.sql.execution.SQLExecutionRDD.compute(SQLExecutionRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2026-01-10T18:21:25.824Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2026-01-10T18:21:25.824Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2026-01-10T18:21:25.825Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2026-01-10T18:21:25.825Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2026-01-10T18:21:25.825Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2026-01-10T18:21:25.825Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2026-01-10T18:21:25.825Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2026-01-10T18:21:25.825Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2026-01-10T18:21:25.825Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2026-01-10T18:21:25.825Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2026-01-10T18:21:25.825Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2026-01-10T18:21:25.825Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1090) [2026-01-10T18:21:25.825Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:614) [2026-01-10T18:21:25.825Z] at java.base/java.lang.Thread.run(Thread.java:1474) [2026-01-10T18:21:25.825Z] The following benchmarks failed: dec-tree [2026-01-10T18:21:25.825Z] ----------------------------------- [2026-01-10T18:21:25.825Z] renaissance-dec-tree_0_FAILED [2026-01-10T18:21:25.825Z] ----------------------------------- [2026-01-10T18:21:25.825Z] [2026-01-10T18:21:25.825Z] TEST TEARDOWN: [2026-01-10T18:21:25.825Z] Nothing to be done for teardown. [2026-01-10T18:21:25.825Z] renaissance-dec-tree_0 Finish Time: Sat Jan 10 18:21:21 2026 Epoch Time (ms): 1768069281941