No Possible Issues Found via Git Search

renaissance-naive-bayes_0

[2026-02-12T04:11:17.136Z] Running test renaissance-naive-bayes_0 ... [2026-02-12T04:11:17.136Z] =============================================== [2026-02-12T04:11:17.136Z] renaissance-naive-bayes_0 Start Time: Thu Feb 12 04:11:17 2026 Epoch Time (ms): 1770869477012 [2026-02-12T04:11:17.136Z] variation: NoOptions [2026-02-12T04:11:17.136Z] JVM_OPTIONS: [2026-02-12T04:11:17.136Z] { \ [2026-02-12T04:11:17.136Z] echo ""; echo "TEST SETUP:"; \ [2026-02-12T04:11:17.136Z] echo "Nothing to be done for setup."; \ [2026-02-12T04:11:17.136Z] mkdir -p "/ssd/jenkins/workspace/Test_openjdk11_hs_sanity.perf_arm_linux/aqa-tests/TKG/../TKG/output_17708641846250/renaissance-naive-bayes_0"; \ [2026-02-12T04:11:17.136Z] cd "/ssd/jenkins/workspace/Test_openjdk11_hs_sanity.perf_arm_linux/aqa-tests/TKG/../TKG/output_17708641846250/renaissance-naive-bayes_0"; \ [2026-02-12T04:11:17.136Z] echo ""; echo "TESTING:"; \ [2026-02-12T04:11:17.136Z] "/ssd/jenkins/workspace/Test_openjdk11_hs_sanity.perf_arm_linux/jdkbinary/j2sdk-image/bin/java" -jar "/ssd/jenkins/workspace/Test_openjdk11_hs_sanity.perf_arm_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/ssd/jenkins/workspace/Test_openjdk11_hs_sanity.perf_arm_linux/aqa-tests/TKG/../TKG/output_17708641846250/renaissance-naive-bayes_0"/naive-bayes.json" naive-bayes; \ [2026-02-12T04:11:17.136Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-naive-bayes_0""_PASSED"; echo "-----------------------------------"; cd /ssd/jenkins/workspace/Test_openjdk11_hs_sanity.perf_arm_linux/aqa-tests/TKG/..; rm -f -r "/ssd/jenkins/workspace/Test_openjdk11_hs_sanity.perf_arm_linux/aqa-tests/TKG/../TKG/output_17708641846250/renaissance-naive-bayes_0"; else echo "-----------------------------------"; echo "renaissance-naive-bayes_0""_FAILED"; echo "-----------------------------------"; fi; \ [2026-02-12T04:11:17.136Z] echo ""; echo "TEST TEARDOWN:"; \ [2026-02-12T04:11:17.136Z] echo "Nothing to be done for teardown."; \ [2026-02-12T04:11:17.136Z] } 2>&1 | tee -a "/ssd/jenkins/workspace/Test_openjdk11_hs_sanity.perf_arm_linux/aqa-tests/TKG/../TKG/output_17708641846250/TestTargetResult"; [2026-02-12T04:11:17.136Z] [2026-02-12T04:11:17.136Z] TEST SETUP: [2026-02-12T04:11:17.136Z] Nothing to be done for setup. [2026-02-12T04:11:17.136Z] [2026-02-12T04:11:17.136Z] TESTING: [2026-02-12T04:11:41.549Z] NOTE: 'naive-bayes' benchmark uses Spark local executor with 8 (out of 8) threads. [2026-02-12T04:11:55.730Z] 04:11:54.573 WARN [main] org.apache.spark.util.SizeEstimator - Failed to check whether UseCompressedOops is set; assuming yes [2026-02-12T04:11:55.730Z] WARNING: An illegal reflective access operation has occurred [2026-02-12T04:11:55.730Z] WARNING: Illegal reflective access by org.apache.spark.util.SizeEstimator$ (file:/ssd/jenkins/workspace/Test_openjdk11_hs_sanity.perf_arm_linux/aqa-tests/TKG/output_17708641846250/renaissance-naive-bayes_0/harness-041119-13807907413680217253/apache-spark/lib/spark-core_2.13-3.5.3.jar) to field java.net.URI.scheme [2026-02-12T04:11:55.730Z] WARNING: Please consider reporting this to the maintainers of org.apache.spark.util.SizeEstimator$ [2026-02-12T04:11:55.730Z] WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations [2026-02-12T04:11:55.730Z] WARNING: All illegal access operations will be denied in a future release [2026-02-12T04:12:01.108Z] ====== naive-bayes (apache-spark) [default], iteration 0 started ====== [2026-02-12T04:12:01.519Z] GC before operation: completed in 255.308 ms, heap usage 54.255 MB -> 36.538 MB. [2026-02-12T04:13:12.638Z] 04:13:01.797 WARN [Executor task launch worker for task 0.0 in stage 0.0 (TID 0)] org.apache.spark.storage.BlockManager - Block rdd_3_0 could not be removed as it was not found on disk or in memory [2026-02-12T04:13:12.638Z] 04:13:02.619 WARN [Executor task launch worker for task 6.0 in stage 0.0 (TID 6)] org.apache.spark.storage.BlockManager - Block rdd_3_6 could not be removed as it was not found on disk or in memory [2026-02-12T04:13:12.638Z] 04:13:02.629 ERROR [Executor task launch worker for task 6.0 in stage 0.0 (TID 6)] org.apache.spark.executor.Executor - Exception in task 6.0 in stage 0.0 (TID 6) [2026-02-12T04:13:12.638Z] java.lang.OutOfMemoryError: Java heap space [2026-02-12T04:13:12.638Z] at java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:61) ~[?:?] [2026-02-12T04:13:12.638Z] at java.nio.ByteBuffer.allocate(ByteBuffer.java:348) ~[?:?] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.ColumnBuilder$.ensureFreeSpace(ColumnBuilder.scala:167) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.BasicColumnBuilder.appendFrom(ColumnBuilder.scala:73) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.ComplexColumnBuilder.org$apache$spark$sql$execution$columnar$NullableColumnBuilder$$super$appendFrom(ColumnBuilder.scala:93) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.NullableColumnBuilder.appendFrom(NullableColumnBuilder.scala:61) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.NullableColumnBuilder.appendFrom$(NullableColumnBuilder.scala:54) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.ComplexColumnBuilder.appendFrom(ColumnBuilder.scala:93) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:105) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:80) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:290) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:287) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:224) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:302) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1597) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.storage.BlockManager$$Lambda$2072/0x89a0b828.apply(Unknown Source) ~[?:?] [2026-02-12T04:13:12.638Z] at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1524) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1588) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1389) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.storage.BlockManager.getOrElseUpdateRDDBlock(BlockManager.scala:1343) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:379) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:329) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.executor.Executor$TaskRunner$$Lambda$2437/0x897f9028.apply(Unknown Source) ~[?:?] [2026-02-12T04:13:12.638Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.638Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] [2026-02-12T04:13:12.638Z] 04:13:02.290 ERROR [Executor task launch worker for task 0.0 in stage 0.0 (TID 0)] org.apache.spark.executor.Executor - Exception in task 0.0 in stage 0.0 (TID 0) [2026-02-12T04:13:12.638Z] java.lang.OutOfMemoryError: Java heap space [2026-02-12T04:13:12.638Z] at java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:61) ~[?:?] [2026-02-12T04:13:12.639Z] at java.nio.ByteBuffer.allocate(ByteBuffer.java:348) ~[?:?] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.ColumnBuilder$.ensureFreeSpace(ColumnBuilder.scala:167) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.BasicColumnBuilder.appendFrom(ColumnBuilder.scala:73) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.ComplexColumnBuilder.org$apache$spark$sql$execution$columnar$NullableColumnBuilder$$super$appendFrom(ColumnBuilder.scala:93) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.NullableColumnBuilder.appendFrom(NullableColumnBuilder.scala:61) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.NullableColumnBuilder.appendFrom$(NullableColumnBuilder.scala:54) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.ComplexColumnBuilder.appendFrom(ColumnBuilder.scala:93) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:105) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:80) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:290) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:287) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:224) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:302) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1597) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager$$Lambda$2072/0x89a0b828.apply(Unknown Source) ~[?:?] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1524) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1588) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1389) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.getOrElseUpdateRDDBlock(BlockManager.scala:1343) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:379) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:329) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.executor.Executor$TaskRunner$$Lambda$2437/0x897f9028.apply(Unknown Source) ~[?:?] [2026-02-12T04:13:12.639Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] [2026-02-12T04:13:12.639Z] 04:13:03.333 ERROR [Executor task launch worker for task 6.0 in stage 0.0 (TID 6)] org.apache.spark.util.SparkUncaughtExceptionHandler - Uncaught exception in thread Thread[Executor task launch worker for task 6.0 in stage 0.0 (TID 6),5,main] [2026-02-12T04:13:12.639Z] java.lang.OutOfMemoryError: Java heap space [2026-02-12T04:13:12.639Z] at java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:61) ~[?:?] [2026-02-12T04:13:12.639Z] at java.nio.ByteBuffer.allocate(ByteBuffer.java:348) ~[?:?] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.ColumnBuilder$.ensureFreeSpace(ColumnBuilder.scala:167) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.BasicColumnBuilder.appendFrom(ColumnBuilder.scala:73) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.ComplexColumnBuilder.org$apache$spark$sql$execution$columnar$NullableColumnBuilder$$super$appendFrom(ColumnBuilder.scala:93) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.NullableColumnBuilder.appendFrom(NullableColumnBuilder.scala:61) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.NullableColumnBuilder.appendFrom$(NullableColumnBuilder.scala:54) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.ComplexColumnBuilder.appendFrom(ColumnBuilder.scala:93) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:105) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:80) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:290) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:287) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:224) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:302) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1597) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager$$Lambda$2072/0x89a0b828.apply(Unknown Source) ~[?:?] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1524) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1588) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1389) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.storage.BlockManager.getOrElseUpdateRDDBlock(BlockManager.scala:1343) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:379) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:329) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.executor.Executor$TaskRunner$$Lambda$2437/0x897f9028.apply(Unknown Source) ~[?:?] [2026-02-12T04:13:12.639Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] [2026-02-12T04:13:12.639Z] 04:13:03.334 ERROR [Executor task launch worker for task 0.0 in stage 0.0 (TID 0)] org.apache.spark.util.SparkUncaughtExceptionHandler - Uncaught exception in thread Thread[Executor task launch worker for task 0.0 in stage 0.0 (TID 0),5,main] [2026-02-12T04:13:12.639Z] java.lang.OutOfMemoryError: Java heap space [2026-02-12T04:13:12.639Z] at java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:61) ~[?:?] [2026-02-12T04:13:12.639Z] at java.nio.ByteBuffer.allocate(ByteBuffer.java:348) ~[?:?] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.ColumnBuilder$.ensureFreeSpace(ColumnBuilder.scala:167) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.639Z] at org.apache.spark.sql.execution.columnar.BasicColumnBuilder.appendFrom(ColumnBuilder.scala:73) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.sql.execution.columnar.ComplexColumnBuilder.org$apache$spark$sql$execution$columnar$NullableColumnBuilder$$super$appendFrom(ColumnBuilder.scala:93) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.sql.execution.columnar.NullableColumnBuilder.appendFrom(NullableColumnBuilder.scala:61) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.sql.execution.columnar.NullableColumnBuilder.appendFrom$(NullableColumnBuilder.scala:54) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.sql.execution.columnar.ComplexColumnBuilder.appendFrom(ColumnBuilder.scala:93) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:105) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:80) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:290) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:287) ~[spark-sql_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:224) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:302) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1597) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.storage.BlockManager$$Lambda$2072/0x89a0b828.apply(Unknown Source) ~[?:?] [2026-02-12T04:13:12.640Z] at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1524) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1588) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1389) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.storage.BlockManager.getOrElseUpdateRDDBlock(BlockManager.scala:1343) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:379) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:329) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.executor.Executor$TaskRunner$$Lambda$2437/0x897f9028.apply(Unknown Source) ~[?:?] [2026-02-12T04:13:12.640Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2026-02-12T04:13:12.640Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] [2026-02-12T04:13:12.640Z] ====== naive-bayes (apache-spark) [default], iteration 0 failed (SparkException) ====== [2026-02-12T04:13:12.640Z] ----------------------------------- [2026-02-12T04:13:12.640Z] renaissance-naive-bayes_0_FAILED [2026-02-12T04:13:12.640Z] ----------------------------------- [2026-02-12T04:13:12.640Z] [2026-02-12T04:13:12.640Z] TEST TEARDOWN: [2026-02-12T04:13:12.640Z] Nothing to be done for teardown. [2026-02-12T04:13:12.640Z] renaissance-naive-bayes_0 Finish Time: Thu Feb 12 04:13:04 2026 Epoch Time (ms): 1770869584399