No Possible Issues Found via Git Search
renaissance-log-regression_0
[2026-02-20T11:28:14.600Z] Running test renaissance-log-regression_0 ...
[2026-02-20T11:28:14.600Z] ===============================================
[2026-02-20T11:28:14.600Z] renaissance-log-regression_0 Start Time: Fri Feb 20 11:28:14 2026 Epoch Time (ms): 1771586894558
[2026-02-20T11:28:14.600Z] variation: NoOptions
[2026-02-20T11:28:14.600Z] JVM_OPTIONS:
[2026-02-20T11:28:14.600Z] { \
[2026-02-20T11:28:14.600Z] echo ""; echo "TEST SETUP:"; \
[2026-02-20T11:28:14.600Z] echo "Nothing to be done for setup."; \
[2026-02-20T11:28:14.600Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG/../TKG/output_17715844989650/renaissance-log-regression_0"; \
[2026-02-20T11:28:14.600Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG/../TKG/output_17715844989650/renaissance-log-regression_0"; \
[2026-02-20T11:28:14.600Z] echo ""; echo "TESTING:"; \
[2026-02-20T11:28:14.600Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG/../TKG/output_17715844989650/renaissance-log-regression_0"/log-regression.json" log-regression; \
[2026-02-20T11:28:14.600Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-log-regression_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG/../TKG/output_17715844989650/renaissance-log-regression_0"; else echo "-----------------------------------"; echo "renaissance-log-regression_0""_FAILED"; echo "-----------------------------------"; fi; \
[2026-02-20T11:28:14.600Z] echo ""; echo "TEST TEARDOWN:"; \
[2026-02-20T11:28:14.600Z] echo "Nothing to be done for teardown."; \
[2026-02-20T11:28:14.600Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG/../TKG/output_17715844989650/TestTargetResult";
[2026-02-20T11:28:14.600Z]
[2026-02-20T11:28:14.600Z] TEST SETUP:
[2026-02-20T11:28:14.600Z] Nothing to be done for setup.
[2026-02-20T11:28:14.600Z]
[2026-02-20T11:28:14.600Z] TESTING:
[2026-02-20T11:28:37.621Z] NOTE: 'log-regression' benchmark uses Spark local executor with 4 (out of 4) threads.
[2026-02-20T11:28:48.414Z] ====== log-regression (apache-spark) [default], iteration 0 started ======
[2026-02-20T11:28:48.414Z] GC before operation: completed in 206.345 ms, heap usage 53.899 MB -> 36.162 MB.
[2026-02-20T11:29:46.173Z] ====== log-regression (apache-spark) [default], iteration 0 completed (53724.301 ms) ======
[2026-02-20T11:29:46.173Z] ====== log-regression (apache-spark) [default], iteration 1 started ======
[2026-02-20T11:29:46.173Z] GC before operation: completed in 489.855 ms, heap usage 331.937 MB -> 123.968 MB.
[2026-02-20T11:29:55.077Z] ====== log-regression (apache-spark) [default], iteration 1 completed (11391.534 ms) ======
[2026-02-20T11:29:55.077Z] ====== log-regression (apache-spark) [default], iteration 2 started ======
[2026-02-20T11:29:55.077Z] GC before operation: completed in 509.469 ms, heap usage 299.031 MB -> 124.517 MB.
[2026-02-20T11:30:03.997Z] ====== log-regression (apache-spark) [default], iteration 2 completed (9225.876 ms) ======
[2026-02-20T11:30:03.997Z] ====== log-regression (apache-spark) [default], iteration 3 started ======
[2026-02-20T11:30:04.329Z] GC before operation: completed in 556.557 ms, heap usage 267.271 MB -> 124.837 MB.
[2026-02-20T11:30:13.206Z] ====== log-regression (apache-spark) [default], iteration 3 completed (7825.592 ms) ======
[2026-02-20T11:30:13.206Z] ====== log-regression (apache-spark) [default], iteration 4 started ======
[2026-02-20T11:30:13.206Z] GC before operation: completed in 594.002 ms, heap usage 520.206 MB -> 125.742 MB.
[2026-02-20T11:30:20.569Z] ====== log-regression (apache-spark) [default], iteration 4 completed (7437.187 ms) ======
[2026-02-20T11:30:20.569Z] ====== log-regression (apache-spark) [default], iteration 5 started ======
[2026-02-20T11:30:20.900Z] GC before operation: completed in 648.706 ms, heap usage 511.407 MB -> 125.972 MB.
[2026-02-20T11:30:28.197Z] ====== log-regression (apache-spark) [default], iteration 5 completed (7146.845 ms) ======
[2026-02-20T11:30:28.197Z] ====== log-regression (apache-spark) [default], iteration 6 started ======
[2026-02-20T11:30:28.922Z] GC before operation: completed in 673.914 ms, heap usage 250.629 MB -> 125.452 MB.
[2026-02-20T11:30:36.163Z] ====== log-regression (apache-spark) [default], iteration 6 completed (6805.547 ms) ======
[2026-02-20T11:30:36.163Z] ====== log-regression (apache-spark) [default], iteration 7 started ======
[2026-02-20T11:30:36.163Z] GC before operation: completed in 712.670 ms, heap usage 559.944 MB -> 126.551 MB.
[2026-02-20T11:30:43.423Z] ====== log-regression (apache-spark) [default], iteration 7 completed (6806.339 ms) ======
[2026-02-20T11:30:43.423Z] ====== log-regression (apache-spark) [default], iteration 8 started ======
[2026-02-20T11:30:43.754Z] GC before operation: completed in 725.366 ms, heap usage 514.206 MB -> 126.583 MB.
[2026-02-20T11:30:51.008Z] ====== log-regression (apache-spark) [default], iteration 8 completed (6662.538 ms) ======
[2026-02-20T11:30:51.008Z] ====== log-regression (apache-spark) [default], iteration 9 started ======
[2026-02-20T11:30:51.336Z] GC before operation: completed in 743.911 ms, heap usage 385.137 MB -> 126.453 MB.
[2026-02-20T11:30:58.594Z] ====== log-regression (apache-spark) [default], iteration 9 completed (6853.511 ms) ======
[2026-02-20T11:30:58.594Z] ====== log-regression (apache-spark) [default], iteration 10 started ======
[2026-02-20T11:30:58.925Z] GC before operation: completed in 745.792 ms, heap usage 338.772 MB -> 126.571 MB.
[2026-02-20T11:31:06.349Z] ====== log-regression (apache-spark) [default], iteration 10 completed (6734.500 ms) ======
[2026-02-20T11:31:06.349Z] ====== log-regression (apache-spark) [default], iteration 11 started ======
[2026-02-20T11:31:06.350Z] GC before operation: completed in 754.280 ms, heap usage 295.887 MB -> 126.597 MB.
[2026-02-20T11:31:12.303Z] 11:31:11.360 ERROR [Executor task launch worker for task 0.0 in stage 270.0 (TID 1080)] org.apache.spark.executor.Executor - Exception in task 0.0 in stage 270.0 (TID 1080)
[2026-02-20T11:31:12.303Z] java.lang.ClassCastException: cannot assign instance of org.apache.spark.executor.ShuffleWriteMetrics to field org.apache.spark.executor.TaskMetrics.shuffleWriteMetrics of type org.apache.spark.executor.ShuffleWriteMetrics in instance of org.apache.spark.executor.TaskMetrics
[2026-02-20T11:31:12.303Z] at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096) ~[?:?]
[2026-02-20T11:31:12.303Z] at java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060) ~[?:?]
[2026-02-20T11:31:12.303Z] at java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347) ~[?:?]
[2026-02-20T11:31:12.303Z] at java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679) ~[?:?]
[2026-02-20T11:31:12.303Z] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486) ~[?:?]
[2026-02-20T11:31:12.303Z] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257) ~[?:?]
[2026-02-20T11:31:12.303Z] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733) ~[?:?]
[2026-02-20T11:31:12.303Z] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:509) ~[?:?]
[2026-02-20T11:31:12.303Z] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:467) ~[?:?]
[2026-02-20T11:31:12.304Z] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.metrics$lzycompute(Task.scala:76) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.metrics(Task.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.run(Task.scala:109) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2026-02-20T11:31:12.304Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2026-02-20T11:31:12.304Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2026-02-20T11:31:12.304Z] 11:31:11.360 ERROR [Executor task launch worker for task 2.0 in stage 270.0 (TID 1082)] org.apache.spark.executor.Executor - Exception in task 2.0 in stage 270.0 (TID 1082)
[2026-02-20T11:31:12.304Z] java.lang.ClassCastException: cannot assign instance of org.apache.spark.executor.ShuffleWriteMetrics to field org.apache.spark.executor.TaskMetrics.shuffleWriteMetrics of type org.apache.spark.executor.ShuffleWriteMetrics in instance of org.apache.spark.executor.TaskMetrics
[2026-02-20T11:31:12.304Z] at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:509) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:467) ~[?:?]
[2026-02-20T11:31:12.304Z] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.metrics$lzycompute(Task.scala:76) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.metrics(Task.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.run(Task.scala:109) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2026-02-20T11:31:12.304Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2026-02-20T11:31:12.304Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2026-02-20T11:31:12.304Z] 11:31:11.360 ERROR [Executor task launch worker for task 3.0 in stage 270.0 (TID 1083)] org.apache.spark.executor.Executor - Exception in task 3.0 in stage 270.0 (TID 1083)
[2026-02-20T11:31:12.304Z] java.lang.ClassCastException: cannot assign instance of org.apache.spark.executor.ShuffleWriteMetrics to field org.apache.spark.executor.TaskMetrics.shuffleWriteMetrics of type org.apache.spark.executor.ShuffleWriteMetrics in instance of org.apache.spark.executor.TaskMetrics
[2026-02-20T11:31:12.304Z] at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:509) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:467) ~[?:?]
[2026-02-20T11:31:12.304Z] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.metrics$lzycompute(Task.scala:76) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.metrics(Task.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.run(Task.scala:109) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2026-02-20T11:31:12.304Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2026-02-20T11:31:12.304Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2026-02-20T11:31:12.304Z] 11:31:11.360 ERROR [Executor task launch worker for task 1.0 in stage 270.0 (TID 1081)] org.apache.spark.executor.Executor - Exception in task 1.0 in stage 270.0 (TID 1081)
[2026-02-20T11:31:12.304Z] java.lang.ClassCastException: cannot assign instance of org.apache.spark.executor.ShuffleWriteMetrics to field org.apache.spark.executor.TaskMetrics.shuffleWriteMetrics of type org.apache.spark.executor.ShuffleWriteMetrics in instance of org.apache.spark.executor.TaskMetrics
[2026-02-20T11:31:12.304Z] at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:509) ~[?:?]
[2026-02-20T11:31:12.304Z] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:467) ~[?:?]
[2026-02-20T11:31:12.304Z] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.metrics$lzycompute(Task.scala:76) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.metrics(Task.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.scheduler.Task.run(Task.scala:109) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [spark-core_2.13-3.5.3.jar:3.5.3]
[2026-02-20T11:31:12.304Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2026-02-20T11:31:12.304Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2026-02-20T11:31:12.304Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2026-02-20T11:31:12.304Z] Exception in thread "Executor task launch worker for task 0.0 in stage 270.0 (TID 1080)" java.lang.ClassCastException: cannot assign instance of org.apache.spark.executor.ShuffleWriteMetrics to field org.apache.spark.executor.TaskMetrics.shuffleWriteMetrics of type org.apache.spark.executor.ShuffleWriteMetrics in instance of org.apache.spark.executor.TaskMetrics
[2026-02-20T11:31:12.304Z] at java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
[2026-02-20T11:31:12.305Z] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
[2026-02-20T11:31:12.305Z] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123)
[2026-02-20T11:31:12.305Z] at org.apache.spark.scheduler.Task.metrics$lzycompute(Task.scala:76)
[2026-02-20T11:31:12.305Z] at org.apache.spark.scheduler.Task.metrics(Task.scala:75)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$collectAccumulatorsAndResetStatusOnFailure$1(Executor.scala:523)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$collectAccumulatorsAndResetStatusOnFailure$1$adapted(Executor.scala:522)
[2026-02-20T11:31:12.305Z] at scala.Option.foreach(Option.scala:437)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.collectAccumulatorsAndResetStatusOnFailure(Executor.scala:522)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:800)
[2026-02-20T11:31:12.305Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
[2026-02-20T11:31:12.305Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[2026-02-20T11:31:12.305Z] at java.base/java.lang.Thread.run(Thread.java:840)
[2026-02-20T11:31:12.305Z] Exception in thread "Executor task launch worker for task 3.0 in stage 270.0 (TID 1083)" java.lang.ClassCastException: cannot assign instance of org.apache.spark.executor.ShuffleWriteMetrics to field org.apache.spark.executor.TaskMetrics.shuffleWriteMetrics of type org.apache.spark.executor.ShuffleWriteMetrics in instance of org.apache.spark.executor.TaskMetrics
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
[2026-02-20T11:31:12.305Z] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
[2026-02-20T11:31:12.305Z] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123)
[2026-02-20T11:31:12.305Z] at org.apache.spark.scheduler.Task.metrics$lzycompute(Task.scala:76)
[2026-02-20T11:31:12.305Z] at org.apache.spark.scheduler.Task.metrics(Task.scala:75)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$collectAccumulatorsAndResetStatusOnFailure$1(Executor.scala:523)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$collectAccumulatorsAndResetStatusOnFailure$1$adapted(Executor.scala:522)
[2026-02-20T11:31:12.305Z] at scala.Option.foreach(Option.scala:437)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.collectAccumulatorsAndResetStatusOnFailure(Executor.scala:522)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:800)
[2026-02-20T11:31:12.305Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
[2026-02-20T11:31:12.305Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[2026-02-20T11:31:12.305Z] at java.base/java.lang.Thread.run(Thread.java:840)
[2026-02-20T11:31:12.305Z] Exception in thread "Executor task launch worker for task 1.0 in stage 270.0 (TID 1081)" java.lang.ClassCastException: cannot assign instance of org.apache.spark.executor.ShuffleWriteMetrics to field org.apache.spark.executor.TaskMetrics.shuffleWriteMetrics of type org.apache.spark.executor.ShuffleWriteMetrics in instance of org.apache.spark.executor.TaskMetrics
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
[2026-02-20T11:31:12.305Z] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
[2026-02-20T11:31:12.305Z] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123)
[2026-02-20T11:31:12.305Z] at org.apache.spark.scheduler.Task.metrics$lzycompute(Task.scala:76)
[2026-02-20T11:31:12.305Z] at org.apache.spark.scheduler.Task.metrics(Task.scala:75)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$collectAccumulatorsAndResetStatusOnFailure$1(Executor.scala:523)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$collectAccumulatorsAndResetStatusOnFailure$1$adapted(Executor.scala:522)
[2026-02-20T11:31:12.305Z] at scala.Option.foreach(Option.scala:437)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.collectAccumulatorsAndResetStatusOnFailure(Executor.scala:522)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:800)
[2026-02-20T11:31:12.305Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
[2026-02-20T11:31:12.305Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[2026-02-20T11:31:12.305Z] at java.base/java.lang.Thread.run(Thread.java:840)
[2026-02-20T11:31:12.305Z] Exception in thread "Executor task launch worker for task 2.0 in stage 270.0 (TID 1082)" java.lang.ClassCastException: cannot assign instance of org.apache.spark.executor.ShuffleWriteMetrics to field org.apache.spark.executor.TaskMetrics.shuffleWriteMetrics of type org.apache.spark.executor.ShuffleWriteMetrics in instance of org.apache.spark.executor.TaskMetrics
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2096)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2060)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1347)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2679)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2486)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
[2026-02-20T11:31:12.305Z] at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
[2026-02-20T11:31:12.305Z] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
[2026-02-20T11:31:12.305Z] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123)
[2026-02-20T11:31:12.305Z] at org.apache.spark.scheduler.Task.metrics$lzycompute(Task.scala:76)
[2026-02-20T11:31:12.305Z] at org.apache.spark.scheduler.Task.metrics(Task.scala:75)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$collectAccumulatorsAndResetStatusOnFailure$1(Executor.scala:523)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$collectAccumulatorsAndResetStatusOnFailure$1$adapted(Executor.scala:522)
[2026-02-20T11:31:12.305Z] at scala.Option.foreach(Option.scala:437)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.collectAccumulatorsAndResetStatusOnFailure(Executor.scala:522)
[2026-02-20T11:31:12.305Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:800)
[2026-02-20T11:31:12.305Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
[2026-02-20T11:31:12.305Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[2026-02-20T11:31:12.305Z] at java.base/java.lang.Thread.run(Thread.java:840)
[2026-02-20T18:44:37.406Z] test-rise-ubuntu2404-riscv64-4 seems to be removed or offline (java.lang.InterruptedException); will wait for 5 min 0 sec for it to come back online
[2026-02-20T18:44:52.427Z] test-rise-ubuntu2404-riscv64-4 is back online
[2026-02-21T10:49:43.430Z] test-rise-ubuntu2404-riscv64-4 seems to be removed or offline (java.lang.InterruptedException); will wait for 5 min 0 sec for it to come back online
[2026-02-21T10:49:59.302Z] test-rise-ubuntu2404-riscv64-4 is back online
[2026-02-21T11:42:32.425Z] Cancelling nested steps due to timeout
[2026-02-21T11:42:32.455Z] Sending interrupt signal to process
[2026-02-21T11:42:41.577Z] Terminated
[2026-02-21T11:42:41.577Z] make[5]: *** [autoGen.mk:130: renaissance-log-regression_0] Error 143
[2026-02-21T11:42:41.577Z] make[5]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/perf/renaissance'
[2026-02-21T11:42:41.577Z] make[4]: *** [/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG/../TKG/settings.mk:362: testList-renaissance] Error 2
[2026-02-21T11:42:41.577Z] make[4]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/perf'
[2026-02-21T11:42:41.577Z] make[3]: *** [/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG/../TKG/settings.mk:362: testList-perf] Error 2
[2026-02-21T11:42:41.577Z] make[3]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests'
[2026-02-21T11:42:41.577Z] make[2]: *** [settings.mk:362: testList-..] Error 2
[2026-02-21T11:42:41.577Z] make[2]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG'
[2026-02-21T11:42:41.577Z] make[1]: *** [makefile:62: _testList] Error 2
[2026-02-21T11:42:41.577Z] make[1]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_2_rerun/aqa-tests/TKG'
[2026-02-21T11:42:41.577Z] make: *** [parallelList.mk:14: testList_2] Error 2
[2026-02-21T11:42:41.712Z] script returned exit code 2
[Pipeline] sh
[2026-02-21T11:42:42.404Z] + uname
[2026-02-21T11:42:42.404Z] + [ Linux = AIX ]
[2026-02-21T11:42:42.404Z] + uname
[2026-02-21T11:42:42.404Z] + [ Linux = SunOS ]
[2026-02-21T11:42:42.404Z] + uname
[2026-02-21T11:42:42.404Z] + [ Linux = *BSD ]
[2026-02-21T11:42:42.404Z] + MAKE=make
[2026-02-21T11:42:42.404Z] + make -f ./aqa-tests/TKG/testEnv.mk testEnvTeardown
[2026-02-21T11:42:42.404Z] make: Nothing to be done for 'testEnvTeardown'.
[Pipeline] }
[2026-02-21T11:42:42.475Z] $ ssh-agent -k
[2026-02-21T11:42:42.501Z] unset SSH_AUTH_SOCK;
[2026-02-21T11:42:42.501Z] unset SSH_AGENT_PID;
[2026-02-21T11:42:42.501Z] echo Agent pid 3909994 killed;
[2026-02-21T11:42:42.599Z] [ssh-agent] Stopped.
[Pipeline] // sshagent
[Pipeline] }
[2026-02-21T11:42:42.630Z] Xvfb stopping
[Pipeline] // wrap
[Pipeline] echo
[2026-02-21T11:42:42.939Z] no DaCapo-h2 metric found
[Pipeline] echo
[2026-02-21T11:42:42.956Z] Could not find test result, set build result to FAILURE.
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Post)
[Pipeline] echo
[2026-02-21T11:42:43.022Z] Saving aqa-tests/testenv/testenv.properties file on jenkins.
[Pipeline] archiveArtifacts
[2026-02-21T11:42:43.047Z] Archiving artifacts
[2026-02-21T11:42:43.132Z] Recording fingerprints
[Pipeline] echo
[2026-02-21T11:42:43.188Z] Saving aqa-tests/TKG/**/*.tap file on jenkins.
[Pipeline] archiveArtifacts
[2026-02-21T11:42:43.215Z] Archiving artifacts
[2026-02-21T11:42:43.801Z] No artifacts found that match the file pattern "aqa-tests/TKG/**/*.tap". Configuration error?
[Pipeline] sh
[2026-02-21T11:42:44.321Z] + tar -cf benchmark_test_output.tar.gz ./aqa-tests/TKG/output_17715844989650
[Pipeline] echo
[2026-02-21T11:42:44.394Z] ARTIFACTORY_SERVER is not set. Saving artifacts on jenkins.
[Pipeline] archiveArtifacts
[2026-02-21T11:42:44.431Z] Archiving artifacts
[2026-02-21T11:42:44.521Z] Recording fingerprints
[Pipeline] findFiles
[Pipeline] junit
[2026-02-21T11:42:45.954Z] Recording test results
[2026-02-21T11:42:47.314Z] None of the test reports contained any result
[2026-02-21T11:42:47.314Z] [Checks API] No suitable checks publisher found.
[Pipeline] }
[Pipeline] // stage
[Pipeline] echo
[2026-02-21T11:42:47.341Z] PROCESSCATCH: Terminating any hung/left over test processes:
[Pipeline] sh
[2026-02-21T11:42:47.963Z] + aqa-tests/terminateTestProcesses.sh jenkins
[2026-02-21T11:42:47.964Z] Unix type machine..
[2026-02-21T11:42:47.964Z] Running on a Linux host
[2026-02-21T11:42:47.964Z] Woohoo - no rogue processes detected!
[Pipeline] retry
[Pipeline] {
[2026-02-21T11:42:47.264Z] No test report files were found. Configuration error?
[Pipeline] cleanWs
[2026-02-21T11:42:48.116Z] [WS-CLEANUP] Deleting project workspace...
[2026-02-21T11:42:48.116Z] [WS-CLEANUP] Deferred wipeout is disabled by the job configuration...
[2026-02-21T11:42:59.122Z] [WS-CLEANUP] done
[Pipeline] }
[Pipeline] // retry
[Pipeline] sh
[2026-02-21T11:42:59.666Z] + find /tmp -name *core* -print -exec rm -f {} ;
[2026-02-21T11:43:00.008Z] + true
[Pipeline] }
[Pipeline] // timeout
[Pipeline] timeout
[2026-02-21T11:43:00.109Z] Timeout set to expire in 5 min 0 sec
[Pipeline] {
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timestamps
[Pipeline] End of Pipeline
Finished: FAILURE