Skip to content

respond to code review comments #320

respond to code review comments

respond to code review comments #320

Triggered via push October 18, 2023 21:50
Status Failure
Total duration 3h 15m 40s
Artifacts 24

build_main.yml

on: push
Run  /  Check changes
47s
Run / Check changes
Run  /  Base image build
1m 1s
Run / Base image build
Run  /  Breaking change detection with Buf (branch-3.5)
58s
Run / Breaking change detection with Buf (branch-3.5)
Run  /  Run TPC-DS queries with SF=1
1h 37m
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
52m 5s
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
1h 22m
Run / Run Spark on Kubernetes Integration test
Matrix: Run / build
Matrix: Run / java-other-versions
Run  /  Build modules: sparkr
36m 16s
Run / Build modules: sparkr
Run  /  Linters, licenses, dependencies and documentation generation
1h 53m
Run / Linters, licenses, dependencies and documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

14 errors and 1 warning
Run / Build modules: pyspark-connect
Process completed with exit code 19.
Run / Build modules: catalyst, hive-thriftserver
Process completed with exit code 18.
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-2be9d98b44e9f24a-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-de3d1e8b44eb25f2-exec-1".
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$679/0x00007fc8bc5c3d98@4c3b1e1e rejected from java.util.concurrent.ThreadPoolExecutor@2e8b24e0[Shutting down, pool size = 4, active threads = 2, queued tasks = 0, completed tasks = 301]
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$679/0x00007fc8bc5c3d98@35568da7 rejected from java.util.concurrent.ThreadPoolExecutor@2e8b24e0[Shutting down, pool size = 2, active threads = 1, queued tasks = 0, completed tasks = 302]
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-5e9f848b4503248c-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-132c798b450459fb-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-808cf28b45089a0c-exec-1".
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-56bec0ed0f0945c285bb0bfc9d556689-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-56bec0ed0f0945c285bb0bfc9d556689-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
CliSuite.SPARK-37555: spark-sql should pass last unclosed comment to backend: CliSuite#L198
org.scalatest.exceptions.TestFailedException: ======================= CliSuite failure output ======================= Spark SQL CLI command line: ../../bin/spark-sql --master local --driver-java-options -Dderby.system.durability=test --conf spark.ui.enabled=false --conf spark.sql.legacy.emptyCurrentDBInCli=true --hiveconf javax.jdo.option.ConnectionURL=jdbc:derby:;databaseName=/home/runner/work/spark/spark/target/tmp/spark-6854871f-60a1-4cba-bbe6-d2a579b2a370;create=true --hiveconf hive.exec.scratchdir=/home/runner/work/spark/spark/target/tmp/spark-94fc49cb-5f53-4304-9464-13a686e7d623 --hiveconf conf1=conftest --hiveconf conf2=1 --hiveconf hive.metastore.warehouse.dir=/home/runner/work/spark/spark/target/tmp/spark-d73446fa-d692-4fa4-b7b5-b943f463fe89 Exception: java.util.concurrent.TimeoutException: Future timed out after [5 minutes] Failed to capture next expected output "Found an unclosed bracketed comment. Please, append */ at the end of the comment." within 5 minutes. 2023-10-18 15:37:40.743 - stderr> Setting default log level to "WARN". 2023-10-18 15:37:40.743 - stderr> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 2023-10-18 15:37:48.165 - stderr> Spark master: local, Application Id: local-1697668662631 2023-10-18 15:37:49.559 - stdout> spark-sql> /* SELECT /*+ HINT() 4; */; 2023-10-18 15:37:50.066 - stderr> 2023-10-18 15:37:50.066 - stderr> [PARSE_SYNTAX_ERROR] Syntax error at or near ';'.(line 1, pos 26) 2023-10-18 15:37:50.066 - stderr> 2023-10-18 15:37:50.066 - stderr> == SQL == 2023-10-18 15:37:50.066 - stderr> /* SELECT /*+ HINT() 4; */; 2023-10-18 15:37:50.066 - stderr> --------------------------^^^ 2023-10-18 15:37:50.066 - stderr> 2023-10-18 15:37:50.089 - stdout> spark-sql> /* SELECT /*+ HINT() 4; */ SELECT 1; 2023-10-18 15:37:52.085 - stdout> 1 2023-10-18 15:37:52.085 - stderr> Time taken: 1.994 seconds, Fetched 1 row(s) 2023-10-18 15:37:52.104 - stderr> 2023-10-18 15:37:52.105 - stderr> [UNCLOSED_BRACKETED_COMMENT] Found an unclosed bracketed comment. Please, append */ at the end of the comment. 2023-10-18 15:37:52.105 - stderr> == SQL == 2023-10-18 15:37:52.105 - stderr> /* Here is a unclosed bracketed comment SELECT 1; 2023-10-18 15:37:52.105 - stdout> spark-sql> /* Here is a unclosed bracketed comment SELECT 1; 2023-10-18 15:37:52.112 - stdout> spark-sql> /* SELECT /*+ HINT() */ 4; */; 2023-10-18 15:37:52.226 - stdout> spark-sql> =========================== End CliSuite failure output ===========================
Run / Build modules: pyspark-errors
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.

Artifacts

Produced during runtime
Name Size
site Expired
59.3 MB
test-results-catalyst, hive-thriftserver--17-hadoop3-hive2.3 Expired
210 KB
test-results-core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx--17-hadoop3-hive2.3 Expired
132 KB
test-results-docker-integration--17-hadoop3-hive2.3 Expired
119 KB
test-results-hive-- other tests-17-hadoop3-hive2.3 Expired
910 KB
test-results-hive-- slow tests-17-hadoop3-hive2.3 Expired
852 KB
test-results-pyspark-connect--17-hadoop3-hive2.3 Expired
17.9 KB
test-results-pyspark-core, pyspark-streaming--17-hadoop3-hive2.3 Expired
80.1 KB
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3 Expired
1.15 MB
test-results-pyspark-pandas--17-hadoop3-hive2.3 Expired
1.14 MB
test-results-pyspark-pandas-connect-part0--17-hadoop3-hive2.3 Expired
1.06 MB
test-results-pyspark-pandas-connect-part1--17-hadoop3-hive2.3 Expired
971 KB
test-results-pyspark-pandas-connect-part2--17-hadoop3-hive2.3 Expired
637 KB
test-results-pyspark-pandas-connect-part3--17-hadoop3-hive2.3 Expired
326 KB
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3 Expired
1.85 MB
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3 Expired
394 KB
test-results-sparkr--17-hadoop3-hive2.3 Expired
280 KB
test-results-sql-- extended tests-17-hadoop3-hive2.3 Expired
2.97 MB
test-results-sql-- other tests-17-hadoop3-hive2.3 Expired
4.25 MB
test-results-sql-- slow tests-17-hadoop3-hive2.3 Expired
2.77 MB
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3 Expired
2.12 MB
test-results-tpcds--17-hadoop3-hive2.3 Expired
21.9 KB
unit-tests-log-catalyst, hive-thriftserver--17-hadoop3-hive2.3 Expired
152 MB
unit-tests-log-pyspark-connect--17-hadoop3-hive2.3 Expired
12.1 MB