Skip to content

respond to code review comments #321

respond to code review comments

respond to code review comments #321

GitHub Actions / Report test results failed Oct 20, 2023 in 0s

32293 tests run, 805 skipped, 3 failed.

Annotations

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_arrow_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_arrow_map.py.test_other_than_recordbatch_iter

<_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:38125: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:38125: Failed to connect to remote host: Connection refused {created_time:"2023-10-19T23:03:58.049506671+00:00", grpc_status:14}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/connect/test_parity_arrow_map.py", line 26, in test_other_than_recordbatch_iter
    self.check_other_than_recordbatch_iter()
  File "/__w/spark/spark/python/pyspark/sql/tests/test_arrow_map.py", line 109, in check_other_than_recordbatch_iter
    (self.spark.range(10, numPartitions=3).mapInArrow(not_iter, "a int").count())
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 255, in count
    table, _ = self.agg(_invoke_function("count", lit(1)))._to_table()
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1725, in _to_table
    table, schema = self._session.client.to_table(query)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 823, in to_table
    table, schema, _, _, _ = self._execute_and_fetch(req)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1281, in _execute_and_fetch
    for response in self._execute_and_fetch_as_iterator(req):
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1262, in _execute_and_fetch_as_iterator
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1501, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1561, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:38125: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:38125: Failed to connect to remote host: Connection refused {created_time:"2023-10-19T23:03:58.049506671+00:00", grpc_status:14}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_arrow_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_arrow_map.py.test_self_join

<_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:38125: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:38125: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-19T23:14:09.975610532+00:00"}"
>
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/test_arrow_map.py", line 144, in test_self_join
    df2 = df1.mapInArrow(lambda iter: iter, "id long")
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 2022, in mapInArrow
    return self._map_partitions(func, schema, PythonEvalType.SQL_MAP_ARROW_ITER_UDF, barrier)
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 2001, in _map_partitions
    child=self._plan, function=udf_obj, cols=self.columns, is_barrier=barrier
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 244, in columns
    return self.schema.names
  File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1738, in schema
    return self._session.client.schema(query)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 930, in schema
    schema = self._analyze(method="schema", plan=plan).schema
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1120, in _analyze
    self._handle_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1501, in _handle_error
    self._handle_rpc_error(error)
  File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 1561, in _handle_rpc_error
    raise SparkConnectGrpcException(str(rpc_error)) from None
pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:38125: Failed to connect to remote host: Connection refused"
	debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:38125: Failed to connect to remote host: Connection refused {grpc_status:14, created_time:"2023-10-19T23:14:09.975610532+00:00"}"
>

Check failure on line 1 in python/pyspark/sql/tests/connect/test_parity_arrow_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/connect/test_parity_arrow_map.py.tearDownClass (pyspark.sql.tests.connect.test_parity_arrow_map.ArrowMapParityTests)

[Errno 111] Connection refused
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/testing/connectutils.py", line 195, in tearDownClass
    cls.spark.stop()
  File "/__w/spark/spark/python/pyspark/sql/connect/session.py", line 669, in stop
    PySparkSession._activeSession.stop()
  File "/__w/spark/spark/python/pyspark/sql/session.py", line 1804, in stop
    self._jvm.SparkSession.clearDefaultSession()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1712, in __getattr__
    answer = self._gateway_client.send_command(
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1036, in send_command
    connection = self._get_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 284, in _get_connection
    connection = self._create_new_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 291, in _create_new_connection
    connection.connect_to_java_server()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 438, in connect_to_java_server
    self.socket.connect((self.java_address, self.java_port))
ConnectionRefusedError: [Errno 111] Connection refused