-
Notifications
You must be signed in to change notification settings - Fork 468
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SNOW-823151: First result batch has int64 arrow type for NUMBER column with real values #1568
Comments
@tekumara did this happen with previous versions? |
I'm not entirely sure. |
hey @tekumara , you are just one step close to get the result you want. please let me know if you have any other questions, I'm more than happy to answer. |
I still have the same problem using those methods. The MRE above uses _create_empty_table() to make it easier to reproduce. |
can you try this? conn = snowflake.connector.connect(**CONNECTION_PARAMETERS)
cur = conn.cursor()
cur.execute("SELECT cast(1.5 as NUMBER(32,4)) as AMOUNT;")
batches = cur.get_result_batches()
batch = batches[0]
table = batch.to_arrow()
assert table.schema.types[0] == pyarrow.float64() |
That example works but not in the general case ie: when I fetch and use |
@tekumara Did you ever find out why the first batch always returns empty? |
No I didn't, are you seeing this too? |
Yes I am seeing the same thing, when the connector returns a list of To work around this, when distributing |
@sfc-gh-aling could you reopen this issue please as it’s still happening? 🙏 |
thank you all for your feedbacks on this one! i reopened this issue and we'll take a look |
Hey @sfc-gh-dszmolka, I am experiencing this issue as well. When calling As a workaround, I simply do not process the first batch if it contains 0 rows. |
When fetching multiple batches via
get_result_batches
, the first batch is empty and has the incorrect arrow data type for NUMBER columns containing real values, eg:Will fail with an assertion error and output:
snowflake-connector-python==3.0.3
The text was updated successfully, but these errors were encountered: