Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[SPARK-49311][SQL] Make it possible for large 'interval second' value…
…s to be cast to decimal ### What changes were proposed in this pull request? Prior to this PR, `interval second` values where the number of microseconds needed to be represented by 19 digits could not be cast to decimal. This PR removes this gap. ``` scala> sql("select 1000000000000.000000::interval second").show(false) +---------------------------------------------+ |CAST(1000000000000.000000 AS INTERVAL SECOND)| +---------------------------------------------+ |INTERVAL '1000000000000' SECOND | +---------------------------------------------+ scala> sql("select 1000000000000.000000::interval second::decimal(38, 10)").show(false) org.apache.spark.SparkArithmeticException: [NUMERIC_VALUE_OUT_OF_RANGE.WITH_SUGGESTION] 0 cannot be represented as Decimal(18, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error, and return NULL instead. SQLSTATE: 22003 ``` ### Why are the changes needed? This change adds additional coverage. ### Does this PR introduce _any_ user-facing change? Yes, users couldn't cast large second intervals to decimals earlier. Now, they can. ### How was this patch tested? Unit test in `IntervalExpressionsSuite`. ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#47808 from harshmotw-db/harshmotw-db/interval_decimal_fix. Authored-by: Harsh Motwani <harsh.motwani@databricks.com> Signed-off-by: Max Gekk <max.gekk@gmail.com>
- Loading branch information