-
Notifications
You must be signed in to change notification settings - Fork 117
feat: native support for intervals <-> Neo4j duration #796
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: 5.0
Are you sure you want to change the base?
Conversation
| class SparkToNeo4jDataConverter extends DataConverter[Value] { | ||
|
|
||
| override def convert(value: Any, dataType: DataType): Value = { | ||
| dataType match { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looking into how to re-write this to still match on value (and incorporate below)
spark-3/src/test/scala/org/neo4j/spark/DataSourceWriterTSE.scala
Outdated
Show resolved
Hide resolved
c564f4e to
b8a5787
Compare
this commit introduces a new way to write from Spark SQL interval types to Neo4j duration type. commit is additive, i.e. the previous method to write Neo4j duration type via custom struct is still possible, and should therefore be backwards compatible. Fixes CONN-341
b8a5787 to
078fcf0
Compare
|
|
||
| class DataSourceWriterTSE extends SparkConnectorScalaBaseTSE { | ||
| val sparkSession = SparkSession.builder().getOrCreate() | ||
| val sparkSession: SparkSession = SparkSession.builder().getOrCreate() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: not sure adding the explicit type really helps readability here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree
| assertEquals(expected, records) | ||
| } | ||
|
|
||
| private def writeAndGetInterval(expectedDt: Class[_ <: DataType], sql: String): Value = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: I'd put the caller before the callee (I find it easier to read from top to bottom, than scrolling back and forth, but that's just a personal preference)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can do
| } | ||
|
|
||
| @Test | ||
| def `should write nodes with native Neo4j durations when passed DayTimeIntervalType`(): Unit = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: would it be possible to turn all of these test into a single parameterized test? it would make it easier to check whether we've got exhaustive checks on all these interval variants
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: I can see INTERVAL DAY TO SECOND and INTERVAL YEAR TO MONTH covered, but about all the others? (which would be easier to add to a parameterized test btw)
INTERVAL YEARINTERVAL MONTHINTERVAL DAYINTERVAL DAY TO HOURINTERVAL DAY TO MINUTEINTERVAL HOURINTERVAL HOUR TO MINUTEINTERVAL HOUR TO SECONDINTERVAL MINUTEINTERVAL MINUTE TO SECONDINTERVAL SECOND
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I looked into parametrized tests in sacala/junit4 env, but at that time it was just a few cases. Can look into your suggested intervals and look into re-adding parameterized
this commit introduces a new way to write from Spark SQL interval types to Neo4j duration type.
commit is additive, i.e. the previous method to write Neo4j duration type via custom struct is
still possible, and should therefore be backwards compatible.
Fixes CONN-341