Data import from other sorces #484
-
Hey there, I was just wondering if there is the opportunity to import already existing historic data from another database/backend into QuantumLeap (or e.g. crate). I'm open for suggestions :) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @SBlechmann !
Maybe but it'll do :-) Well unless you have lots of data... We don't have an import Web API at the moment, but one thing you could try if you need/want/prefer using the notify endpoint for importing lots of data is to split your entity data into batches and then call the notify endpoint with each batch payload. E.g.
This is a streaming solution which avoids one notify call per origin DB row and should keep memory usage under control, but I don't think you'll need it if you don't have too much data to import. Also keep in mind, QL does already split large payloads into batches (see #450) so a simpler option if you have say less than 1GB worth of JSON to import would be to just pack all the entities in a single payload for the notify endpoint. Oh a should mention we've got a script to export data from crate to timescale that you might be able to tweak for your specific scenario. Another option would be to leverage DB import/export tools---e.g. export to CSV from the origin DB then import into target DB, but you'll still have to be able to covert b/w the origin DB schema and the QuantumLeap schema that the target DB expects. But I think you've already thought about all this already, so not very helpful I guess. But what are the origin and target DBs in your case? How much data do you have to migrate? How's the data stored in the origin DB? If you can share a bit more about your specific scenario maybe we'll be able to concoct a plan together... |
Beta Was this translation helpful? Give feedback.
Hi @SBlechmann !
Maybe but it'll do :-) Well unless you have lots of data...
We don't have an import Web API at the moment, but one thing you could try if you need/want/prefer using the notify endpoint for importing lots of data is to split your entity data into batches and then call the notify endpoint with each batch payload. E.g.