Replies: 4 comments 3 replies
-
Interesting issue! I can't think of an elegant solution, but what you could do is to find all duplicates, like you are doing now, and, with the duplicates, add +1 on the first duplicate to the timestamp and save, +2, on the second duplicate to the timestamp and save, etc. I think it'd make sense to just ensure the existing duplicates are fixed as a one-time activity, I'm guessing it'd be hard to create new duplicates. |
Beta Was this translation helpful? Give feedback.
-
@ahyatt . Help here to debug why the denote/last-export row is getting erased from the ekg db on try to re-run
Here is a test run in the scratch pad showing that the
|
Beta Was this translation helpful? Give feedback.
-
The problem seems to be something that should affect logseq too, which is that when we add the schema, it overwrites anything anything stored in the subject. I can reproduce this via ert test in the triples library. I think this requires a fix in that library, so I'll fix this. Thanks for bringing this to my attention! |
Beta Was this translation helpful? Give feedback.
-
I've checked in a fix for this in the latest triples |
Beta Was this translation helpful? Give feedback.
-
I have started working on exporting ekg notes to denote following the logseq export code. One challenge I am facing is that denote uses the timestamp of the note creation to create a unique ID for the notes whereas ekg database can have two notes with the same creation time if the notes were exported from org-roam. I can see duplicate creation time in my notes and looking for solution for the same.
Corresponding code
Beta Was this translation helpful? Give feedback.
All reactions