Optimizing validate_element in datamodel.py for faster loading of large dmx files #16
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
I had a scenario where I was trying to load a relatively large dmx file using datamodel.py and it was taking a very long time. I noticed that validate_element is using an O(n) lookup for every element added during load, so I added some sets to convert this to O(1), unless validation fails.
Time comparison
In the case of my dmx file (SFM session, ~57MB), these were the load time before and after the change:
Let me know if you'd like me to make any adjustments.