Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimizing validate_element in datamodel.py for faster loading of large dmx files #16

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

Almie
Copy link

@Almie Almie commented Dec 20, 2024

I had a scenario where I was trying to load a relatively large dmx file using datamodel.py and it was taking a very long time. I noticed that validate_element is using an O(n) lookup for every element added during load, so I added some sets to convert this to O(1), unless validation fails.

Time comparison

In the case of my dmx file (SFM session, ~57MB), these were the load time before and after the change:

  • before: 1hr+ (I left it running for 1 hour and it's still not finished)
  • after: 10 sec

Let me know if you'd like me to make any adjustments.

@Artfunkel
Copy link
Owner

This is certainly a problem, thank you for identifying it. However these changes cause problems when modifying a datamodel because we never remove items from either set. I'll have to think more about a solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants