-
Notifications
You must be signed in to change notification settings - Fork 4
Open
Open
Copy link
Labels
enhancementNew feature or requestNew feature or request
Description
Feature Request
It would be valuable to have the option to slightly modify an existing tensor network and reuse a previously optimized tensor contraction for the lightly modified tensor network. This would significantly reduce computational costs when experimenting with minor changes to the network structure, rather than needing to fully re-optimize the contraction from scratch every time.
Use case:
- Users working with tensor networks often need to make small changes (such as adjusting a few bonds or reshaping tensors) and would benefit from incremental optimization strategies.
- Reusing contraction paths or cached optimization results for similar networks can speed up research and prototyping.
Impact:
- Improves performance for iterative tensor network development.
- Reduces computational and memory costs.
- Enhances usability for research workflows involving tensor network variants.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request