-
Notifications
You must be signed in to change notification settings - Fork 218
[ENH] Implementation of TS2Vec #2825
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Thank you for contributing to
|
Testing with soft dependencies also needs to be isolated i.e. https://github.com/aeon-toolkit/aeon/blob/main/aeon/classification/feature_based/tests/test_tsfresh.py |
Thanks for pointing me in the correct direction. |
hi, thanks very much for this. I think we need to have a read of the paper, I'm not familiar myself, apologies if we are a little slow but if we introduce a new category of collection transformer we need to make sure we understand it. Are there other algorithms in this class that could follow on? |
from my initial look, it seems that this transformer outputs a multivariate time series of embeddings, so input a collection of shape (n_cases, n_channels, n_timepoints) to outputs (n_cases, n_embeddings, n_timepoints), is that correct? If so, the output type is not tabular, it is a collection of transformed time series. Have I understood correctly? Also, our estimators generally dont work with tensors. Could you format the output to by a 3D numpy array (n_cases, n_embeddings, n_timepoints),? I'd like to try this for classification, could be interesting. ah, having talked to hadi the collection output is an option for ts2vec, usual output is tabular. |
Waiting on the self supervised PR to get in then will review this, @TonyBagnall check the self supervised PR if you want to know more about how the structure of the module is gonna be for now (#2385 ) |
Hi. Just to clarify a few things about dimensions. I specifically fixed the TS2Vec implementation parameter to Example: from aeon.transformations.collection.contrastive_based import TS2Vec
from aeon.datasets import load_classification
for dataset in ['BasicMotions', 'Car']:
print(dataset)
X, y = load_classification(name=dataset)
transformer = TS2Vec(device="cuda", output_dim=320)
print('\tinput:', X.shape, type(X))
rtrn = transformer.fit_transform(X)
print('\toutput:', rtrn.shape, type(rtrn)) Output:
When #2385 is merged, I can move the model into |
hi @gasperpetelin #2385 is merged, it would be great if you could get this in |
Hi @TonyBagnall. I’ve now moved my code to the self_supervised module. However, I’m not sure why some tests are failing. It looks like they are trying to connect to |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Gonna make some general comments before starting to get into details with this:
- The setup we're following in self supervised is that the model should take a network as input parameter (see what TRILITE does) from the networks module of aeon (aeon/networks)
- The only thing the self supervised model should do is the training part, nothing related to the neural networks should be mentioned here.
- The same fit mechanism of TRILITE should be followed, meaning callbacks etc. (see fit function of TRILITE)
- Saving and loading of best model should be followed same as TRILITE
- The implementation for now is supporting tensorfow keras backend, when removing the network related stuff in the model implementaiton the fit function should look like the one in TRILITE, if the model uses a custom loss then it can be implemented as a private function of the class
- Most important thing is: keeping consistency with TRILITE
@hadifawaz1999 I just have 3 questions regarding the implementation:
|
The model should be re-implemented in tensorflow keras for the network and training strategy |
Regarding the TensorFlow reimplementation: I’ll give it a try, but to be honest, I have significantly more experience with PyTorch than with TensorFlow/Keras. Because of that, implementing TS2Vec in TensorFlow is not something I can do easily or quickly. TS2Vec has a non-trivial architecture and training procedure. Additionally, the two frameworks differ in several important aspects (for example, PyTorch's Just to confirm, TS2Vec would only be accepted as a TensorFlow/Keras implementation and not as a PyTorch version (similar to how HydraTransformer is handled)? |
Yes the implementaiton should be in tensorflow-keras. The network itself should be easilly adapted, i dont really believe a model will be impacted by the initialization of the weights if you had to use another method, because if a model is impacted its really a bad sign on the model itself. Padding exists in keras as well so there is no issue. Anything can be adapted easilly, there is no impossible way. aeon supports the use of torch in other modules, but the deep learning framework no it follows a tensorflow backend and keras for the networks module |
The first approach for implementing TS2Vec is still a work in progress.
Usage:
Reference Issues/PRs
Implements TS2Vec as discussed in #2753
What does this implement/fix? Explain your changes.
Implements TS2Vec
Does your contribution introduce a new dependency? If yes, which one?
No new dependencies. The code is mostly based on Pytorch from original author
Any other comments?
The original code has MIT licence
PR checklist
For all contributions
For new estimators and functions
__maintainer__
at the top of relevant files and want to be contacted regarding its maintenance. Unmaintained files may be removed. This is for the full file, and you should not add yourself if you are just making minor changes or do not want to help maintain its contents.For developers with write access