You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We want to do a proof-of-concept: test whether artificial recurrent neural networks (RNN) learn to predict the next tone when trying to perceive music in a noisy environment.
This is an accessible project that does not require an extensive coding expertise. Participants that want to contribute with coding will be recommended to do this in python, although data retrieval can be done in any other language. Participants who do not wish to contribute with code can contribute either with a) their musical expertise in Data Preprocessing and during the interpretation of the Testing, or b) their patience and web-browsing skills to manual data retrieval.
We will start by collecting a lot of Bach's compositions. We will transform each composition into a sequence of tokens: each token will be one note or several notes (i.e., a chord) that were meant to be played together in the original composition. For this proof-of-concept we will ignore information about the duration or loudness of each note/chord. For each sequence, we will generate two items: a ground-truth (the sequence itself), and a set of observations (the sequence plus some noise that will corrupt part of the information).
We will then present the observations to the RNN and train its weights so that, for each of the observation sequences, the RNN is able to extract the ground-truth sequence. In other words: we will train the RNNs to perceive the original musical sequence usign only the noisy observations.
After the training we will test whether the RNN is encoding, in its internal states, a set of predictions on the next token. If that is the case, we will conclude that the RNN is actively attempting to predict the next item to implement perception.
The results will help us generating hypotheses on how the brain decodes music. The neuroscientific community will be able to use these hypotheses to inform the design of experiments IRL brains.
More details and resources in the github repository
Web scrapping with python
Training RNNs with pytorch
Data to use
Resources on MIDI files: This website could be a good place to start. This other website has a lot of links with potentially better quality MIDI files. The better the quality of the MIDI, the easier the work of the preprocesser(s)!
Resources on data scrappers: I used this tutorial when I needed to implemet a scraper in the past. This other tutorial also looks informative and easy to follow.
Number of collaborators
3
Credit to collaborators
Contributors will be listed in the contributors textfile.
Image
Type
method_development
Development status
0_concept_no_content
Topic
machine_learning
Tools
other
Programming language
Python
Modalities
behavioral
Git skills
0_no_git_skills
Anything else?
No response
Things to do after the project is submitted and ready to review.
Add a comment below the main post of your issue saying: Hi @brainhackorg/project-monitors my project is ready!
Twitter-sized summary of your project pitch.
The text was updated successfully, but these errors were encountered:
Title
Bayes and Bach: An RNN-powered affair
Leaders
Alejandro Tabas (mattermost: @atabas)
Collaborators
No response
Brainhack Global 2024 Event
Brainhack Donostia
Project Description
We want to do a proof-of-concept: test whether artificial recurrent neural networks (RNN) learn to predict the next tone when trying to perceive music in a noisy environment.
This is an accessible project that does not require an extensive coding expertise. Participants that want to contribute with coding will be recommended to do this in python, although data retrieval can be done in any other language. Participants who do not wish to contribute with code can contribute either with a) their musical expertise in Data Preprocessing and during the interpretation of the Testing, or b) their patience and web-browsing skills to manual data retrieval.
We will start by collecting a lot of Bach's compositions. We will transform each composition into a sequence of tokens: each token will be one note or several notes (i.e., a chord) that were meant to be played together in the original composition. For this proof-of-concept we will ignore information about the duration or loudness of each note/chord. For each sequence, we will generate two items: a ground-truth (the sequence itself), and a set of observations (the sequence plus some noise that will corrupt part of the information).
We will then present the observations to the RNN and train its weights so that, for each of the observation sequences, the RNN is able to extract the ground-truth sequence. In other words: we will train the RNNs to perceive the original musical sequence usign only the noisy observations.
After the training we will test whether the RNN is encoding, in its internal states, a set of predictions on the next token. If that is the case, we will conclude that the RNN is actively attempting to predict the next item to implement perception.
The results will help us generating hypotheses on how the brain decodes music. The neuroscientific community will be able to use these hypotheses to inform the design of experiments IRL brains.
More details and resources in the github repository
Link to project repository/sources
https://github.com/qtabs/BayesPlusBach/
https://mattermost.brainhack.org/brainhack/channels/bh-donostia-2024--bayes-3-bach
Goals for Brainhack Global
Milestones:
Good first issues
Communication channels
https://mattermost.brainhack.org/brainhack/channels/bh-donostia-2024--bayes-3-bach
Skills
Python: all levels
Onboarding documentation
https://github.com/qtabs/BayesPlusBach/blob/main/CONTRIBUTING.md
What will participants learn?
Web scrapping with python
Training RNNs with pytorch
Data to use
Resources on MIDI files: This website could be a good place to start. This other website has a lot of links with potentially better quality MIDI files. The better the quality of the MIDI, the easier the work of the preprocesser(s)!
Resources on data scrappers: I used this tutorial when I needed to implemet a scraper in the past. This other tutorial also looks informative and easy to follow.
Number of collaborators
3
Credit to collaborators
Contributors will be listed in the contributors textfile.
Image
Type
method_development
Development status
0_concept_no_content
Topic
machine_learning
Tools
other
Programming language
Python
Modalities
behavioral
Git skills
0_no_git_skills
Anything else?
No response
Things to do after the project is submitted and ready to review.
Hi @brainhackorg/project-monitors my project is ready!
The text was updated successfully, but these errors were encountered: