Contact information
email: elenazoppellari@gmail.com
Linkedin: Elena Zoppellari
Hello world! π¦
I am a Physics graduate who has developed a fascination for artificial intelligence, in particular for computer vision and natural language processing.
In my page you will find some of the projects and works that I have developed during my MSc in Physics of Data @ University of Padova:
π§ Reproducing Neuron Dynamics with Highly Structured and Trained Chaotic Random RNN Models
This project explores how persistent neural activity, firing that continues after a stimulus is removed, can be reproduced through both three-population structured models (L excitatory, R excitatory and Inhibitory neurons) and trained chaotic recurrent neural networks (RNNs) via FORCE Learning.
π Reports for Numerical Methods in Soft Matter
Topics: Monte Carlo and Molecular Dynamics simulation methods, including random sampling techniques, Markov Chains, Ising model simulations, advanced sampling algorithms and Langevin dynamics.
π§ Estimation and Analysis of Mutual Information in a Recurrent Neural Network
We conducted an analysis on the mutual information between the layers of a recurrent neural network for a cognitive task, implementing various estimators and particularly effectively adapting the estimator proposed by Kolchinsky et al. (2017). We also studied the role of individual neurons in the process, observing differentiated behaviors in terms of intensity and activation times.
ποΈ Simulating Real-World Challenges: Blind Face Restoration and Upscaling
We adapted the progressive GAN by Kim et al. (2019) for the reconstruction of blurred faces under non-deterministic conditions. Specifically, we designed an encoder and introduced a dynamic training strategy with an attention loss based on geometric prior. The results show a significant improvement in the realism of the images, reducing FID, LPIPS, and NIQE.
π΄ Exercises for Physical Models for Living Systems
Homework 1-3: Focus on ecology modeling, covering topics such as linear stability analysis, quasi-stationary approximation, and matrix eigenvalue calculations for different ecological structures.
Homework 4-7: Address theoretical neuroscience, involving tasks such as generating spike trains for neurons, stability analysis of the Wilson-Cowan model, and simulations related to feedback loops in genetic networks.
πββοΈ Mini-Batch K-means on RCV1 dataset using Dask
Together with my team, I implemented Mini-Batch K-Means on the vast RCV1 dataset of over 250GB, with over 800k articles each with 50k features, using Dask in Python for parallelizing the process and three Virtual Machines on CloudVeneto. The results, evaluated through the analysis of the Dask dashboard and execution time, guided the optimization of the configuration for optimal performance
βοΈ BiLSTM vs BERT in feature extraction for Neural Dependency Parsing
In this project, my team implemented two feature extractors to improve an ArcEager parser: a BiLSTM, achieving a UAS of 82%, and a fine-tuning of BERT, with a UAS of 85%, compared to a SoTA value of 93%. For the BERT model, an analysis of the predicted moves is provided, observing better performance for shift and reduce.
π° Financial Mathematics Reports
Topics: pricing a Call Option using the Binomial Model; recovering Implicit Dividends from Option Prices using Call-Put parity and the Box Spread Strategy; computing Option Prices using binomial, Leisen-Reimer and Black-Scholes models; the dependencies of Greek letters on stock value, maturity time, volatility and the rate of dividend yield; implied volatility smile, swek and Greek letters, Monte Carlo simulations on Vanilla options and exotic, path-dependent options such as Asian, Lookback and Barrier options.
π¨ Standard GAN vs WGAN for Image Colorization
In this project, my team implemented the conditional GAN pix2pix (Isola et al., 2016) on Pytorch for coloring black-and-white images created from Tiny-ImageNet-200. The results were compared in the case of training using the Wasserstein loss.
π€ Basic Machine Learning Exercises
Topics: manual implementation of perceptron, logistic regression using scikit-learn, linear regression using scikit-learn, Support Vector Machines, Image classification using Neural Networks.
π¬ Multinomial Naive Bayes for Fake News Classification
We developed a Multinomial Naive Bayes in R for fake news classification with both 2 (true/false) and 5 classes, achieving an accuracy of 92.51% in the first case and 23.77% in the second. A reliability test of the predictions was also conducted using Bayes' theorem and the rate of true and false positives and negatives.
π§ Rainfall-runoff modeling using Deep Learning
Together with the team, I developed an LSTM to provide predictions on the hydrological basins of the CAMELS dataset using Keras in Python. We attempted to add an encoder to the model, which proved to perform better than the LSTM alone by leveraging the statistical information from the input. The project was supervised by Professor Carlo Albert from the Swiss Federal Institute of Aquatic Science and Technology.
π§ Exercises in R for Advanced Statistics course
The exercises cover both basic and advanced statistic topics. Developed in R language.
π Hierarchical mergers of binary black holes
Using both plotting and a simple machine learning algorithm, a random forest, we determined that the features that have the greatest impact on the fate of a binary black holes in nuclear star clusters, globular clusters and young star clusters are: black hole masses, spin magnitudes, escape velocities, total masses of the star clusters and the number of generations involved.