I'm an ML Engineer with 6 years of experience in R&D: Computer Vision and NLP.
Currently focusing on model optimization and building scalable ML / MLOps pipelines.
Learn more about my approach to AI and work experience on my cute little website.
-
Simplify the complex
Aim to explain challenging concepts in plain language, avoiding the dense jargon often found in academic papers. -
Avoid unnecessary work
Take the time to assess whether it's truly worth it and how to approach it in the most efficient way possible. -
Organize for success
Time spent organizing and maintaining clean code and workflows pays off not just in the long term but immediately. -
Respect technological boundaries
Strive to recognize the limits of technology and avoid playing the role of omnipotent creator. -
Stay humble
A little less ego goes a long way in achieving better results.
- Somatic Marker Hypothesis by Antonio Damasio
- LLM Inference Optimization
- PEFT Method Overview [implementing Adapters in PyTorch]
- Physical Symbol Systems and the Language of Thought
- Building a Transformer (Cross-Attention and MHA Explained)
-
Python,C++,Wolfram,LaTex,Git -
NumPy,Pandas,Matplotlib,Plotly -
PyTorch,Lightning,Huggitng Face libs,OpenCV -
MlFlow,DVC,Weights & Biases,Hydra,Optuna,Prometheus,Grafana -
Flask,FastAPI,Docker,CI/CD,AWS SageMaker,Gradio,Streamlit,vLLM
- Transformer Architectures Course.
Deep exploration of transformer-based architectures such as BERT, GPT, T5, and others.





