-
-
Notifications
You must be signed in to change notification settings - Fork 0
Home

This release provides a comprehensive guide showcasing the power of the Julia language for solving machine learning tasks – from data processing to deep learning.
The guide is divided into two main sections, presented as interactive Jupyter notebooks.
This release is split into two major parts, each focusing on a key library within the Julia ecosystem.
This section consists of 6 (six) interactive notebooks designed to introduce you to the DataFrames.jl package. It focuses on solving specific problems to demonstrate how available functionalities can be applied to common data processing challenges.
This section is dedicated to Flux.jl, the primary deep learning library in Julia. You will learn:
-
Core components:
DenseandConvlayers, activation functions (relu,σ), optimizers (ADAM,Descent), and loss functions (crossentropy). -
Practical example: Building, training, and evaluating a neural network for MNIST image classification.
-
GPU acceleration: Instructions for speeding up computations using
CUDA.jl.
To ensure that the notebooks work correctly, please follow these steps:
-
Clone the entire repository:
git clone https://github.com/Cartesian-School/Julia-for-Machine-Learning.git
-
Navigate to the directory where you downloaded the files.
-
Launch Jupyter Notebook from this directory.
-
Open the notebooks from the section you're interested in and start your journey into machine learning with Julia!
🔧 Versions and Compatibility
The DataFrames.jl section has been tested with Julia 1.11.3 and DataFrames.jl 1.7.0 .
The Flux.jl section is compatible with the latest versions of Flux.jl and related packages.
You can check older commits and tags in this repository to find versions of the guide compatible with earlier package versions.
We hope this guide will be useful for the community. Thank you for your interest!