What if machine learning felt effortless?
etsi-etna is a minimalistic, dependency-light neural network library
designed to make training and evaluating models on structured data fast,
interpretable and beginner-friendly. It focuses on auto-preprocessing,
simple linear networks, and core metrics - ideal for research
prototyping, learning, and quick deployments.
Features • Installation • MVP Demo • Quickstart • Experiment Tracking
Machine learning libraries often force a trade-off: simplicity or speed. Etna removes that barrier.
- Blazing Fast: The heavy lifting (Linear layers, ReLU, Softmax, Backprop) is handled by a highly optimized Rust core (
etna_core). - Pythonic API: Users interact with a familiar, Scikit-learn-like Python interface.
- Arbitrary Depth: Etna now supports sequential multi-layer architectures.
- Stateful Optimization: Advanced optimizers like Adam maintain momentum and moment estimates across training sessions.
- Sequential Multi-Layer Architecture: Define any number of hidden layers using a simple list (e.g.,
[64, 32, 16]). - Hybrid Architecture:
pyo3bindings bridge Python ease with Rust performance. - Auto-Preprocessing: Automatic scaling and categorical encoding based on column types.
- Persistent Training: Save and load models while preserving weight values and optimizer states.
- Flexible MLflow Tracking: Single-line experiment tracking that is now safely optional for local-only use.
- Python (3.8 or later)
- Rust (1.70 or later)
Etna uses maturin to build the Rust extensions.
-
Clone the repository
git clone https://github.com/etsi-ai/etna.git cd etna -
Set up a Virtual Environment (Recommended)
python -m venv .venv # Activate the environment source .venv/bin/activate # Linux/macOS # .venv\Scripts\activate # Windows
-
Install dependencies & build
# Install build tools pip install -r requirements.txt # Build and install locally maturin develop --release
The best way to see Etna in action is to run our interactive MVP notebook. This notebook verifies your installation by performing an end-to-end test of the entire system.
It will automatically:
- Generate Dummy Data: Creates synthetic datasets for both classification and regression.
- Train Models: Trains the Rust backend on both tasks.
- Track Experiments: Logs loss curves and artifacts to a local MLflow server.
To run it:
jupyter notebook mvp_testing.ipynbIf you prefer to start coding immediately, here are the basics:
- Classification with Deep Architecture Etna now supports any network depth. The example below initializes a 4-layer network.
from etna import Model
# Initialize model with 3 hidden layers: 64 -> 32 -> 16
model = Model(
file_path="iris.csv",
target="species",
hidden_layers=[64, 32, 16],
activation="leaky_relu"
)
# Train with Adam optimizer
model.train(epochs=100, lr=0.01, optimizer="adam")
# Predict original class labels
predictions = model.predict()- Regression & Incremental Training Because optimizer states are persistent, you can resume training smoothly.
model = Model("housing.csv", target="price", task_type="regression")
# First training phase
model.train(epochs=50)
# Resume training: Adam momentum is preserved!
model.train(epochs=50)Etna includes native MLflow integration. To use it, simply provide your tracking URI when saving.
# Save locally AND log to MLflow in one step
model.save_model(
path="my_model_v1.json",
run_name="Deep_Run_01",
mlflow_tracking_uri="http://localhost:5000"
)What happens automatically:
- Model artifact saved to
my_model_v1.json - Parameters (
task_type,target) logged to MLflow - Training Loss history logged as metrics
- Artifacts uploaded to the MLflow run
View your dashboard by running mlflow ui in your terminal and visiting http://localhost:5000
Pull requests are welcome!
Please refer to CONTRIBUTING.md and CODE_OF_CONDUCT.md before submitting a Pull Request.
Connect with the etsi.ai team and other contributors on our Discord.
This project is distributed under the BSD-2-Clause License. See the LICENSE for details.
Built with ❤️ by etsi.ai