βββββββββββββββββββββββββββ ββββββ βββββββββ
βββββββββββββββββββββββββββββββββββββββββββββ
ββββββ βββ βββ ββββββββββββββββ βββ
ββββββ βββ βββ ββββββββββββββββ βββ
βββ βββ βββ βββ ββββββ βββ βββ
βββ βββ βββ βββ ββββββ βββ βββ
βββββββββββββββββββββββββββ ββββββ βββββββββ
βββββββββββββββββββββββββββββββββββββββββββββ
ββββββ βββ βββ ββββββββββββββββ βββ
ββββββ βββ βββ ββββββββββββββββ βββ
βββ βββ βββ βββ ββββββ βββ βββ
βββ βββ βββ βββ ββββββ βββ βββ
A brain that fires, learns, dreams, and evolves.
Pure C99. No frameworks. No matrices. No GPU required. Just biology, distilled into 4399 lines of code.
FΔ±trat (Turkish: innate nature) is a biologically-accurate spiking neural network that models the brain at the individual neuron level. Every neuron has DNA. Every synapse has a weight learned through spike-timing. The network sleeps, dreams, and evolves through natural selection. It learns language through pure neural dynamics β no backpropagation, no loss functions, no gradient descent.
Most "neural networks" are matrix multipliers wearing a neuroscience costume. FΔ±trat is not.
| Feature | Traditional NN | FΔ±trat |
|---|---|---|
| Processing unit | Float vector | Individual spiking neuron (8 bytes) |
| Learning rule | Backpropagation | Spike-Timing Dependent Plasticity (STDP) |
| Connectivity | Dense layers | 3D spatial hash β neurons connect to neighbors |
| Position encoding | Positional embeddings | Morton Z-order curves β position IS the ID |
| Memory | Weight matrices | Bloom-filtered synapse table (1GB filter β 99% I/O reduction) |
| Sleep | β | β REM replay + Tafakkur (contemplation) phases |
| Evolution | β | β Neurons reproduce, mutate, and die |
| Neuromodulation | β | β Valence Β· Arousal Β· Curiosity signals |
| Dependencies | PyTorch, CUDA, 50GB+ | gcc and libc. That's it. |
βββββββββββββββββββββββββββββββββββββββββββββββ
β F I T R A T B R A I N β
β β
STIMULUS βββββββΊβ βββββββββββ ββββββββββββ βββββββββββββ ββββββββΊ RESPONSE
"selam" β β INPUT β β HIDDEN β β OUTPUT β β "aleykum selam"
β β Z=0 ββββ Z=1-254 ββββ Z=255 β β
β β Layer β β Layers β β Layer β β
β βββββββββββ ββββββββββββ βββββββββββββ β
β β β β β
β βΌ βΌ βΌ β
β βββββββββββββββββββββββββββββββββββββββ β
β β SPATIAL HASH GRID β β
β β O(1) neighbor lookup via β β
β β Morton Z-order coordinates β β
β βββββββββββββββββββββββββββββββββββββββ β
β β β β β
β βΌ βΌ βΌ β
β ββββββββββββ ββββββββββββ ββββββββββββββ β
β β STDP β β BLOOM β β SYNAPSE β β
β β Learning β β FILTER β β TABLE β β
β β β β (1 GB) β β (hashmap) β β
β ββββββββββββ ββββββββββββ ββββββββββββββ β
β β
β ββββββββββββ ββββββββββββ ββββββββββββββ β
β β SLEEP β β NEURO- β β EVOLUTION β β
β β CYCLE β β MODULAT. β β birth/die β β
β β REM+CONT β β VΒ·AΒ·C β β reproduce β β
β ββββββββββββ ββββββββββββ ββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββ
Every simulation tick executes 9 stages in strict order:
ββββββββ βββββββββββββ ββββββββ βββββββββ ββββββββββββ
β FIRE βββββΊβ PROPAGATE βββββΊβ STDP βββββΊβ DECAY βββββΊβ NEUROMOD β
ββββββββ βββββββββββββ ββββββββ βββββββββ ββββββββββββ
β
ββββββββββββββ ββββββββββββ ββββββββββββ βββββββββββΌβββ
β CHECKPOINT ββββββ OUTPUT ββββββ EVOLUTIONββββββ SLEEP β
ββββββββββββββ ββββββββββββ ββββββββββββ ββββββββββββββ
Each stage is a pure function over the neuron array. No hidden state. No magic.
Every neuron is a packed 8-byte struct β smaller than a pointer on most systems:
typedef struct {
uint8_t flags; // type (excitatory/inhibitory) + alive + fired + plasticity + layer
uint8_t activation; // current membrane potential [0-255] β [0.0-1.0]
uint8_t threshold; // firing threshold [0-255] β [0.0-2.0]
uint8_t ema; // exponential moving average (homeostasis)
uint8_t modulator; // per-neuron plasticity multiplier
uint8_t dna_extra; // mutation_rate(4) + max_inactivity_class(2) + reserved(2)
uint16_t last_fire_dt; // ticks since last fire (STDP timing window)
} __attribute__((packed)) Neuron;
_Static_assert(sizeof(Neuron) == 8, "Neuron struct must be exactly 8 bytes");Every neuron has DNA. The dna_extra field encodes mutation rate and survival traits. When neurons reproduce, offspring inherit traits with mutations. Natural selection operates on the network itself.
Neuron positions aren't stored β they're computed from the ID using Z-order curve encoding. A neuron's ID is its 3D coordinate: X[12 bits] | Y[12 bits] | Z[8 bits]. Decoding costs 3 shifts + 3 masks (~5 ALU cycles). Zero memory overhead.
Real Hebbian learning: if neuron A fires before neuron B, strengthen AβB. If A fires after B, weaken it. The timing window is Β±40 ticks with exponential decay. Learned weights persist in a hash-map synapse table.
99% of synapses never change. A 1GB Bloom filter with 4 hash functions screens lookups before hitting the synapse table. False positive rate: ~0.05%. Result: 99% of propagation uses the default weight with zero table I/O.
The network has a circadian rhythm:
- REM Phase β Random replay of stored spike patterns. Memory consolidation through re-activation.
- Tafakkur Phase β Contemplation. Low-noise introspective processing. The network "thinks" without external input.
Every 10,000 ticks, the grim reaper visits:
- Inactive neurons die (their slot is freed).
- Active neurons with
can_reproduceflag spawn offspring with mutated DNA. - The network literally evolves its own topology.
Global brain-state signals modulate all processing:
- Valence β Positive/negative emotional charge
- Arousal β Alertness level (affects firing thresholds)
- Curiosity β Exploration drive (affects plasticity)
Bidirectional protocol between macrocolumn (float) and spiking (binary) representations. Follows Human Brain Project standards. Enables future multi-scale simulation.
Full brain state serialization. Save 50K+ neurons to disk and restore exactly. Training can be interrupted and resumed without loss.
git clone https://github.com/SovranAMR/fitrat.git
cd fitrat
makeRequirements: gcc with C99 support and a POSIX system. No other dependencies.
./fitrat2Trains through a progressive curriculum β from 2-letter echoes to full Turkish conversations. Watch the neural network learn in real-time:
[EPOCH 1] selam β selam β (dist=0)
[EPOCH 3] nasilsin β iyiyim β (dist=0)
[EPOCH 5] allah var mi β evet β (dist=0)
./fitrat2 --chatInteractive mode. Type a message, get a neural response. No lookup tables, no pattern matching β pure spike dynamics:
YOU: selam
REPLY: aleykum selam
YOU: nasilsin
REPLY: iyiyim
YOU: kimsin
REPLY: ben ali
| Metric | Value |
|---|---|
| Total source | ~4399 lines of C99 |
| Source files | 35 (.c + .h) |
| Neuron memory | 8 bytes each |
| External dependencies | 0 |
| Frameworks used | 0 |
| GPU required | No |
| Matrices multiplied | 0 |
| Backpropagation passes | 0 |
| Build time | < 2 seconds |
| Binary size | ~100 KB |
src/
βββ main.c # Training loop + interactive chat mode
βββ neuron.h # 8-byte packed neuron struct + DNA flags
βββ tick.c / tick.h # 9-stage tick pipeline orchestrator
βββ learning.c / .h # STDP + spike recording + competitive learning
βββ synapse_table.c/.h # Hash-map persistent synapse weights
βββ io.c / io.h # Text β spike encoding/decoding + populations
βββ grid.c / grid.h # Spatial hash grid for O(1) neighbor lookup
βββ morton.c / .h # Z-order curve position encoding
βββ bloom.h # 1GB Bloom filter for synapse screening
βββ evolution.c / .h # Neuronal birth, death, and reproduction
βββ sleep.c / .h # REM replay + tafakkur (contemplation)
βββ neuromod.c / .h # Valence Β· Arousal Β· Curiosity modulation
βββ poisson.c / .h # Macrocolumn β spike bridge (HBP standard)
βββ homeostasis.c / .h # Activity-dependent threshold regulation
βββ checkpoint.c / .h # Full brain state serialization
βββ hash.c / .h # FNV-1a + Murmur3 hash functions
βββ Makefile # Single `make` builds everything
"FΔ±trat" means innate nature in Turkish β the original disposition that every being is born with.
This project rejects the dominant paradigm of artificial neural networks. Instead of simulating the math that describes intelligence, it simulates the biology that produces it.
There is no loss function to minimize. There is no gradient to descend. There are only neurons β each one unique, each one alive, each one following the same simple rules that real neurons follow:
- If your input exceeds your threshold, fire.
- If you fire together, wire together.
- If you're inactive too long, die.
- If you're active enough, reproduce.
From these four rules, intelligence is not programmed β it emerges.
- Vulkan GPU compute for parallel propagation
- Multi-language curriculum (Turkish β English β Arabic)
- Persistent long-term memory across sessions
- Emotional response modulation
- Telegram bot interface
- Visual cortex β image spike encoding
- Multi-brain communication protocol
Built with bare metal and biological truth.
No frameworks were harmed in the making of this brain.
Created by SovranAMR