Create readme.md
Browse files
readme.md
ADDED
|
@@ -0,0 +1,71 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Chaos Classifier: Logistic Map Regime Detection via 1D CNN
|
| 2 |
+
|
| 3 |
+
This model classifies time series sequences generated by the **logistic map** into one of three dynamical regimes:
|
| 4 |
+
|
| 5 |
+
- `0` β Stable (converges to a fixed point)
|
| 6 |
+
- `1` β Periodic (oscillates between repeating values)
|
| 7 |
+
- `2` β Chaotic (irregular, non-repeating behavior)
|
| 8 |
+
|
| 9 |
+
The goal is to simulate **financial market regimes** using a controlled chaotic system and train a model to learn phase transitions directly from raw sequences.
|
| 10 |
+
|
| 11 |
+
---
|
| 12 |
+
|
| 13 |
+
## Motivation
|
| 14 |
+
|
| 15 |
+
Financial systems often exhibit regime shifts: stable growth, cyclical trends, and chaotic crashes.
|
| 16 |
+
This model uses the **logistic map** as a proxy to simulate such transitions and demonstrates how a neural network can classify them.
|
| 17 |
+
|
| 18 |
+
---
|
| 19 |
+
|
| 20 |
+
## Data Generation
|
| 21 |
+
|
| 22 |
+
Sequences are generated from the logistic map equation:
|
| 23 |
+
|
| 24 |
+
\[
|
| 25 |
+
x_{n+1} = r \cdot x_n \cdot (1 - x_n)
|
| 26 |
+
\]
|
| 27 |
+
|
| 28 |
+
Where:
|
| 29 |
+
- `xβ β (0.1, 0.9)` is the initial condition
|
| 30 |
+
- `r β [2.5, 4.0]` controls behavior
|
| 31 |
+
|
| 32 |
+
Label assignment:
|
| 33 |
+
- `r < 3.0` β Stable (label = 0)
|
| 34 |
+
- `3.0 β€ r < 3.57` β Periodic (label = 1)
|
| 35 |
+
- `r β₯ 3.57` β Chaotic (label = 2)
|
| 36 |
+
|
| 37 |
+
---
|
| 38 |
+
|
| 39 |
+
## Model Architecture
|
| 40 |
+
|
| 41 |
+
A **1D Convolutional Neural Network (CNN)** was used:
|
| 42 |
+
|
| 43 |
+
- `Conv1D β BatchNorm β ReLU` Γ 2
|
| 44 |
+
- `GlobalAvgPool1D`
|
| 45 |
+
- `Linear β Softmax (via CrossEntropyLoss)`
|
| 46 |
+
|
| 47 |
+
Advantages of 1D CNN:
|
| 48 |
+
- Captures **local temporal patterns**
|
| 49 |
+
- Learns **wave shapes and jitters**
|
| 50 |
+
- Parameter-efficient vs. MLP
|
| 51 |
+
|
| 52 |
+
---
|
| 53 |
+
|
| 54 |
+
## Performance
|
| 55 |
+
|
| 56 |
+
Trained on 500 synthetic sequences (length = 100), test accuracy reached:
|
| 57 |
+
|
| 58 |
+
- **98β99% accuracy**
|
| 59 |
+
- Smooth convergence
|
| 60 |
+
- Robust generalization
|
| 61 |
+
- Confusion matrix showed near-perfect stability detection and strong chaos/periodic separation
|
| 62 |
+
|
| 63 |
+
---
|
| 64 |
+
|
| 65 |
+
## Inference Example
|
| 66 |
+
|
| 67 |
+
You can generate a prediction by passing an `r` value:
|
| 68 |
+
|
| 69 |
+
```python
|
| 70 |
+
predict_regime(3.95, model, scaler, device)
|
| 71 |
+
# Output: Predicted Regime: Chaotic (Class 2)
|