1 pointby cagdas_ucar4 hours ago1 comment
  • cagdas_ucar4 hours ago
    Most neural networks have a fixed architecture you design upfront, then train with backprop. This one doesn't. You feed it raw sequential data — stock prices, text, sensor streams — and it creates neurons and patterns as needed, driven entirely by prediction errors.

    The core idea: every neuron is a prediction machine. When predictions fail, new patterns form at higher levels to capture the context that was missed. Voting across all active neurons produces the final prediction, weighted by abstraction level. No gradient descent, no loss functions.

    I've been building this for over a year, testing on 5 years of stock data across 100 stocks and on text sequences. The stock channel achieves profitable trading through reward-weighted action selection; the text channel memorizes sequences to 100% accuracy in ~5 episodes.

    Node.js, Apache 2.0.