A simple implementation of a feedforward neural network using Python and NumPy.
This project demonstrates the network learning the XOR problem from scratch, including forward pass, backpropagation, and gradient descent.
- Input Layer: 2 neurons
- Hidden Layer: 3 neurons, Sigmoid activation
- Output Layer: 1 neuron, Sigmoid activation
- Loss Function: Mean Squared Error
- Optimization: Gradient Descent
python neural_network.py