Training Neural Networks #4
We’re going to talk about how neurons in a neural network learn by getting their math adjusted, called backpropagation, and how we can optimize networks by finding the best combinations of weights to minimize error.
We’re going to talk about how neurons in a neural network learn by getting their math adjusted, called backpropagation, and how we can optimize networks by finding the best combinations of weights to minimize error.
In our final episode of Crash Course AI, we're going to look towards the future.
Jabril tries to make an AI to settle the question once and for all.
We're going to talk about 5 common types of algorithmic bias we should pay attention to.