Your browser doesn't support the features required by impress.js, so you are presented with a simplified version of this presentation.
For the best experience please use the latest Chrome, Safari or Firefox browser.
if-else? Usually very hard.
cats
Machine Learning.
I trained a machine learning model. What is training?
def train(images, labels):
# machine learning!
return model
def predict(model, test_images):
# use model to predict labels
return test_labels
def train(images, labels):
# machine learning!
return model
def predict(model, test_images):
# use model to predict labels
return test_labels
# Vanilla Gradient Descent
weights = random_init() # e.g., sample a vector from Gaussian distribution
while True:
weights_grad = evaluate_gradient(loss_fun, data, weights)
weights += -step_size * weights_grad # perform parameter update
# Vanilla Stochastic Gradient Descent
weights = random_init() # e.g., sample a vector from Gaussian distribution
while True:
data_batch = sample_training_data(data, 256) # sample 256 examples
weights_grad = evaluate_gradient(loss_fun, data_batch, weights)
weights += -step_size * weights_grad # perform parameter update
Here’s a popular story about momentum [1, 2, 3]: gradient descent is a man walking down a hill. He follows the steepest path downwards; his progress is slow, but steady. Momentum is a heavy ball rolling down the same hill. The added inertia acts both as a smoother and an accelerator, dampening oscillations and causing us to barrel through narrow valleys, small humps and local minima.