Exercises
Test what we’ve learned about optimization algorithms, SciPy, vectors, and matrices.
Exercise 1: Extending binary search
We learned how to generalize gradient descent and Newton’s method to deal with more variables, but what about binary search? Well, let’s discover it ourselves. Adapt the binary search method we saw in the first lesson of this section to solve a problem with two variables. Then, solve the following problem:
def f(x, y):return x + ydef constraint(x, y):return x + y > 1def binary_search(a1, b1, a2, b2, f, cons, tol):'''Now we need two intervals, one for each variable.'''# Remove the following line and complete the code.pass
Excercise 2: Maximizing
We stated that gradient descent and Newton’s method can be easily adapted to solve maximization problems. But what are the restrictions that the problem should fulfill so we can maximize it with these algorithms? Change the methods and solve the following problem with both of them:
import numpy as np# This would be a gradient ascent instead of a descentdef gradient_ascent(start, gradient, max_iter, learning_rate, tol=0.01):# Remove the following line and complete the codepass
Exercise 3: Implementing linear regression
Linear regression is one of the most popular machine learning algorithms. Given a function and a set of inputs and outputs of that function, we want to approximate it with a linear function as accurately as we can.
We call the original function and the approximation we’re looking for. The input is a vector ...