Search⌘ K
AI Features

Solutions

Learn how to solve various optimization problems by extending binary search to multiple variables, applying gradient ascent and Newton's method for maximization, implementing linear regression for function approximation, and using ternary search for bimodal functions. This lesson provides hands-on coding examples to deepen your understanding of key optimization algorithms.

Let’s see how to solve the exercises from the previous lessons.

Excercise 1: Extending binary search

In this case, we need to do two binary searches—one for each variable. That’s why it’s difficult to scale this algorithm to solve problems with many variables.

Let’s see the solution code:

Python 3.10.4
def f(x, y):
return x + y
def constraint(x, y):
return x + y > 1
def binary_search(a1, b1, a2, b2, f, cons, tol):
'''
Now we need two intervals, one for each variable
'''
while a1 < b1 and abs(b1 - a1) > tol:
m1 = (a1 + b1)/2
while a2 < b2 and abs(b2 - a2) > tol:
m2 = (a2 + b2)/2
if cons(m1, m2):
b2 = m2
else:
a2 = m2 + tol
if cons(m1, b2):
b1 = m1
else:
a1 = m1 + tol
return b1, b2
x, y = binary_search(-10, 10, -10, 10, f, constraint, 0.1)
print(f"x = {x}, y = {y}")

Feel free to experiment with other functions and constraints.

In this case, the function should be monotonic with respect to each of the variables. That is, if we fix one of them, then the resulting function should be monotonic.

The constraint should divide the entire (x,y)(x, y) ...