Solutions
Learn how to solve various optimization problems by extending binary search to multiple variables, applying gradient ascent and Newton's method for maximization, implementing linear regression for function approximation, and using ternary search for bimodal functions. This lesson provides hands-on coding examples to deepen your understanding of key optimization algorithms.
Let’s see how to solve the exercises from the previous lessons.
Excercise 1: Extending binary search
In this case, we need to do two binary searches—one for each variable. That’s why it’s difficult to scale this algorithm to solve problems with many variables.
Let’s see the solution code:
Feel free to experiment with other functions and constraints.
In this case, the function should be monotonic with respect to each of the variables. That is, if we fix one of them, then the resulting function should be monotonic.
The constraint should divide the entire ...