The phrasing is maybe slightly confusing — The question is not about Newton-Raphson, but about fixed point iteration. In the exercise on Newton-Raphson for fixed point iteration, we interpret it as an optimization problem. The question wants you to solve the same optimization problem using gradient descent on that objective.

## Q4 2018

Hi,

I don't understand why is the first answer correct ( I would have chosen the last one ), where can I find this on the notes/slides?

Thanks

Hey, this question is based on the exercise on fixed-point iteration (https://github.com/epfml/OptML_course/blob/master/labs/ex07/solution/Lab%207%20-%20Fixed%20Point%20with%20Newton.ipynb).

The phrasing is maybe slightly confusing — The question is not about Newton-Raphson, but about fixed point iteration. In the exercise on Newton-Raphson for fixed point iteration, we interpret it as an optimization problem. The question wants you to solve the same optimization problem using gradient descent on that objective.

## 1

## Add comment