I have this homework. But i don't know how to do it.
We did it with a for
loop.
f=lambda x: (x**2)-27
fd=lambda x:x*2
a=50
for i in range(1,100):
a = a - (f(a)/fd(a))
print(i, ". root in step - ", a)
I think we are asked to turn this into a while
loop and do it.
The
while
loop should keep going until the delta between each step is smaller than the precision of a floating point number:This outputs:
Demo: https://ideone.com/nAmHYw