Let f be a differentiable function on [a,b ] and x belongs to[a,b]. Show that, if f'(x)=0 and f''(x)>0, then f must have a local maximum at x.
To prove: If "f'(x)=0" and "f''(x)>0" then "(x,f(x))" is local minimum of "f"
Proof:
If a function f is continuous on the interval [a, b] and "x\\in (a,b)" and f(x)>b where b is some constant real number , then for all c near x, f(c)>b
Assume this theorem were not true. Then the intermediate value theorem would not hold
and similarly for the case f(x)<a
Let f be a function on [a, b] and let "f''(x\\in (a,b))>0" Then by above statement it must be that for some open interval around x containing x, f''(x)>0
If f''(x)>0 for some interval around x containing x, and f'(x)=0, then for all c in the interval,
"c<x\\Rightarrow f'(c)<0\\\\c>x\\Rightarrow f'(c)>0"
because f'(x) is an increasing function in the interval
We now have that x is a local minimum by the definition of local minimum and can also be considered the variant for x being a local maximum.
Comments
Leave a comment