Let f be a differentiable function on [a,b ] and x belongs to[a,b]. Show that, if f'(x)=0 and f''(x)>0, then f must have a local maximum at x.
To prove: If and then is local minimum of
Proof:
If a function f is continuous on the interval [a, b] and and f(x)>b where b is some constant real number , then for all c near x, f(c)>b
Assume this theorem were not true. Then the intermediate value theorem would not hold
and similarly for the case f(x)<a
Let f be a function on [a, b] and let Then by above statement it must be that for some open interval around x containing x, f''(x)>0
If f''(x)>0 for some interval around x containing x, and f'(x)=0, then for all c in the interval,
because f'(x) is an increasing function in the interval
We now have that x is a local minimum by the definition of local minimum and can also be considered the variant for x being a local maximum.
Comments