Let f be a differentiable function on [alpha and beta ] and x =[alpha and beta]. Show that, if f '(x) =0 and f''(x) > 0, then f must have a local maximum at x.
Suppose "f" has a local maximum at "x_0 \u2208 (\\alpha, \\beta)" . For small (enough) h, "f(x_0 + h) \u2264 f(x0)."
If "h > 0" then
"\\dfrac{f(x_0 + h) \u2212 f(x_0)}{ h} \u2264 0."
Similarly, if "h < 0" , then
"\\dfrac{f(x_0 + h) \u2212 f(x_0)}{ h} \u2265 0."
By elementary properties of the limit, it follows that "f'(x_0) = 0."
Comments
Leave a comment