Let f be a differentiable function on [α, β ] and x ∈[α, β ] .Show that, if
f ′(x) = 0 and , f ′′(x) >0 then f must have a local maximum at x.
"f''\\left( x \\right) >0\\Rightarrow f''\\left( t \\right) >0,t\\in \\left( x-\\varepsilon ,x+\\varepsilon \\right) ,\\varepsilon >0\\\\Taylor\u0091s\\,\\,formula\\,\\,for\\,\\,\\varDelta \\in \\left( -\\varepsilon ,\\varepsilon \\right) :\\\\f\\left( x+\\varDelta \\right) =f\\left( x \\right) +f'\\left( x \\right) \\varDelta +\\frac{1}{2}f''\\left( \\xi \\right) \\varDelta ^2,\\xi \\in \\left( x-\\varDelta ,x+\\varDelta \\right) \\subset \\left( x-\\varepsilon ,x+\\varepsilon \\right) \\Rightarrow \\\\\\Rightarrow f\\left( x+\\varDelta \\right) =f\\left( x \\right) +\\frac{1}{2}f''\\left( \\xi \\right) \\varDelta ^2\\geqslant f\\left( x \\right)"
Thus x is local minimum.
Comments
Leave a comment