In the temperature range between 0C and 7000C the resistance R [in ohms] of a certain platinum resistance thermometer is given by
R = 10 + 0.04124T − 1.779 × 10−5T2
where T is the temperature in degrees Celsius. Where in the interval from 0C to 700C is the resistance of the thermometer most sensitive and least sensitive to temperature changes? [Hint: Consider the size of dR/dT in the
interval 0 ≤ T ≤ 700.].
Sensitivity describes the smallest absolute amount of change that can be detected by a measurement. So, the thermometer is most sensitive at minimal |dR/dT|.
"\\frac{dR}{dt}=0.04124-3.558\\cdot10^{-5}T"
"\\frac{dR}{dt}=0" at "T=0.04124\/3.558\\cdot10^{-5}=1.159\\cdot10^{-3}\\ \\degree C"
"\\frac{dR}{dt}(0)=0.04124"
"\\frac{dR}{dt}(700)=0.04124-3.558\\cdot10^{-5}\\cdot700=0.16334"
So, the resistance of the thermometer is most sensitive at "T=1.159\\cdot10^{-3}\\ \\degree C" and is least sensitive at "T=700\\degree C"
Comments
Leave a comment