In the temperature range between 0°C and 700°C the resistance R [in ohms] of a certain platinum resistance thermometer is given by
R = 10 + 0.04124T − 1.779 × 10^(−5)T^2
where T is the temperature in degrees Celsius. Where in the interval from 0°C to 700°C is the resistance of the thermometer most sensitive and least sensitive to temperature changes? [Hint: Consider the size of dR/dT in the interval 0 ≤ T ≤ 700.].
As we know the resistance would be sensitive for we need to find:
"\\cfrac{dR}{dT}"
so from here we find that :
"\\cfrac{d}{dT}(10 + 0.04124T \u2212 1.779 \u00d7 10^{\u22125}T^2)"
"0.04124-2\\times1.779\\times10^{-5}T"
Now it is clear that the "\\cfrac{dR}{dT}" is decreasing
Now the Resistance would be most sensitive when :
"\\cfrac{dR}{dT}" is 0,, so for here that would be
"T=1159.07 \\ \\ ^oC"
it would be least sensitive for :
"\\cfrac{dR}{dT}" is minimum
and for the given range it would be for :
"T= 700 \\ \\ ^oC"
Comments
Leave a comment