Let a metallic conductor having a resistance of Ro at 0°C heated of t°C and let its resistance at this temperature be Rt. Then, considering normal ranges of temperature, it is found that the increase in resistance depends on,
(1) directly on its initial resistance
(2) directly on the rise in temperature
(3) on the nature of the material of the conductor.
(2) directly on the rise in temperature
(3) on the nature of the material of the conductor.
or, _ _ _ _ _ _ _ (1)
where (alpha) is a constant and is known as the temperature coefficient of resistance of the conductor.
Rearranging Eq. (1), we get
if ,
Hence, the temperature-coefficient of a material may be defined as : the increase in resistance per ohm original resistance per °C rise in temperature.
From Eq. (1), we find that
Temperature Coefficient of Electrical Resistance |
It should be remembered that the above equation holds good for both rise as well as fall in temperature.
As temperature of a conductor is decreased. its resistance is also decreased. In Fig. 1 is shown the temperature/resistance graph for copper and is practically a straight line. If this line is extended backwards, it would cut the temperature axis at a point where temperature is - 234.5°C (a number quite easy to remember). It means that theoretically, the resistance of copper conductor will become zero at this point though as shown by solid line, in practice, the curve departs from a straight line at very low temperatures. From the two similar triangles of Fig. 1 it is seen that :
Where α=1/234.5 for copper.
Nice I appreciate it
ReplyDelete