A resistor's Temperature Coefficient of Resistance (TCR) tells how much its value changes as its temperature changes. It is usually expressed in $ppm/℃$ (parts per million per degree Centigrade) units. What does that really mean?
Let's use an example: Riedon's $50 Ω$ resistor has a (standard) TCR of $20 ppm/ohm/℃$. That means its resistance will not change more than $0.000020$ ohms ($20.1,000,000$) per ohm per degree Centigrade temperature change (within the rated temperature range of $-55$ to $+145℃$, measured from $25℃$ room temperature.)
Assume our resistor is in a product that heats up from room temperature to $50℃$. To find our $50W$ resistor's (maximum) change caused by that $25℃$ rise, multiply 20ppm times $50Ω$ times $25$ (the temperature change.)
$$0.000020 \times 50 \times 25 = 0.025W.$$
For resistors, a $5000 PPM$ level is considered high, whereas $20 PPM$ is quite low. A $1000 PPM/℃$
characteristic reveals that a $1^{\circ}$ change in temperature results in a change
in resistance equal to $1000 PPM$, or
$${1000\over 1,000,000}= {1 \over 1000}$$
of its nameplate value-not a significant change for most applications. However,
a $10℃$ change results in a change equal to $1/100$ ($1%$) of its nameplate
value, which is becoming significant.
In equation form, the change in resistance is given by
$$\Delta R = {R_{nominal} \over 10^6} (PPM)(\Delta T) $$
where $R_{nominal}$ is the nameplate value of the resistor at room temperature
and $\Delta T$ is the change in temperature from the reference level of $20℃$.
Do you want to say or ask something?