The luminosity \(L\) (total power output in watts, W) of a star is given by the formula \[ L=L( R,T) =4\pi R^{2}\sigma T^{4} \]
where \(R\) is the radius of the star (in meters), \(T\) is its effective surface temperature (in kelvin, K), and \(\sigma\) is the Stefan–Boltzmann constant. For the sun, \(L_{s}=(3.90\times\;10^{26})\) \(\rm{W}\), \(R_{s}=(6.94\times\;10^{8})\;\rm{m}\) , and \(T_{s}=4800\;\rm{K}\). Suppose in a billion years, the changes in the Sun will be \(\Delta R_{s}=(0.08\;\times\;10^{8})\;\rm{m}\) and \(\Delta T_{s}=100\;\rm{K}\). What will be the resulting percent increase in luminosity?
The relative error in luminosity is \[ \begin{eqnarray*} \frac{\Delta L}{L}&\approx& \frac{dL}{L}=\dfrac{8\pi \sigma RT^{3}}{4\pi R^{2}\sigma T^{4}}\left( TdR+2RdT\right) =2\left( \frac{dR}{R}+2\frac{dT}{T} \right) =2\frac{\Delta R}{R}+4\frac{\Delta T}{T}\\[4pt] &=&\frac{2(0.08\times 10^{8})}{6.94\times 10^{8}}+\frac{4(100)}{4800}\approx 0.106 \end{eqnarray*} \]
The percent increase in luminosity will be approximately \(10.6\%\).