edited by
0 votes
0 votes

Assume that an ordinary mercury-in-glass thermometer follows first order dynamics with a time constant of $10\:s$. It is at a steady state temperature of $0^{\circ}C$. At time $t=0$, the thermometer is suddenly immersed in a constant temperature bath at $100^{\circ}C$. The time required (in $s$) for the thermometer to read $95^{\circ}C$, approximately is

  1. $60$
  2. $40$
  3. $30$
  4. $20$
edited by

Please log in or register to answer this question.

Answer: