Level:
Project ID:
1003148602
Accepted:
1
Clonable:
0
Easy:
0
Consider an object thrown at an angle of \( 30^{\circ} \) above the horizontal with the initial velocity of \( 40\frac{\mathrm{m}}{\mathrm{s}} \). How long does it take for the object to reach its maximum height?
\[ \]
Note: The height \( y \) of an object thrown is described by the formula \( y=v_0t\sin\alpha-\frac12gt^2 \), where \( v_0 \) is the initial velocity, \( g \) is gravitational acceleration (count with the rounded value \( 10\frac{\mathrm{m}}{\mathrm{s}^2}\)), \( t \) is the time period of the object motion in seconds, and \( \alpha \) is the angle to the horizontal at which the object is thrown.
\( 2\,\mathrm{s} \)
\( 4\,\mathrm{s} \)
\( 8\,\mathrm{s} \)
\( 1\,\mathrm{s} \)