+113 votes
in Physics of Everyday Life by
edited by

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+97 votes
by

To determine the time it takes for the object released from the airplane to hit the ground, we need to consider the vertical motion of the object.

Given: Height of the airplane (h) = 1000 meters Speed of the airplane (v) = 120 kilometers per hour

First, we need to convert the speed from kilometers per hour to meters per second, as the standard unit of time is seconds: Speed of the airplane (v) = 120 km/h = (120 * 1000) meters / (60 * 60) seconds = 33.33 meters/second (rounded to two decimal places)

We can use the equation for free-fall motion: h = (1/2) * g * t^2

Where: h is the height (1000 meters) g is the acceleration due to gravity (approximately 9.8 m/s^2) t is the time it takes for the object to hit the ground (unknown)

Rearranging the equation to solve for time (t): t = sqrt((2 * h) / g)

Substituting the values: t = sqrt((2 * 1000) / 9.8) ≈ sqrt(204.08) ≈ 14.28 seconds (rounded to two decimal places)

Therefore, the object will hit the ground approximately 14.28 seconds after it is released from the airplane.

Welcome to Physicsgurus Q&A, where you can ask questions and receive answers from other members of the community.
...