To calculate the time it takes for the shell to reach the ground, we can analyze the horizontal and vertical components of its motion separately.
Given: Initial velocity (v0) = 800 m/s Launch angle (θ) = 50° Vertical displacement (Δy) = -150 m (negative because the ground is below the cliff)
First, let's calculate the time it takes for the shell to reach the highest point of its trajectory. The vertical component of the initial velocity can be calculated as:
v0y = v0 * sin(θ)
v0y = 800 m/s * sin(50°) v0y ≈ 617.21 m/s
Using the equation for vertical displacement:
Δy = v0y * t + (1/2) * g * t^2
Where: t = time of flight g = acceleration due to gravity (-9.8 m/s^2)
Plugging in the values:
-150 m = 617.21 m/s * t + (1/2) * (-9.8 m/s^2) * t^2
This is a quadratic equation. Rearranging and simplifying:
-4.9 t^2 + 617.21 t - 150 = 0
Solving this quadratic equation, we find two possible values for time. However, we are only interested in the positive value since time cannot be negative:
t ≈ 0.311 s
Now, to calculate the total time of flight, we double the time it takes to reach the highest point since the time to ascend is equal to the time to descend:
Total time of flight = 2 * t Total time of flight ≈ 2 * 0.311 s Total time of flight ≈ 0.622 s
Therefore, it will take approximately 0.622 seconds for the shell to reach the ground, 150 meters below the cliff.