+301 votes
in Speed by
edited by

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+10 votes
by

To determine the time it takes for the ball to reach the ground when thrown horizontally from an 80 m cliff with an initial speed of 10 m/s, we can use the kinematic equation for vertical motion:

d=12gt2d = frac{1}{2} g t^2

where:

  • d is the vertical distance traveled (80 m in this case),
  • g is the acceleration due to gravity (approximately 9.8 m/s^2),
  • t is the time.

In this scenario, the initial vertical velocity (v₀) is 0 because the ball is thrown horizontally. Since the initial vertical velocity is zero, the equation simplifies to:

d=12gt2d = frac{1}{2} g t^2

Rearranging the equation to solve for time (t), we get:

t=2dgt = sqrt{frac{2d}{g}}

Substituting the given values, we have:

t=2⋅809.8t = sqrt{frac{2 cdot 80}{9.8}}

Solving this equation, we find:

t≈4.04 secondst approx 4.04 ext{ seconds}

Welcome to Physicsgurus Q&A, where you can ask questions and receive answers from other members of the community.
...