+27 votes
in Physics by
edited by

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+10 votes
by

To answer your question, let's assume we neglect air resistance and consider a simplified scenario where the Earth's surface is uniform and flat. In this case, the distance from the Earth's surface to the point where an object thrown horizontally hits the ground is determined by the Earth's curvature.

The Earth has a radius of approximately 6,371 kilometers. If you throw a ball horizontally with a certain initial velocity, it will follow a curved trajectory due to the Earth's gravitational pull. As the ball moves forward, the Earth curves away beneath it. The ball will eventually fall to the ground when the curvature of the Earth matches the downward curve of the ball's path.

The distance the ball would travel horizontally before hitting the ground depends on its initial velocity and the gravitational acceleration of Earth (9.8 meters per second squared). Assuming the initial velocity is relatively small, we can approximate the distance using a simplified formula derived from projectile motion:

d = (v^2 / g)

where: d = horizontal distance traveled v = initial horizontal velocity of the ball g = gravitational acceleration

Let's assume the initial velocity is around 30 meters per second (which is approximately 108 kilometers per hour or 67 miles per hour). Plugging this value into the formula:

d = (30^2 / 9.8) ≈ 91.8 meters

Therefore, in this simplified scenario, the ball would need to be approximately 91.8 meters above the Earth's surface before it hit the ground.

Welcome to Physicsgurus Q&A, where you can ask questions and receive answers from other members of the community.
...