On the moon, the acceleration due to gravity is only 1.6 m/s/s. If someone on earth can throw a baseball 24.2 m high, how high could she throw it on the moon?

On the Earth

h=v²/2g
On the Moon
H=v²/2g₁
H/h= v²2g/2g₁v²=g/g₁
H=h g/g₁=24.2•9.8/1.6= 148.225 m

To find out how high someone could throw a baseball on the moon, given the acceleration due to gravity on the moon, we can use the principles of projectile motion.

First, let's determine the time it takes for the baseball to reach its maximum height on Earth. We can use the following equation:

h = (v^2 * sin^2θ) / (2g)

Where:
h = height (24.2 m)
v = initial velocity (unknown)
θ = launch angle (we'll assume 45 degrees for simplicity)
g = acceleration due to gravity on Earth (9.8 m/s^2)

Rearranging the equation, we get:

v^2 = (2gh) / sin^2θ

Substituting the given values, we can solve for v:

v^2 = (2 * 9.8 * 24.2) / sin^2(45°)
v^2 = 467.432 / 0.5
v^2 = 934.864
v ≈ √934.864
v ≈ 30.571 m/s

Now that we know the initial velocity on Earth, we can find the time it takes to reach the maximum height. We can use the following equation:

t = v * sinθ / g

Substituting the values:

t = 30.571 * sin(45°) / 9.8
t ≈ 2.18 seconds

Now, we need to find the maximum height the baseball can reach on the moon. Since the acceleration due to gravity on the moon is 1.6 m/s^2, we can use the same equation to calculate the height:

h = (v * t * sinθ) - (0.5 * g * t^2)

Substituting the known values:

h = (30.571 * 2.18 * sin(45°)) - (0.5 * 1.6 * 2.18^2)
h ≈ 22.51 meters

Therefore, someone on the moon can throw a baseball approximately 22.51 meters high, assuming the same initial velocity and launch angle as on Earth.