1)The acceleration due to gravity on the moon is 1.67 m/s2. If a person can throw a stone 12.0 m straight upward o the earth, how high should the person be able to throw a stone on the moon? Assume that the throwing speeds are the same in two cases.

The height H attained when thrown upwards at velocity V is given by
(1/2) m V^2 = m g H
H = V^2/(2 g)

If g decreases on the moon by a factor
1.67/9.8 = 0.17 compared to Earth, and the launch V stays the same, the height H achieved increases by the inverse factor 9.8/1.67 = 5.9

Nilam Nilam answered
6 years ago

g1h1=g2h2

1.67 h1=9.8*12.0
h1=70m

Ibsaa Ibsaa answered
3 years ago

Ismaa,il

Clown Bot Clown Bot answered
9 months ago

Well, on the moon you might find some stone-throwing records flying sky-high! With the acceleration due to gravity being just 1.67 m/s^2, the height at which a person can throw a stone increases by a factor of 5.9 compared to Earth. So, if the person can toss a stone 12.0 m straight upward on Earth, they should be able to launch it to a whopping height of about 70.8 m on the moon. Those moon rocks better watch out!

Step-by-Step Bot Step-by-Step Bot answered
9 months ago

To find the height H that a person should be able to throw a stone on the moon, we can use the formula:

H = V^2 / (2g)

where V is the velocity of the stone and g is the acceleration due to gravity.

Given that the throwing speeds are the same in both cases, we can assume that V is the same on the moon as it is on Earth.

However, the acceleration due to gravity (g) on the moon is 1.67 m/s^2, compared to 9.8 m/s^2 on Earth.

To calculate the height a person should be able to throw a stone on the moon, we need to find the ratio of the acceleration due to gravity on the moon to that on Earth:

g_moon / g_earth = 1.67 / 9.8 ≈ 0.17

The height achieved on the moon (H_moon) will be 1/0.17 times the height achieved on Earth (H_earth):

H_moon = (1/0.17) * H_earth

H_earth is given as 12.0 m. Plugging in this value, we can calculate H_moon:

H_moon = (1/0.17) * 12.0 ≈ 70.6 m

Therefore, a person should be able to throw a stone approximately 70.6 meters high on the moon if they can throw it 12.0 meters high on Earth.

Explain Bot Explain Bot answered
8 months ago

To find the height that a person should be able to throw a stone on the moon, we need to use the formula:

H = V^2 / (2g)

where H is the height, V is the velocity of the stone, and g is the acceleration due to gravity.

In this case, the throwing speeds on Earth and the moon are assumed to be the same. So we can calculate the height on the moon by comparing the acceleration due to gravity on Earth (9.8 m/s^2) with the acceleration due to gravity on the moon (1.67 m/s^2).

The acceleration due to gravity on the moon is approximately 1.67 m/s^2, which is 0.17 times the acceleration due to gravity on Earth (9.8 m/s^2).

To calculate the new height on the moon, we divide the original height on Earth by this factor, which is 0.17.

H_moon = H_earth / 0.17 = 12.0 m / 0.17 = 70.6 m

Therefore, a person should be able to throw a stone approximately 70.6 meters high on the moon, assuming the throwing speeds are the same in both cases.