If the distance from the Earth to the moon is 3.84 x 105 km. How long would it take a radio wave with wavelength 413.8 m to travel from Earth to the moon?

C=(lanbda)v & E=(h)v

You don't need either of these equations.

The speed of the radio wave is 3E8 m/s. You have a distance of 3.85E8 meters (note I converted from km to m).
distance = rate x time.
You have distance, you have rate (3E8 m/s), solve for time in seconds.

To find the time it takes for a radio wave to travel from Earth to the moon, we can use the formula:

time = distance / speed

where distance is the distance between Earth and the moon, and speed is the speed of the radio wave.

Given:
Distance = 3.84 × 10^5 km (convert to meters: 3.84 × 10^8 m)
Wavelength (λ) = 413.8 m

First, we need to calculate the speed of the radio wave. The speed of light in a vacuum is approximately 3 × 10^8 m/s. Since radio waves and light waves are both electromagnetic waves, we can use the same speed for radio waves.

Next, we can find the time it takes for the radio wave to travel using the formula:

time = distance / speed
time = (3.84 × 10^8 m) / (3 × 10^8 m/s)

Now, let's calculate the time it takes for the radio wave to travel from Earth to the moon:

time = (3.84 × 10^8 m) / (3 × 10^8 m/s)
time ≈ 1.28 seconds

Therefore, it would take approximately 1.28 seconds for a radio wave with a wavelength of 413.8 m to travel from Earth to the moon.