Radio signals travel at a rate of 3*10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 4.2*10^7 meters? (Hint: Time is distance divided by speed.)

a. 1.26*10^16 seconds
b. 1.26*10^15 seconds***
c. 1.4*10^1 seconds
d. 1.4*10^-1 seconds

Your answer would be -1, because you are subtracting 8 from 7. so it is D.

Your answer would be -1, because you are subtracting 8 from 7.

To find the time it takes for a radio signal to travel from a satellite to the surface of Earth, we need to use the formula: time = distance / speed.

Given:
Speed of radio signals = 3 * 10^8 meters per second
Height of satellite = 4.2 * 10^7 meters

To find the distance, we need to consider the total distance traveled by the radio signal, which is the distance from the satellite to the surface of the Earth.

Distance = Height of satellite = 4.2 * 10^7 meters

Now, substitute the values into the formula:

Time = Distance / Speed
Time = (4.2 * 10^7 meters) / (3 * 10^8 meters per second)

Next, simplify the equation by canceling out units:

Time = (4.2 * 10^7) / (3 * 10^8) seconds

To divide these numbers, divide the coefficients and subtract the exponents:

Time = (1.4) * (10^7 - 8) seconds
Time = 1.4 * 10^(-1) seconds

Therefore, the correct answer is d. 1.4 * 10^(-1) seconds.

a and b are clearly wrong. You have multiplied instead of dividing. Read the hint.

When you divide, you subtract exponents, not add.