A long-distance runner runs 2 miles south and then 4 miles east. How far is the runner from the

starting point? Round to nearest tenth.

a^2 + b^2 = c^2

2^2 + 4^2 = c^2

4 + 16 = c^2

20 = c^2

4.5 = c

thank you very much Ms. Sue

You're welcome, Leo.

To find the distance of the runner from the starting point, we can use the Pythagorean Theorem, which states that in a right triangle, the square of the length of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the lengths of the other two sides.

In this case, we can consider the runner's path as a right triangle, with 2 miles being the length of one side (south direction) and 4 miles being the length of the other side (east direction). We can determine the length of the hypotenuse, which represents the distance of the runner from the starting point.

Using the Pythagorean Theorem, we have:

Hypotenuse^2 = 2^2 + 4^2
Hypotenuse^2 = 4 + 16
Hypotenuse^2 = 20

To find the value of the hypotenuse, we need to take the square root of both sides of the equation:

Hypotenuse = √20
Hypotenuse ≈ 4.47 (rounded to the nearest tenth)

Therefore, the runner is approximately 4.47 miles away from the starting point.