if you drive 9


miles​ south, then make a left turn and drive 12

miles​ east, how far are​ you, in a straight​ line, from your starting​ point? Use the Pythagorean Theorem to solve the problem. Use a calculator to find square​ roots, rounding to the nearest tenth as needed.

Using the Pythagorean Theorem, we can find the distance in a straight line from the starting point:

c² = a² + b²

where a = 9 miles (the distance traveled south) and b = 12 miles (the distance traveled east).

c² = 9² + 12²
c² = 81 + 144
c² = 225
c = √225
c = 15

So the distance in a straight line from the starting point is 15 miles.

To find the distance from your starting point, we can use the Pythagorean theorem, which states that the square of the hypotenuse of a right triangle is equal to the sum of the squares of the other two sides.

In this case, we have traveled 9 miles south and then made a left turn and traveled 12 miles east.

Let's consider the distance traveled south as the height of the triangle and the distance traveled east as the base of the triangle.

Using the Pythagorean theorem, we can calculate the distance from the starting point as follows:

Distance^2 = 9^2 + 12^2

Simplifying this equation:

Distance^2 = 81 + 144

Distance^2 = 225

To find the distance, we take the square root of both sides:

Distance = √225

Using a calculator, the square root of 225 is 15.

Therefore, you are approximately 15 miles from your starting point.