A surfer drives his dune buggy out into the sand dunes. He follows his compass 10 miles due north and then turns due west. If he ends up approximately 35 miles from where he started, how far west did he travel?

10 miles
25 miles
33.5 miles
35 miles

33.5 miles

To find out how far west the surfer traveled, we need to use the Pythagorean theorem since he traveled both north and west in a right-angle direction.

According to the question, the surfer traveled 10 miles due north and then turned due west. Let's assume the distance he traveled west is represented by x.

Thus, using the Pythagorean theorem:
(10^2) + (x^2) = (35^2)

Simplifying the equation:
100 + x^2 = 1225

Subtracting 100 from both sides:
x^2 = 1125

Taking the square root of both sides:
x = sqrt(1125) ≈ 33.54

Therefore, the surfer traveled approximately 33.54 miles west.

To find out how far west the surfer traveled, we need to make use of the Pythagorean theorem. The theorem states that in a right triangle, the square of the length of the hypotenuse (the longest side) is equal to the sum of the squares of the other two sides.

In this case, the distance the surfer travels due north can be considered as the vertical side of the right triangle, while the distance he travels due west is the horizontal side. The total distance from the starting point to the ending point is the hypotenuse.

Using the Pythagorean theorem, we can calculate the distance traveled due west as follows:

Distance due west = √(Hypotenuse^2 - Distance due north^2)

In this case, the hypotenuse is approximately 35 miles, and the distance due north is 10 miles. Plugging in these values, we have:

Distance due west = √(35^2 - 10^2)
= √(1225 - 100)
= √1125
≈ 33.54 miles (rounded to two decimal places)

Therefore, the surfer traveled approximately 33.54 miles due west. So the correct answer is 33.5 miles.