A surfer drives his dune buggy out into the sand dunes. He follows his compass 10 miles due north and then turns due west. If he ends up approximately 35 miles from where he started, how far west did he travel?

To solve this problem, we can use the Pythagorean theorem. The surfer travels 10 miles due north, forming the vertical leg of a right triangle. Let's denote the distance he traveled west as x, which will be the horizontal leg of the right triangle. The distance from where he started to where he ended up is the hypotenuse of the right triangle, which is 35 miles.

According to the Pythagorean theorem, the sum of the squares of the lengths of the legs of a right triangle is equal to the square of the length of the hypotenuse. Using this information, we can set up the following equation:

10^2 + x^2 = 35^2

Simplifying this equation gives us:

100 + x^2 = 1225

Subtracting 100 from both sides of the equation gives:

x^2 = 1125

Now we can solve for x by taking the square root of both sides:

x = √1125 ≈ 33.54

Therefore, the surfer traveled approximately 33.54 miles west.