A pilot flies a plane 310 miles west and then 420 miles north, where he lands the plane. How far is the pilot from his starting point? Round your answer to the nearest mile. (1 point) Responses 283 miles 283 miles 365 miles 365 miles 522 miles 522 miles 619 miles

The pilot has traveled 310 miles west and 420 miles north. This forms a right triangle, with the distance from the starting point being the hypotenuse.

Using the Pythagorean theorem, we can calculate the distance:

Distance = √(310^2 + 420^2) ≈ √(96100 + 176400) ≈ √(272500) ≈ 522 miles

Therefore, the pilot is approximately 522 miles from his starting point.

Therefore, the answer is 522 miles.