A plane leaves airport A and travels 560 miles to airport B at a bearing of N32E. The plane leaves airport B and travels to airport C 320 miles away at a bearing of S72E. Find the distance from airport A to airport C.

Once you make your diagram, it is easy to see that we can use the cosine law.

AC^2 = 560^2 + 320^2 - 2(320)(560)cos104°
= 502704.8074
AC = appr 709 miles

Well, that plane is certainly taking quite the scenic route! I hope they remembered to bring some snacks.

Now, to find the distance from airport A to airport C, we can break down the journey into two components: the north-south component and the east-west component.

From airport A to airport B, the north-south component is 560 miles times the sine of 32 degrees (since it's a bearing of N32E), which approximately equals 297.92 miles. The east-west component is 560 miles times the cosine of 32 degrees, which gives us approximately 475.86 miles.

From airport B to airport C, the north-south component is 320 miles times the sine of 72 degrees (since it's a bearing of S72E), which is approximately 300.04 miles. The east-west component is just 320 miles times the cosine of 72 degrees, which equals approximately 97.16 miles.

Now, to find the total north-south component, we add the two north-south components together: 297.92 miles + 300.04 miles = 597.96 miles.

To find the total east-west component, we add the two east-west components together: 475.86 miles + 97.16 miles = 573.02 miles.

Finally, we can use the Pythagorean theorem to find the distance from airport A to airport C. The square of the distance is equal to the sum of the squares of the two components:

(distance from A to C)² = (597.96 miles)² + (573.02 miles)²

(distance from A to C)² = 357,553.8116 square miles + 328,176.6404 square miles

(distance from A to C)² ≈ 685,730.452 square miles

(distance from A to C) ≈ √685,730.452

(distance from A to C) ≈ 828.27 miles

So, the distance from airport A to airport C is approximately 828.27 miles. Now that's one heck of a detour!

To find the distance from airport A to airport C, we need to break down the problem into components.

1. Find the x-component of the distance covered from airport A to airport B:
- Since the bearing is N32E, we need to find the x-component of the distance traveled.
- The angle between North and the bearing is 90 degrees.
- Therefore, the angle between the bearing and the positive x-axis is 32 degrees (since the bearing is N32E).
- The x-component of the distance is given by: distance_AB * cos(angle)
- distance_AB = 560 miles
- x-component_AB = 560 * cos(32)

2. Find the y-component of the distance covered from airport A to airport B:
- The angle between the bearing and the positive y-axis is 180 - angle (since the bearing is N32E).
- The y-component of the distance is given by: distance_AB * sin(angle)
- distance_AB = 560 miles
- y-component_AB = 560 * sin(32)

3. Find the x-component of the distance covered from airport B to airport C:
- Since the bearing is S72E, we need to find the x-component of the distance traveled.
- The angle between South and the bearing is 180 degrees.
- Therefore, the angle between the bearing and the positive x-axis is 180 - 72 = 108 degrees (since the bearing is S72E).
- The x-component of the distance is given by: distance_BC * cos(angle)
- distance_BC = 320 miles
- x-component_BC = 320 * cos(108)

4. Find the y-component of the distance covered from airport B to airport C:
- The angle between the bearing and the positive y-axis is 180 + angle (since the bearing is S72E).
- The y-component of the distance is given by: distance_BC * sin(angle)
- distance_BC = 320 miles
- y-component_BC = 320 * sin(108)

5. Find the total x-component of the distance from airport A to airport C:
- Total x-component = x-component_AB + x-component_BC

6. Find the total y-component of the distance from airport A to airport C:
- Total y-component = y-component_AB + y-component_BC

7. Find the distance from airport A to airport C using the distance formula, which is the square root of (total x-component^2 + total y-component^2).

To find the distance from airport A to airport C, we can break down the problem into two parts: finding the distance from airport A to airport B, and finding the distance from airport B to airport C.

First, we need to find the distance from airport A to airport B. The plane travels 560 miles at a bearing of N32E. To calculate the distance, we can use the Pythagorean theorem, which states that in a right triangle, the square of the hypotenuse is equal to the sum of the squares of the other two sides.

In this case, the distance from airport A to airport B is the hypotenuse of a right triangle, and we can use the following formula to find it:

Distance_AB = √(x² + y²)

To determine x and y, we can use trigonometry. Since the bearing is N32E, it means that the angle formed between the north direction and the line AB is 32 degrees. We can determine the horizontal and vertical components (x and y) based on this angle:

x = 560 * sin(32°)
y = 560 * cos(32°)

Plugging these values into the formula, we can calculate the distance from airport A to airport B.

Next, we need to find the distance from airport B to airport C. The plane travels 320 miles at a bearing of S72E. We can follow a similar process as before, using the given bearing to determine the horizontal and vertical components, and then calculating the distance using the formula.

Again, the bearing S72E forms an angle of 72 degrees between the south direction and the line BC. So we can determine the horizontal and vertical components:

x = 320 * sin(72°)
y = 320 * cos(72°)

Plugging these values into the formula, we can calculate the distance from airport B to airport C.

Finally, to find the total distance from airport A to airport C, we add the distance from airport A to airport B and the distance from airport B to airport C together.

Dist_AB_to_C = Distance_AB + Distance_BC

This will give us the final answer, which is the distance from airport A to airport C.