an aeroplane leaves an airport, flies due north for 2hrs at 500km/h. It then flies on a bearing of 053 degree at 300km/h for 90mins. How far is the plane from the airport correct to 1 dp

use the law of cosines. The distance z is

z^2 = 1000^2 + 450^2 - 2*1000*450 cos127°

All angles are measured CW from +y-axis.

d1 = 500*2 = 1000km[0o].
d2 = 300*1.5 = 450km[53o].

D = d1+d2.
D = (1000*sin0+450*sin53)+(1000*cos0+450*cos53)i
D = 359.4 + 1270.8i = 1320.7km[16o].

To find the distance of the plane from the airport, we first need to calculate the distance traveled in each leg of the journey.

Leg 1:
Speed = 500 km/h
Time = 2 hours

Distance = Speed × Time
Distance = 500 km/h × 2 hours
Distance = 1000 km

Leg 2:
Speed = 300 km/h
Time = 90 minutes = 1.5 hours

Distance = Speed × Time
Distance = 300 km/h × 1.5 hours
Distance = 450 km

Now, we can use the concept of vectors and the Pythagorean theorem to find the resultant distance of the plane from the airport.

Let's define the north direction as the y-axis and the east direction as the x-axis.

Leg 1 traveled directly north, so the distance in the y-axis direction is 1000 km.

For Leg 2, we need to find the distance in both the x-axis and y-axis directions. We can use trigonometry to calculate these values.

Angle = 053 degrees

Distance in the x-axis direction = 450 km × cos(53°)
Distance in the y-axis direction = 450 km × sin(53°)

Now, we can use the Pythagorean theorem to find the resultant distance:

Resultant distance = √((Distance in x-axis direction)^2 + (Distance in y-axis direction)^2)

Resultant distance = √((450 km × cos(53°))^2 + (450 km × sin(53°))^2)

Using a calculator, we can calculate the resultant distance to be approximately 706.5 km rounded to 1 decimal place.

Therefore, the plane is approximately 706.5 km away from the airport.

To find the distance of the plane from the airport, we can break down the question into two components:

1. The Northward Distance:
The plane flies due north for 2 hours at a speed of 500 km/h.
Distance = Speed * Time
Distance = 500 km/h * 2 hours
Distance = 1000 km

2. The Eastward Distance:
The plane changes its direction to a bearing of 053 degrees and flies for 90 minutes at a speed of 300 km/h.
First, we need to convert the time from minutes to hours:
90 minutes = 90/60 = 1.5 hours

To find the Eastward distance, we can use Trigonometry:
Cosine of the angle = Adjacent / Hypotenuse
Cos(053) = Adjacent / 300 km/h

Rearranging the equation, we get:
Adjacent = Cos(053) * 300 km/h
Adjacent ≈ 0.584 * 300 km/h
Adjacent ≈ 175.2 km/h

Now, we multiply the Adjacent distance by the time:
Distance = Speed * Time
Distance = 175.2 km/h * 1.5 hours
Distance = 262.8 km

Finally, we can use the Pythagorean theorem to find the total distance between the plane and the airport:
Total Distance = √(Northward Distance^2 + Eastward Distance^2)
Total Distance = √(1000 km^2 + 262.8 km^2)
Total Distance = √(1000000 km^2 + 69150.24 km^2)
Total Distance ≈ √(1069150.24 km^2)
Total Distance ≈ 1034.4 km

Therefore, the plane is approximately 1034.4 kilometers away from the airport, correct to 1 decimal place.