If two planes leave the same airport at 1:00 PM, how many miles apart will they be at 3:00 PM if one travels directly north at 150 mph and the other travels directly west at 200 mph? ( please help me with the working thanks )

When you add a west-facing vector of length 200 t and a north-facing vector of length 150 t, the resultant displacement, which is the hypotenuse, will have a length of 250 t. This is becasue the sides of the right triangle are always in a 3:4:5 ratio.

(250t)^2 = (150t)^2+ (200t)^2

After t = 2 hours, they will be d = 500 miles apart, since

d^2 = sqrt[{300)^2 + (400)^2)]

To find out how many miles apart the two planes will be at 3:00 PM, we need to find the distance traveled by each plane during the two-hour time period.

For the plane traveling north at 150 mph, we can calculate the distance as follows:
Distance = Speed × Time
Distance = 150 mph × 2 hours = 300 miles

For the plane traveling west at 200 mph, we can calculate the distance as follows:
Distance = Speed × Time
Distance = 200 mph × 2 hours = 400 miles

Now that we know the distances traveled by each plane, we can use the Pythagorean theorem to find the distance between the two planes. The theorem states that in a right triangle, the square of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, the distances traveled by each plane represent the two legs of the right triangle, and the distance between the two planes represents the hypotenuse.

Using the Pythagorean theorem, we have:
c² = a² + b²,
where a and b are the distances traveled by each plane, and c is the distance between the two planes.

So we have:
c² = (300 miles)² + (400 miles)²
c² = 90000 square miles + 160000 square miles
c² = 250000 square miles

To find the actual distance c, we need to take the square root of 250000:
c = √250000
c ≈ 500 miles

Therefore, the two planes will be approximately 500 miles apart at 3:00 PM.