A 10-foot-tall ladder rests against an 8-foot wall. The wall and ground form a 90-degree angle.

How many feet away from the wall is the bottom of the ladder?

We can use the Pythagorean theorem to solve this problem. According to the theorem, in a right triangle, the square of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, the ladder is the hypotenuse, and its length is 10 feet. One side of the triangle is the height of the wall, which is 8 feet.

Using the Pythagorean theorem, we have:

(Length of ladder)^2 = (Height of wall)^2 + (Distance from wall)^2

(10)^2 = (8)^2 + (Distance from wall)^2

100 = 64 + (Distance from wall)^2

Subtracting 64 from both sides, we get:

36 = (Distance from wall)^2

Taking the square root of both sides, we have:

6 = Distance from wall

So, the bottom of the ladder is 6 feet away from the wall.