If a rock is dropped from a height of 66 ft, its poistion t seconds after it is dropped until it hits the ground is given by the function s(t)=−16t^2+66 .

Round values below to 3 decimal places.

How long does it take the rock to hit the ground?

_____seconds

Find the average velocity of the rock from when it is released until when it hits the ground.

______feet per second

What time after the rock is thrown will its instantaneous velocity be equal to its average velcity? (Apply the Mean Value Theorem)

___seconds after it is thrown

really? They told you that its height after t seconds is

h = −16t^2+66
so, how long does it take to hit the ground? (height = 0)
16t^2 = 66
t^2 = 66/16
t = 1/4 √66 ≈ 2.031 s

you also know that speed = distance/time, so its average velocity is
-66ft/(1/4 √66)s = -264/√66 ≈ -32.496 ft/s

you know that since h = −16t^2+66, v = -32t
so v = avg velocity when
-32t = -264/√66
t = 264/(32√66) ≈ 1.015 seconds
Note that 1.015 < 2.031 as guaranteed by the MVT.

Explain more?

If I understood what this meant, I clearly would have the answer already.

solve −16t^2+66 = 0

use that to get avg velocity: -66/t
find where -32t = avg velocity

How long does it take the rock to hit the ground?

To find the time it takes for the rock to hit the ground, we can set the function for the position equal to zero:

-16t^2 + 66 = 0

Simplifying the equation, we get:

16t^2 = 66

Dividing both sides by 16:

t^2 = 4.125

Taking the square root of both sides:

t = ±2.031

Since time cannot be negative in this context, we take the positive value. Therefore, it takes the rock approximately 2.031 seconds to hit the ground.

Average velocity of the rock from when it is released until when it hits the ground:

The average velocity can be found by dividing the change in position by the change in time. In this case, the initial position is 66 feet and the final position is 0 feet (since the rock hits the ground). The time it takes for the rock to hit the ground is approximately 2.031 seconds. Using these values, we can calculate the average velocity as follows:

Average velocity = (final position - initial position) / (final time - initial time)
= (0 - 66) / (2.031 - 0)
= -66 / 2.031
≈ -32.470

Therefore, the average velocity of the rock from when it is released until it hits the ground is approximately -32.470 feet per second.

Time after the rock is thrown when its instantaneous velocity is equal to its average velocity:

According to the Mean Value Theorem, there exists a point in the interval [0, 2.031] where the instantaneous velocity is equal to the average velocity. Since the average velocity is constant throughout the entire interval, the point where the instantaneous velocity is equal to the average velocity must occur midway through the interval.

Therefore, the time after the rock is thrown when its instantaneous velocity is equal to its average velocity is approximately 2.031 / 2 = 1.016 seconds after it is thrown.

To find the time it takes for the rock to hit the ground, we need to solve the equation s(t) = 0. In this case, s(t) is given by the function s(t) = -16t^2 + 66, so we have:

-16t^2 + 66 = 0

To solve this quadratic equation, we can use the quadratic formula:

t = (-b +/- sqrt(b^2 - 4ac)) / (2a)

In our case, a = -16, b = 0, and c = 66. Plugging these values into the quadratic formula, we get:

t = (0 +/- sqrt(0^2 - 4(-16)(66))) / (2(-16))

Simplifying further gives:

t = sqrt(1056) / -32
t = -sqrt(1056) / 32

However, since time cannot be negative in this context, we disregard the negative solution. Therefore, the time it takes for the rock to hit the ground is approximately 1.701 seconds (rounded to 3 decimal places).

To find the average velocity, we can differentiate the position function s(t) with respect to time:

v(t) = s'(t) = -32t

The average velocity can be found by evaluating v(t) at t = 1.701:

v(1.701) = -32(1.701)

So the average velocity of the rock from when it is released until it hits the ground is approximately -54.432 feet per second (rounded to 3 decimal places).

To find the time when the rock's instantaneous velocity is equal to its average velocity, we need to find the time t between release and hitting the ground when the derivative of s(t) equals -54.432. This means:

s'(t) = -32t = -54.432

Simplifying, we have:

t = -54.432 / -32

So the time after the rock is thrown when its instantaneous velocity is equal to its average velocity is approximately 1.701 seconds (rounded to 3 decimal places), which is the same as the time it takes for the rock to hit the ground.