Monochromatic light passes through two parallel narrow slits and forms an interference pattern on a screen. As the distance between the two slits is increased, the distance between light bands in the pattern on the screen will

1) decrease
2) increase
3) remain the same

decrease

He is wrong. But I don't know myself. I have a friend. Two friends. They took physics but all they did was mess around everyday. They say increase.

<---------------~~~~~~------------------>
1 * Sure

*Indicates complete uncertainty.

decrease

1) decrease

The distance between the light bands in the pattern on the screen will decrease as the distance between the two slits is increased. This phenomenon is known as interference, which occurs when waves from multiple sources interact with each other.

To understand why the distance between the light bands decreases, let's consider the concept of interference. When monochromatic light passes through two parallel narrow slits, it produces a pattern of alternating dark and bright fringes on a screen. This pattern is known as an interference pattern.

Interference occurs because the waves from each slit meet and interfere with each other. When the waves meet at a certain point on the screen, they either reinforce each other (constructive interference) or cancel each other out (destructive interference), depending on the phase relationship between them.

When the distance between the two slits is increased, the waves from each slit diverge more, meaning they spread out over a wider area on the screen. As a result, the interference pattern on the screen becomes wider, causing the distance between the light bands to decrease. This is because the wider pattern has more bright and dark fringes packed into the same amount of space.

In conclusion, as the distance between the two slits is increased, the distance between the light bands in the interference pattern on the screen will decrease.

Decrease