1. The statement is true.
To understand why, let's review the concept of convergence and divergence of a series. A series is said to converge if the sequence of partial sums approaches a finite limit as the number of terms increases towards infinity. Mathematically, if the limit of the terms, a(sub n), as n approaches infinity is zero (lim as n->infinity of a(sub n) = 0), then it indicates that the terms are getting smaller and smaller as n increases. This behavior suggests that the series has a higher chance of converging. Therefore, if the limit of a(sub n) is zero, there is a high possibility that the series will converge.
2. The statement is false.
If the sum of the series from n=1 to infinity of a(sub n) converges (that is, the series has a finite limit), it does not necessarily imply that the sum of the series from n=1 to infinity of 1/(a(sub n)) will diverge. The convergence or divergence of the series 1/(a(sub n)) depends on the behavior of the terms a(sub n), not the opposite way around.
In fact, if the terms a(sub n) are positive and approaching zero (lim as n->infinity of a(sub n) = 0), then it suggests that 1/(a(sub n)) will tend to be larger as n increases. If the terms a(sub n) are positive and tending to zero, the series 1/(a(sub n)) is more likely to diverge rather than converge.
To determine the convergence or divergence of the series 1/(a(sub n)), you would need to examine the behavior and properties of the terms a(sub n) themselves independently.