I am trying to learn how to work with infinities correctly while using limits. I don't think I have understood all the rules correctly yet because I am getting a contradiction:
$$ \lim_{n \to \infty} (n \cdot 0) = \lim_{n \to \infty} 0 = 0. $$
But I also got:
$$ \lim_{n \to \infty} (n \cdot 0) = (\lim_{n \to \infty} n) \cdot (\lim_{n \to \infty} 0) = \infty \cdot 0 = \text{undefined}.$$
Both can't be right. Which one is correct and why? What rule did I violate in the incorrect one?