Suppose that $f(n) = o(1)$$f(n)\in o(1)$ . This can be interpreted using to equivalent definitions:
There exists$\forall$ constant $k>0$ $\exists$ $n_{0}$ such that $\forall n \geq n_{0}$ , $n \epsilon \mathbb{N}$, and $\forall$ constant $k > 0$, $f(n) \leq k$. Of course, this implies that f(n) can be as small as you want it to be. My background is in computer science and there is no such algorithm. The running time of an algorithm would be at least 1 step, which is larger than some constant$|f(n)| \leq k$.
$\forall$ constant $k > 0$ , $\lim_{n \rightarrow \infty} \frac{f(n)}{k} = 0$ , which implies that $\lim_{n \rightarrow \infty} f(n) = 0$.