It's probably a vey silly question, but I'm confused. Does $o(1)$ simply mean $\lim_{n \to \infty} \frac{f(n)}{\epsilon}=0$ for some $n>N$?
-
$\begingroup$ You should clarify your question. Even though people may be able to guess what you want to ask, there's no need for anybody to guess. I don't think it will be too hard for you to spend some more time improving your question. $\endgroup$– Adrián BarqueroCommented Mar 26, 2011 at 6:24
-
15$\begingroup$ I thought the question is clear enough. How exactly do you want me to improve it? $\endgroup$– sigma.z.1980Commented Mar 26, 2011 at 14:32
3 Answers
This should be a comment to chazisop's answer; I don't have enough rep to make it.
Chazisop, your quantifiers in 1 are the wrong way round, in fact there are two problems. Firstly, saying $\forall k>0 : f(n) \leqslant k$ is simply equivalent to saying $f(n) \leqslant 0$. The right definition for $o(1)$ is that
$$
\forall k > 0\ \exists N\ \forall n \geqslant N :\; |f(n)| \leqslant k.
$$
Note: the $k$-quantifier appears at the start, this is non-negotiable! Secondly, notice the absolute value signs around $f(n)$. If you are only thinking of nonnegative functions (e.g. the running time of an algorithm) you can omit them, but not for arbitrary functions.
To the OP, no, that's not what $o(1)$ means. There are two problems with what you've written: firstly, what is $\epsilon$? Secondly, what are the $n$ and $N$ supposed to be (I don't mean the $n$ in your limit)? You need to think about this statement more carefully.
The definition of $f(n)$ being $o(1)$ is that $$\lim _{n \to \infty} f(n) = 0.$$ That means that for all $\epsilon>0$ there exists $N_\epsilon$, depending on $\epsilon$, such that for all $n \geqslant N_\epsilon$ we have $|f(n)| \leqslant \epsilon$. I guess this definition is probably where your $n>N$ comes from.
-
$\begingroup$ Corrected my response. Although I would find out the quantifiers error eventually, thanks for the mod signs, which is something I have never encountered before. $\endgroup$– chazisopCommented Mar 26, 2011 at 14:07
-
1$\begingroup$ @chazisop By "mod sign" Matthew means the absolute value signs, and (I mean this neutrally) it's surprising you haven't seen them before, so I suspect you might have misconstrued his intended meaning. Absolute value is sometimes called "modulus" especially when referring to the absolute value of complex numbers, but I find this disfavorable because absolute value means only one thing, while modulus is more commonly used to indicate the base of a modular congruence relation. I actually hadn't encountered "modulus" being used to mean absolute value prior to seeing this response. $\endgroup$ Commented Apr 17, 2021 at 1:29
-
1$\begingroup$ @TrixieWolf you're right that absolute value is more common - I updated the answer. $\endgroup$ Commented Feb 8, 2023 at 15:25
It probably means $$\lim_{n \to \infty} \frac{f(n)}{1} = 0$$
-
$\begingroup$ doesn't f(n)=o(g(n)) mean $limf(n)=|\epsilon \cdot g(n)|$? $\endgroup$ Commented Mar 26, 2011 at 6:24
-
$\begingroup$ see this: math.osu.edu/~fowler/teaching/handouts/ibl-analysis/… $\endgroup$ Commented Mar 26, 2011 at 6:35
-
7$\begingroup$ whether there is $\epsilon$ or $1$ in the denominator does not matter at all... $\endgroup$– FabianCommented Mar 26, 2011 at 6:49
-
2$\begingroup$ $f(n) = o(1)$ as $n \to \infty$ means $\lim_{n\to\infty} f(n) = 0$. No need for $\epsilon$ or $N$. But, of course, the definition of the limit can involve $\epsilon$ and $N$. $\endgroup$– GEdgarCommented Jul 3, 2011 at 12:40
Suppose that $f(n)\in o(1)$ . This can be interpreted using to equivalent definitions:
$\forall$ constant $k>0$ $\exists$ $n_{0}$ such that $\forall n \geq n_{0}$ , $n \epsilon \mathbb{N}$, $|f(n)| \leq k$.
$\forall$ constant $k > 0$ , $\lim_{n \rightarrow \infty} \frac{f(n)}{k} = 0$ , which implies that $\lim_{n \rightarrow \infty} f(n) = 0$.
-
1$\begingroup$ $n_{0}$ must be the same for all n and k. So the order is $\exists n_{0} \forall n \geq n_{0} \forall k > 0$. I just mention my background because there may be slight variations on how asymptotic notation are used in analysis that I am not aware of. $\endgroup$– chazisopCommented Mar 26, 2011 at 8:38
-
$\begingroup$ Isn't it possible to come up with some pseudo-random algorithm that has constant running time? For example, if I want a quadruple of numbers between 0 and 1 which make up a nonzero determinant, one randomly generated quadruple will work almost certainly. $\endgroup$ Commented Mar 26, 2011 at 9:19
-
$\begingroup$ @Joseph In certain models it is possible to have constant time algorithms. They would be however $O(1)$ and not $o(1)$. As I mentioned in my answer, an algorithm cannot have a running time less than 1 step,whereas $\forall k$ includes constants less than 1. $o(1)$ of course makes sense for analysis. $\endgroup$– chazisopCommented Mar 26, 2011 at 9:28
-
$\begingroup$ $o(1)$ can come up in expected time analyses of algorithms, e.g. (fictional) $T(n) = 2n + 3 + \frac{1}{n}$. Also, my standard comment: note that $f(n) = o(1)$ is abuse of notation. A function can not equal a class of functions. $\endgroup$– RaphaelCommented Mar 26, 2011 at 10:21
-
$\begingroup$ No, $n_0$ is allowed to depend on $k$ in the standard meaning of $o(1)$. $f(n)=1/n=o(1)$, but there is no value of $n_0$ which works for all $k$. For any particular $k$ you can choose $n_0(k)$. See 5316's answer. $\endgroup$ Commented Mar 26, 2011 at 13:23