Tri des fonctions par croissance asymptotique

35

Supposons que j'ai une liste de fonctions, par exemple

nloglog(n),2n,n!,n3,nlnn,

Comment puis-je les trier asymptotiquement, c'est-à-dire après la relation définie par

fOgfO(g) ,

en supposant qu'ils soient effectivement comparables deux à deux (voir aussi ici )? L'utilisation de la définition de O semble délicate et il est souvent difficile de prouver l'existence de constantes appropriées et .cn0

Il s’agit de mesures de complexité. Nous nous intéressons donc au comportement asymptotique sous la forme , et supposons que toutes les fonctions prennent uniquement des valeurs non négatives ( ).n , f ( n ) 0n+n,f(n)0

JAN
la source
4
Étant donné que le PO n'est jamais revenu, je supprime les éléments localisés et pose une question de référence à ce sujet.
Raphaël

Réponses:

48

Si vous voulez une preuve rigoureuse, le lemme suivant est souvent utile resp. plus pratique que les définitions.

Si c=limnf(n)g(n) existe, alors

  • c=0 fo(g) ,
  • c(0,)FΘ(g) et
  • c=   Fω(g) .

Avec cela, vous devriez pouvoir commander la plupart des fonctions à venir dans l'analyse d'algorithme¹. En exercice, prouvez-le!

Bien entendu, vous devez pouvoir calculer les limites en conséquence. Voici quelques astuces utiles pour décomposer des fonctions complexes en fonctions de base:

  • Exprime les deux fonctions en e et compare les exposants; si leur rapport tend vers 0 ou , le quotient initial en fait de même.
  • Plus généralement: si vous avez une fonction convexe, continuellement différentiable et strictement croissante, vous pouvez ainsi ré-écrire votre quotient en tant queh

    ,F(n)g(n)=h(F*(n))h(g*(n))

    avec etg*Ω(1)

    ,limnf(n)g(n)=

    puis

    .limnf(n)g(n)=

    Voir ici pour une preuve rigoureuse de cette règle (en allemand).

  • Considérez les continuations de vos fonctions sur les réels. Vous pouvez maintenant utiliser la règle de L'Hôpital ; soyez conscient de ses conditions²!

  • Regardez l'équivalent discret, Stolz – Cesàro .
  • Lorsque les factorielles apparaissent, utilisez la formule de Stirling :

    n!2πn(ne)n

Il est également utile de conserver un pool de relations de base que vous avez prouvées une fois et que vous utilisez souvent, telles que:

  • les logarithmes croissent plus lentement que les polynômes, c.-à-d.

    pour tout α , β > 0 .(logn)αo(nβ)α,β>0

  • ordre des polynômes:

    pour toutα<β.nαo(nβ)α<β

  • les polynômes croissent plus lentement que les exponentielles:

    pour toutαetc>1.nαo(cn)αc>1


Il peut arriver qu’au-dessus du lemme ne soit pas applicable car la limite n’existe pas (par exemple, lorsque les fonctions oscillent). Dans ce cas, considérons la caractérisation suivante des classes de Landau en utilisant des limes supérieures / inférieures :

Avec nous avonscs:=lim supnf(n)g(n)

  • et0cs<fO(g)
  • .cs=0fo(g)

Avec nous avonsci:=lim infnf(n)g(n)

  • et0<cifΩ(g)
  • .ci=fω(g)

En outre,

  • et0<ci,cs<fΘ(g)gΘ(f)
  • ci=cs=1fg.

Check here and here if you are confused by my notation.


¹ Nota bene: My colleague wrote a Mathematica function that does this successfully for many functions, so the lemma really reduces the task to mechanical computation.

² See also here.

Raphael
la source
@Juho Not publicly, afaik, but it's elementary to write yourself; compute Limit[f[n]/g[n], n -> Infinity] and perform a case distinction.
Raphael
20

Another tip: sometimes applying a monotone function (like log or exp) to the functions makes things clearer.

Suresh
la source
5
This should be done carefully: 2nO(n), but 22nO(2n).
Shaull
2
Seconded. The "apply monotone function" thing seems to be some kind of folklore which does not work in general. We have been working on sufficient criteria and have been come up with what I posted in the latest revision of my answer.
Raphael
17

Skiena provides a sorted list of the dominance relations between the most common functions in his book, The Algorithm Design Manual:

n!cnn3n2n1+ϵnlgnnn1/2
lg2nlgnlgnlglgnlglgnα(n)1

Here α(n) denotes the inverse Ackermann function.

Robert S. Barnes
la source
That's an oddly specific list. Many of the relations (whatever means exactly) can be summarised to a handful of more general lemmata.
Raphael
It's his notation for a dominance relation.
Robert S. Barnes
11

Tip: draw graphs of these functions using something like Wolfram Alpha to get a feeling for how they grow. Note that this is not very precise, but if you try it for sufficiently large numbers, you should see the comparative patterns of growth. This of course is no substitute for a proof.

E.g., try: plot log(log(n)) from 1 to 10000 to see an individual graph or plot log(log(n)) and plot log(n) from 1 to 10000 to see a comparison.

Dave Clarke
la source
9
Should we really recommend vodoo?
Raphael
+1 for suggesting to draw graphs of the functions, although the linked graphs are rather confusing unless you know what they mean.
Tsuyoshi Ito
1
Take a graph as a hint what you might want to prove. That hint may be wrong of course.
gnasher729
8

I suggest proceeding according to the definitions of various notations. Start with some arbitrary pair of expressions, and determine the order of those, as outlined below. Then, for each additional element, find its position in the sorted list using binary search and comparing as below. So, for example, let's sort nloglogn and 2n, the first two functions of n, to get the list started.

We use the property that n=2logn to rewrite the first expression as nloglogn=(2logn)loglogn=2lognloglogn. We could then proceed to use the definition to show that nloglogn=2lognloglogno(2n), since for any constant c>0, there is an n0 such that for nn0, c(nloglogn)=c(2lognloglogn)<2n.

Next, we try 3n. We compare it to 2n, the largest element we have placed so far. Since 3n=(2log3)n=2nlog3, we similarly show that 2no(3n)=o(2nlog3).

Etc.

Patrick87
la source
2

Here a list from Wikipedia, The lower in the table the bigger complexity class;

NameRunning TimeConstant timeO(1)Inverse Ackermann timeO(a(n))Iterated logarithmic timeO(logn)Log-logarithmicO(nlogn)Logarithmic timeO(logn)Polylogarithmic timepoly(logn)Fractional powerO(nc),where 0<c<1Linear timeO(n)"n log star n" timeO(nlogn)Quasilinear timeO(nlogn)Quadratic timeO(n2)Cubic timeO(n3)Polynomial timepoly(n)=2O(logn)Quasi-polynomial time2O(poly(logn))Sub-exponential time (first definition)O(2nϵ),ϵ>0Sub-exponential time (second definition)2o(n)Exponential time(with linear exponent)2O(n)Exponential time2poly(n)Factorial timeO(n!)

note : poly(x)=xO(1)

kelalaka
la source
1
Also, interesting how the table suggests that 2nlogno(n!). While the table you link to is somewhat accurate, the one linked there (and which you copied) is about complexity classes, which is not a helpful thing to mix in here. Landau notation is not about "time".
Raphael
1
I put this so the name of the complexity classes can be talked directly here. Yes, Landau is more about a specific type of algorithm in Cryptography.
kelalaka
1
I object to some of @Raphael's views. I have been a mathematician and an instructor for many years. I believe, apart from proving those things, a big table like this increases people's intuition easily and greatly. And the names of the asymptotic classes help people remember and communicate a lot.
Apass.Jack
1
@Apass.Jack In my teaching experience, when given a table many students will learn it by heart and fail to order wrt any function not in the table. Note how that effect seems to account for many of questions regarding asymptotic growth that land on this site. That said, of course we'll use lemmata implied by the table if it makes proofs easier, but that comes after learning how to proof the table. (To emphasize that point, people who come here don't need help reading stuff off a table. They need help proving relations.)
Raphael
1
@kelalaka "Yes, Landau is more about a specific type of algorithm in Cryptography." -- that doesn't even make sense. Landau notation is a shorthand to describe properties of mathematical functions. It has nothing to do with algorithms let alone cryptography, per se.
Raphael