on precomputed decision trees. We will omit some technical details, referring
the reader to section 5.1 of the Pettie's paper \cite{pettie:optimal}.
-\lemma
+\lemma\id{dtbasic}%
The decision tree complexity $D(m,n)$ of the MSF satisfies:
\numlist\ndotted
\:$D(m,n) \ge m/2$ for $m,n > 2$.
a~single~$C_i$ can be made isomorphic.
\qed
-\cor
+\cor\id{dtpart}%
If $C_1,\ldots,C_k$ are the subgraphs generated by the Partition procedure (Algorithm \ref{partition}),
then $D(\bigcup_i C_i) = \sum_i D(C_i)$.
$\bigcup C_i$, so we can apply Lemma \ref{compartsum} on them.
\qed
+\cor\id{dttwice}%
+$2D(m,n) \le D(2m,2n)$ for every $m,n$.
+
+\proof
+For an~arbitrary graph~$G$ with $m$~edges and $n$~vertices, we create a~graph~$G_2$
+consisting of two copies of~$G$ sharing a~single vertex. The copies of~$G$ are obviously
+compartments of~$G_2$, so by Lemma \ref{compartsum} it holds that $D(G_2) = 2D(G)$.
+Taking a~maximum over all choices of~$G$ yields $D(2m,2n) \ge \max_G D(G_2) = 2D(m,n)$.
+\qed
+
%--------------------------------------------------------------------------------
\section{An optimal algorithm}
+Once we have developed the soft heaps, partitioning and MST decision trees,
+it is now simple to state the Pettie's MST algorithm \cite{pettie:optimal}
+and prove that it is asymptotically optimal among all MST algorithms in
+comparison-based models. Several standard MST algorithms from the previous
+chapters will play their roles.
+
+We will describe the algorithm as a~recursive procedure. When the procedure is
+called on a~graph~$G$, it sets the parameter~$t$ to rougly $\log^{(3)} n$ and
+it calls the \<Partition> procedure to split the graph into a~collection of
+subgraphs of size~$t$ and a~set of corrupted edges. Then it uses precomputed decision
+trees to find the MSF of the small subgraphs. The graph obtained by contracting
+the subgraphs is on the other hand dense enough, so that the Iterated Jarn\'\i{}k's
+algorithm runs on it in linear time. Afterwards we combine the MSF's of the subgraphs
+and of the contracted graphs, we mix in the corrupted edges and run two iterations
+of the Contractive Bor\o{u}vka's algorithm. The resulting graph will have both the number of
+vertices and edges reduced by a~constant factor and we recurse on it.
+
+\algn{Optimal MST algorithm, Pettie \cite{pettie:optimal}}\id{optimal}%
+\algo
+\algin A~connected graph~$G$ with an~edge comparison oracle.
+\:If $G$ has no edges, return an~empty tree.
+\:$t\=\lceil\log^{(3)} n\rceil$. \cmt{the size of subgraphs}
+\:Call \<Partition> (\ref{partition}) on $G$ and $t$ with $\varepsilon=1/8$. It returns
+ a~collection~$\C=\{C_1,\ldots,C_k\}$ of subgraphs and a~set~$R^\C$ of corrupted edges.
+\:$F_i \= \mst(C_i)$ for all~$i$ obtained using optimal decision trees.
+\:$G_A \= (G / \bigcup_i C_i) \setminus R^\C$. \cmt{the contracted graph}
+\:$F_A \= \msf(G_A)$ calculated by the Iterated Jarn\'\i{}k's algorithm (\ref{itjar}).
+\:$G_B \= \bigcup_i F_i \cup F_A \cup R^\C$. \cmt{combine subtrees with corrupted edges}
+\:Run two iterations of the Contractive Bor\o{u}vka's algorithm (\ref{contbor}) on~$G_B$,
+ getting a~contracted graph~$G_C$ and a~set~$F_B$ of MST edges.
+\:$F_C \= \mst(G_C)$ obtained by a~recursive call to this algorithm.
+\:Return $F_B \cup F_C$.
+\algout The minimum spanning tree of~$G$.
+\endalgo
+
+Correctness of this algorithm immediately follows from the Partitioning theorem (\ref{partthm})
+and from the proofs of the respective algorithms used as subroutines. Let us take a~look at
+the time complexity. We will be careful to use only the operations offered by the Pointer machine.
+
+\lemma\id{optlemma}%
+The time complexity $T(m,n)$ of the Optimal algorithm satisfies the following recurrence:
+$$
+T(m,n) \le \sum_i c_1 D(C_i) + T(m/2, n/4) + c_2 m,
+$$
+where~$c_1$ and~$c_2$ are some constants and $D$~is the decision tree complexity
+from the previous section.
+
+\proof
+The first two steps of the algorithm are trivial as we have linear time at our
+disposal.
+
+By the Parititioning theorem (\ref{partthm}), the call to \<Partition> with~$\varepsilon$
+set to a~constant takes $\O(m)$ time and it produces a~collection of subgraphs of size
+at most~$t$ and at most $m/4$ corrupted edges. It also guarantees that the
+connected components of the union of the $C_i$'s have at least~$t$ vertices
+(unless there is just a~single component).
+
+\FIXME{Decision trees and sorting}
+
+The contracted graph~$G_A$ has at most $n/t = \O(n / \log^{(3)}n)$ vertices and asymptotically
+the same number of edges as~$G$, so according to Corollary \ref{ijdens} the Iterated Jarn\'\i{}k's
+algorithm runs on it in linear time.
+
+The combined graph~$G_B$ has~$n$ vertices, but less than~$n$ edges from the
+individual spanning trees and at most~$m/4$ additional edges which were
+corrupted. The iterations of the Bor\o{u}vka's algorithm on~$G_B$ take $\O(m)$
+time by Lemma \ref{boruvkaiter} and they produce a~graph~$G_C$ with at most~$n/4$
+vertices and at most $n/4 + m/4 \le m/2$ edges. (The $n$~tree edges in~$G_B$ are guaranteed
+to be reduced by the Bor\o{u}vka's algorithm.)
+
+The remaining steps of the algorithm can be easily performed in linear time either directly
+or in case of the contractions by the bucket-sorting techniques of Section \ref{bucketsort}.
+\qed
+
+\paran{Optimality}
+The properties of decision tree complexity, which we have proven in the previous
+section, will help us show that the time complexity recurrence is satisfied by the
+decision tree complexity $D(m,n)$ itself. This way, we prove the following theorem:
+
+\thmn{Optimality of Pettie's algorithm}
+The time complexity of the Optimal MST algorithm \ref{optimal} is $\Theta(D(m,n))$.
+
+\proof
+We will prove by induction that $T(m,n) \le cD(m,n)$ for some $c>0$. The base
+case is trivial, for the induction step we will expand on the previous lemma:
+\def\eqalign#1{\null\,\vcenter{\openup\jot
+ \ialign{\strut\hfil$\displaystyle{##}$&$\displaystyle{{}##}$\hfil
+ \crcr#1\crcr}}\,}
+$$\vcenter{\openup\jot\halign{\strut\hfil $\displaystyle{#}$&$\displaystyle{{}#}$\hfil&\quad#\hfil\cr
+T(m,n)
+ &\le \sum_i c_1 D(C_i) + T(m/2, n/4) + c_2 m &(Lemma \ref{optlemma})\cr
+ &\le c_1 D(m,n) + T(m/2, n/4) + c_2m &(Corollary \ref{dtpart})\cr
+ &\le c_1 D(m,n) + cD(m/2, n/4) + c_2m &(induction hypothesis)\cr
+ &\le c_1 D(m,n) + c/2\cdot D(m,n/2) + c_2m &(Corollary \ref{dttwice})\cr
+ &\le c_1 D(m,n) + c/2\cdot D(m,n) + 2c_2 D(m,n) &(Lemma \ref{dtbasic})\cr
+ &\le (c_1 + c/2 + 2c_2) \cdot D(m,n)&\cr
+ &\le cD(m,n). &(by setting $c=2c_1+4c_2$)\cr
+}}$$
+The other inequality is obvious as $D(m,n)$ is an~asymptotic lower bound for
+the time complexity of every comparison-based algorithm.
+\qed
+
+\FIXME{Ackermann}
+
+\FIXME{Consequences for models}
+\FIXME{Random graphs}
\endpart