pages={241--273},
year={2004}
}
+
+@inproceedings{ takaoka:twothree,
+ title={{Theory of 2-3 Heaps}},
+ author={Takaoka, T. and Christchurch, N. Z.},
+ booktitle={Computing and Combinatorics: 5th Annual International Conference, COCOON'99},
+ location={Tokyo, Japan},
+ year={1999},
+ pages={41--50},
+ publisher={Springer},
+ series={{Lecture Notes in Computer Science}},
+ volume={1627}
+}
+
+@inproceedings{ takaoka:trinomial,
+ title={{Theory of Trinomial Heaps}},
+ author={Takaoka, Tadao},
+ booktitle={Computing and Combinatorics: 6th Annual International Conference, COCOON 2000},
+ location={Sydney, Australia},
+ year={2000},
+ pages={362--372},
+ publisher={Springer},
+ series={{Lecture Notes in Computer Science}},
+ volume={1858}
+}
+
+@book{ clrs,
+ title={{Introduction to Algorithms}},
+ author={Leiserson, C.E. and Rivest, R.L. and Cormen, T.H. and Stein, C.},
+ year={2001},
+ publisher={McGraw-Hill}
+}
This chapter attempts to survery the important algorithms for finding the MST and it
also presents several new ones.
+%--------------------------------------------------------------------------------
+
\section{Basic Properties}
In this section, we will examine the basic properties of spanning trees and prove
is the MST of~$G_1$ if and only if $\pi[T]$ is the MST of~$G_2$.
\qed
+%--------------------------------------------------------------------------------
+
\section{The Red-Blue Meta-Algorithm}
Most MST algorithms can be described as special cases of the following procedure
lemma all other (red) edges are outside~$T_{min}$, so the blue edges are exactly~$T_{min}$.
\qed
+%--------------------------------------------------------------------------------
+
\section{Classical algorithms}
The three classical MST algorithms can be easily stated in terms of the Red-Blue meta-algorithm.
Follows from the previous lemmata.
\qed
-\algn{Jarn\'\i{}k \cite{jarnik:ojistem}, Prim \cite{prim:mst}, Dijkstra \cite{dijkstra:mst}}
+\algn{Jarn\'\i{}k \cite{jarnik:ojistem}, Prim \cite{prim:mst}, Dijkstra \cite{dijkstra:mst}}\id{jarnik}%
\algo
\algin A~graph~$G$ with an edge comparison oracle.
\:$T\=$ a single-vertex tree containing an~arbitrary vertex of~$G$.
Follows from the above analysis.
\qed
+%--------------------------------------------------------------------------------
+
\section{Contractive algorithms}
While the classical algorithms are based on growing suitable trees, they
edges as every $H_{a,k}$ contains a complete graph on~$a$ vertices.
\qed
+%--------------------------------------------------------------------------------
+
\section{Minor-closed graph classes}
The contracting algorithm given in the previous section has been found to perform
\figure{hexangle.eps}{\epsfxsize}{The construction from Remark~\ref{hexa}}
+%--------------------------------------------------------------------------------
\section{Using Fibonacci heaps}
\id{fibonacci}
+We have seen that the Jarn\'\i{}k's Algorithm \ref{jarnik} runs in $\O(m\log n)$ time
+(and this bound can be easily shown to be tight). Fredman and Tarjan have shown a~faster
+implementation in~\cite{ft:fibonacci} using their Fibonacci heaps. In this section,
+we convey their results and we show several interesting consequences.
+
+The previous implementation of the algorithm used a binary heap to store all neighboring
+edges of the cut~$\delta(T)$. Instead of that, we will remember the vertices adjacent
+to~$T$ and for each such vertex~$v$ we will keep the lightest edge~$uv$ such that $u$~lies
+in~$T$. We will call these edges \df{active edges} and keep them in a~heap, ordered by weight.
+
+When we want to extend~$T$ by the lightest edge of~$\delta(T)$, it is sufficient to
+find the lightest active edge~$uv$ and add this edge to~$T$ together with a new vertex~$v$.
+Then we have to update the active edges as follows. The edge~$uv$ has just ceased to
+be active. We scan all neighbors~$w$ of the vertex~$v$. When $w$~is in~$T$, no action
+is needed. If $w$~is outside~$T$ and it was not adjacent to~$T$ (there is no active edge
+remembered for it so far), we set the edge~$vw$ as active. Otherwise we check the existing
+active edge for~$w$ and replace it by~$vw$ if the new edge is lighter.
+
+The following algorithm shows how these operations translate to insertions, decreases
+and deletions on the heap.
+
+\algn{Jarn\'\i{}k with active edges, Fredman and Tarjan \cite{ft:fibonacci}}\id{jarniktwo}%
+\algo
+\algin A~graph~$G$ with an edge comparison oracle.
+\:$v_0\=$ an~arbitrary vertex of~$G$.
+\:$T\=$ a tree containing just the vertex~$v_0$.
+\:$H\=$ a~heap of active edges stored as pairs $(u,v)$ where $u\in T,v\not\in T$, ordered by the weights $w(vw)$, initially empty.
+\:$A\=$ an~auxiliary array mapping vertices outside~$T$ to their active edges in the heap; initially all elements undefined.
+\:\<Insert> all edges incident with~$v_0$ to~$H$ and update~$A$ accordingly.
+\:While $H$ is not empty:
+\::$(u,v)\=\<DeleteMin>(H)$.
+\::$T\=T+uv$.
+\::For all edges $vw$ such that $w\not\in T$:
+\:::If there exists an~active edge~$A(w)$:
+\::::If $vw$ is lighter than~$A(w)$, \<Decrease> $A(w)$ to~$(v,w)$ in~$H$.
+\:::If there is no such edge, then \<Insert> $(v,w)$ to~$H$ and set~$A(w)$.
+\algout Minimum spanning tree~$T$.
+\endalgo
+
+\thmn{Fibonacci heaps} The~Fibonacci heap performs the following operations
+with the indicated amortized time complexity:
+\itemize\ibull
+\:\<Insert> (insertion of a~new element) in $\O(1)$,
+\:\<Decrease> (decreasing value of an~existing element) in $\O(1)$,
+\:\<Merge> (merging of two heaps into one) in $\O(1)$,
+\:\<DeleteMin> (deletion of the minimal element) in $\O(\log n)$,
+\:\<Delete> (deletion of an~arbitrary element) in $\O(\log n)$,
+\endlist
+\>where $n$ is the maximum number of elements present in the heap at the time of
+the operation.
+
+\proof
+See Fredman and Tarjan \cite{ft:fibonacci} for both the description of the Fibonacci
+heap and the proof of this theorem.
+\qed
+
+\thm
+Algorithm~\ref{jarniktwo} with a~Fibonacci heap finds the MST of the input graph in time~$\O(m+n\log n)$.
+
+\proof
+The algorithm always stops, because every edge enters the heap~$H$ at most once.
+As it selects exactly the same edges as the original Jarn\'\i{}k's algorithm,
+it gives the correct answer.
+
+The time complexity is $\O(m)$ plus the cost of the heap operations. The algorithm
+performs at most one \<Insert> or \<Decrease> per edge and exactly one \<DeleteMin>
+per vertex and there are at most $n$ elements in the heap at any given time,
+so by the previous theorem the operations take $\O(m+n\log n)$ time in total.
+\qed
+
+\cor
+For graphs with edge density at least $\log n$, this algorithm runs in linear time.
+
+\rem
+We can consider using other kinds of heaps which have the property that inserts
+and decreases are faster than deletes. Of course, the Fibonacci heaps are asymptotically
+optimal (by the standard $\Omega(n\log n)$ lower bound on sorting by comparisons, see
+for example \cite{clrs}), so the other data structures can improve only
+multiplicative constants or offer an~easier implementation.
+
+A~nice example is a~\df{$d$-regular heap} --- a~variant of the usual binary heap
+in the form of a~complete $d$-regular tree. \<Insert>, \<Decrease> and other operations
+involving bubbling the values up spend $\O(1)$ time at a~single level, so they run
+in~$\O(\log_d n)$ time. \<Delete> and \<DeleteMin> require bubbling down, which incurs
+comparison with all~$d$ sons at every level, so they run in~$\O(d\log_d n)$.
+With this structure, the time complexity of the whole algorithm
+is $\O(nd\log_d n + m\log_d n)$, which suggests setting $d=m/n$, giving $\O(m\log_{m/n}n)$.
+This is still linear for graphs with density at~least~$n^{1+\varepsilon}$.
+
+Another possibility is to use the 2-3-heaps \cite{takaoka:twothree} or Trinomial
+heaps \cite{takaoka:trinomial}. Both have the same asymptotic complexity as Fibonacci
+heaps (the latter even in worst case, but it does not matter here) and their
+authors claim implementation advantages.
+
+\FIXME{Mention Thorup's Fibonacci-like heaps for integers?}
+
+
+
% G has to be connected, so m=O(n)
% mention Steiner trees
% mention matroids
% impedance mismatch in terminology: contraction of G along e vs. contraction of e.
% use \delta(X) notation
% mention disconnected graphs
-%%% fix off by 1 errors in the distractors
\endpart