From 3d560320bcc1aac9ccd3a36ec3157e837976b773 Mon Sep 17 00:00:00 2001 From: Martin Mares Date: Sat, 3 May 2008 13:24:57 +0200 Subject: [PATCH] Corrections: Chapter 1. --- cover.tex | 2 +- mst.tex | 198 +++++++++++++++++++++++++++------------------------ notation.tex | 5 +- 3 files changed, 110 insertions(+), 95 deletions(-) diff --git a/cover.tex b/cover.tex index 4fe0d29..962ec7e 100644 --- a/cover.tex +++ b/cover.tex @@ -76,7 +76,7 @@ and individuals for academic or research purposes. \bigskip \leftline{Martin Mare\v{s}} -\leftline{Prague, 30 April, 2008} +\leftline{Prague, April 30th, 2008} \eject diff --git a/mst.tex b/mst.tex index 1cf428e..9aa9bdf 100644 --- a/mst.tex +++ b/mst.tex @@ -29,13 +29,13 @@ For a given graph~$G$ with weights $w:E(G)\rightarrow {\bb R}$: When comparing two weights, we will use the terms \df{lighter} and \df{heavier} in the obvious sense. \:A~\df{minimum spanning tree (MST)} of~$G$ is a spanning tree~$T$ such that its weight $w(T)$ - is the smallest possible of all the spanning trees of~$G$. + is the smallest possible among all the spanning trees of~$G$. \:For a disconnected graph, a \df{(minimum) spanning forest (MSF)} is defined as a union of (minimum) spanning trees of its connected components. \endlist Bor\o{u}vka's work was further extended by Jarn\'\i{}k \cite{jarnik:ojistem}, again in -mostly geometric setting, giving another efficient algorithm. However, when +mostly geometric setting. He has discovered another efficient algorithm. However, when computer science and graph theory started forming in the 1950's and the spanning tree problem was one of the central topics of the flourishing new disciplines, the previous work was not well known and the algorithms had to be @@ -44,7 +44,7 @@ rediscovered several times. In the next 50 years, several significantly faster algorithms were discovered, ranging from the $\O(m\timesbeta(m,n))$ time algorithm by Fredman and Tarjan \cite{ft:fibonacci}, over algorithms with inverse-Ackermann type complexity by Chazelle \cite{chazelle:ackermann} -and Pettie \cite{pettie:ackermann}, to another algorithm by Pettie \cite{pettie:optimal} +and Pettie \cite{pettie:ackermann}, to an~algorithm by Pettie \cite{pettie:optimal} whose time complexity is provably optimal. In the upcoming chapters, we will explore this colorful universe of MST algorithms. @@ -67,16 +67,16 @@ for the subgraphs as for the corresponding sets of edges. First of all, let us show that the weights on edges are not necessary for the definition of the MST. We can formulate an equivalent characterization using -an ordering of edges instead. +an~ordering of edges instead. \defnn{Heavy and light edges}\id{heavy}% Let~$T$ be a~spanning tree. Then: \itemize\ibull -\:For vertices $x$ and $y$, let $T[x,y]$ denote the (unique) path in~$T$ joining $x$ and~$y$. +\:For vertices $x$ and $y$, let $T[x,y]$ denote the (unique) path in~$T$ joining $x$ with~$y$. \:For an edge $e=xy$ we will call $T[e]:=T[x,y]$ the \df{path covered by~$e$} and the edges of this path \df{edges covered by~$e$}. -\:An edge~$e$ is called \df{light with respect to~$T$} (or just \df{$T$-light}) if it covers a heavier edge, i.e., if there - is an edge $f\in T[e]$ such that $w(f) > w(e)$. +\:An edge~$e$ is called \df{light with respect to~$T$} (or just \df{$T$-light}) if it covers a~heavier edge, i.e., if there + is an~edge $f\in T[e]$ such that $w(f) > w(e)$. \:An edge~$e$ is called \df{$T$-heavy} if it covers a~lighter edge. \endlist @@ -91,7 +91,7 @@ is not minimum. \proof If there is a $T$-light edge~$e$, then there exists an edge $e'\in T[e]$ such -that $w(e')>w(e)$. Now $T-e'$ is a forest of two trees with endpoints of~$e$ +that $w(e')>w(e)$. Now $T-e'$ ($T$~with the edge~$e'$ removed) is a forest of two trees with endpoints of~$e$ located in different components, so adding $e$ to this forest must restore connectivity and $T':=T-e'+e$ is another spanning tree with weight $w(T') = w(T)-w(e')+w(e) < w(T)$. Hence $T$ could not have been minimum. @@ -100,10 +100,10 @@ connectivity and $T':=T-e'+e$ is another spanning tree with weight $w(T') \figure{mst2.eps}{278pt}{An edge exchange as in the proof of Lemma~\ref{lightlemma}} The converse of this lemma is also true and to prove it, we will once again use -technique of transforming trees by \df{exchanges} of edges. In the proof of the +the technique of transforming trees by \df{exchanges of edges.} In the proof of the lemma, we have made use of the fact that whenever we exchange an edge~$e$ of -a spanning tree for another edge~$f$ covered by~$e$, the result is again -a spanning tree. In fact, it is possible to transform any spanning tree +a~spanning tree for another edge~$f$ covered by~$e$, the result is again +a~spanning tree. In fact, it is possible to transform any spanning tree to any other spanning tree by a sequence of exchanges. \lemman{Exchange property for trees}\id{xchglemma}% @@ -115,19 +115,21 @@ $T_{i+1}=T_i - e_i + e_i^\prime$ where $e_i\in T_i$ and $e_i^\prime\in T'$. \proof By induction on $d(T,T'):=\vert T\symdiff T'\vert$. When $d(T,T')=0$, both trees are identical and no exchanges are needed. Otherwise, the trees are different, -but as they are of the same size, there must exist an edge $e'\in T'\setminus T$. +but as they have the same number of edges, there must exist an edge $e'\in T'\setminus T$. The cycle $T[e']+e'$ cannot be wholly contained in~$T'$, so there also must exist an edge $e\in T[e']\setminus T'$. Exchanging $e$ for~$e'$ yields a spanning -tree $T^*:=T-e+e'$ such that $d(T^*,T')=d(T,T')-2$ and we can apply the induction +tree $T^*:=T-e+e'$ such that $d(T^*,T')=d(T,T')-2$. Now we can apply the induction hypothesis to $T^*$ and $T'$ to get the rest of the exchange sequence. \qed \figure{mst1.eps}{295pt}{One step of the proof of Lemma~\ref{xchglemma}} +\>In some cases, a~much stronger statement is true: + \lemman{Monotone exchanges}\id{monoxchg}% Let $T$ be a spanning tree such that there are no $T$-light edges and $T'$ be an arbitrary spanning tree. Then there exists a sequence of edge exchanges -transforming $T$ to~$T'$ such that the weight does not decrease in any step. +transforming $T$ to~$T'$ such that the weight of the tree does not decrease in any step. \proof We improve the argument from the previous proof, refining the induction step. @@ -141,14 +143,16 @@ $T'\setminus T^*$, since these are the only edges considered by the induction steps. To accomplish that, we replace the so far arbitrary choice of $e'\in T'\setminus T$ by picking the lightest such edge. -Now consider an edge $f\in T'\setminus T^*$. We want to show that $f$ is not +Let us consider an edge $f\in T'\setminus T^*$. We want to show that $f$ is not $T^*$-light, i.e., that it is heavier than all edges on $T^*[f]$. The path $T^*[f]$ is -either equal to the original path $T[f]$ (if $e\not\in T[f]$) or to $T[f] \symdiff C$, -where $C$ is the cycle $T[e']+e'$. The former case is trivial, in the latter one +either identical to the original path $T[f]$ (if $e\not\in T[f]$) or to $T[f] \symdiff C$, +where $C$ is the cycle $T[e']+e'$. The former case is trivial, in the latter we have $w(f)\ge w(e')$ due to the choice of $e'$ and all other edges on~$C$ are lighter than~$e'$ as $e'$ was not $T$-light. \qed +This lemma immediately implies that Lemma \ref{lightlemma} works in both directions: + \thmn{Minimality of spanning trees}\id{mstthm}% A~spanning tree~$T$ is minimum iff there is no $T$-light edge. @@ -163,7 +167,7 @@ and thus $T$~is also minimum. \qed In general, a single graph can have many minimum spanning trees (for example -a complete graph on~$n$ vertices and unit edge weights has $n^{n-2}$ +a complete graph on~$n$ vertices with unit edge weights has $n^{n-2}$ minimum spanning trees according to the Cayley's formula \cite{cayley:trees}). However, as the following theorem shows, this is possible only if the weight function is not injective. @@ -185,11 +189,11 @@ $T_1$ and $T_2$ must be identical. When $G$ is a graph with distinct edge weights, we will use $\mst(G)$ to denote its unique minimum spanning tree. -The following trivial lemma will be often invaluable: +Also the following trivial lemma will be often invaluable: \lemman{Edge removal} -Let~$G$ be a~graph with distinct edge weights and $e$ any its edge -which does not lie in~$\mst(G)$. Then $\mst(G-e) = \mst(G)$. +Let~$G$ be a~graph with distinct edge weights and $e \in G\setminus\mst(G)$. +Then $\mst(G-e) = \mst(G)$. \proof The tree $T=\mst(G)$ is also a~MST of~$G-e$, because every $T$-light @@ -213,11 +217,11 @@ problem, we will postpone it until Section \ref{kbestsect}. For the time being, we will always assume distinct weights. \obs -If all edge weights are distinct and $T$~is an~arbitrary tree, then for every tree~$T$ all edges are -either $T$-heavy, or $T$-light, or contained in~$T$. +If all edge weights are distinct and $T$~is an~arbitrary spanning tree, then every edge of~$G$ +is either $T$-heavy, or $T$-light, or contained in~$T$. \paran{Monotone isomorphism}% -Another useful consequence is that whenever two graphs are isomorphic and the +Another useful consequence of the Minimality theorem is that whenever two graphs are isomorphic and the isomorphism preserves the relative order of weights, the isomorphism applies to their MST's as well: \defn @@ -231,7 +235,7 @@ Let~$G_1$ and $G_2$ be two weighted graphs with distinct edge weights and $\pi$ a~monotone isomorphism between them. Then $\mst(G_2) = \pi[\mst(G_1)]$. \proof -The isomorphism~$\pi$ maps spanning trees onto spanning trees and it preserves +The isomorphism~$\pi$ maps spanning trees to spanning trees bijectively and it preserves the relation of covering. Since it is monotone, it preserves the property of being a light edge (an~edge $e\in E(G_1)$ is $T$-light $\Leftrightarrow$ the edge $\pi[e]\in E(G_2)$ is~$f[T]$-light). Therefore by the Minimality Theorem @@ -248,7 +252,7 @@ Most MST algorithms can be described as special cases of the following procedure \algn{Red-Blue Meta-Algorithm}\id{rbma}% \algo \algin A~graph $G$ with an edge comparison oracle (see \ref{edgeoracle}) -\:In the beginning, all edges are colored black. +\:At the beginning, all edges are colored black. \:Apply rules as long as possible: \::Either pick a cut~$C$ such that its lightest edge is not blue \hfil\break and color this edge blue, \cmt{Blue rule} \::or pick a cycle~$C$ such that its heaviest edge is not red \hfil\break and color this edge \rack{blue.}{red.\hfil} \cmt{Red rule} @@ -258,7 +262,7 @@ Most MST algorithms can be described as special cases of the following procedure \para This procedure is not a proper algorithm, since it does not specify how to choose the rule to apply. We will however prove that no matter how the rules are applied, -the procedure always stops and gives the correct result. Also, it will turn out +the procedure always stops and it gives the correct result. Also, it will turn out that each of the classical MST algorithms can be described as a specific way of choosing the rules in this procedure, which justifies the name meta-algorithm. @@ -268,10 +272,10 @@ We intend to prove that this is also the output of the procedure. \paran{Correctness}% Let us prove that the meta-algorithm is correct. First we show that the edges colored -blue in any step of the procedure always belong to~$T_{min}$ and that edges colored +blue in any step of the procedure always belong to~$T_{min}$ and that the edges colored red are guaranteed to be outside~$T_{min}$. Then we demonstrate that the procedure -always stops. We will prefer a~slightly more general formulation of the lemmata, which will turn out -to be useful in the future chapters. +always stops. Some parts of the proof will turn out to be useful in the upcoming chapters, +so we will state them in a~slightly more general way. \lemman{Blue lemma, also known as the Cut rule}\id{bluelemma}% The lightest edge of every cut is contained in the MST. @@ -307,19 +311,24 @@ $T_{min}$. As long as there exists a black edge, at least one rule can be applied. \proof -Assume that $e=xy$ be a black edge. Let us denote $M$ the set of vertices +Assume that $e=xy$ is a black edge. Let us define~$M$ as the set of vertices reachable from~$x$ using only blue edges. If $y$~lies in~$M$, then $e$ together -with some blue path between $x$ and $y$ forms a cycle and it must be the heaviest +with some blue path between $x$ and $y$ forms a cycle and $e$~must be the heaviest edge on this cycle. This holds because all blue edges have been already proven -to be in $T_{min}$ and there can be no $T_{min}$-light edges (see Theorem~\ref{mstthm}). -In this case we can apply the Red rule. +to be in $T_{min}$ and there can be no $T_{min}$-light edges. +In this case, we can apply the Red rule. On the other hand, if $y\not\in M$, then the cut formed by all edges between $M$ -and $V(G)\setminus M$ contains no blue edges, therefore we can use the Blue rule. +and $V\setminus M$ contains no blue edges, therefore we can use the Blue rule. \qed \figure{mst-bez.eps}{295pt}{Configurations in the proof of the Black lemma} +\nota\id{deltanota}% +We will use $\delta(M)$ to denote the cut separating~$M$ from its complement. +That is, $\delta(M) = E \cap (M \times (V\setminus M))$. We will also abbreviate +$\delta(\{v\})$ as~$\delta(v)$. + \thmn{Red-Blue correctness}% For any selection of rules, the Red-Blue procedure stops and the blue edges form the minimum spanning tree of the input graph. @@ -333,7 +342,7 @@ due to our Red and Blue lemmata. When no further rules can be applied, the Black lemma guarantees that all edges are colored, so by the Blue lemma all blue edges are in~$T_{min}$ and by the Red -lemma all other (red) edges are outside~$T_{min}$, so the blue edges are exactly~$T_{min}$. +lemma all other (red) edges are outside~$T_{min}$. Thus the blue edges are exactly~$T_{min}$. \qed \rem @@ -349,10 +358,10 @@ will however not pursue this direction in our work, referring the reader to the \section{Classical algorithms}\id{classalg}% -The three classical MST algorithms can be easily stated in terms of the Red-Blue meta-algorithm. -For each of them, we first show the general version of the algorithm, then we prove that -it gives the correct result and finally we discuss the time complexity of various -implementations. +The three classical MST algorithms (Bor\o{u}vka's, Jarn\'\i{}k's, and Kruskal's) can be easily +stated in terms of the Red-Blue meta-algorithm. For each of them, we first show the general version +of the algorithm, then we prove that it gives the correct result and finally we discuss the time +complexity of various implementations. \paran{Bor\o{u}vka's algorithm}% The oldest MST algorithm is based on a~simple idea: grow a~forest in a~sequence of @@ -362,7 +371,7 @@ edge of those having exactly one endpoint in the tree (we will call such edges the \df{neighboring edges} of the tree). We add all such edges to the forest and proceed with the next iteration. -\algn{Bor\o{u}vka \cite{boruvka:ojistem}, Choquet \cite{choquet:mst}, Sollin \cite{sollin:mst} and others} +\algn{Bor\o{u}vka \cite{boruvka:ojistem}, Choquet \cite{choquet:mst}, Sollin \cite{sollin:mst}, and others} \algo \algin A~graph~$G$ with an edge comparison oracle. \:$T\=$ a forest consisting of vertices of~$G$ and no edges. @@ -374,7 +383,8 @@ proceed with the next iteration. \endalgo \lemma\id{boruvkadrop}% -In each iteration of the algorithm, the number of trees in~$T$ drops at least twice. +In each iteration of the algorithm, the number of trees in~$T$ decreases by at least +a~factor of two. \proof Each tree gets merged with at least one of its neighbors, so each of the new trees @@ -385,7 +395,7 @@ contains two or more original trees. The algorithm stops in $\O(\log n)$ iterations. \lemma\id{borcorr}% -Bor\o{u}vka's algorithm outputs the MST of the input graph. +The Bor\o{u}vka's algorithm outputs the MST of the input graph. \proof In every iteration of the algorithm, $T$ is a blue subgraph, @@ -396,7 +406,7 @@ we do not need the Red rule to explicitly exclude edges. It remains to show that adding the edges simultaneously does not produce a cycle. Consider the first iteration of the algorithm where $T$ contains a~cycle~$C$. Without loss of generality we can assume that: -$$C=T_1[u_1v_1]\,v_1u_2\,T_2[u_2v_2]\,v_2u_3\,T_3[u_3v_3]\, \ldots \,T_k[u_kv_k]\,v_ku_1.$$ +$$C=T_1[u_1,v_1]\,v_1u_2\,T_2[u_2,v_2]\,v_2u_3\,T_3[u_3,v_3]\, \ldots \,T_k[u_k,v_k]\,v_ku_1.$$ Each component $T_i$ has chosen its lightest incident edge~$e_i$ as either the edge $v_iu_{i+1}$ or $v_{i-1}u_i$ (indexing cyclically). Suppose that $e_1=v_1u_2$ (otherwise we reverse the orientation of the cycle). Then $e_2=v_2u_3$ and $w(e_2) and \. The \ operation tests whether two elements are equivalent and \ joins two different equivalence classes into one. @@ -516,13 +526,13 @@ joins two different equivalence classes into one. \para We can maintain the connected components of our forest~$T$ as equivalence classes. When we want to add an~edge~$uv$, we first call $\(u,v)$ to check if both endpoints of the edge lie in -the same components. If they do not, addition of this edge connects both components into one, +the same component. If they do not, addition of this edge connects both components into one, so we perform $\(u,v)$ to merge the equivalence classes. -Tarjan and van Leeuwen have shown that there is a~data structure for the DSU problem -with surprising efficiency: +Tarjan has shown that there is a~data structure for the DSU problem +of surprising efficiency: -\thmn{Disjoint Set Union, Tarjan and van Leeuwen \cite{tarjan:setunion}}\id{dfu}% +\thmn{Disjoint Set Union, Tarjan \cite{tarjan:setunion}}\id{dfu}% Starting with a~trivial equivalence with single-element classes, a~sequence of operations comprising of $n$~\s intermixed with $m\ge n$~\s can be processed in time $\O(m\timesalpha(m,n))$, where $\alpha(m,n)$ is a~certain inverse of the Ackermann's function @@ -535,12 +545,12 @@ See \cite{tarjan:setunion}. This completes the following theorem: \thm\id{kruskal}% -Kruskal's algorithm finds the MST of a given graph in time $\O(m\log n)$. +The Kruskal's algorithm finds the MST of a given graph in time $\O(m\log n)$. If the edges are already sorted by their weights, the time drops to $\O(m\timesalpha(m,n))$. \proof -We spend $\O(m\log n)$ on sorting, $\O(m\timesalpha(m,n))$ on processing the sequence +We spend $\O(m\log n)$ time on sorting, $\O(m\timesalpha(m,n))$ on processing the sequence of \s and \s, and $\O(m)$ on all other work. \qed @@ -572,15 +582,15 @@ There are two definitions of edge contraction that differ when an edge of a~triangle is contracted. Either we unify the other two edges to a single edge or we keep them as two parallel edges, leaving us with a~multigraph. We will use the multigraph version and we will show that we can easily reduce the multigraph -to a simple graph later. (See \ref{contract} for the exact definitions.) +to a~simple graph later. (See \ref{contract} for the exact definitions.) We only need to be able to map edges of the contracted graph to the original -edges, so each edge will carry a unique label $\ell(e)$ that will be preserved by +edges, so we let each edge carry a unique label $\ell(e)$ that will be preserved by contractions. \lemman{Flattening a multigraph}\id{flattening}% -Let $G$ be a multigraph and $G'$ its subgraph such that all loops have been -removed and each bundle of parallel edges replaced by its lightest edge. +Let $G$ be a multigraph and $G'$ its subgraph obtaining by removing loops +and replacing each bundle of parallel edges by its lightest edge. Then $G'$~has the same MST as~$G$. \proof @@ -598,25 +608,25 @@ lemma applied to a~two-edge cycle, as we will see in \ref{multimst}.) \:$\ell(e)\=e$ for all edges~$e$. \cmt{Initialize the labels.} \:While $n(G)>1$: \::For each vertex $v_k$ of~$G$, let $e_k$ be the lightest edge incident to~$v_k$. -\::$T\=T\cup \{ \ell(e_k) \}$. \cmt{Remember labels of all selected edges.} -\::Contract all edges $e_k$, inheriting labels and weights.\foot{In other words, we ask the comparison oracle for the edge $\ell(e)$ instead of~$e$.} -\::Flatten $G$, removing parallel edges and loops. +\::$T\=T\cup \{ \ell(e_1),\ldots,\ell(e_n) \}$.\hfil\break\cmt{Remember labels of all selected edges.} +\::Contract all edges $e_k$, inheriting labels and weights.\foot{In other words, we will ask the comparison oracle for the edge $\ell(e)$ instead of~$e$.} +\::Flatten $G$ (remove parallel edges and loops). \algout Minimum spanning tree~$T$. \endalgo \nota For the analysis of the algorithm, we will denote the graph considered by the algorithm at the beginning of the $i$-th iteration by $G_i$ (starting with $G_0=G$) and the number -of vertices and edges of this graph by $n_i$ and $m_i$ respectively. +of vertices and edges of this graph by $n_i$ and $m_i$ respectively. A~single iteration +of the algorithm will be called a~\df{Bor\o{u}vka step}. \lemma\id{contiter}% -The $i$-th iteration of the algorithm (also called the \df{Bor\o{u}vka step}) can be carried -out in time~$\O(m_i)$. +The $i$-th Bor\o{u}vka step can be carried out in time~$\O(m_i)$. \proof The only non-trivial parts are steps 6 and~7. Contractions can be handled similarly to the unions in the original Bor\o{u}vka's algorithm (see \ref{boruvkaiter}): -We build an auxiliary graph containing only the selected edges~$e_k$, find +We build an~auxiliary graph containing only the selected edges~$e_k$, find connected components of this graph and renumber vertices in each component to the identifier of the component. This takes $\O(m_i)$ time. @@ -643,6 +653,8 @@ edges and loops at the end of the previous iteration). Hence the total time spen in all iterations is $\O(\sum_i n_i^2) = \O(\sum_i n^2/4^i) = \O(n^2)$. \qed +On planar graphs, the algorithm runs much faster: + \thmn{Contractive Bor\o{u}vka on planar graphs, \cite{mm:mst}}\id{planarbor}% When the input graph is planar, the Contractive Bor\o{u}vka's algorithm runs in time $\O(n)$. @@ -651,7 +663,7 @@ time $\O(n)$. Let us refine the previous proof. We already know that $n_i \le n/2^i$. We will prove that when~$G$ is planar, the $m_i$'s are decreasing geometrically. We know that every $G_i$ is planar, because the class of planar graphs is closed under edge deletion and -contraction. Moreover, $G_i$~is also simple, so we can use the standard theorem on +contraction. Moreover, $G_i$~is also simple, so we can use the standard bound on the number of edges of planar simple graphs (see for example \cite{diestel:gt}) to get $m_i\le 3n_i \le 3n/2^i$. The total time complexity of the algorithm is therefore $\O(\sum_i m_i)=\O(\sum_i n/2^i)=\O(n)$. \qed @@ -660,7 +672,7 @@ The total time complexity of the algorithm is therefore $\O(\sum_i m_i)=\O(\sum_ There are several other possibilities how to find the MST of a planar graph in linear time. For example, Matsui \cite{matsui:planar} has described an algorithm based on simultaneously working on the graph and its topological dual. The advantage of our approach is that we do not need -to construct the planar embedding explicitly. We will show one more linear algorithm +to construct the planar embedding explicitly. We will show another simpler linear-time algorithm in section~\ref{minorclosed}. \rem @@ -682,17 +694,17 @@ their counterparts in~$G/e$. Then: $$\mst(G) = \pi^{-1}[\mst(G/e)] + e.$$ %Flattening lemma (\ref{flattening}), the MST stays the same and if we remove a parallel edge %or loop~$f$, then $\pi(f)$ would be removed when flattening~$G/e$, so $f$ never participates %in a MST. -The right-hand side of the equality is a spanning tree of~$G$, let us denote it by~$T$ and +The right-hand side of the equality is a spanning tree of~$G$. Let us denote it by~$T$ and the MST of $G/e$ by~$T'$. If $T$ were not minimum, there would exist a $T$-light edge~$f$ in~$G$ -(by Theorem \ref{mstthm}). If the path $T[f]$ covered by~$f$ does not contain~$e$, +(by the Minimality Theorem, \ref{mstthm}). If the path $T[f]$ covered by~$f$ does not contain~$e$, then $\pi[T[f]]$ is a path covered by~$\pi(f)$ in~$T'$. Otherwise $\pi(T[f]-e)$ is such a path. In both cases, $f$ is $T'$-light, which contradicts the minimality of~$T'$. (We do not have -a~multigraph version of the theorem, but the side we need is a~straightforward edge exchange, -which obviously works in multigraphs as well.) +a~multigraph version of the theorem, but the direction we need is a~straightforward edge exchange, +which obviously works in multigraphs as well as in simple graphs.) \qed \rem -In the previous algorithm, the role of the mapping~$\pi^{-1}$ is of course played by the edge labels~$\ell$. +In the Contractive Bor\o{u}vka's algorithm, the role of the mapping~$\pi^{-1}$ is of course played by the edge labels~$\ell$. \paran{A~lower bound}% Finally, we will show a family of graphs for which the $\O(m\log n)$ bound on time complexity @@ -701,29 +713,29 @@ the algorithm never compares two edges with the same weight. Therefore, when two graphs are monotonically isomorphic (see~\ref{mstiso}), the algorithm processes them in the same way. \defn -A~\df{distractor of order~$k$,} denoted by~$D_k$, is a path on $n=2^k$~vertices $v_1,\ldots,v_n$ +A~\df{distractor of order~$k$,} denoted by~$D_k$, is a path on $n=2^k$~vertices $v_1,\ldots,v_n$, where each edge $v_iv_{i+1}$ has its weight equal to the number of trailing zeroes in the binary representation of the number~$i$. The vertex $v_1$ is called a~\df{base} of the distractor. +\figure{distractor.eps}{\epsfxsize}{A~distractor $D_3$ and its evolution (bold edges are contracted)} + \rem Alternatively, we can use a recursive definition: $D_0$ is a single vertex, $D_{k+1}$ consists of two disjoint copies of~$D_k$ joined by an edge of weight~$k$. -\figure{distractor.eps}{\epsfxsize}{A~distractor $D_3$ and its evolution (bold edges are contracted)} - \lemma -A~single iteration of the contractive algorithm reduces~$D_k$ to a graph isomorphic with~$D_{k-1}$. +A~single iteration of the contractive algorithm reduces the distractor~$D_k$ to a~graph isomorphic with~$D_{k-1}$. \proof Each vertex~$v$ of~$D_k$ is incident with a single edge of weight~1. The algorithm therefore -selects all weight~1 edges and contracts them. This produces a graph which is -exactly $D_{k-1}$ with all weights increased by~1, which does not change the relative order of edges. +selects all weight~1 edges and contracts them. This produces a~graph that is +equal to $D_{k-1}$ with all weights increased by~1, which does not change the relative order of edges. \qed \defn A~\df{hedgehog}~$H_{a,k}$ is a graph consisting of $a$~distractors $D_k^1,\ldots,D_k^a$ of order~$k$ -together with edges of a complete graph on the bases of the distractors. These additional edges -have arbitrary weights, but heavier than the edges of all distractors. +together with edges of a complete graph on the bases of these distractors. The additional edges +have arbitrary weights that are heavier than the edges of all the distractors. \figure{hedgehog.eps}{\epsfxsize}{A~hedgehog $H_{5,2}$ (quills bent to fit in the picture)} @@ -733,10 +745,12 @@ A~single iteration of the contractive algorithm reduces~$H_{a,k}$ to a graph iso \proof Each vertex is incident with an edge of some distractor, so the algorithm does not select any edge of the complete graph. Contraction therefore reduces each distractor to a smaller -distractor (modulo an additive factor in weight) and leaves the complete graph intact. -This is monotonely isomorphic to $H_{a,k-1}$. +distractor (modulo an additive factor in weight) and it leaves the complete graph intact. +The resulting graph is monotonely isomorphic to $H_{a,k-1}$. \qed +When we set the parameters appropriately, we get the following lower bound: + \thmn{Lower bound for Contractive Bor\o{u}vka}% For each $n$ there exists a graph on $\Theta(n)$ vertices and $\Theta(n)$ edges such that the Contractive Bor\o{u}vka's algorithm spends time $\Omega(n\log n)$ on it. diff --git a/notation.tex b/notation.tex index e1e424f..7c9d525 100644 --- a/notation.tex +++ b/notation.tex @@ -61,13 +61,14 @@ \n{$\alpha(n)$}{diagonal inverse of the Ackermann's function \[ackerinv]} \n{$\alpha(m,n)$}{$\alpha(m,n) := \min\{ x\ge 1 \mid A(x,4\lceil m/n\rceil) > \log n \}$ \[ackerinv]} \n{$\beta(m,n)$}{$\beta(m,n) := \min\{i \mid \log^{(i)}n \le m/n \}$ \[itjarthm]} -\n{$\delta_G(U)$}{all edges connecting $U\subset V(G)$ with $V(G)\setminus U$; we usually omit the~$G$} -\n{$\delta_G(v)$}{edges of a one-vertex cut, i.e., $\delta_G(\{v\})$} +\n{$\delta_G(U)$}{the cut separating $U\subset V(G)$ from $V(G)\setminus U$ \[deltanota]} +\n{$\delta_G(v)$}{edges of a one-vertex cut, i.e., $\delta_G(\{v\})$ \[deltanota]} \n{$\Theta(g)$}{asymptotic~$\Theta$: $f=\Theta(g)$ iff $f=\O(g)$ and $f=\Omega(g)$} \n{$\lambda_i(n)$}{inverse of the $i$-th row of the Ackermann's function \[ackerinv]} \n{$\varrho({\cal C})$}{edge density of a graph class~$\cal C$ \[density]} \n{$\Omega(g)$}{asymptotic~$\Omega$: $f=\Omega(g)$ iff $\exists c>0: f(n)\ge g(n)$ for all~$n\ge n_0$} +%%\n{$x := y$}{$x$ is defined as~$y$} \n{$T[u,v]$}{the path in a tree~$T$ joining vertices $u$ and $v$ \[heavy]} \n{$T[e]$}{the path in a tree~$T$ joining the endpoints of an~edge~$e$ \[heavy]} \n{$A\symdiff B$}{symetric difference of sets: $(A\setminus B) \cup (B\setminus A)$} -- 2.39.5