From ae4e56f7ffd761c0b94e198cfd0ef24b8b0edf20 Mon Sep 17 00:00:00 2001 From: Martin Mares Date: Wed, 4 Jun 2008 14:32:21 +0200 Subject: [PATCH] More shortening. --- abstract.tex | 31 ++++++++++++------------------- 1 file changed, 12 insertions(+), 19 deletions(-) diff --git a/abstract.tex b/abstract.tex index 8181a1c..ce4baa0 100644 --- a/abstract.tex +++ b/abstract.tex @@ -384,9 +384,9 @@ $\O(2^{r^\delta})$ on precomputing of tables. \section{Minor-closed graph classes}\id{minorclosed}% The contractive algorithm given in Section~\ref{contalg} has been found to perform -well on planar graphs, but in the general case its time complexity was not linear. -Can we find any broader class of graphs where this algorithm is still efficient? -The right context turns out to be the minor-closed graph classes, which are +well on planar graphs, but in general its time complexity was not linear. +Can we find any broader class of graphs where the linear bound holds? +The right context turns out to be the minor-closed classes, which are closed under contractions and have bounded density. \defn\id{minordef}% @@ -797,17 +797,15 @@ components: \problemn{Dynamic connectivity} Maintain an~undirected graph under a~sequence of the following operations: \itemize\ibull -\:$\(n)$ --- Create a~graph with $n$~isolated vertices $\{1,\ldots,n\}$.\foot{% -The structure could support dynamic addition and removal of vertices, too, -but this is easy to add and infrequently used, so we will rather keep the set -of vertices fixed for clarity.} +\:$\(n)$ --- Create a~graph with $n$~isolated vertices $\{1,\ldots,n\}$. +(It is possible to modify the structure to support dynamic addition and removal of vertices, too.) \:$\(G,u,v)$ --- Insert an~edge $uv$ to~$G$ and return its unique identifier. This assumes that the edge did not exist yet. \:$\(G,e)$ --- Delete an~edge specified by its identifier from~$G$. \:$\(G,u,v)$ --- Test if vertices $u$ and~$v$ are in the same connected component of~$G$. \endlist -In this chapter, we will focus on the dynamic version of the minimum spanning forest. +\>In this chapter, we will focus on the dynamic version of the minimum spanning forest. This problem seems to be intimately related to the dynamic connectivity. Indeed, all known algorithms for dynamic connectivity maintain some sort of a~spanning forest. This suggests that a~dynamic MSF algorithm could be obtained by modifying the @@ -964,7 +962,7 @@ updated in time $\O(\log^2 n)$ amortized per operation. \paran{Fully dynamic MSF}% The decremental MSF algorithm can be turned to a~fully dynamic one by a~blackbox -reduction whose properties are summarized in the following theorem: +reduction of Holm et al.: \thmn{MSF dynamization, Holm et al.~\cite{holm:polylog}} Suppose that we have a~decremental MSF algorithm with the following properties: @@ -1015,7 +1013,6 @@ In this section, we will require the edge weights to be numeric, because comparisons are certainly not sufficient to determine the second best spanning tree. We will assume that our computation model is able to add, subtract and compare the edge weights in constant time. - Let us focus on finding the second lightest spanning tree first. \paran{Second lightest spanning tree}% @@ -1036,9 +1033,7 @@ efficiently by the methods of Section~\ref{verifysect}. Therefore: \lemma For every graph~$H$ and a~MST $T$ of~$H$, linear time is sufficient to find edges $e\in T$ and $f\in H\setminus T$ such that $w(f)-w(e)$ is minimum. - -\nota -We will call this procedure \df{finding the best exchange in $(H,T)$.} +(We will call this procedure \df{finding the best exchange in $(H,T)$.}) \cor Given~$G$ and~$T_1$, we can find~$T_2$ in time $\O(m)$. @@ -1047,7 +1042,7 @@ Given~$G$ and~$T_1$, we can find~$T_2$ in time $\O(m)$. Once we know~$T_1$ and~$T_2$, how to get~$T_3$? According to the Difference lemma, $T_3$~can be obtained by a~single exchange from either~$T_1$ or~$T_2$. Therefore we need to find the best exchange for~$T_2$ and the second best exchange for~$T_1$ and use the better of them. -The latter is not easy to find directly, but we can get around it: +The latter is not easy to find directly, so we observe: \obs\id{tbobs}% The tree $T_3$~can be obtained by a~single edge exchange in either $(G_1,T_1/e)$ or $(G_2,T_2)$: @@ -1078,7 +1073,7 @@ Given~$G$ and~$T_1$, we can find $T_2,\ldots,T_K$ in time $\O(Km + K\log K)$. \paran{Invariant edges}% Our algorithm can be further improved for small values of~$K$ (which seems to be the common case in most applications) by the reduction of Eppstein \cite{eppstein:ksmallest}. -We will observe that there are many edges of~$T_1$ +He has proven that there are many edges of~$T_1$ which are guaranteed to be contained in $T_2,\ldots,T_K$ as well, and likewise there are many edges of $G\setminus T_1$ which are excluded from all those spanning trees. When we combine this with the previous construction, we get the following theorem: @@ -1142,10 +1137,8 @@ improvements to $O(n\log n/\log \log n)$ by using the RAM data structures of Die Linear time complexity was reached by Myrvold and Ruskey \cite{myrvold:rank} for a~non-lexicographic order, which is defined locally by the history of the -data structure --- in fact, they introduce a linear-time unranking algorithm -first and then they derive an inverse algorithm without describing the order -explicitly. However, they leave the problem of lexicographic ranking open. - +data structure. +However, they leave the problem of lexicographic ranking open. We will describe a~general procedure which, when combined with suitable RAM data structures, yields a~linear-time algorithm for lexicographic (un)ranking. -- 2.39.5