5 \chapter{Dynamic Spanning Trees}\id{dynchap}%
7 \section{Dynamic graph algorithms}
9 In many applications, we often need to solve a~certain graph problem for a~sequence
10 of graphs that differ only a~little, so recomputing the solution from scratch for
11 every graph would be a~waste of time. In such cases, we usually turn our attention
12 to \df{dynamic graph algorithms.} A~dynamic algorithm is in fact a~data structure
13 that remembers a~graph and offers operations that modify the structure of the graph
14 and also operations that query the result of the problem for the current state
15 of the graph. A~typical example of a~problem of this kind is dynamic
16 maintenance of connected components:
18 \problemn{Dynamic connectivity}
19 Maintain an~undirected graph under a~sequence of the following operations:
21 \:$\<Init>(n)$ --- Create a~graph with $n$~isolated vertices $\{1,\ldots,n\}$.\foot{%
22 The structure could support dynamic addition and removal of vertices, too,
23 but this is easy to add and infrequently used, so we will rather keep the set
24 of vertices fixed for clarity.}
25 \:$\<Insert>(G,u,v)$ --- Insert an~edge $uv$ to~$G$ and return its unique
26 identifier. This assumes that the edge did not exist yet.
27 \:$\<Delete>(G,e)$ --- Delete an~edge specified by its identifier from~$G$.
28 \:$\<Connected>(G,u,v)$ --- Test if $u$ and~$v$ are in the same connected component of~$G$.
32 We have already encountered a~special case of dynamic connectivity when implementing the
33 Kruskal's algorithm in Section \ref{classalg}. At that time, we did not need to delete
34 any edges from the graph, which makes the problem substantially easier. This special
35 case is customarily called an~\df{incremental} or \df{semidynamic} graph algorithm.
36 We mentioned the Disjoint Set Union data structure of Tarjan and van Leeuwen (Theorem \ref{dfu})
37 which can be used for that: Connected components are represented as an~equivalence classes.
38 Queries on connectedness translate to \<Find>, edge insertions to \<Find>
39 followed by \<Union> if the new edge joins two different components. This way,
40 a~sequence of $m$~operations starting with an~empty graph on $n$~vertices is
41 processed in time $\O(n+m\timesalpha(m,n))$ and this holds even for the Pointer
42 Machine. Fredman and Saks \cite{fredman:cellprobe} have proven a~matching lower
43 bound in the cell-probe model which is stronger than RAM with $\O(\log n)$-bit
46 The edges that have triggered the \<Union>s form a~spanning forest of the current graph.
47 So far, all known algorithms for dynamic connectivity maintain some sort of a~spanning
48 tree. This suggests that a~dynamic MST algorithm could be obtained by modifying the
49 dynamic connectivity algorithms. This will indeed turn out to be true. Semidynamic MST
50 is easy to achieve even in the few pages of this section, but making it fully dynamic will require
51 more effort, so we will review some of the required building blocks before going into that.
53 We however have to answer one important question first: What should be the output of
54 our MSF data structure? Adding an~operation that would return the MSF of the current
55 graph is of course possible, but somewhat impractical as this operation has to
56 spend $\Omega(n)$ time on the mere writing of its output. A~better way seems to
57 be making the \<Insert> and \<Delete> operations report the list of modifications
58 of the MSF implied by the change in the graph.
60 Let us see what happens when we \<Insert> an~edge~$e$ to a~graph~$G$ with its minimum spanning
61 forest~$F$, obtaining a~new graph~$G'$ with its MSF~$F'$. If $e$~connects two components of~$G$ (and
62 therefore also of~$F$), we have to add~$e$ to~$F$. Otherwise, one of the following cases happens:
63 Either $e$~is $F$-heavy and so the forest~$F$ is also the MSF of the new graph. Or it is $F$-light
64 and we have to modify~$F$ by exchanging the heaviest edge~$f$ of the path $F[e]$ with~$e$.
66 Correctness of the former case follows immediately from the Theorem on Minimality by order
67 (\ref{mstthm}), because all $F'$-light would be also $F$-light, which is impossible as $F$~was
68 minimum. In the latter case, the edge~$f$ is not contained in~$F'$ because it is the heaviest
69 on the cycle $F[e]+e$ (by the Red lemma, \ref{redlemma}). We can now use the Blue lemma
70 (\ref{bluelemma}) to prove that it should be replaced with~$e$. Consider the tree~$T$
71 of~$F$ that contains both endpoints of the edge~$e$. When we remove~$f$ from~$F$, this tree falls
72 apart to two components $T_1$ and~$T_2$. The edge~$f$ was the lightest edge of the cut~$\delta_G(T_1)$
73 and $e$~is lighter than~$f$, so $e$~is the lightest in~$\delta_{G'}(T_1)$ and hence $e\in F'$.
75 A~\<Delete> of an~edge that is not contained in~$F$ does not change~$F$. When we delete
76 an~MSF edge, we have to reconnect~$F$ by choosing the lightest edge of the cut separating
77 the new components (again the Blue lemma in action). If there is no such
78 replacement edge, we have deleted a~bridge, so the MSF has to remain
81 The idea of reporting differences in the MSF indeed works very well. We can summarize
82 what we have shown in the following lemma and use it to define the dynamic MSF.
85 An~\<Insert> or \<Delete> of an~edge in~$G$ causes at most one edge addition, edge
86 removal or edge exchange in $\msf(G)$.
88 \problemn{Dynamic minimum spanning forest}
89 Maintain an~undirected graph with distinct weights on edges (drawn from a~totally ordered set)
90 and its minimum spanning forest under a~sequence of the following operations:
92 \:$\<Init>(n)$ --- Create a~graph with $n$~isolated vertices $\{1,\ldots,n\}$.
93 \:$\<Insert>(G,u,v)$ --- Insert an~edge $uv$ to~$G$. Return its unique
94 identifier and the list of additions and deletions of edges in $\msf(G)$.
95 \:$\<Delete>(G,e)$ --- Delete an~edge specified by its identifier from~$G$.
96 Return the list of additions and deletions of edges in $\msf(G)$.
99 \paran{Semidynamic MSF}%
100 To obtain a~semidynamic MSF algorithm, we need to keep the forest in a~data structure that
101 supports search for the heaviest edge on the path connecting a~given pair
102 of vertices. This can be handled efficiently by the Link-Cut trees of Sleator and Tarjan:
104 \thmn{Link-Cut Trees, Sleator and Tarjan \cite{sleator:trees}}\id{sletar}%
105 There is a~data structure that represents a~forest of rooted trees on~$n$ vertices.
106 Each edge of the forest has a~weight drawn from a~totally ordered set. The structure
107 supports the following operations in time $\O(\log n)$ amortized:\foot{%
108 The Link-Cut trees offer many other operations, but we do not mention them
109 as they are not needed in our application.}
111 \:$\<Parent>(v)$ --- Return the parent of~$v$ in its tree or \<null> if $v$~is a~root.
112 \:$\<Root>(v)$ --- Return the root of the tree containing~$v$.
113 \:$\<Weight>(v)$ --- Return the weight of the edge $(\<Parent(v)>,v)$.
114 \:$\<PathMax>(v)$ --- Return the vertex~$w$ closest to $\<Root>(v)$ such that the edge
115 $(\<Parent>(w),w)$ is the heaviest of those on the path from the root to~$v$.
116 If more edges have the maximum weight, break the tie arbitrarily.
117 If there is no such edge ($v$~is the root itself), return \<null>.
118 \:$\<Link>(u,v,x)$ --- Connect the trees containing $u$ and~$v$ by an~edge $(u,v)$ of
119 weight~$x$. Assumes that $u~$is a tree root and $v$~lies in a~different tree.
120 \:$\<Cut>(v)$ --- Split the tree containing the non-root vertex $v$ to two trees by
121 removing the edge $(\<Parent>(v),v)$. Returns the weight of this edge.
122 \:$\<Evert>(v)$ --- Modify the orientations of the edges in the tree containing~$v$
123 to make~$v$ the tree's root.
126 %% \>Additionally, all edges on the path from~$v$ to $\<Root>(v)$ can be enumerated in
127 %% time $\O(\ell + \log n)$, where $\ell$~is the length of that path. This operation
128 %% (and also the path itself) is called $\<Path>(v)$.
130 %% \>If the weights are real numbers (or in general an~arbitrary group), the $\O(\log n)$
131 %% operations also include:
134 %% \:$\<PathWeight>(v)$ --- Return the sum of the weights on $\<Path>(v)$.
135 %% \:$\<PathUpdate>(v,x)$ --- Add~$x$ to the weights of all edges on $\<Path>(v)$.
139 See \cite{sleator:trees}.
142 Once we have this structure, we can turn our ideas on updating of the MSF to
143 an~incremental algorithm:
145 \algn{\<Insert> in a~semidynamic MSF}
147 \algin A~graph~$G$ with its MSF $F$ represented as a~Link-Cut forest, an~edge~$uv$
148 with weight~$w$ to be inserted.
149 \:$\<Evert>(u)$. \cmt{$u$~is now the root of its tree.}
150 \:If $\<Root>(v) \ne u$: \cmt{$u$~and~$v$ lie in different trees.}
151 \::$\<Link>(u,v,w)$. \cmt{Connect the trees.}
152 \::Return ``$uv$ added''.
153 \:Otherwise: \cmt{both are in the same tree}
154 \::$y\=\<PathMax>(v)$.
155 \::$x\=\<Parent>(y)$. \cmt{Edge~$xy$ is the heaviest on $F[uv]$.}
156 \::If $\<Weight>(y) > w$: \cmt{We have to exchange~$xy$ with~$uv$.}
157 \:::$\<Cut>(y)$, $\<Evert>(v)$, $\<Link>(v,y,w)$.
158 \:::Return ``$uv$~added, $xy$~removed''.
159 \::Otherwise return ``no changes''.
160 \algout The list of changes in~$F$.
163 \thmn{Incremental MSF}
164 When only edge insertions are allowed, the dynamic MSF can be maintained in time $\O(\log n)$
165 amortized per operation.
168 Every \<Insert> performs $\O(1)$ operations on the Link-Cut forest, which take
169 $\O(\log n)$ each by Theorem \ref{sletar}.
173 We can easily extend the semidynamic MSF algorithm to allow an~operation commonly called
174 \<Backtrack> --- removal of the most recently inserted edge. It is sufficient to keep the
175 history of all MSF changes in a~stack and reverse the most recent change upon backtrack.
177 What are the obstacles to making the structure fully dynamic?
178 Deletion of edges that do not belong to the MSF is trivial (we do not
179 need to change anything) and so is deletion of bridges (we just remove the bridge
180 from the Link-Cut tree, knowing that there is no edge to replace it). The hard part
181 is the search for replacement edges after an~edge of the MSF is deleted.
183 This very problem also has to be solved by algorithms for fully dynamic connectivity,
184 we will take a~look on them first.
186 %--------------------------------------------------------------------------------
188 \section{Eulerian Tour trees}
190 An~important stop on the road to fully dynamic algorithms has the name \df{Eulerian Tour trees} or
191 simply \df{ET-trees}. It is a~representation of forests introduced by Henzinger and King
192 \cite{henzinger:randdyn} in their randomized dynamic algorithms. It is similar to the one by Sleator
193 and Tarjan, but it is much simpler and instead of path operations it offers efficient operations on
194 subtrees. It is also possible to attach auxiliary data to vertices and edges of the original tree.
197 The \df{Eulerian Tour sequence} $\Eul(T)$ of a~rooted tree~$T$ is the sequence of vertices of~$T$ as visited
198 by the depth-first traversal of~$T$. More precisely, it is generated by the following algorithm $\<ET>(v)$
199 when it is invoked on the root of the tree:
201 \:Record~$v$ in the sequence.
202 \:For each son~$w$ of~$v$:
203 \::Call $\<ET>(w)$ recursively.
206 \>One of the occurrences of each vertex is defined as its \df{active occurrence} and it will
207 be used to store auxiliary data associated with the vertex.
210 The ET-sequence contains a~vertex of degree~$d$ exactly $d$~times except for the root which
211 occurs $d+1$ times. The whole sequence therefore contains $2n-1$ elements. It indeed describes the
212 order vertices on an~Eulerian tour in the tree with all edges doubled. Let us observe what happens
213 to the ET-sequence when we modify the tree.
215 When we \em{delete} an~edge $uv$ from the tree~$T$ (let $u$~be the parent of~$v$), the sequence
216 $\Eul(T) = AuvBvuC$ (with no~$u$ nor~$v$ in~$B$) splits to two ET-sequences $AuC$ and $vBv$.
217 If there was only a~single occurrence of~$v$, it corresponded to a~leaf and thus the second
218 sequence should consist of $v$~alone.
220 \em{Changing the root} of the tree~$T$ from~$v$ to~$w$ changes $\Eul(T)$ from $vAwBwCv$ to $wBwCvAw$.
221 If $w$~was a~leaf, the sequence changes from $vAwCv$ to $wCvAw$. If $vw$ was the only edge of~$T$,
222 the sequence $vw$ becomes $wv$. Note that this works regardless of the presence of~$w$ inside~$B$.
224 \em{Joining} the roots of two trees by a~new edge makes their ET-sequences $vAv$ and~$wBw$
225 combine to $vAvwBwv$. Again, we have to handle the cases when $v$ or~$w$ has degree~1 separately:
226 $v$~and~$wBw$ combine to $vwBwv$, and $v$~with~$w$ makes $vwv$.
228 \float{\valign{\vfil#\vfil\cr
229 \hbox{\epsfbox{pic/ettree.eps}}\cr
230 \noalign{\qquad\quad}
232 $\Eul(T_1) = 0121034546474308980$,~~$\Eul(T_2) = aba$. \cr
233 $\Eul(T_1-34) = 01210308980, 4546474$. \cr
234 $\Eul(T_1\hbox{~rooted at~3}) = 3454647430898012103$. \cr
235 $\Eul(T_1+0a+T_2)$: $0121034546474308980aba0$. \cr
237 }}{Trees and their ET-sequences}
239 If any of the occurrences that we have removed from the sequence was active, there is always
240 a~new occurrence of the same vertex that can stand in its place and inherit the auxiliary data.
242 The ET-trees will represent the ET-sequences by $(a,b)$-trees with the parameter~$a$ set upon
243 initialization of the structure and with $b=2a$. We know from the standard theorems of $(a,b)$-trees
244 (see for example \cite{clrs}) that the depth of a~tree with $n$~leaves is always $\O(\log_a n)$
245 and that all basic operations including insertion, deletion, search, splitting and joining the trees
246 run in time $\O(a\log_a n)$ in the worst case.
248 We will use the ET-trees to maintain a~spanning forest of the current graph. The auxiliary data of
249 each vertex will hold a~list of edges incident with the given vertex, which do not lie in the
250 forest. Such edges are usually called the \df{non-tree edges.}
253 \df{Eulerian Tour trees} are a~data structure that represents a~forest of trees and a~set of non-tree
254 edges associated with the vertices of the forest. To avoid confusion, we will distinguish between
255 \df{original} vertices and edges (of the original trees) and the vertices and edges of the
256 data structure. The structure consists of:
258 \:A~collection of $(a,b)$-trees with some fixed parameters $a$ and~$b$.
259 Each such tree corresponds to one of the original trees~$T$. Its
260 leaves (in the usual tree order) correspond to the elements
261 of the ET-sequence $\Eul(T)$. Each two consecutive leaves $u$ and~$v$ are separated
262 by a~unique key stored in an~internal vertex of the $(a,b)$-tree. This key is used to represent
263 the original edge~$uv$. Each original edge is therefore kept in both its orientations.
264 \:A~mapping $\<act>(v)$ that maps each original vertex to the leaf containing its active occurrence.
265 \:A~mapping $\<edge>(e)$ that maps an~original edge~$e$ to one of the internal keys representing it.
266 \:A~mapping $\<twin>(k)$ that maps an~internal key~$k$ to the other internal key of the same
268 \:A~list of non-tree edges placed in each leaf. The lists are allowed to be non-empty only
269 in the leaves that represent active occurrences of original vertices.
270 \:Boolean \df{markers} in the internal vertices that signal presence of a~non-tree
271 edge anywhere in the subtree rooted at that internal vertex.
272 \:Counters $\<leaves>(v)$ that contain the number of leaves in the subtree rooted at~$v$.
274 \>The structure supports the following operations on the original trees:
276 \:\<Create> --- Create a~single-vertex tree.
277 \:$\<Link>(u,v)$ --- Join two different trees by an~edge~$uv$ and return a~unique identifier
279 \:$\<Cut>(e)$ --- Split a~tree by removing the edge~$e$ given by its identifier.
280 \:$\<Root>(v)$ --- Return the root of the ET-tree containing the vertex~$v$. This can be used
281 to test whether two vertices lie in the same tree. However, the root is not guaranteed
282 to stay the same when the tree is modified by a~\<Link> or \<Cut>.
283 \:$\<Size>(v)$ --- Return the number of vertices in the tree containing the vertex~$v$.
284 \:$\<InsertNontree>(v,e)$ --- Add a~non-tree edge~$e$ to the list at~$v$ and return a~unique
285 identifier of this edge.
286 \:$\<DeleteNontree>(e)$ --- Delete a~non-tree edge~$e$ given by its identifier.
287 \:$\<ScanNontree>(v)$ --- Return non-tree edges stored in the same tree as~$v$.
291 We will implement the operations on the ET-trees by translating the intended changes of the
292 ET-sequences to operations on the $(a,b)$-trees. The role of identifiers of the original vertices
293 and edges will be of course played by pointers to the respective leaves and internal keys of
296 \<Cut> of an~edge splits the $(a,b)$-tree at both internal keys representing the given edge
297 and joins them back in the different order.
299 \<Link> of two trees can be accomplished by making both vertices the roots of their trees first
300 and joining the roots by an~edge afterwards. Re-rooting again involves splits and joins of $(a,b)$-trees.
301 As we can split at any occurrence of the new root vertex, we will use the active occurrence
302 which we remember. Joining of the roots translated to joining of the $(a,b)$-trees.
304 \<Root> just follows parent pointers in the $(a,b)$-tree and it walks the path from the leaf
307 \<Size> finds the root and returns its counter.
309 \<InsertNontree> finds the leaf $\<act>(v)$ containing the list of $v$'s non-tree edges
310 and inserts the new edge there. The returned identifier will consist from the pointer to
311 the edge and the vertex in whose list it is stored. Then we have to recalculate the markers
312 on the path from $\<act>(v)$ to the root. \<DeleteNontree> is analogous.
314 Whenever any other operation changes a~vertex of the tree, it will also update its marker and
315 counter and, if necessary, the markers and counters on the path to the root.
317 \<ScanNontree> traverses the tree recursively from the root, but it does not enter the
318 subtrees whose roots are not marked.
320 Analysis of the time complexity is now straightforward:
322 \thmn{Eulerian Tour trees, Henzinger and Rauch \cite{henzinger:randdyn}}\id{etthm}%
323 The ET-trees perform the operations \<Link> and \<Cut> in time $\O(a\log_a n)$, \<Create>
324 in $\O(1)$, \<Root>, \<InsertNontree>, and \<DeleteNontree> in $\O(\log_a n)$, and
325 \<ScanNontree> in $\O(a\log_a n)$ per edge reported. Here $n$~is the number of vertices
326 in the original forest and $a\ge 2$ is an~arbitrary constant.
329 We set $b=2a$. Our implementation performs $\O(1)$ operations on the $(a,b)$-trees
330 per operation on the ET-tree, plus $\O(1)$ other operations. We apply the standard theorems
331 on the complexity of $(a,b)$-trees \cite{clrs}.
334 \examplen{Connectivity acceleration}\id{accel}%
335 In most cases, the ET-trees are used with $a$~constant, but sometimes choosing~$a$ as a~function
336 of~$n$ can also have its beauty. Suppose that there is a~data structure which maintains an~arbitrary
337 spanning forest of a~dynamic graph. Suppose also that the structure works in time $\O(\log^k n)$
338 per operation and that it reports $\O(1)$ changes in the spanning forest for every change
339 in the graph. If we keep the spanning forest in ET-trees with $a=\log n$, the updates of the
340 data structure cost an~extra $\O(\log^2 n / \log\log n)$, but queries accelerate to $\O(\log
343 %--------------------------------------------------------------------------------
345 \section{Dynamic connectivity}
347 The fully dynamic connectivity problem has a~long and rich history. In the 1980's, Frederickson \cite{frederickson:dynamic}
348 has used his topological trees to construct a~dynamic connectivity algorithm of complexity $\O(\sqrt m)$ per update
349 $\O(1)$ per query. Eppstein et al.~\cite{eppstein:sparsify} have introduced a~sparsification technique which can bring the
350 updates down to $\O(\sqrt n)$. Later, several different algorithms with complexity on the order of $n^\varepsilon$
351 were presented by Henzinger and King \cite{henzinger:mst} and also by Mare\v{s} \cite{mares:dga}.
352 A~polylogarithmic time bound was first reached by the randomized algorithm of Henzinger and King \cite{henzinger:randdyn}.
353 The best result known as of now is the $\O(\log^2 n)$ time deterministic algorithm by Holm,
354 de~Lichtenberg and Thorup \cite{holm:polylog}, which will we describe in this section.
356 The algorithm will maintain a~spanning forest~$F$ of the current graph~$G$, represented by an~ET-tree
357 which will be used to answer connectivity queries. The edges of~$G\setminus F$ will be stored as~non-tree
358 edges in the ET-tree. Hence, an~insertion of an~edge to~$G$ either adds it to~$F$ or inserts it as non-tree.
359 Deletions of non-tree edges are also easy, but when a~tree edge is deleted, we have to search for its
360 replacement among the non-tree edges.
362 To govern the search in an~efficient way, we will associate each edge~$e$ with a~level $\ell(e) \le
363 L = \lfloor\log_2 n\rfloor$. For each level~$i$, we will use~$F_i$ to denote the subforest
364 of~$F$ containing edges of level at least~$i$. Therefore $F=F_0 \supseteq F_1 \supseteq \ldots \supseteq F_L$.
365 We will maintain the following \em{invariants:}
368 \def\iinv{{\bo I\the\itemcount~}}
370 \:$F$~is the maximum spanning forest of~$G$ with respect to the levels. In other words,
371 if $uv$ is a~non-tree edge, then $u$ and~$v$ are connected in~$F_{\ell(uw)}$.
372 \:For each~$i$, the components of~$F_i$ have at most $\lfloor n/2^i \rfloor$ vertices.
373 This implies that all~$F_i$ for $i>L$ are empty.
377 At the beginning, the graph contains no edges, so both invariants are trivially
378 satistifed. Newly inserted edges can enter level~0, which cannot break I1 nor~I2.
380 When we delete a~tree edge at level~$\ell$, we split a~tree~$T$ of~$F_\ell$ to two
381 trees $T_1$ and~$T_2$. Without loss of generality, let us assume that $T_1$ is the
382 smaller one. We will try to find the replacement edge of the highest possible
383 level that connects them back. From I1, we know that such an~edge cannot belong to
384 level greater than~$\ell$, so we start looking for it at level~$\ell$. According
385 to~I2, the tree~$T$ had at most $\lfloor n/2^\ell\rfloor$ vertices, so $T_1$ has
386 at most $\lfloor n/2^{\ell+1} \rfloor$ of them. Thus we can increase the levels
387 of all edges of~$T_1$ without violating either invariant.
389 We now start enumerating the non-tree edges incident with~$T_1$. For each such edge,
390 we test whether its other endpoint lies in~$T_2$. If it does, we have found the replacement
391 edge and we insert it to~$F_\ell$. Otherwise we move the edge one level up. (This will
392 be the gist of our amortization argument: We can charge most of the work on level
393 increases and we know that the level of each edge can reach at most~$L$.)
395 If the non-tree edges at level~$\ell$ are exhausted, we try the same in the next
396 lower level and so on. If there is no replacement edge on level~0, the tree~$T$
397 remains disconnected.
400 We will use a~single ET-tree with~$a$ set to~2 for each level. For the $i$-th level, the ET-tree
401 ${\cal E}_i$ will represent the forest~$F_i$ and the non-tree edges of
402 level~$i$ will be attached to its vertices.
404 \algn{Insertion of an~edge}
406 \algin An~edge $uv$ to insert.
408 \:Ask the ET-tree ${\cal E}_0$ if $u$ and~$v$ are in the same component. If they are:
409 \::Add $uv$ to the list of non-tree edges in ${\cal E}_0$ at both $u$ and~$v$.
411 \::Add $uv$ to~$F_0$.
414 \algn{Deletion of an~edge}
416 \algin An~edge $uv$ to delete.
417 \:$\ell \= \ell(uv)$.
418 \:If $uv$ is a~non-tree edge:
419 \::Remove $uv$ from the lists of non-tree edges at both $u$ and~$v$ in~${\cal E}_{\ell}$.
421 \::Remove $uv$ from~$F_\ell$.
422 \::Call $\<Replace>(uv,\ell)$.
425 \algn{$\<Replace>(uv,i)$ -- Search for an~replacement edge for~$uv$ at level~$i$}
427 \:Let $T_1$ and~$T_2$ be the trees in~$F_i$ containing $u$ and~$v$ respectively.
428 \:If $n(T_1) > n(T_2)$, swap $T_1$ with~$T_2$.
429 \:Increase levels of all edges in~$T_1$ to~$i+1$ and insert them to~${\cal E}_{i+1}$
431 \:Enumerate non-tree edges incident with vertices of~$T_1$ and stored in ${\cal E}_i$.
432 For each edge~$xy$, $x\in T_1$, do:
433 \::If $y\in T_2$, add the edge $xy$ to~$F_i$ and return.
434 \::Otherwise increase $\ell(xy)$ by one. This includes deleting~$xy$ from~${\cal E}_i$
435 and inserting it to~${\cal E}_{i+1}$.
436 \:If $i>0$, call $\<Replace>(xy,i-1)$.
439 \>Analysis of the time complexity is straightforward:
441 \thmn{Fully dynamic connectivity, Holm et al.~\cite{holm:polylog}}
442 Dynamic connectivity can be maintained in time $\O(\log^2 n)$ amortized for both
443 \<Insert> and \<Delete> and in time $\O(\log n/\log\log n)$ per \<Connected>
447 The direct cost of an~\<Insert> is $\O(\log n)$ for the operations on the ET-trees
448 (by Theorem \ref{etthm}). We will have it pre-pay all level increases of the new
449 edge. Since the levels never decrease, each edge can be brought a~level up at most
450 $L=\lfloor\log n\rfloor$ times. Every increase costs $\O(\log n)$ on the ET-tree
451 operations, so we pay $\O(\log^2 n)$ for all of them.
453 A~\<Delete> costs $\O(\log^2 n)$ directly, as we might have to update all~$L$
454 ET-trees. Additionally, we call \<Replace> up to $L$ times. The initialization of
455 \<Replace> costs $\O(\log n)$ per call, the rest is paid for by the edge level
458 To bring the complexity of \<Connected> from $\O(\log n)$ down to $\O(\log n/\log\log n)$,
459 we apply the trick from Example \ref{accel} and store~$F_0$ in a~ET-tree with $a=\max(\lfloor\log n\rfloor,2)$.
460 This does not hurt the complexity of insertions and deletions, but allows for faster queries.