From 52cc5a65f1b809f53e6a73ec39b57df19f6d3b69 Mon Sep 17 00:00:00 2001 From: Martin Mares Date: Sat, 29 Mar 2008 20:11:23 +0100 Subject: [PATCH] SH: Optimality. --- opt.tex | 18 +++++++++++++----- 1 file changed, 13 insertions(+), 5 deletions(-) diff --git a/opt.tex b/opt.tex index 5b2d0d4..7ffcac3 100644 --- a/opt.tex +++ b/opt.tex @@ -493,13 +493,21 @@ We set the parameter~$r$ to~$2+2\lceil\log (1/\varepsilon)\rceil$. The rest foll from the analysis above. By Lemma \ref{shcorrlemma}, there are always at most $n/2^{r-2} \le \varepsilon n$ corrupted items in the heap. By Lemma \ref{shmeld}--\ref{shyards}, the time spent on all operations in the sequence can be paid for by charging $\O(r)$ time -against each \, which yields the time bound. +against each \. This yields the time bound. \qed -\FIXME{Christen lemmata.} - -\FIXME{Remark on optimality.} - +\rem +When we set $\varepsilon = 1/2n$, the soft heap is not allowed to corrupt any +items, so it can be used like any usual heap. By the standard lower bound on +sorting it therefore requires $\Omega(\log n)$ time per operation, so the +time complexity is optimal for this choice of~$\varepsilon$. Chazelle \cite{chazelle:softheap} +proves that it is optimal for every choice of~$\varepsilon$. + +The space consumed by the heap need not be linear in the \em{current} number +of items, but if a~case where this matters ever occured, we could fix it easily +by rebuilding the whole data structure completely after $n/2$ deletes. This +increases the number of potentially corrupted items, but at worst twice, so it +suffices to decrease~$\varepsilon$ twice. -- 2.39.2