+\paran{Complexity of MST}%
+As we have already noted, the exact decision tree complexity $D(m,n)$ of the MST problem
+is still open and so therefore is the time complexity of the optimal algorithm. However,
+every time we come up with another comparison-based algorithm, we can use its complexity
+(or more specifically the number of comparisons it performs, which can be even lower)
+as an~upper bound on the optimal algorithm.
+
+The best explicit comparison-based algorithm known to date achieves complexity $\O(m\timesalpha(m,n))$.\foot{$\alpha(m,n)$
+is a~certain inverse of the Ackermann's function, $\lambda_k(n)$ is the row inverse of the same
+function. See \ref{ackerinv} for the exact definitions.}
+It has been discovered by Chazelle \cite{chazelle:ackermann} as an~improvement of his previous
+$\O(m\timesalpha(m,n)\cdot\log\alpha(m,n))$ algorithm \cite{chazelle:almostacker}.
+It is also based on the ideas of this chapter --- in particular the soft heaps and robust contractions.
+The algorithm is very complex and it involves a~lot of elaborate
+technical details, so we only refer to the original paper here. Another algorithm of the same
+complexity, using similar ideas, has been discovered independently by Pettie \cite{pettie:ackermann}, who,
+having the optimal algorithm at hand, does not take care about the low-level details and he only
+bounds the number of comparisons. Using any of these results, we can prove an~Ackermannian
+upper bound on the optimal algorithm:
+
+\thmn{Upper bound on complexity of the Optimal algorithm}\id{optthm}%
+The time complexity of the Optimal MST algorithm is $\O(m\timesalpha(m,n))$.