Equivalent data structures with same bounds in worst case (vs. amortized)

196 views Asked by At

I could not make my title very descriptive, apologies!

Is it the case that for every data structure, supporting some operations with certain amortized running times, another data structure supporting the same operations in the same running times in worst case? I am interested in both the iterative, ephermal data structures and functional ones too.

I am certain that this question must have been asked before. I cannot seem to find the correct search keywords (in Google, SO, TCS). I am looking for a cited answer in {yes, no, open}.

1

There are 1 answers

4
swen On BEST ANSWER

No, at least in models where element distinctness of n elements requires time Ω(n log n).

Consider the following data structure, which I describe using Python.

class SpecialList:
    def __init__(self):
        self.lst = []
    def append(self, x):
        self.lst.append(x)
    def rotationalsimilarity(self, k):
        rotatedlst = self.lst[k:] + self.lst[:k]
        count = sum(1 if x == y else 0 for (x, y) in zip(self.lst, rotatedlst))
        self.lst = []
        return count

Clearly append and rotationalsimilarity (since it clears the list) are amortized O(1). If rotationalsimilarity were worst-case O(1), then we could provide an O(1) undo operation that restores the data structure to its previous state. It follows that we could implement element distinctness in time O(n).

def distinct(lst):
    slst = SpecialList()
    for x in lst:
        slst.append(x)
    for k in range(1, len(lst)):  # 1 <= k < len(lst)
        if slst.rotationalsimilarity(k) > 0:
            return False
        slst.undo()
    else:
        return True