I could not make my title very descriptive, apologies!
Is it the case that for every data structure, supporting some operations with certain amortized running times, another data structure supporting the same operations in the same running times in worst case? I am interested in both the iterative, ephermal data structures and functional ones too.
I am certain that this question must have been asked before. I cannot seem to find the correct search keywords (in Google, SO, TCS). I am looking for a cited answer in {yes, no, open}.
No, at least in models where element distinctness of n elements requires time Ω(n log n).
Consider the following data structure, which I describe using Python.
Clearly
append
androtationalsimilarity
(since it clears the list) are amortized O(1). Ifrotationalsimilarity
were worst-case O(1), then we could provide an O(1)undo
operation that restores the data structure to its previous state. It follows that we could implement element distinctness in time O(n).