The Standard Deque
A Deque is a collection that allows for items to be added to the front, but to be removed from either the front or the back, giving it properties of both a stack and a queue. Interestingly, if you remove the ability to remove items from the back of the deque, it becomes a first-in-last-out stack, whereas if you remove the ability to remove items from the front of the deque, it becomes a first-in-first-out queue. If you think of stacks and queues as restricted deques, you can understand the design and mechanics of all three structures while only studying one.
A Sequential Deque
Sequential memory (generally) must be preallocated to avoid collision with other in-memory structures. This creates an issue if the items in a deque exceed the allocated size. Until this occurs, however, the most naive approach to keep the items of a deque inside its allocated space is to follow a cyclical structure, where new items are added along the sequence of memory until reaching the end of allocated space, at which point they wrap around to the start again. If any items are removed from the “back” of the deque, at the start of the sequence, then more room is cleared for the cycle. This ensures that the allocated memory won’t run out until every sector of the memory is actually in use. It also allows items to be added or removed from either the front or the back at any time.
This structure requires two indexes to point at the front and back of the memory. The structure is out of space when the front and back indexes point at the same block, implying that another addition to either the front of back of the deque would overwrite the entry at the other end. Once this occurs, you can either remove items to create space, block new additions, start a new deque for new items, or copy the contents to a larger allocation of space.
The Restricted Sequential Stack, Queue, and Double Stack
To build a stack, you may restrict a deque to only allow for addition and removal of items at the front of the deque. The back of the deque then becomes static, and the index no longer needs to be tracked.
To build a queue, you may restrict a deque to only allow for addition of items to the front of the deque, and removal items to the back of the deque. The restricted queue behaves exactly the same as a deque, just without the mechanism to add an item to the front.
A double stack allows for two stacks to start at either end of an allocated section of memory, such that the stacks grow toward each other and only run out of space once they meet. This allows for two stacks to have optimal access to memory, reducing wasted space and the chance of one stack running out of space preemptively. A double stack may be created from a deque by only adding one type of item to the front, and another type to the back. Of course, while removing items from the front or back you must also detect when the index reaches the turnover from one stack to the next, such that you don’t remove from the wrong stack. This difference is more pronounced than with stacks or queues, but most of the methods for handling overflows, deletions, and vaccuming are similar, and so the relation between a deque and a double stack is still worth considering.
Handling overflows
The simplest solution to handling overflows is to copy the entire collection to another, larger allocation of memory. This is the method used by C# by their standard stack and queue. This approach to handling overflows ensures that you can continue to add items to the structure, assuming that the program is still allowed to allocate more space in general. Reading a value still completes in $O(1)$ time. Without a write cache, however, writing a value requires an immediate full deque copy, which will take $O(n)$ time. It also requires that references to the deque be indirect, such that the location of the deque is allowed to move entirely. The index already creates an indirect reference, however, so if the location of the index is static there may be no change in performance.
Alternatively to a full copy, you can start a new deque in a new allocation of memory, such that new items are added and removed from that allocation before returning to the existing deque if the overflow is ever cleared. The new deque would then need a reference to the old deque such that the index may jump from one allocation to the next once the newer is cleared. This method still requires that the ends of the deque be able to move in memory, such that references might have to be more indirect than with a static location. Since no copies are ever required, writes, removals, and reads may all complete in constant time, though they might require an expensive allocation or deallocation, which may complete in $O(n)$ time depending on the memory archetecture. This is roughly the approach that Postgres uses for its rows.
A deque which uses independent, linked chunks of memory in this way is similar to a linked list, but with multiple items at each location in memory rather than only one. In fact, just as the standard deque could be restricted into a stack or a queue by removing operations, the standard deque can also be restricted into a linked list by using the linked address solution for overflows and setting the maximum size of each segment to one. This design would rarely be useful in practice, but it helps illustrate the advantages to the different configurations of a standard deque. A linked list does not require vaccuming, as each deleted item will only require that its memory be deallocated and the index of its siblings corrected. Similarly, a linked deque that uses smaller segments of memory is expected to take less time to vacuum than with larger segments, as any segments that were cleared will not result in a copy, and items need only be copied until they fill a segment, at which point the copy may stop.
If the storage of elements is optional, you may silently drop items on overflow. These drops may be performed on the oldest items in the deque. This can be performed manually, or by simply allowing for overwriting of the old items by new items. The drops may also be selectively applied through deletions on items using some kind of filter, though this requires that the memory by vacuumed afterwards. Finally, new items may be silently dropped until space becomes available. Which items are dropped will depend on which items are considered to be the most important.