alxolr

posts about software engineering craft

Queues, Stacks, Deques data structures coded in rust

TLDR;

  • If you need a Queue or a Stack use std::collections::VecDeque
  • If you need only a stack use Vec
  • There is posibility to have push, and pop for a Queue or Stack in O(1) time, to rotate the pointers and override the already existing items, ~30% faster than VecDeque.

Intro

The following article is the transcript of my youtube video on Queues, Stacks, and Dequeues data structures with examples of implementation in rust.

Queue

A queue is FIFO (First In, First Out) data structure. To define one, we will need a generic vector data and a couple of pointers, commonly called head and tail. We increment the tail pointer to add an element in our queue (push method). We increment the head pointer to get an element out (pop method). Once they both are at max size, we rotate the indexes at position zero.

The queue is used in graph algorithms mainly for Breadth First Search.

Queue coded in rust

pub struct Queue<T> {
    head: usize,
    tail: usize,
    data: Vec<Option<T>>,
}
impl<T> Queue<T> {
    pub fn new(size: usize) -> Self {
        Queue {
            head: 0,
            tail: 0,
            data: Vec::with_capacity(size),
        }
    }

    pub fn push(&mut self, element: T) {
        self.push_element(element);

        if self.tail + 1 < self.data.capacity() {
            self.tail += 1;
        } else {
            self.tail = 0;
        }
    }

    pub fn pop(&mut self) -> Option<T> {
        let value = self.data[self.head].take();

        if self.head + 1 < self.data.capacity() {
            self.head += 1;
        } else {
            self.head = 0;
        }

        value
    }

    fn push_element(&mut self, element: T) {
        if self.is_full() {
            self.data.push(Some(element));
        } else {
            self.data[self.tail].take(); // take in scope and release the memory of prev value
            self.data[self.tail] = Some(element);
        }
    }

    fn is_full(&self) -> bool {
        self.tail == self.data.len() && self.tail < self.data.capacity()
    }
}

Queue Analysis Time/Space Complexity

The above queue will have push at O(1) constant time and pop at O(1). Space complexity is O(n). The implementation is a bit faster ~%30 than the standard VecDeque because we are not shifting back removed elements. We rotate and override once the element size exceeds the queue size. A way to mitigate that is to start your queue with a healthy size or ensure that your business constraints cover the size you specify.

Stack

Stack is a LIFO (Last In, First Out) data structure. To define one, we will need a generic vector data and a pointer called top or tail where we will keep the index of the last inserted element. When we push an item, we move the tail to one position on the right. On pop we return the element at position tail and push tail one position back.

Stack is used in graph algorithms mainly for Depth First Search.

In almost all languages, arrays or vectors behave already like stacks. You don't need to write your stack.

Stack coded in rust

pub struct Stack<T> {
    tail: usize,
    data: Vec<Option<T>>,
}
impl<T> Stack<T> {
    pub fn new(size: usize) -> Self {
        Stack {
            tail: 0,
            data: Vec::with_capacity(size),
        }
    }

    pub fn push(&mut self, element: T) {
        self.push_element(element);

        if self.tail + 1 < self.data.capacity() {
            self.tail += 1;
        } else {
            self.tail = 0;
        }
    }

    pub fn pop(&mut self) -> Option<T> {
        let prev = match self.tail {
            0 => 0,
            _ => {
                self.tail -= 1;
                self.tail
            }
        };

        self.data[prev].take()
    }

    fn push_element(&mut self, element: T) {
        if self.is_full() {
            self.data.push(Some(element)); // grow the vec by pushing an an element
        } else {
            // we need to clean-up memory of the previous value by extracting it in the scope
            self.data[self.tail].take();
            self.data[self.tail] = Some(element);
        }
    }

    fn is_full(&self) -> bool {
        self.tail == self.data.len() && self.tail < self.data.capacity()
    }
}

Stack Analysis Time/Space Complexity

The above stack will have push and pop at O(1) constant time. Space will be O(n).

Dequeues

Dequeues are unique data structures that behave as a Queue and a Stack at the same time. A rust implementation is std::collections::VecDeque. For most of the use cases when you will need either a queue or a stack, I highly encourage you to use VecDeque.

trait Deque<T> {
    fn push_front(&mut self, element: T);
    fn pop_front(&mut self) -> Option<T>;

    fn push_back(&mut self, element: T);
    fn pop_back(&mut self) -> Option<T>;
}

Benchmark Queue vs VecDeque

I did a benchmark using criterion and following configuration.

use std::collections::VecDeque;

use criterion::{criterion_group, criterion_main, Criterion};
use queue::Queue;

pub fn criterion_benchmark(c: &mut Criterion) {
    let mut group = c.benchmark_group("Queue vs VecDeque");
    let size = 100_000;
    let arr = vec![100; size];

    group.bench_function("Queue", |b| {
        let mut queue = Queue::new(size);
        b.iter(|| {
            arr.clone()
                .into_iter()
                .for_each(|element| queue.push(element));

            for _ in 0..arr.len() {
                queue.pop();
            }
        })
    });

    group.bench_function("VecDeque", |b| {
        let mut queue = VecDeque::new();

        b.iter(|| {
            arr.clone()
                .into_iter()
                .for_each(|element| queue.push_back(element));

            for _ in 0..arr.len() {
                queue.pop_front();
            }
        })
    });

    group.finish();
}

criterion_group!(benches, criterion_benchmark);
criterion_main!(benches);
     Running benches/benchmark.rs (target/release/deps/benchmark-45385d985ef393f9)
Benchmarking Queue vs VecDeque/Queue: Warming up f
Queue vs VecDeque/Queue time:   [69.570 µs 69.662 µs 69.762 µs]
                        change: [-2.7773% -2.3446% -1.8911%] (p = 0.00 < 0.05)
                        Performance has improved.
Found 5 outliers among 100 measurements (5.00%)
  2 (2.00%) high mild
  3 (3.00%) high severe
Queue vs VecDeque/VecDeque
                        time:   [106.55 µs 106.81 µs 107.13 µs]
                        change: [-1.3963% -0.9752% -0.5092%] (p = 0.00 < 0.05)
                        Change within noise threshold.

I hope that this article was helpful. If you like it, please share it with your friends and leave a comment; I will gladly answer all the questions.

Related articles

×