Padova · IT
◑ GROWING NOTE algorithmscs-fundamentalscourse-notes

Course Notes — DSA

Running notes for a Data Structures & Algorithms course — concepts, analysis methods, and worked examples.

Running notes for a Data Structures and Algorithms course. Updated as topics progress.

Basic Concepts

A Computational Problem asks for a solution in terms of an Algorithm.

The Algorithm maps a set of instances or cases to a possibly empty set of solutions — so a computational problem is a relation that maps instances to solutions.

The behavior of algorithms is described using model of computations, which specify how an algorithm handles input/output and what operations it may perform. This course uses the RAM Model of Computation.

Formally, an algorithm AI×OA \subseteq I \times O solves a computational problem if:

A Data Structure stores and organizes data for efficient algorithm access.

Analysis of Algorithms

While analyzing an algorithm, we evaluate its efficiency and correctness. The algorithm should:

  1. Solve the problem correctly for all valid inputs
  2. Utilize resources efficiently — primarily time, secondarily space

Primitive Operations

To analyze at an abstract level, we count primitive operations — each defined to run in O(1)O(1) time on the RAM model:

Growth of Functions

We study running time T(n)T(n) as a function of input size nn. Three asymptotic notations bound growth:

Big-O (upper bound)

f(n)O(g(n))f(n) \in O(g(n)) if there exist constants c>0c > 0 and n0n_0 such that: f(n)cg(n)for all nn0f(n) \leq c \cdot g(n) \quad \text{for all } n \geq n_0

We say ff grows no faster than gg.

Big-Omega (lower bound)

f(n)Ω(g(n))f(n) \in \Omega(g(n)) if there exist constants c>0c > 0 and n0n_0 such that: f(n)cg(n)for all nn0f(n) \geq c \cdot g(n) \quad \text{for all } n \geq n_0

We say ff grows at least as fast as gg.

Big-Theta (tight bound)

f(n)Θ(g(n))f(n) \in \Theta(g(n)) if f(n)O(g(n))f(n) \in O(g(n)) and f(n)Ω(g(n))f(n) \in \Omega(g(n)).

ff and gg grow at the same rate up to constant factors.

Common Complexity Classes

ClassNameTypical algorithm
O(1)O(1)ConstantArray access, hash lookup
O(logn)O(\log n)LogarithmicBinary search, balanced BST ops
O(n)O(n)LinearLinear scan, BFS
O(nlogn)O(n \log n)LinearithmicMerge sort, heap sort
O(n2)O(n^2)QuadraticInsertion sort, bubble sort
O(n3)O(n^3)CubicMatrix multiplication (naive)
O(2n)O(2^n)ExponentialSubset enumeration
O(n!)O(n!)FactorialPermutation enumeration

Sorting Algorithms

Insertion Sort

Works well for small nn or nearly-sorted data. Inner loop is tight and cache-friendly.

Merge Sort

Divide-and-conquer: split in half, sort each half, merge. The merge step takes O(n)O(n); the recurrence is T(n)=2T(n/2)+O(n)T(n) = 2T(n/2) + O(n), which solves to O(nlogn)O(n \log n) by the Master Theorem.

Heap Sort

Build a max-heap in O(n)O(n), then extract-max nn times in O(logn)O(\log n) each.

Data Structures Covered

StructureAccessSearchInsertDelete
ArrayO(1)O(1)O(n)O(n)O(n)O(n)O(n)O(n)
Linked ListO(n)O(n)O(n)O(n)O(1)O(1)O(1)O(1)
Stack / QueueO(n)O(n)O(n)O(n)O(1)O(1)O(1)O(1)
Hash TableO(1)O(1) avgO(1)O(1) avgO(1)O(1) avg
AVL TreeO(logn)O(\log n)O(logn)O(\log n)O(logn)O(\log n)O(logn)O(\log n)
HeapO(n)O(n)O(logn)O(\log n)O(logn)O(\log n)