Your first algorithms course is not easy.

The *words* are not the hardest part, but from years of teaching this material we have learned that words *do* sometimes invite enormous misunderstandings. You can be led down a garden path by a fancy word that you learned, only to discover that it means something else *this* week.

Luckily, the terminological issue is relatively *easy* to remedy by just writing it down and raising awareness. Here are some of the false friends I am aware of. (Do contact me with more examples, in particular ones that actually confused you.)

### stack

A stack is a specific abstract data type, see Stack (abstract data type). When we say “stack” *in an algorithms course*, we almost always mean the data type.

However, the word “stack” is *also* a specific data structure in the runtime system of an active computer programme, see Call stack, also called execution stack, program stack, control stack, run-time stack, or machine stack, and is often shortened to just “the stack”.

When we talk about recursion and its implementation in our course, some people may say “the stack,” rather than “the call stack” or some more unambigous term. For instance, you may overhear “in your graph exploration programme, you wouldn’t have run out of stack space if you had used a stack.” The first stack is the recursion stack, the other isn’t.

The (call) stack *is* a stack (abstract data type), but not vice versa.

### heap

A heap is a specific data structure that implements the abstract data type *priority queue*, see Heap (data structure). When we say “heap” *in an algorithms course*, we almost always mean the data type.

Howver, the word “heap” is also used to refer to a specific chunk of dynamically allocated memory, see Dynamic memory allocation.

We *almost never* use that term in the present course, but you may come across it in relation to data structures in other material, for instance about allocating linked lists and other dynamic data structures.

The (memory) heap is *not* a heap data structure.

### map

In mathematics, a map is a “function,” a *something* that connects (“maps”) things from one set to another, see Map (mathematics)). The Danish word is *afbildning*.

Some programming languages use term to refer to what we call “symbol table” in this course: a *something* that maps “keys” to “values”, see Map (computer science).

Other words are associative array or dictionary.

An example is Java’s interface `java.util.Map`

that describes an abstract data type for a symbol table, and is famously implemented in the data structures `java.util.HashMap`

.

However, in other programming languages, `map`

is used for *the application of a function to a sequence*, see Map (higher order function)) This is true for Python. (Python instead used “dictionary”, `dict`

, for its symbol tables.)

So in Python, `map`

is a higher-order function, in Java, `Map`

it is an interface. These things have *nothing* to do with each other.

One can try to stick to the term *symbol table* for the abstract data type.

### running time / runtime

Running time is the time to execute an algorithm, synonymous with Time complexity.

It is given as a growth rate in terms of a size measure, using some asympototic notation. We can say “the running time of merge sort is O(*n* log *n*)”.

We may also use it to refer to the actual execution time (in seconds) of a concrete piece of code.

We can then say “the execution time for `ex8.cpp`

was 2.4 seconds on input `002-huge-43.in`

”.

In contrast, the runtime system is the environment in which a programme is executed, and runtime) as part of a program’s life cycle. We talk about *runtime errors* if a programme fails during execution, and may want to distinguish between checking array index-out-of-bounds error at *runtime* or *compile-time*.

Neither of these concepts have to do with running time.

### graph

Graph means two things in algorithms and discrete mathematics; they have nothing in common.

Most of the time we mean a combinatorial structure that models a pairwise relation, see

Graph (discrete mathematics))

In this context, “graph” is a synonym of “network”.

It may also mean graph of a function, which is what you have learned in school. Such a graph is either a set of points or the drawing of these points. This is what we mean when we say “graph of a function,” or “draw the running time as a graph.”

## random

“Random” means two different things: *stochastic* (as in throwing dice or drawing lots) or *unpredictable* (as in arbitrary). These are very different and your instructor will be very cross with you if should mix them up.

When, for instance, we pick the split element in quicksort randomly, we mean “uniformly at random and independently from previous random choices.” It’s *very* important that you don’t misread this as “arbitrary”: if the split element in quicksort is picked *arbitrarily* (say, by your worst enemy), the analysis breaks down and quicksort becomes slowsort.

Of course, when we talk about the *very* important random access model of computation (RAM), the word “random” means arbitrary/unconstrained/unpredictable. Here, it is important that accessing memory location *M*[*i*] takes constant time, no matter how *i* was chosen. Even your worst enemy wouldn’t be able to make the RAM spend more than constant time. So: the word random does *not* mean that the constant-time bound holds only for random-in-the-stochastic sense access, or (God forbid) that the result of *M*[*i*] is random or even arbitrary.

I believe that “random” only means “decidedly nonrandom” when we’re talking about the random access machine, so you’re probably safe in interpreting it as “taken from a probability distribution, probably uniformly and independently unless something else is said” in all other contexts.

### node

A node is a localised swelling in a network, it is a synonym of “knot,” both from the Latin nodus.

Think of a fisherman’s net.

In an algorithms course, *node* means “basic unit of a data structure”, see Node computer science).

It often appears explicitly in an implementation, typically as an object of `class Node`

, and sometimes only implictly as a useful abstraction.

We will also use *node* for the fundamental unit of a graph, see

Vertex). We will try to stick to “vertex” for such nodes, but the terms are synonyms and you will meet node-as-vertex in many places.

Another good word for node-as-vertex is *point*.

Perhaps confusingly, when you construct the data structure to solve a basic graph problem, typically you don’t use a node (data structure) for a node (combinatorial object).

### linear

Linear means at least two, completely unrelated things.

In “linear programming”, the word is used as in calculus (linear function), to highlight the fact that the variable has degree 1 (rather than appearing squared.)

In “linear probing,” a collosion strategy employed by hash tables, it just means “one-after-the-other”.

### programming

In “introduction to programming,” or “computer programming” or “Java programming,” the word *programming* means “writing instructions”, typically using a programming language.

In “linear programming,” “quadratic programming,” or “dynamic programming,” the word *programming* just means *planning* or *scheduling*, as in “TV programming” or “theatre programming,” and could be usefully replaced with *scheduling* or *optimisation* almost everywhere it occurs.

As a particular confusion, constructing a linear program (or “programme” if you’re fancy), means *setting up a bunch of constraints* (these constraints are linear functions), but does not involve writing the (computer program) to *solve* those constraints. You can write a Java program to solve a linear program.

### dynamic

*Dynamic* means “the entries or instance can change” or “the entries are never recomputed”.

In the former sense, it is used in *dynamic array* (an array whose *length* can change – all arrays are mutable, so in that sense they’re all dynamic) or *dynamic graph algorithm* (an algorithm for a graph that undergoes changes.) In the latter sense it is used in *dynamic programming*, a specific algorithmic optimisation strategy, where the whole points was that functions without side effects *never change*, so we might as well store them is a static table and never change them.

The history of the term “dynamic programming” is documented by Bellman and quite entertaining.

### asymptotic

Asymptotic means “never falling together”, and the term draws attention to the fact that one function is the asymptote of another by virtue of the two never touching. This is typically a topic of high school maths, see asymptote. In the analysis of algorithms, “asymptotic” almost always just means “for large values of the input size” and has nothing to do with “not touching” or even approaching each other. In particular, a function is asymptotically itself, and 2 is 1 in asymptotic notation; see Big O notation. There is an intermediate concept of asymptotic analysis which somewhat explains how the two terms related.