Home / Definitions / Algorithm

# Algorithm

Written by Vangie Beal

An algorithm is a finite set of well-defined steps to solve a class of problems or perform a computation. In simpler terms, it is a set of guidelines that describes how to perform a task. To be classified as an algorithm, a set of rules must be unambiguous and have a clear stopping point. It can be expressed in any language, from natural languages like English or French to programming languages like the R language or object-oriented languages like Java.

In computer systems, a developer creates a program by essentially writing a set of algorithms. It is written for computers to produce an output from the given input. Algorithms are used to produce faster results and are essential to processing data. Many computer programs contain algorithms that detail specific instructions in a specific order for carrying out a specific task, such as calculating an employee’s paycheck.

## Examples of algorithms

A common and simple example of an algorithm is a recipe. It’s a finite list of instructions used to perform a task. Typically, these steps must be done in a specific sequence in order to achieve the desired outcome. Other well known algorithms include:

• Google’s PageRank: A set of algorithms Google used to determine the importance of website pages indexed by its search engine. It was designed to decide the order in which search results would be displayed. PageRank expired as of September 2019, but it was the first algorithm Google used.
• Facebook timeline algorithm: The set of algorithms that determines the content a user sees and in which order. It is based on a series of parameters (personal tastes, response to previous content, etc.). The algorithm is constantly updated to better improve user experience.
• High Frequency Trading algorithms: Algorithms used globally by financial institutions to launch orders on the market based on expected profit and market conditions at any given time.
• Algorithm of Round Robin: Used by process and network schedulers in computing to assign time slices to each process in equal portions in a circular order. It can determine the amount of time the CPU of a computer will spend on each of the processes in progress.