Table of Contents
    Home / Definitions / Concurrency
    IT 2 min read

    Concurrency allows a central processing unit (CPU) to run multiple tasks simultaneously over a certain period of time. These tasks or processes do not depend on each other. Concurrency facilitates the rapid transition between different applications on a computer, for example; it appears that different processes are running at the exact same time. But instead, the CPU is switching back and forth quickly between threads, which are tiny segments of computing processes. Technically, a CPU can run only one thread at a time, but concurrency allows it to switch back and forth between tasks so that they seem to be running simultaneously. This is called multithreading.

    Concurrency allows computer programs to run properly by executing different tasks on one CPU during the same general time frame, but these tasks do not all depend on each other, nor are they parallel. In contrast, parallelism actually runs tasks fully at the same time, rather than moving rapidly between them. One benefit of multi-core technology is that it allows a single CPU to perform parallel processing.

    Concurrency gives a computer with only one CPU more flexibility by allowing it to decide when and for how long to work on tasks. Concurrency also differs from sequential processes or programming: sequential processes are completed one at a time, while concurrent ones do not have to be completed before another one begins: they can be divided in whatever way is most effective.

    Concurrency can be applied to computer systems and programming languages. Programming languages that use concurrency include Java, Python, Rust, and Go.