What Is a Concurrency Control?

Learn to distinguish between concurrency and parallelism and using concurrency control to ensure data integrity in databases.

Concurrency vs. parallelism

Concurrency is the process of executing more than one task seemingly at the same time. Concurrency increases the application’s throughput by maximizing the utilization of a CPU core. It creates the illusion of parallel execution, but the operating system allocates a slice of the CPU time to the individual tasks, braiding them rather than processing them simultaneously. This braided processing is called context switching. Ultimately, concurrency is about dealing with a lot of things at once.

In contrast, parallelism is about doing a lot of things at once. Parallelism is the process of executing more than one task at the same time, with each task running on a dedicated CPU core.

Get hands-on with 1200+ tech skills courses.