Diving into Data Structures

Learn about data structures and algorithms.

We'll cover the following

Though a TPM is not held to the same level of proficiency as a software developer, we should understand basic programming concepts. We’ll cover the programming topics that come up the most in our day-to-day activities.

A Data structures and algorithms class was likely in our first or second year in college if you took a traditional route to becoming a TPM. As with most programming fundamentals, we won’t use this ourselves in our day-to-day work. However, we can think of them as a strong foundation for understanding the language our development team will use in most conversations we have with them.

We’ll briefly go over a few of the more common data structures we may encounter in design meetings, standups, and general work conversations. Even if you’ve taken the class and remember the concepts, it’s always good to refresh your memory.

Space and time complexities

In a computer, random access memory (RAM) is where data is stored that is in active use, such as variables in an application. Because RAM is a limited resource, measuring the amount of space data takes up in RAM is an important consideration. The other consideration is the amount of time it takes to perform an action such as searching, inserting, deleting, or accessing data. The amount of time it takes to perform an action once is then compounded by the number of times the loop is run and can add up very quickly to a considerable time sink if the wrong data structure is utilized for the task.

Both of these measurements use what is referred to as big O (bigbig Oh“Oh”) notation. These measurements are essential categories used for reference to understand the performance of a data structure or method. For this context, the big O notation is used assuming asymptotic growth and uses nn to denote the input that impacts the growth. Essentially, these mathematical functions represent the curve, or behavior, that the space or time performance will start to match as nn gets large enough. As an example, if the amount of time it takes to access a specific element correlates linearly to where it is in the data structure—for instance, the fifth element in a collection—the big O would be O(n)O (n). As nn increases, so does the time it takes, which is called linear time. However, if the amount of time it takes to access an element from a data structure is the same regardless of where in the data structure the element is, then the big O is O(1)O (1), or in other words, constant time. As a TPM, knowing where the complexity categories come from isn’t as important as knowing the relative costs associated with each big O. The figure below shows each big O category on a curve of operations vs. elements:

Get hands-on with 1200+ tech skills courses.