Requirements of Spark

Let's learn about the requirements to build a Spark framework.

Let's understand the functional and non-functional requirements of Spark.

Functional requirements

The functional requirements of Spark are listed below:

  • Data processing: The system needs to process a large working dataset efficiently and also be able to do it repeatedly for iterative or interactive queries.

  • Latency and throughput: Our system should achieve low latency and high throughput for the tasks, like iterative data processing (where we use the same data repeatedly) and performing ad hoc queries on the same dataset. For example, we expect our system to query many terabytes of data in a few seconds. Usually, the first run is slower than the others because our system needs to load the data from the disks that involve IO operations.

Level up your interview prep. Join Educative to access 70+ hands-on prep courses.