In the world of computing, computer architecture is one of the most crucial aspects of having an efficient computing device. Computer architecture is a set of rules and protocols for how software and hardware have to be connected. It is primarily concerned with the balance of efficiency, performance, reliability, and cost. Generally, the architecture components include:
Processors
Memory
Peripherals
In computer architecture, performance is one of the key features when differentiating between systems. We use different laws that quantify and analyze the difference in performance when any specific portion of the system is optimized. The two laws on which we are going to focus are:
Gustafson’s law
Both of these laws are primarily used in the field of Parallel Distributed Computing (PDC). PDC focuses on developing algorithms, architectures, and systems that can efficiently leverage parallelism to solve large-scale computational problems. Parallelism in this context means a task can be computed/solved when divided among different processors.
This law, formulated by computer architect Gene Amdahl, quantifies the maximum potential speedup achieved by parallelizing a computation while considering the fixed portion of the task that cannot be parallelized. Let's break this definition into something easier to understand. Speedup is the ratio of the time taken to complete a task with the improved system to the baseline system. We refer to the improved system as the parallelized fraction, while the serial fraction is used for the baseline approach. This law's core concept is based on the fixed portion of the task in the different approaches.
Amdahl’s law, proved by a formula, is used to determine the theoretical speedup of a system when a specific portion of the system is improved. Generally, the portion that is improved when using these formulas is the number of processors.
The formula is:
Gustafson's Law, formulated by computer scientist John L. Gustafson, provides an alternative perspective on the potential performance improvements achievable through parallel computing. Gustafson's Law considers the workload's scalability and disregards the idea of having a fixed portion of the workload.
It states that as the computing resources (typically the number of processors) increase, the problem size and workload can be scaled up to utilize the available resources effectively. This shows that even more significant problems can be solved with parallel computing, which is more efficient as it focuses on keeping the problem size proportional to the available resources.
The following formula mathematically proves this law:
Now we discuss the differences between the two laws. As we can see, Gustafson’s law had a completely different perspective on speedup regarding parallel computing. While Amdahl's Law focuses on fixed problem sizes, Gustafson's Law considers the scalability of the workload.
We can see the advantages and disadvantages of the two laws side by side:
Quantifies the maximum potential speedup achieved by parallelizing a program.
Identifies the portions of a program that are not efficiently parallelizable.
Analyzes the balance between parallelization and alternative optimization approaches.
Assumes the parallelized portion of the program is fixed.
Assumes that all processors have the same performance characteristics.
Ignores other factors that can affect the performance of parallel programs.
Recognizes the importance of scaling the workload to leverage the available resources fully.
Promotes efficient resource utilization by scaling the workload.
Assumes that the workload can be scaled proportionally to the available resources.
Overlooks the impact of the serial portion on overall performance and speedup.
Focuses only on the potential benefits of workload scaling.
Both of these laws are applicable in different scenarios. Amdahl’s law can be used when the workload is fixed, as it calculates the potential speedup with the assumption of a fixed workload. Moreover, it can be utilized when the non-parallelizable portion of the task is relatively large, highlighting the
On the other hand, Gustafson’s law is applicable when the workload or problem size can be scaled proportionally with the available resources. It also addresses problems requiring larger problem sizes or workloads, promoting the development of systems capable of handling such realistic computations.
Let's take a small quiz to understand this Answer better.
Assessment
Which statement accurately differentiates Amdahl’s Law and Gustafson’s Law?
Amdahl’s Law focuses on fixed problem size, while Gustafson’s Law considers variable problem size.
Amdahl’s Law emphasizes optimizing the serial portion, while Gustafson’s Law emphasizes scaling up the workload.
Amdahl’s Law limits the speedup by the non-parallelizable fraction, while Gustafson’s Law focuses on efficient utilization of parallel resources.
Amdahl’s Law and Gustafson’s Law address different aspects of achieving speedup in parallel computing.
Amdahl's Law and Gustafson's Law are two fundamental principles in parallel computing that address the potential speedup achievable by parallelizing a program. Amdahl's Law focuses on fixed problem size and states that the maximum speedup is limited by the fraction of the program that cannot be parallelized. In contrast, Gustafson's Law considers variable problem size. It emphasizes scaling up the workload to fully utilize the available resources, indicating that the speedup can be nearly linear as the problem size increases. These are some base differences between these two laws for computing potential speedup.
Free Resources