[wpdreams_ajaxsearchpro_results id=1 element='div']

What is Amdahl’s law?

[ad_1]

Amdahl’s law states that the benefits of running algorithms in parallel are limited by any section that can only run serially. The largest speedup achieved by parallelizing a process is equal to one divided by the proportion that cannot be parallelized, minus one. It is used in parallel computing and can be adapted for more complicated situations. The name is a pun on Moore’s Law.

Amdahl’s law is a theory that involves running algorithms in series or in parallel. He claims that the benefits of running in parallel (i.e., running multiple steps simultaneously) are limited by any section of the algorithm that can only run serially (one step at a time). The most common use of Amdahl’s law is in parallel computing, such as on multi-core machines.

After all, Amdahl’s law is a mathematical formula. Stated in its simplest form, it says that the largest speedup that can be achieved by parallelizing a process is equal to one divided by the proportion of the process that cannot be parallelized, minus one. For example, if 80% of a process can be parallelized, then one divided by the remaining 20% ​​gives five; removing one leaves four. This means that parallelizing the process like this makes it four times faster. The formula also works where only a minority of the process can be parallelized: if 12% of the process can be parallelized, the calculation is one divided by 88%, which equals 1.136, minus one, which adds a 13.6% speedup .

The formula can be adapted for use in more complicated situations where different steps in the process get different speedups from being parallelized. This involves producing a figure for each stage, which is the percentage of time spent on that stage before parallelization, divided by the speedup, then adding these figures together to produce a total. The formula then divides one by this total and subtracts one from the result, giving the overall speed increase.

The main area where Amdahl’s law is used is parallel computing. This is where multiple processors work on a task at the same time. This is one of the major drawbacks of computer processors, which is that they run very quickly but can only perform one action at a time. In some cases, a multi-core processor can perform parallel computation effectively, as it simulates multiple processors.

While some people argue that Amdahl’s Law is a misleading name and really should be “Amdahl’s Argument,” the name is a pun on Moore’s Law. This is a theory based on a 1965 statement by Intel founder Gordon Moore. He predicted that technology would progress in such a way that the number of transistors mounted on an integrated circuit would double every two years, a prediction that turned out to be highly accurate.

[ad_2]