Speedup
Speedup is a measure of performance improvement obtained when a system, algorithm, or process is enhanced relative to a baseline. It is commonly defined as S = T1 / Tp, where T1 is the execution time of the reference (baseline) version and Tp is the execution time of the improved version. A higher speedup indicates greater performance gain. Speedup can apply to hardware upgrades, software optimizations, or changes in problem size or workload.
In parallel computing, speedup describes how the execution time changes when a task is distributed across multiple
Amdahl's law provides a bound on achievable speedup given a non-parallelizable portion of a task. If a
Gustafson's law offers an alternative perspective, arguing that with larger problem sizes, the parallel portion can
Applications of speedup analysis include performance engineering, benchmarking, and guiding decisions about hardware upgrades or parallelization