What’s cache memory?

Print anything with Printful



Cache memory is fast memory located in or near a CPU that stores frequently used instructions, improving system speed. CPUs with more cache may outperform faster CPUs with less cache. Disk caching stores frequently accessed data in RAM, which is faster than hard drives. Hybrid hard drives with built-in flash memory caches may eventually eliminate the need for RAM disk caching.

Cache memory (pronounced cash) is extremely fast memory that is either built into a computer’s central processing unit (CPU) or located next to it on a separate chip. The CPU uses cache memory to store repeatedly requested instructions to run programs, improving overall system speed. The advantage of cache memory is that the CPU does not have to use the motherboard system bus for data transfer. Whenever data needs to pass through the system bus, the data transfer rate slows down relative to the motherboard’s capability. The CPU can process data much faster by avoiding the bottleneck created by the system bus.

It just so happens that once most programs are up and running, they use very little resources. When these resources are cached, programs can run faster and more efficiently. All other things being equal, the cache is so effective in system performance that a computer running a fast CPU with little cache may perform lower benchmarks than a system running a somewhat slower CPU with more cache. The cache built into the CPU itself is called the level 1 (L1) cache. Cache that resides on a separate chip next to the CPU is called a level 2 (L2) cache. Some CPUs have both L1 and L2 cache onboard and designate the separate cache chip as a level 3 (L3) cache.

The cache built into the CPU is faster than the separate cache, running at the speed of the microprocessor itself. However, separate cache is still about twice as fast as random access memory (RAM). Cache is more expensive than RAM, but it’s worth buying a CPU and motherboard with built-in cache to maximize system performance.

Disk caching applies the same principle to the hard drive that caching applies to the CPU. Frequently accessed hard drive data is stored in a separate segment of RAM to avoid having to repeatedly retrieve it from the hard drive. In this case, RAM is faster than the platter technology used in conventional hard drives. This situation will change, however, as hybrid hard drives become ubiquitous. These drives have built-in flash memory caches. Eventually, hard drives will be 100% flash drives, eliminating the need for RAM disk caching, since flash memory is faster than RAM.




Protect your devices with Threat Protection by NordVPN


Skip to content