
CPU cache size
CPU cache size is an important CPU component that is used for high-speed computations. A CPU cache is often organized as a hierarchy of cache layers, from L1 to L4—L1, and L2 being smaller and faster cache layers as opposed to the larger and slower layers L3 and L4. In an ideal setting, every data needed by the application resides in caches and hence no read is required from RAM, thereby making the overall operation faster.
However, this is hardly the scenario for most of the deep learning applications. For example, for a typical ImageNet experiment with a batch size of 128, we need more than 85MB of CPU cache to store all information for one mini batch [13]. Since such datasets are not small enough to be cache-only, a RAM read cannot be avoided. Hence modern day CPU cache sizes have little to no impact on the performance of deep learning applications.