12
9
Is saying a cache is a special kind of buffer correct? They both perform similar functions, but is there some underlying difference that I am missing?
12
9
Is saying a cache is a special kind of buffer correct? They both perform similar functions, but is there some underlying difference that I am missing?
11
From Wikipedia's article on data buffers:
a buffer is a region of a physical memory storage used to temporarily hold data while it is being moved from one place to another
A buffer ends up cycling through and holding every single piece of data that is transmitted from one storage location to another (like when using a circular buffer in audio processing). A buffer allows just that - a "buffer" of data before and after your current position in the data stream.
Indeed, there are some common aspects of a buffer and a cache. However, cache in the conventional sense usually does not store all of the data when it's being moved from place to place (i.e. CPU cache).
The purpose of a cache is to store data in a transparent way, such that just enough data is cached so that the remaining data can be transferred without any performance penalty. In this context, the cache only "pre-fetches" a small amount of data (depending on the transfer rates, cache sizes, etc...).
The main difference is that a buffer will eventually have held all of the data. Conversely, a cache may have held all, some, or none of the data (depending on the design). However, a cache is accessed as if you were directly accessing the data in the first place - what exactly gets cached is transparent to the "user" of the cache.
The difference is in the interface. When you're using a cache to access a data source, you use it as if the cache is the data source - you can access every part of the data source through the cache, and the cache will determine where the data comes from (the cache itself, or the source). The cache itself determines what parts of the data to preload (usually just the beginning, but sometimes all), while the cache replacement algorithm in use determines what/when things are removed from the cache. The best example of this is a system, aside from CPU cache itself, is prefetcher/readahead. Both load the parts of data they think you will use most into memory, and revert to the hard drive if something isn't cached.
Conversely, a buffer can't be used to instantaneously move your location in the data stream, unless the new part has already been moved to the buffer. To do so would require the buffer to relocate (given the new location exceeds the buffer length), effectively requiring you to "restart" the buffer from a new location. The best example of this is moving the slider in a Youtube video.
Another good example of a buffer is playing audio back in Winamp. Since audio files need to be decoded by the CPU, it takes some time between when the song is read in, to when the audio is processed, to when it's sent to your sound card. Winamp will buffer some of the audio data, so that there is enough audio data already processed to avoid any "lock-ups" (i.e. the CPU is always preparing the audio you'll hear in a few hundred milliseconds, it's never real-time; what you hear comes from the buffer, which is what the CPU prepared in the past).
I take it that in your example of Youtube if the video has completely buffered i.e. downloaded to your device, it is simply cached and you don't have to seek a new location each time you move the slide unless you refresh the page or reload the video. Would that be correct? – PeanutsMonkey – 2012-06-07T17:43:48.123
@PeanutsMonkey correct, the Youtube video is downloaded directly into your browser's cache as it is being buffered. The buffer in this case is simply a high-level term, as you are always viewing what currently sits in the cache. As the video is buffered, it's moved to the cache (they share the same physical location). I updated the answer with another example of a buffer, in the context of audio players. – Breakthrough – 2012-06-07T17:54:17.347
2tl;dr version: If you want to get the data out of it as quickly as possible, it's a buffer. If you want to keep the data in it as long as possible, it's a cache. – David Schwartz – 2012-06-07T18:28:38.533
10
It would be more accurate to say that a cache is a particular usage pattern of a buffer, that implies multiple uses of the same data. Most uses of "buffer" imply that the data will be drained or discarded after a single use (although this isn't necessarily the case), whereas "cache" implies that the data will be reused multiple times. Caching also often implies that the data is stored as it is also being simultaneously used, although this isn't necessarily the case (as in pre-fetching and the like), whereas buffering implies that the data is being stored up for later use.
There is certainly a large overlap in both implementation and usage, however.
3
One important difference between cache and buffer is:
Buffer is a part of the primary memory. They are structures present and accessed from the primary memory (RAM).
On the other hand, cache is a separate physical memory in a computer's memory hierarchy.
Buffer is also sometimes called as - Buffer cache. This name stresses on the fact that the use of buffer is similar to that of cache, i.e. to store data. while the difference lies in the context of its usage.
Buffers are used for temporarily storing data, while the data is moved from one object to another. EX: when a video is moved from the Internet onto our PC for the display buffers are used to store the frames of the video which would be displayed next. ( THIS INCREASES THE QoS, AS THE VIDEO WOULD RUN SMOOTHLY AFTER A SUCCESSFUL BUFFERING PROCESS.) EX: another example is the scenario when we write data onto our files. The newly written data is not copied to the secondary memory instantaneously. The changes made are stored in the buffer and then according to the designed policy, the changes are reflected back to the file in the secondary memory (hard disk).
Caches on the other hand are used between the primary memory and processors, to bridge the gap between the speed of execution of RAM and the processor. Also the most frequently accessed data is stored in the cache to reduce the access to RAM.
-1 you write "cache is a separate physical memory" <--- No. Not necessarily. IE stores cache on HDD and no doubt loads it into RAM http://stackoverflow.com/questions/854412/internet-explorer-cache-location I don't think C code can specify to load it into physical memory known as cache. What gets put in that cache is more of a low level thing, maybe only the OS can specify. But it's still called cache even though it's in RAM. And the web server squid can set up a cache, no reason to think that's all in physical cache memory or needs to be.
– barlop – 2015-12-16T04:15:14.847caching is a function, it doesn't have to be in special memory – barlop – 2015-12-16T04:15:38.773
1
Common thing: both are intermediary data storage components (software or hardware) between computation and "main" storage.
To me difference is the following:
Buffer:
Cache:
http://stackoverflow.com/questions/6345020/what-is-the-difference-between-buffer-vs-cache-memory-in-linux – Ciro Santilli 新疆改造中心法轮功六四事件 – 2017-04-03T06:39:45.037