PrefaceAndroid has three levels of cache, the main ones are memory cache and hard disk cache. The implementation of these two cache mechanisms applies the LruCache algorithm. Today we will thoroughly understand the cache mechanism in Android from usage to source code analysis; 1. Introduction to LruCache Concept1. What is LruCache? LruCache is a cache class provided by Android 3.1, so you can directly use LruCache to implement memory cache in Android. DisLruCache is not yet part of the Android SDK, but the official Android documentation recommends using this algorithm to implement hard disk cache; LruCache is a generic class. Its main algorithm principle is to store the most recently used objects in LinkedHashMap with strong references (the object reference method we usually use). When the cache is full, the least recently used object is removed from the memory, and get and put methods are provided to complete the cache acquisition and addition operations; 2. Use of LruCache The use of LruCache is very simple. Let's take image cache as an example.
① Set the size of the LruCache cache, which is generally 1/8 of the available capacity of the current process; ②Rewrite the sizeOf method to calculate the size of each picture to be cached; Note: The total cache capacity and the size of each cache object must be in the same unit; 2. Implementation principle of LruCacheThe core idea of LruCache is easy to understand. It is to maintain a cache object list, where the object list is arranged in the order of access. That is, the object that has not been accessed will be placed at the end of the queue and will be eliminated. The object that has been accessed recently will be placed at the head of the queue and will be eliminated last. 1. Constructor
When creating an LruCache object, it does the following internally:
If we want to control the cache by quantity, we can see that the default sizeOf method returns 1, which means that each time a piece of data is added, the space it occupies is 1, until the number of added contents reaches maxSize. At this time, if we add more data, the system will select the least frequently used values and remove them; If the cache limit is memory space, you need to rewrite size to return the size of the space occupied by the object:
2. Read content Reading content corresponds to the get method, the source code is as follows:
The get method tries to read the content according to the passed key. If it cannot be read, you can choose whether to create an object. If you choose to create an object, you need to override the create method to return the object to be created. It is also very simple to use; 3. Storage content The put method is used to store content:
When storing content, you can see how the cache size is controlled: after storing the content, size += safeSizeOf(key, value); will be executed. The default implementation of safeSizeOf is the sizeOf method. Each time an object is stored, the size will be increased by the corresponding value. If the stored key already has data, the size will remain unchanged; It also provides an entryRemoved method, which is called when the data is removed (remove is called, the new value is overwritten, and the data that exceeds the cache is deleted). The default implementation is empty. At the end of put, trimToSize is called, which is a method to control the cache size. This method will be called every time new data is stored. When the current size exceeds the maximum value of the cache, the least recently used data will be deleted through this method; In addition to normal storage and reading, LruCache also provides a method to read all cached objects at once:
4. trimToSize() method
SummarizeLruCache maintains a collection LinkedHashMap, which is sorted in access order. When the put() method is called, elements are added to the collection, and trimToSize() is called to determine whether the cache is full. If it is full, the LinkedHashMap iterator is used to delete the tail element, that is, the element least recently accessed. When the get() method is called to access the cache object, the LinkedHashMap get() method is called to obtain the corresponding collection element, and the element is updated to the head of the queue; |
<<: WeChat launches a new feature that can remotely lock the screen
>>: The first wave! 12 UI design tips that designers must pay attention to
A reasonable points system generally consists of ...
Zhuang Huijun's "Wave Pattern Elementary...
Google recently released an online course on Andr...
Since the release of Apple iOS 15, it has receive...
Well, let’s start from the small circle again. Re...
Tongcheng Travel spent only 9 yuan to buy the rea...
Fuzhou mobile phone accessories applet investment...
QQ group new gameplay can realize automatic hangi...
A few days ago, a colleague sent me an App screen...
[[130868]] The Economist recently wrote that the ...
Recently, many friends have been asking us questi...
Chrome and Opera are better at supporting the lat...
China Joy 2015 came as scheduled, and AnySDK made...
Currently, the trend of online video advertising ...
Each product iteration requires that the new user...