HashMap implementation principle, expansion mechanism, and summary of common interview questions

HashMap implementation principle, expansion mechanism, and summary of common interview questions

Whether it is an Android interview or a Java interview, you will be asked about the principle of hashmap and how to implement it. Today we will summarize it;

1. HashMap (array + linked list + red-black tree) principle

HashMap uses arrays at the bottom layer, and each array element is of Node type (or TreeNode). Each position in the table can also be called a Hash bucket, that is, elements with the same hash value will be stored in a Hash bucket (the hash value here refers to the hash value calculated for the key), that is, the same in the Table subscript. In order to solve the problem of multiple elements in the same position (conflict), HashMap uses the zipper method and red-black tree data structures to resolve conflicts.

1. The meaning of data structure parameters

 //Threshold (capacity * load factor). When the key-value pairs in the HashMap exceed this value, the HashMap will expand.
int threshold;
// The load factor of the hash table describes how full the HashMap is. A value close to 0 indicates that it is very empty, and a value close to 1 indicates that it is fully filled.
final float loadFactor;
// Default initial capacity - must be a power of 2, default value is 16.
static final int DEFAULT_INITIAL_CAPACITY = 1 << 4;
// The maximum number of elements that can be stored. The default value is 1 << 30
static final int MAXIMUM_CAPACITY = 1 << 30;
// The default load factor is 0.75. When the number of elements in the map reaches 75% of the capacity, expansion will be triggered
static final float DEFAULT_LOAD_FACTOR = 0.75f;
// Bucket tree threshold: the threshold for converting a linked list into a red-black tree. When storing data, if the linked list length > this value, the linked list will be converted into a red-black tree.
static final int TREEIFY_THRESHOLD = 8;
// Bucket linked list restoration threshold: that is, the threshold for converting the red-black tree to a linked list. When the number of items in the original red-black tree is < 6, the red-black tree is converted into a linked list.
static final int UNTREEIFY_THRESHOLD = 6;
// Minimum tree capacity threshold: when the capacity in the hash table > this value, the linked list is allowed to be converted into a red-black tree, otherwise it is directly expanded
static final int MIN_TREEIFY_CAPACITY = 64;

2. Construction method:

 // Construct an empty HashMap with the default initial capacity
public HashMap() {
// Load factor, the default value is 0.75f
this.loadFactor = DEFAULT_LOAD_FACTOR; // all other fields defaulted
}
// Construct an empty HashMap using the specified initial value
public HashMap(int initialCapacity) {
// Initial capacity and load factor
this(initialCapacity, DEFAULT_LOAD_FACTOR);
}
// Construct an empty HashMap using the specified initial value
public HashMap(int initialCapacity, float loadFactor) {
if (initialCapacity < 0)
throw new IllegalArgumentException("Illegal initial capacity: " +
initialCapacity);
// When the initial capacity is greater than the maximum initial capacity, the initial capacity is the maximum initial capacity
if (initialCapacity > MAXIMUM_CAPACITY)
initialCapacity = MAXIMUM_CAPACITY;
if (loadFactor <= 0 || Float.isNaN(loadFactor))
throw new IllegalArgumentException("Illegal load factor: " +
loadFactor);
this.loadFactor = loadFactor;
this.threshold = tableSizeFor(initialCapacity);
}
// Returns a number that is greater than the input parameter and is the nearest integer power of 2
static final int tableSizeFor(int cap) {
int n = cap - 1;
n |= n >>> 1;
n |= n >>> 2;
n |= n >>> 4;
n |= n >>> 8;
n |= n >>> 16;
return (n < 0) ? 1 : (n >= MAXIMUM_CAPACITY) ? MAXIMUM_CAPACITY : n + 1;
}

3. Node and TreeNode

In HashMap, the stored value is not a put KV, but a Node type. There is also a TreeNode type, which can be converted to the Node type.

 static class Node<K,V> implements Map.Entry<K,V> {
final int hash;
final K key;
V value;
Node<K,V> next;
Node(int hash, K key, V value, Node<K,V> next) {
this.hash = hash;
this.key = key;
this.value = value;
this.next = next;
}
}

4. Hash calculation and subscript determination

 /**
* Calculate the hash value of the key:
* 1. If the key is null, the hash value is 0
* 2. If the key is not null, perform XOR calculation on the key's hashCode value and the high 16 bits (XOR: 0 for the same and 1 for different)
*/
static final int hash(Object key) {
int h;
return (key == null) ? 0 : (h = key.hashCode()) ^ (h >>> 16);
}

5. Resize

 final Node<K,V>[] resize() {
Node<K,V>[] oldTab = table;
int oldCap = (oldTab == null) ? 0 : oldTab.length;
// threshold indicates the threshold that triggers capacity expansion (capacity expansion occurs when size >= capacity * loadfactor)
int oldThr = threshold;
int newCap, newThr = 0;
// oldCap is greater than 0, which proves that the map has been operated on, not just when the map was created
if (oldCap > 0) {
// If the current capacity is greater than the maximum capacity, the threshold is set to the maximum integer value and no copy operation is performed
if (oldCap >= MAXIMUM_CAPACITY) {
threshold = Integer.MAX_VALUE;
return oldTab;
}
else if ((newCap = oldCap << 1) < MAXIMUM_CAPACITY &&
oldCap >= DEFAULT_INITIAL_CAPACITY)
// If 2 times the old capacity does not exceed the maximum allowed capacity, and the old capacity reaches the default initial capacity of 16, the new expansion threshold is set to 2 times the old capacity
newThr = oldThr << 1; // double threshold
}
else if (oldThr > 0) // initial capacity was placed in threshold
// Create a map using HashMap(capacity) or HashMap(capacity, loadFactor)
// This is the first expansion, the new capacity is set to threshold, which is capacity*loadFactor
newCap = oldThr;
else { // zero initial threshold signifies using defaults
// For the first expansion, create a map using new HashMap(), using the default capacity and load factor
newCap = DEFAULT_INITIAL_CAPACITY;
newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY);
}
// Set the threshold for the next expansion
if (newThr == 0) {
float ft = (float)newCap * loadFactor;
newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ?
(int)ft : Integer.MAX_VALUE);
}
threshold = newThr;
@SuppressWarnings({"rawtypes","unchecked"})
// Apply for a new array
Node<K,V>[] newTab = (Node<K,V>[])new Node[newCap];
table = newTab;
// The following is to copy the elements in the old array to the newly applied array
// Because the index calculation method of the node in the old array is: oldIndex=(oldCapacity - 1) & key.hash,
// When the capacity of the array changes, the node index needs to be re-determined. There are two possibilities for the new node position:
// 1.newIndex=oldIndex, the index remains unchanged, provided that key.hash & oldCapacity result is 0
// 2.newIndex=oldIndex+oldCapacity, either the first case or the second case
if (oldTab != null) {
// Traverse the old array (oldCap length)
for (int j = 0; j < oldCap; ++j) {
Node<K,V> e;
if ((e = oldTab[j]) != null) {
oldTab[j] = null;
if (e.next == null)
// Determine the new position and save it
newTab[e.hash & (newCap - 1)] = e;
else if (e instanceof TreeNode)
// Split the nodes of the red-black tree, store each node in the tree in a new position, and determine whether the tree needs to be converted to a linked list
((TreeNode<K,V>)e).split(this, newTab, j, oldCap);
else { // preserve order
// Traverse the linked list and divide it into two parts, one part (loHead) has the same index, and the other part (hiHead) has the new index of oldIndex+oldCapacity
// Then put the linked list into the corresponding array
Node<K,V> loHead = null, loTail = null;
Node<K,V> hiHead = null, hiTail = null;
Node<K,V> next;
do {
next = e.next;
if ((e.hash & oldCap) == 0) {
if (loTail == null)
loHead = e;
else
loTail.next = e;
loTail = e;
}
else {
if (hiTail == null)
hiHead = e;
else
hiTail.next = e;
hiTail = e;
}
} while ((e = next) != null);
if (loTail != null) {
loTail.next = null;
newTab[j] = loHead;
}
if (hiTail != null) {
hiTail.next = null;
newTab[j + oldCap] = hiHead;
}
}
}
}
}
return newTab;
}

Regarding expansion, in Java7's HashMap, if multiple threads modify the HashMap (expanding the capacity at the same time), it may cause a loop in the linked list. This is because Java7 only uses an array plus a linked list, and uses the head insertion method when inserting into the linked list, and the order of the linked list nodes will change when expanding the capacity; while Java8 uses the tail insertion method when inserting nodes, and the order of the linked list nodes will not change when expanding the capacity, which can avoid the loop problem. However, this does not mean that Java8's HashMap can support concurrent modifications, because many of its internal operations do not guarantee atomicity (for example, two threads inserting elements at the same time, size++, there is no atomicity guarantee).​

6. Split of red-black tree during resize

Like a linked list, the elements in a red-black tree also need to determine the new index position one by one. It is also divided into two parts, one part with the unchanged index and the other part with the new index of oldIndex+oldCapacity.

Note: split is a method of the internal class TreeNode in HashMap, not a method of HashMap.

 /**
* When expanding capacity, the elements in the same hash bucket (red-black tree) are split, possibly into two parts
* part1. The hash of the node and the capacity of the original array are 0 -> After moving to the new table, the index and the old table remain unchanged
* part2. The hash of the node and the capacity of the original array are not 0 -> After moving to the new table, the new index is "oldIndex+oldCapacity"
* After splitting these two parts, determine whether the tree needs to be converted into a linked list. If the number of each part does not exceed UNTREEIFY_THRESHOLD (the default is 6), it needs to be converted into a linked list.
*
* @param map hashMap instance itself
* @param tab expands the newly applied array
* @param index The subscript index to be split this time (corresponding to the old array)
* @param bit old array capacity
*/
final void split(HashMap<K,V> map, Node<K,V>[] tab, int index, int bit) {
TreeNode<K,V> b = this;
// Relink into lo and hi lists, preserving order
// loHead links the node with unchanged index
TreeNode<K,V> loHead = null, loTail = null;
// hiHead links the node whose index has changed
TreeNode<K,V> hiHead = null, hiTail = null;
int lc = 0, hc = 0;
for (TreeNode<K,V> e = b, next; e != null; e = next) {
next = (TreeNode<K,V>)e.next;
e.next = null;
// If the capacity of the current node and the original array is 0, the index position after expansion will be consistent with that in the old table
if ((e.hash & bit) == 0) {
if ((e.prev = loTail) == null)
loHead = e;
else
loTail.next = e;
loTail = e;
++lc;
} else {
// If the capacity of the current node and the original array is not 0, the index position after expansion is "oldIndex+oldCapacity"
if ((e.prev = hiTail) == null)
hiHead = e;
else
hiTail.next = e;
hiTail = e;
++hc;
}
}
if (loHead != null) {
if (lc <= UNTREEIFY_THRESHOLD)
tab[index] = loHead.untreeify(map);
else {
tab[index] = loHead;
// The high node is not empty, indicating that the original linked list element has been split, and the number of red-black tree nodes in the cut position is greater than 6, which does not meet the conditions for transferring the linked list and needs to be re-treeed
if (hiHead != null) // (else is already treeified)
loHead.treeify(tab);
}
}
if (hiHead != null) {
if (hc <= UNTREEIFY_THRESHOLD)
tab[index + bit] = hiHead.untreeify(map);
else {
tab[index + bit] = hiHead;
// The low-order node is not empty, indicating that the original linked list element has been split. The number of red-black tree nodes in the cut position is greater than 6, which does not meet the conditions for transferring the linked list and needs to be re-treeed
if (loHead != null)
hiHead.treeify(tab);
}
}
}

7. Linked list to red-black tree

 final void treeifyBin(Node<K,V>[] tab, int hash) {
int n, index; Node<K,V> e;
// If the map capacity (array length) is 0, or less than MIN_TREEIFY_CAPACITY (default 64), then expand the capacity without converting the red-black tree
// The underlying array is also called a hash bucket. That is to say, when the number of hash buckets is less than 64, the capacity will be expanded.
if (tab == null || (n = tab.length) < MIN_TREEIFY_CAPACITY)
resize();
else if ((e = tab[index = (n - 1) & hash]) != null) {
TreeNode<K,V> hd = null, tl = null;
do {
// Convert linked list nodes to red-black tree nodes
TreeNode<K,V> p = replacementTreeNode(e, null);
if (tl == null)
hd = p;
else {
p.prev = tl;
tl.next = p;
}
tl = p;
} while ((e = e.next) != null);
//Convert the red-black tree operation
if ((tab[index] = hd) != null)
hd.treeify(tab);
}
}

8. Put operation

 public V put(K key, V value) {
return putVal(hash(key), key, value, false, true);
}
final V putVal(int hash, K key, V value, boolean onlyIfAbsent,
boolean evict) {
Node<K,V>[] tab; Node<K,V> p; int n, i;
// Initially, HashMap is empty, so it needs to be expanded. n is the capacity after expansion.
if ((tab = table) == null || (n = tab.length) == 0)
n = (tab = resize()).length;
if ((p = tab[i = (n - 1) & hash]) == null)
// If there is no other item at the location to be placed (no conflict), place it directly at that location
tab[i] = newNode(hash, key, value, null);
else {
// After calculation, there are already other items in the position to be inserted, and conflicts need to be resolved (zipper method or red-black tree)
Node<K,V> e; K k;
if (p.hash == hash &&
((k = p.key) == key || (key != null && key.equals(k))))
// After the previous operation, the first Node of the "bucket" pointed to by p is checked to see if the position matches. If the position matches and the key is the same, it means that the put data already exists and can be directly overwritten.
e = p;
else if (p instanceof TreeNode)
// If p points to TreeNode, that is, the node stored in the red-black tree, then add the new element to the red-black tree
e = ((TreeNode<K,V>)p).putTreeVal(this, tab, hash, key, value);
else {
// If p points to the head node of the linked list, the new node is inserted to the end using the tail insertion method (if the same node is found during the traversal, it will be overwritten)
for (int binCount = 0; ; ++binCount) {
// Go to the end node
if ((e = p.next) == null) {
// Insert the new node at the end
p.next = newNode(hash, key, value, null);
// Determine whether the length of the linked list reaches the threshold for tree conversion. If so, convert the linked list into a red-black tree
if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st
// Note that it does not necessarily convert to a red-black tree, but also depends on the length of the tab. When tab.length < MIN_TREEIFY_CAPACITY, it still takes the approach of expansion instead of treeization
treeifyBin(tab, hash);
break;
}
// If it is an existing node, the loop will be interrupted and the value will be overwritten later
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k))))
break;
p = e;
}
}
// If the data already exists, overwrite it
if (e != null) { // existing mapping for key
V oldValue = e.value;
if (!onlyIfAbsent || oldValue == null)
e.value = value;
afterNodeAccess(e);
return oldValue;
}
}
//Increase the count by one (for fast failure)
++modCount;
if (++size > threshold)
resize();
afterNodeInsertion(evict);
return null;
}

9. Get operation

 public V get(Object key) {
Node<K,V> e;
return (e = getNode(hash(key), key)) == null ? null : e.value;
}
/**
* When getting, the most important thing is to first find the bucket location based on the hash value of the key, and then search based on the key
*/
final Node<K,V> getNode(int hash, Object key) {
Node<K,V>[] tab; Node<K,V> first, e; int n; K k;
// If data exists at the hashed location according to the key, return null directly if it does not exist
if ((tab = table) != null && (n = tab.length) > 0 &&
(first = tab[(n - 1) & hash]) != null) {
// Determine whether the first node is the element to be found based on hash and key. If so, return the first node
if (first.hash == hash && // always check first node
((k = first.key) == key || (key != null && key.equals(k))))
return first;
if ((e = first.next) != null) {
// If the node is a red-black tree node type, traverse the red-black tree to search
if (first instanceof TreeNode)
return ((TreeNode<K,V>)first).getTreeNode(hash, key);
// Traverse the linked list to search
do {
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k))))
return e;
} while ((e = e.next) != null);
}
}
return null;
}

10. Remove operation

There are two remove interfaces: remove(key) and remove(key,value). Both of them call a removeNode method, as follows:

 public V remove(Object key) {
Node<K,V> e;
return (e = removeNode(hash(key), key, null, false, true)) == null ?
null : e.value;
}
// Implemented the remove method in the Map interface
@Override
public boolean remove(Object key, Object value) {
return removeNode(hash(key), key, value, true, true) != null;
}
final Node<K,V> removeNode(int hash, Object key, Object value,
boolean matchValue, boolean movable) {
Node<K,V>[] tab; Node<K,V> p; int n, index;
// The map is not empty, and the corresponding position of the hash is not empty, then the search is performed, otherwise it is considered not found and null is returned
if ((tab = table) != null && (n = tab.length) > 0 &&
(p = tab[index = (n - 1) & hash]) != null) {
Node<K,V> node = null, e; K k; V v;
// Check if the first node matching the hash address matches. If both the hash and the key match, it means the element to be deleted has been found.
if (p.hash == hash &&
((k = p.key) == key || (key != null && key.equals(k))))
node = p;
else if ((e = p.next) != null) {
//Traverse the red-black tree
if (p instanceof TreeNode)
node = ((TreeNode<K,V>)p).getTreeNode(hash, key);
else {
// Traverse the linked list
do {
if (e.hash == hash &&
((k = e.key) == key ||
(key != null && key.equals(k)))) {
node = e;
break;
}
p = e;
} while ((e = e.next) != null);
}
}
// If node is null, it means that the element corresponding to key is not found
// If node is not null, it is determined by calling remove(key) or remove(key,value)
if (node ​​!= null && (!matchValue || (v = node.value) == value ||
(value != null && value.equals(v)))) {
// The node to be deleted matches, if it is a tree node type, delete the node from the tree
if (node ​​instanceof TreeNode)
((TreeNode<K,V>)node).removeTreeNode(this, tab, movable);
else if (node ​​== p)
// When the node to be deleted is the first node, directly move the next node of the head node forward one position (the old head node is deleted)
tab[index] = node.next;
else
// Non-head node, modify the pointer and assign the next node to the parent node's next
p.next = node.next;
// The number of modifications increases by one, and the number of elements decreases by one
++modCount;
--size;
afterNodeRemoval(node);
return node;
}
}
return null;
}

2. Common interview questions about hashmap

1. Talk about your understanding of HashMap

  • HashMap stores key-value pairs, and the key is unique. It uses the chain address method to handle hash conflicts. When adding an element to the HashMap, the hash value of the key is calculated and the remainder is taken to get the storage position of the element in the array.
  • The underlying data structure of HashMap has undergone major changes in JDK1.8. Before 1.8, it used an array plus a linked list data structure, while 1.8 uses an array plus a linked list plus a red-black tree data structure.
  • HashMap is not thread-safe. For thread-safety, you can use HashTable and ConcurrentHashMap.
  • The hash() and resize() methods have also undergone major changes in version 1.8, improving performance.
  • Both keys and values ​​can store null, but a key can only store one null. When the key is null, it is stored in table[0].

2. Some parameters of HashMap

 //The default initial length of HashMap is 16
static final int DEFAULT_INITIAL_CAPACITY = 1 << 4;
//The maximum length of HashMap is 2 to the power of 30
static final int MAXIMUM_CAPACITY = 1 << 30;
//The default load factor of HashMap is 0.75
static final float DEFAULT_LOAD_FACTOR = 0.75f;
//The critical value of upgrading the HashMap linked list to a red-black tree
static final int TREEIFY_THRESHOLD = 8;
//The critical value of HashMap red-black tree degenerating into a linked list
static final int UNTREEIFY_THRESHOLD = 6;
//HashMap linked list upgraded to red-black tree second condition: the length of HashMap array (bucket) is greater than or equal to 64
static final int MIN_TREEIFY_CAPACITY = 64;
//HashMap underlying Node bucket array
transient Node<K,V>[] table;
// Expansion threshold. When the number of elements in your hashmap exceeds this threshold, expansion will occur.
//threshold = capacity * loadFactor
int threshold;

3. Why must the length of a HashMap be a power of 2?

  • When calculating the node index, the hsah value of the key is used for the remainder operation. However, when the computer calculates, there is no remainder operation, so the remainder operation is converted into other operations.
  • When n is a power of 2, it satisfies the formula: (n - 1) & hash = hash % n. In this way, bitwise operations can be used instead of remainder operations, making the calculation more efficient.

4. Why does HashMap perform bitwise operations when obtaining hash values?

To ask it another way: Can we directly use the hashcode value of the key to calculate the index storage?

 static final int hash(Object key) {
int h;
return (key == null) ? 0 : (h = key.hashCode()) ^ (h >>> 16);
}

If we use hashCode to modulo the array size directly, then only the low bits of hashCode are involved in the calculation, and the high bits have no effect. So our idea is to let the high bits of hashCode participate in the calculation as well, to further reduce the probability of hash collision and make the data distribution more even. We call this operation perturbation.

(h >>> 16) is an unsigned right shift of 16 bits, with 0 added to the right to get the high 16 bits of hashCode.

(h = key.hashCode()) ^ (h >>> 16) XORing the hashCode with its high 16 bits can make the resulting hash value more consistent, minimize hash conflicts, and improve performance.

From this perspective, the lower 16 bits of hashCode are hashed (XORed), and the length of the HashMap array generally does not exceed 2 to the power of 16, so the upper 16 bits are not used in most cases. Therefore, you only need to XOR the key's HashCode with its lower 16 bits to use the high-order hash value, reduce the probability of hash collisions, and make data distribution more even.

5. What are the differences between HashMap in JDK1.7 and JDK1.8? The underlying implementation of HashMap

In Java, there are two relatively simple data structures for storing data: arrays and linked lists. The characteristics of arrays are: easy to address, difficult to insert and delete; the characteristics of linked lists are: difficult to address, but easy to insert and delete; so we combine arrays and linked lists to give full play to their respective advantages, and use a method called zipper method to resolve hash conflicts.

JDK1.8 mainly solves or optimizes the following issues:

Resize capacity and calculate hash optimization

The red-black tree is introduced to avoid a single linked list being too long and affecting query efficiency. Please refer to the red-black tree algorithm;

The multi-threaded dead loop problem has been solved, but it is still not thread-safe and may cause data loss when used in multi-threaded environments.

6.What is the specific process of HashMap’s put method?

Source code, with code on it

HashMap is lazy loaded and the array is created only when it is put for the first time.

  • Determine whether the key-value pair array table[i] is empty or null, otherwise execute resize() to expand the capacity;
  • Calculate the hash value according to the key value key to get the inserted array index i. If table[i]==null, directly create a new node and add it, go to ⑥. If table[i] is not empty, go to ③.
  • Determine whether the first element of table[i] is the same as key. If so, overwrite value directly. Otherwise, go to step ④. The same here refers to hashCode and equals.
  • Determine whether table[i] is a treeNode, that is, whether table[i] is a red-black tree. If it is a red-black tree, insert the key-value pair directly into the tree, otherwise go to step ⑤;
  • Traverse table[i] and record the traversal length. If the same key value is found during the traversal, the value is directly overwritten. If there is no same key, a node is inserted at the end of the linked list. After insertion, determine whether the length of the linked list is greater than or equal to 8. If so, consider tree conversion. If the number of elements in the array is less than 64, just resize the array. If it is greater than or equal, then tree conversion is performed.
  • After successful insertion, determine whether the number of key-value pairs in the array exceeds the threshold. If so, expand the capacity.

7.What is the specific process of HashMap’s get method?

Source code, with code on it

  • First, get the hash value of the key according to the hash method;
  • Then, the Node array index corresponding to the key is obtained through hash & (length - 1) (length corresponds to the array length);
  • First, determine whether this node is empty and whether it is the value to be found. If so, return empty. Otherwise, determine whether the second node is empty. If so, return empty. If not, determine whether the data structure is a linked list or a red-black tree.
  • The linked list structure is searched sequentially, and the == symbol and equals() method are used each time to determine whether the key is the same. If the condition is met, the node is returned directly. If no node is found after traversing the linked list, it returns empty.
  • The red-black tree structure performs the corresponding getTreeNode() search operation;

8.How is the expansion operation of HashMap implemented?

Regardless of JDK1.7 or JDK1.8, when the put method is executed, if the table is empty, the resize() method is executed to expand the capacity, and the default length is 16;

JDK1.7 expansion

Conditions: For expansion to occur, two conditions must be met at the same time

  • The current storage quantity is greater than or equal to the threshold;
  • A hash collision occurs;

Features: Expand capacity first, then add (head insertion method used for expansion). Head insertion method will reverse the linked list and may cause an infinite loop in a multi-threaded environment.

JDK1.8 expansion

condition:

  • The current storage quantity is greater than or equal to the threshold
  • When the length of a linked list is >= 8, but the number of nodes stored in the array is size() < 64
  • Features: Plug first and then determine whether expansion is needed (tail-plug method is used when expanding capacity)
  • Disadvantages: In multi-threaded mode, 1.8 may have data overwriting

9. Conditions for upgrading linked lists to red-black trees

The upgrade to a red-black tree will be considered only when the linked list length is greater than 8. There is a condition that the length of the HashMap Node array is greater than or equal to 64 (if it is not satisfied, an expansion will be performed instead of upgrading);

10. Conditions for red-black trees to degenerate into linked lists

  • When resize() is called, if the number of nodes in the tree split from the red-black tree is less than or equal to the critical value of 6, it will degenerate into a linked list;
  • When deleting an element remove(), the removeTreeNode() method will check whether the red-black tree meets the degeneration condition, regardless of the number of nodes. If the red-black tree root is empty, or the left subtree/right subtree of the root is empty, and the left subtree of the left subtree of the root.left.left is empty, the red-black tree will degenerate into a linked list;

11.How does HashMap resolve hash conflicts?

  • Use chain address method (using hash table) to link data with the same index;
  • Use a secondary perturbation function (hash function) to reduce the probability of hash collisions and make data distribution more even;
  • The introduction of red-black trees further reduces the time complexity of traversal, making traversal faster;

<<:  Introduction to Glide: Image loading library for Android

>>:  A quick overview of the history of computers

Recommend

2020 Teacher Gu's Modeling and Color Intermediate Class

2020 Teacher Gu's Modeling and Color Intermed...

iPhone 6 flies against the wind: let all kinds of storms and doors go to hell

It is hard to find another mobile phone like the ...

Summary of practical experience: How to carry out a fission activity well?

Recently, we urgently planned an activity support...

Git process in iOS development

[[152623]] Git process in iOS development I belie...

Things to note when applying for 400 telephone numbers

Since its emergence, 400 telephones have graduall...

Do you really understand colds?

When it comes to colds, everyone knows about it. ...

Why QQ secretly reads browser records? Officials have fixed it

[[376860]] Recently, some netizens discovered tha...

WeChat official data revealed: What kind of articles are more popular

With 468 million monthly active users, WeChat has...

A brief discussion on the "high refresh rate" of mobile phone screens

Perhaps the most popular term in smartphones sinc...

How much does it cost to rent a server with different bandwidths?

With the development of my country's network,...

Analysis of WeChat Reading's full-link user growth system

The Internet has been running wild, and the era o...

What did a group of scientists obsessed with farts do?

Leviathan Press: After reading the article, I sti...