Preface I believe that many developers have more or less understood multithreading. In the past, I didn’t understand some things very thoroughly. After accumulating experience, I need to summarize this aspect myself. I knew that multithreading was very important when I first came into contact with iOS, but at that time, my ability was limited and I couldn’t understand it very well. If I just looked up its conceptual things, I could find a lot of them on the Internet, and it would be meaningless to summarize it in that way. When I first conceived of writing this article, I hoped that I could write it from a different perspective. I wanted to summarize multithreading from a practical perspective, so I analyzed, understood and summarized multithreading from some examples I saw from third parties and myself, as well as the multithreading and memory management books I read some time ago.
These concepts are easy to confuse One process: A process is a container for threads. When you open an App, you are opening a process. QQ has its own process, and WeChat has its own process. A process can contain multiple threads. If you compare the process to a highway, threads are the lanes on the highway. It is precisely because of these lanes that the efficiency of the entire traffic operation has become higher. It is also because of the emergence of multi-threading that the efficiency of the entire system operation has become higher. 2. Threads: Threads are process entities that we open up in the process to do things for us. To put it simply, threads are channels that we open up in the process to do what we want to do. A thread can only do one "thing" at a time point, and multiple threads can do multiple "things" at the same time point. This understanding is still the example of the highway we mentioned earlier. A highway is a process, and each lane is a different thread. When passing through a toll booth, if there is only one thread on this process, that is, there is only one lane on a highway, then you can only queue up one car at a time. It is impossible for two cars to pass at the same time. But if you have multiple threads on a process, that is, there are several lanes on the highway, there are also multiple toll windows, then it is entirely possible for two cars to pay the fee and pass at the same time. I believe this can help you understand the relationship between the process and the thread.
Three queues: The queue is used to manage the "tasks" mentioned below. It adopts the first-in-first-out (FIFO) principle. It is derived from the following classified parallel and serial queues. There can be multiple queues on a thread.
4. Tasks: According to my own popular understanding, tasks are the concept of "things" mentioned above. This "thing" can be understood as a task, and this "thing" must be executed on a thread (whether it is the current thread or another thread you have started). You can choose to execute this "thing" synchronously or asynchronously, which derives things that fit the synchronous thread and asynchronous thread on the thread.
There is also a special serial queue in iOS - the main queue, which runs a special thread - the main thread The main thread is also called the UI thread. As the name suggests, the main task of the UI thread is to process the UI in a timely manner. Only the main thread has the ability to process the UI. Other time-consuming operations are placed in the child thread (that is, the opening thread) to execute. The opening thread will also occupy a certain amount of memory, so do not open many threads at the same time. The above content explains several key concepts in multithreading. If you have any questions, please feel free to communicate with us. The following is a running table of the queue execution, and we will explain them one by one. NSThread In fact, NSThread is used quite a lot in our daily development. Let's talk about some of the things we need to pay attention to step by step. Let's first look at several methods of initialization.
In addition to the four methods we proposed above, we also need to pay attention to the initialization problem, that is, NSObject (NSThreadPerformAdditions), which is the category added to our NSObject. We also use some of its specific methods very often:
Let's talk about the BOOL type parameter after waitUntilDone mentioned earlier. The meaning of this parameter is a bit like whether we execute the aSelector task synchronously! The specific content of the following two pictures will be clear at a glance: Let's look at the output of the result when it is equal to YES: Regarding NSThread, we will not describe the specific meanings of the following methods. If you have any other questions about NSThread, you can add me on QQ:
NSOperation We also have to mention NSOperation when it comes to multithreading. It may be a little more powerful than we think. There are many things to say about NSOperation, and the same is true for the previous NSThread. If we talk about these in detail, we can write an article. Maybe in the future, as we get more and more exposure to multithreading, we will create an independent category to summarize it. First of all, we need to know that NSOperation is based on GCD encapsulation. When we use the NSOperation class itself, we don't hide it. We focus more on NSInvocationOperation and NSBlockOperation that Apple has encapsulated for us. You can command NSOperation and take a look. There are a few points you should understand. The main ones are the following methods:
Let's talk specifically about the two classes we mentioned above: NSInvocationOperation and NSBlockOperation. First, let's look at the initialization of NSInvocationOperation:
Supplement: NS_DESIGNATED_INITIALIZER specifies the initialization method not for the user, but for the internal reality. You can click in to learn more about it! NSInvocationOperation is actually executed synchronously, so it is not very valuable if used alone. It can only be used with NSOperationQueue to achieve multi-threaded calls. We will talk about this in detail later. Let's take a look at NSBlockOperation. Its important methods are the following two
We have to mention one thing about NSBlockOperation: its maximum concurrency and the specific maximum concurrency are also related to the operating environment. We can click here for the specific content summarized and verified by our peers. We will not go into details here due to space constraints. In fact, the above words are not enough for our daily use, but there is another class that activates both of them that we have to talk about: NSOPerationQueue. The following is a general description of it. It is quite simple, so I will not write a demo specifically. That’s all we have to say about NSOperation. Let’s focus on GCD. Protagonist GCD -- Main Thread 1. Let's start with the main queue and the main thread. We can get the main queue through the following method:
2. We execute tasks synchronously in the main thread. Here are the results of the operation and the printed information: Let's explain why this result occurs when executing synchronous tasks in the main thread. Let's sort out the execution process step by step:
Here we may think that the main queue is a serial queue, so we add a serial queue to the main thread, and then add a synchronization task to the serial queue. Isn't this the same as adding a synchronization task to the main queue of the main thread? What is the result? Let's take a look at the following print: Let's explain the execution steps as before:
Understanding: Comparing this execution process with the previous one, you will know the difference is that the previous one was added to the main queue, but here it is added to the main queue. Since it is inserted at the end of the main queue, all the tasks in the main queue need to be executed before it can be assigned to it. However, when the main thread executes the initMainQueue method, it waits for the synchronization task added in this method to be executed and then executes it. However, the synchronization task in it is waiting for the main thread to finish executing before executing it, so they wait for each other. However, when the main thread executes a synchronization task that is not in the main queue, it does not need the main thread to complete all operations before executing this task. This task is the beginning and end of the task it adds to the serial queue. Since there is no need to wait, it will not cause deadlock! The above question is often asked, and there are many explanations. I hope I can make it clear! 3. Main thread Here we mention one more point, which is the simple transmission of information between threads We mentioned earlier that the main thread is also called the UI thread. We update all UI-related things in the main thread. We recommend that time-consuming operations such as downloading data and accessing the database be done in the sub-thread. This will cause the sub-thread to return to the main thread to update the UI after processing these operations. This is worth our attention, but in fact, this is also the most used one, and I believe everyone understands it! The protagonist GCD -- serial queue We will not go into the conceptual details of serial queues here. Whether it is serial queue + synchronous task or serial queue + asynchronous task, it is simple. If you are interested, you can write it yourself. The following analysis will mention their specific uses. We are in a slightly more complicated problem than the previous one, serial queue + asynchronous + synchronous. You can try not to look down and analyze the execution result of the following code first.
I don't know what the result of your analysis of the code is. Let's take a look at the result here, and then analyze its entire execution process step by step with the above, and we can find the answer: The answer is crash, which is actually deadlock. Let's go through the whole process step by step and analyze where the deadlock occurs:
After the above analysis, you can see that this scenario is the same as adding tasks synchronously in the main thread. Let's think carefully about the whole process and analyze why the main thread + serial queue + synchronous task above does not form a deadlock! By comparing and understanding each other, you can understand the whole problem clearly. The protagonist GCD -- parallel queue Next, let's talk about the parallel queue. There is nothing much to say about parallel queue + synchronous execution or parallel queue + asynchronous execution. Here we will talk about the points that need to be paid attention to in parallel + asynchronous. I wonder if you have ever thought about it. If it is parallel, many tasks will be executed together. If it is an asynchronous task, a new thread will be opened. If we add ten asynchronous tasks, will ten threads be opened? Then one hundred asynchronous tasks will not require one hundred threads to be opened. The answer is definitely no! Let's talk about how the system handles it. The following is the explanation in the advanced programming book. Let's sort it out and draw a conclusion.
When processing parallel asynchronous tasks, threads can be used repeatedly. For example, after the thread of task 1 has completed task 1, the thread can continue to execute the subsequent unexecuted tasks. That's all there is to it. When we were talking about serial queues, serial queues + asynchronous tasks nested in synchronous tasks would cause deadlock. What if we turn it into a synchronous queue? What would the result be? Let's take a look at the execution results of the following code: As can be seen from the above results, there is no problem. Here we will not analyze its execution process step by step, but talk about why there is no problem with the parallel queue, but there will be problems with the serial queue: Adding an asynchronous task to a parallel queue also creates a new thread, and then adding a synchronous task to the parallel queue in this task. Because it is a parallel queue, the execution of this synchronous task does not require the previous asynchronous task to be completed, and it can be started directly, so there is the following print information. Through the above questions, I believe that after understanding, you will have a deeper understanding of adding synchronous tasks or asynchronous tasks to serial queues or parallel queues. Let's continue to summarize. GCD is not only these The following contents about GCD are worthy of our attention. Let's start talking about them one by one: 1. dispatch_barrier_async The dispatch_barrier_async function is what we commonly call a fence method. The meaning of "fence" is to separate the outside from the inside. This is what this function does. It separates the task before and after the inserted fence. When the previous task is executed, the task inserted by the "fence function" will be executed. When the task of the fence is executed, the task behind the fence will be executed. You can understand it by looking at the simple demo below. As you can see from the above, we insert 0 into the position of the third task, which means we wait until the first two tasks are completed before executing the third one. If you think the first two tasks are simple and do not take too much time to execute, you can try to set the "task amount" of the first two tasks a little larger, which will help you better understand this "fence" operation! 2. dispatch_after dispatch_after delayed operation If you want to wait a certain amount of time before executing a task, you can use this function to handle it. The writing method is very simple because it has been encapsulated for us. Look at the following two lines of code:
3. dispatch_apply dispatch_apply is similar to a for loop, which will run the block task n times in the specified dispatch queue. If the queue is a concurrent queue, the block task will be executed concurrently. dispatch_apply is a synchronous call, which will return after the block task is executed n times. Since it is synchronous, there will be problems if we write it like this: We can see that there is a problem, but if we put it in a serial queue or a parallel queue, it will be like this 4. dispatch_group_t Let's first talk about the role of dispatch_group_t. When multiple tasks appended to the Dispatch Queue are all completed, you want to perform the end processing. This situation will often occur. When using only one Serial Dispatch Queue, you only need to append all the operations you want to perform to the Serial Dispatch Queue and append them to the end processing. However, when using Concurrent Dispatch Queue or using multiple Dispatch Queues at the same time, it is more complicated. In this case, Dispatch Group can play its role. Take a look at the following code:
The intent of this code is obvious, and you will understand it after reading the following print information: Summary: This is the summary of the most basic issues about multithreading for the time being. There are still many problems that I am still summarizing, such as the following thread locks and so on. I will share them when the summary is almost complete! |
>>: WeChat launches new feature "Friends Message", but 60% of users say they don't like it
Before I start writing, let me explain my backgro...
[[428880]] Researchers from Trinity College Dubli...
Training course content: Build a drainage system ...
In the digital age, browsers have become our gate...
(Copyrighted image from the gallery, no permissio...
On May 1, the clinical study of the inactivated C...
According to a CNNIC report, the size of China...
Alien Rubik's Cube Tutorial Resource Introduc...
Humans have conducted many explorations on the mo...
Recently, with the strong summer heat, many parts...
WeChat has been revamped, and many users who upda...
Consumers have just jumped out of the joy of final...
I was slapped in the face by Double 11. I think t...
Welcome to the 65th issue of the Nature Trumpet c...
Car consumption was once an unattainable luxury f...