iOS understands multithreading from a practical perspective

iOS understands multithreading from a practical perspective

Preface

I believe that many developers have more or less understood multithreading. In the past, I didn’t understand some things very thoroughly. After accumulating experience, I need to summarize this aspect myself. I knew that multithreading was very important when I first came into contact with iOS, but at that time, my ability was limited and I couldn’t understand it very well. If I just looked up its conceptual things, I could find a lot of them on the Internet, and it would be meaningless to summarize it in that way. When I first conceived of writing this article, I hoped that I could write it from a different perspective. I wanted to summarize multithreading from a practical perspective, so I analyzed, understood and summarized multithreading from some examples I saw from third parties and myself, as well as the multithreading and memory management books I read some time ago.

[[225616]]

These concepts are easy to confuse

One process: A process is a container for threads. When you open an App, you are opening a process. QQ has its own process, and WeChat has its own process. A process can contain multiple threads. If you compare the process to a highway, threads are the lanes on the highway. It is precisely because of these lanes that the efficiency of the entire traffic operation has become higher. It is also because of the emergence of multi-threading that the efficiency of the entire system operation has become higher.

2. Threads: Threads are process entities that we open up in the process to do things for us. To put it simply, threads are channels that we open up in the process to do what we want to do. A thread can only do one "thing" at a time point, and multiple threads can do multiple "things" at the same time point. This understanding is still the example of the highway we mentioned earlier.

A highway is a process, and each lane is a different thread. When passing through a toll booth, if there is only one thread on this process, that is, there is only one lane on a highway, then you can only queue up one car at a time. It is impossible for two cars to pass at the same time. But if you have multiple threads on a process, that is, there are several lanes on the highway, there are also multiple toll windows, then it is entirely possible for two cars to pay the fee and pass at the same time. I believe this can help you understand the relationship between the process and the thread.

  • Synchronous thread: The synchronous thread will block the current thread to execute the "things" (tasks) that the synchronous thread wants to do, and will return to the current thread only after the execution is completed.
  • Asynchronous thread: Asynchronous thread will not block the current thread to execute the "things" that the asynchronous thread wants to do. Because it is asynchronous, it will restart a thread to do the "things" it wants to do.

Three queues: The queue is used to manage the "tasks" mentioned below. It adopts the first-in-first-out (FIFO) principle. It is derived from the following classified parallel and serial queues. There can be multiple queues on a thread.

  • Parallel queue: The tasks in this queue can be executed concurrently (simultaneously). As we know, synchronous execution of tasks will not start a new thread, so the tasks in the parallel queue will only be executed synchronously in one thread. Since synchronous execution means doing things in the current thread, it is necessary to finish the "things" (tasks) one by one before moving on to the next one. However, if the concurrent queue is executed asynchronously, it corresponds to starting an asynchronous thread to execute the "things" (tasks) to be done, and many "things" will be done at the same time.
  • Serial queue: The tasks in this queue are serial, that is, they are done one by one. Serial synchronization will wait for things to be done one by one before doing the next one. If it is asynchronous, a new thread will be opened to execute our tasks serially.

4. Tasks: According to my own popular understanding, tasks are the concept of "things" mentioned above. This "thing" can be understood as a task, and this "thing" must be executed on a thread (whether it is the current thread or another thread you have started). You can choose to execute this "thing" synchronously or asynchronously, which derives things that fit the synchronous thread and asynchronous thread on the thread.

  • Synchronous tasks: No need to start a new thread, just execute in the current thread.
  • Asynchronous task: You need to open a new thread to execute this task asynchronously.

There is also a special serial queue in iOS - the main queue, which runs a special thread - the main thread

The main thread is also called the UI thread. As the name suggests, the main task of the UI thread is to process the UI in a timely manner. Only the main thread has the ability to process the UI. Other time-consuming operations are placed in the child thread (that is, the opening thread) to execute. The opening thread will also occupy a certain amount of memory, so do not open many threads at the same time.

The above content explains several key concepts in multithreading. If you have any questions, please feel free to communicate with us. The following is a running table of the queue execution, and we will explain them one by one.

NSThread

In fact, NSThread is used quite a lot in our daily development. Let's talk about some of the things we need to pay attention to step by step. Let's first look at several methods of initialization.

  1. /*
  2. Initialize the class method of NSThread, and the specific tasks are performed in Block
  3. + (void)detachNewThreadWithBlock:(void (^)(void))block API_AVAILABLE(macosx(10.12), ios(10.0), watchos(3.0), tvos(10.0));
  4.     
  5. Initialize NSThread using the selector method. Target refers to the object to which the selector method belongs. The selector method is also a method of the specified target object.
  6. + (void)detachNewThreadSelector:(SEL)selector toTarget:(id)target withObject:(nullable id)argument;
  7.     
  8. Methods for initializing NSThread. The difference between these two methods and the above two methods is that you can get the NSThread object in these two methods.
  9. The specific parameters have the same meaning as the parameters explained above.
  10. Remember one thing: You need to manually start the NSThread initialized by the following two methods
  11. - (instancetype)initWithTarget:(id)target selector:(SEL)selector object:(nullable id)argument NS_AVAILABLE(10_5, 2_0);
  12.     
  13. - (instancetype)initWithBlock:(void (^)(void))block API_AVAILABLE(macosx(10.12), ios(10.0), watchos(3.0), tvos(10.0));
  14. */

In addition to the four methods we proposed above, we also need to pay attention to the initialization problem, that is, NSObject (NSThreadPerformAdditions), which is the category added to our NSObject. We also use some of its specific methods very often:

  1. /*
  2. The aSelector you execute in this method is executed in MainThread, that is, in the main thread
  3. Note the BOOL parameter after waitUntilDone, which indicates whether to wait until the aSelector method is executed.
  4. Modes is the type of operation of RunLoop. I will also summarize this RunLoop later.
  5. - (void)performSelectorOnMainThread:(SEL)aSelector withObject:(nullable id)arg waitUntilDone:(BOOL)wait modes:(nullable NSArray *)array;
  6.     
  7. - (void)performSelectorOnMainThread:(SEL)aSelector withObject:(nullable id)arg waitUntilDone:(BOOL)wait;
  8. // equivalent to the first method with kCFRunLoopCommonModes
  9.     
  10. The above two methods are run directly in the main thread, and the following two methods are run in the thr you initialized. The other parameters are the same as explained above.
  11. - (void)performSelector:(SEL)aSelector onThread:(NSThread *)thr withObject:(nullable id)arg waitUntilDone:(BOOL)wait modes:(nullable NSArray *)array NS_AVAILABLE(10_5, 2_0);
  12.     
  13. - (void)performSelector:(SEL)aSelector onThread:(NSThread *)thr withObject:(nullable id)arg waitUntilDone:(BOOL)wait NS_AVAILABLE(10_5, 2_0);
  14. // equivalent to the first method with kCFRunLoopCommonModes
  15.     
  16. - (void)performSelectorInBackground:(SEL)aSelector withObject:(nullable id)arg NS_AVAILABLE(10_5, 2_0);
  17. */

Let's talk about the BOOL type parameter after waitUntilDone mentioned earlier. The meaning of this parameter is a bit like whether we execute the aSelector task synchronously! The specific content of the following two pictures will be clear at a glance:

Let's look at the output of the result when it is equal to YES:

Regarding NSThread, we will not describe the specific meanings of the following methods. If you have any other questions about NSThread, you can add me on QQ:

  1. /*
  2.     
  3. Set the thread to sleep until the specified date
  4. + (void)sleepUntilDate:(NSDate *) date ;
  5.     
  6. Thread sleep time interval, this method is more common when setting the startup page interval
  7. + (void)sleepForTimeInterval:(NSTimeInterval)ti;
  8.     
  9. Thread exit. When executing to a certain special situation, you can exit the current thread. Be careful not to call it casually in the main thread.
  10. + (void)exit;
  11.     
  12. Thread priority
  13. + ( double )threadPriority;
  14.     
  15. Setting the thread priority
  16. + (BOOL)setThreadPriority:( double )p;
  17.     
  18. */

NSOperation

We also have to mention NSOperation when it comes to multithreading. It may be a little more powerful than we think. There are many things to say about NSOperation, and the same is true for the previous NSThread. If we talk about these in detail, we can write an article. Maybe in the future, as we get more and more exposure to multithreading, we will create an independent category to summarize it.

First of all, we need to know that NSOperation is based on GCD encapsulation. When we use the NSOperation class itself, we don't hide it. We focus more on NSInvocationOperation and NSBlockOperation that Apple has encapsulated for us.

You can command NSOperation and take a look. There are a few points you should understand. The main ones are the following methods:

  1. NSOperation * operation = [[NSOperation alloc]init];
  2. [operation start]; //Start
  3. [operation cancel]; //Cancel
  4. [operation setCompletionBlock:^{
  5. //Operation after operation is completed
  6. }];

Let's talk specifically about the two classes we mentioned above: NSInvocationOperation and NSBlockOperation. First, let's look at the initialization of NSInvocationOperation:

  1. /*
  2.     
  3. Initialization method After reading the previous article, its target, sel, arg and other parameters are not difficult to understand
  4. -(nullable instancetype)initWithTarget:(id)target selector:(SEL)sel object:(nullable id)arg;
  5.     
  6. -(instancetype)initWithInvocation:(NSInvocation *)inv NS_DESIGNATED_INITIALIZER;
  7.     
  8. */

Supplement: NS_DESIGNATED_INITIALIZER specifies the initialization method not for the user, but for the internal reality. You can click in to learn more about it! NSInvocationOperation is actually executed synchronously, so it is not very valuable if used alone. It can only be used with NSOperationQueue to achieve multi-threaded calls. We will talk about this in detail later.

Let's take a look at NSBlockOperation. Its important methods are the following two

  1. /*
  2. Initialization method
  3. + (instancetype)blockOperationWithBlock:(void (^)(void))block;
  4.     
  5. Add an executable block to the NSBlockOperation initialized earlier
  6. - (void)addExecutionBlock:(void (^)(void))block;
  7. */

We have to mention one thing about NSBlockOperation: its maximum concurrency and the specific maximum concurrency are also related to the operating environment. We can click here for the specific content summarized and verified by our peers. We will not go into details here due to space constraints.

In fact, the above words are not enough for our daily use, but there is another class that activates both of them that we have to talk about: NSOPerationQueue. The following is a general description of it. It is quite simple, so I will not write a demo specifically.

That’s all we have to say about NSOperation. Let’s focus on GCD.

Protagonist GCD -- Main Thread

1. Let's start with the main queue and the main thread. We can get the main queue through the following method:

  1. dispatch_queue_t mainqueue = dispatch_get_main_queue();

2. We execute tasks synchronously in the main thread. Here are the results of the operation and the printed information:

Let's explain why this result occurs when executing synchronous tasks in the main thread. Let's sort out the execution process step by step:

  1. The first print information is obtained in the main thread of the main queue, which is no problem.
  2. Start executing the dispatch_sync function. The main queue is a serial queue. This function will insert this task to the end of the main queue (understanding queue adding tasks)
  3. When the main thread executes here, it will wait for the inserted synchronization task to complete before executing the subsequent operations.
  4. However, since this synchronization task is inserted at the end of the main queue, this block will not be executed until the task at the front of the queue is completed (the main thread is executing the initMainQueue task)
  5. This creates a mutual waiting process. The main thread is waiting for the block to return, but the block is waiting for the main thread to execute it, which causes a deadlock. You can tell from the printed information that the block has not been executed.

Here we may think that the main queue is a serial queue, so we add a serial queue to the main thread, and then add a synchronization task to the serial queue. Isn't this the same as adding a synchronization task to the main queue of the main thread? What is the result? Let's take a look at the following print:

Let's explain the execution steps as before:

  • The main thread executes the initSerialQueue method in the main queue. When it reaches this method, it creates a serial queue (note that it is not the main queue) and prints the first message above.
  • Execute to the dispatch_sync function, this function adds a synchronization task to the serial queue, the synchronization task will be executed immediately
  • The main thread directly operates and executes the synchronization task in this queue, and prints the second message
  • The main thread then executes the third print message below

Understanding: Comparing this execution process with the previous one, you will know the difference is that the previous one was added to the main queue, but here it is added to the main queue. Since it is inserted at the end of the main queue, all the tasks in the main queue need to be executed before it can be assigned to it. However, when the main thread executes the initMainQueue method, it waits for the synchronization task added in this method to be executed and then executes it. However, the synchronization task in it is waiting for the main thread to finish executing before executing it, so they wait for each other. However, when the main thread executes a synchronization task that is not in the main queue, it does not need the main thread to complete all operations before executing this task. This task is the beginning and end of the task it adds to the serial queue. Since there is no need to wait, it will not cause deadlock!

The above question is often asked, and there are many explanations. I hope I can make it clear!

3. Main thread Here we mention one more point, which is the simple transmission of information between threads

We mentioned earlier that the main thread is also called the UI thread. We update all UI-related things in the main thread. We recommend that time-consuming operations such as downloading data and accessing the database be done in the sub-thread. This will cause the sub-thread to return to the main thread to update the UI after processing these operations. This is worth our attention, but in fact, this is also the most used one, and I believe everyone understands it!

The protagonist GCD -- serial queue

We will not go into the conceptual details of serial queues here. Whether it is serial queue + synchronous task or serial queue + asynchronous task, it is simple. If you are interested, you can write it yourself. The following analysis will mention their specific uses. We are in a slightly more complicated problem than the previous one, serial queue + asynchronous + synchronous. You can try not to look down and analyze the execution result of the following code first.

  1. static void * DISPATCH_QUEUE_SERIAL_IDENTIFY;
  2.    
  3. -(void)initDiapatchQueue{
  4.    
  5. dispatch_queue_t serialQueue = dispatch_queue_create(DISPATCH_QUEUE_SERIAL_IDENTIFY, DISPATCH_QUEUE_SERIAL);
  6. dispatch_async(serialQueue, ^{
  7.               
  8. NSLog(@ "The content of an asynchronous task%@" , [NSThread currentThread]);
  9. dispatch_sync(serialQueue, ^{
  10.                       
  11. NSLog(@ "The content of a synchronous task%@" , [NSThread currentThread]);
  12. });
  13. });
  14. }

I don't know what the result of your analysis of the code is. Let's take a look at the result here, and then analyze its entire execution process step by step with the above, and we can find the answer:

The answer is crash, which is actually deadlock. Let's go through the whole process step by step and analyze where the deadlock occurs:

  1. The main thread executes the task initDispatchQueue in the main queue and enters this method. A serial queue is created in this method. I believe everyone understands this step and there is no problem.
  2. An asynchronous task is added to the serial queue . Since it is an asynchronous task, a new thread will be started. For the convenience of description, we will refer to the newly opened thread as thread A and the task as task A. Also, since it is an asynchronous task, the main thread will not wait for the task to return and will continue to execute other tasks.
  3. The next analysis is on thread A. After task A is added to the serial queue, it starts to execute on thread A and prints out our first message, which also proves that it is not on the main thread, which is also fine.
  4. Thread A starts executing Task A. After entering Task A, it adds a task to the serial queue synchronously in Task A, which is called Task B. Since Task B is added synchronously by the dispatch_sync function, it needs to be executed immediately, so it waits for Thread A to execute it.
  5. However, task B is added to the end of the serial queue , and thread A will not execute it before it finishes the current task A. This causes thread A to wait for the current task A to finish, and task B to wait for thread A to execute it, resulting in a deadlock.

After the above analysis, you can see that this scenario is the same as adding tasks synchronously in the main thread. Let's think carefully about the whole process and analyze why the main thread + serial queue + synchronous task above does not form a deadlock! By comparing and understanding each other, you can understand the whole problem clearly.

The protagonist GCD -- parallel queue

Next, let's talk about the parallel queue. There is nothing much to say about parallel queue + synchronous execution or parallel queue + asynchronous execution. Here we will talk about the points that need to be paid attention to in parallel + asynchronous. I wonder if you have ever thought about it. If it is parallel, many tasks will be executed together. If it is an asynchronous task, a new thread will be opened. If we add ten asynchronous tasks, will ten threads be opened? Then one hundred asynchronous tasks will not require one hundred threads to be opened. The answer is definitely no! Let's talk about how the system handles it. The following is the explanation in the advanced programming book. Let's sort it out and draw a conclusion.

  • When it is DISPATCH_QUEUE_CONCURRENT, the subsequent tasks can be executed directly without waiting for the previous task to be processed.
  • The number of processes executed in parallel depends on the current system state. That is, iOS and OS X determine the number of processes executed in parallel in DISPATCH_QUEUE_CONCURRENT based on the current system state, such as the number of processes in the Dispatch Queue, the number of CPU cores, and the CPU load.
  • The XNU kernel, the core of iOS and OS X, determines the number of threads to use and spawns the required threads to perform processing.
  • When processing is completed and the number of processes to be executed decreases, the XNU kernel will terminate the threads that are no longer needed.

When processing parallel asynchronous tasks, threads can be used repeatedly. For example, after the thread of task 1 has completed task 1, the thread can continue to execute the subsequent unexecuted tasks.

That's all there is to it. When we were talking about serial queues, serial queues + asynchronous tasks nested in synchronous tasks would cause deadlock. What if we turn it into a synchronous queue? What would the result be? Let's take a look at the execution results of the following code:

As can be seen from the above results, there is no problem. Here we will not analyze its execution process step by step, but talk about why there is no problem with the parallel queue, but there will be problems with the serial queue:

Adding an asynchronous task to a parallel queue also creates a new thread, and then adding a synchronous task to the parallel queue in this task. Because it is a parallel queue, the execution of this synchronous task does not require the previous asynchronous task to be completed, and it can be started directly, so there is the following print information. Through the above questions, I believe that after understanding, you will have a deeper understanding of adding synchronous tasks or asynchronous tasks to serial queues or parallel queues. Let's continue to summarize.

GCD is not only these

The following contents about GCD are worthy of our attention. Let's start talking about them one by one:

1. dispatch_barrier_async

The dispatch_barrier_async function is what we commonly call a fence method. The meaning of "fence" is to separate the outside from the inside. This is what this function does. It separates the task before and after the inserted fence. When the previous task is executed, the task inserted by the "fence function" will be executed. When the task of the fence is executed, the task behind the fence will be executed. You can understand it by looking at the simple demo below.

As you can see from the above, we insert 0 into the position of the third task, which means we wait until the first two tasks are completed before executing the third one. If you think the first two tasks are simple and do not take too much time to execute, you can try to set the "task amount" of the first two tasks a little larger, which will help you better understand this "fence" operation!

2. dispatch_after

dispatch_after delayed operation

If you want to wait a certain amount of time before executing a task, you can use this function to handle it. The writing method is very simple because it has been encapsulated for us. Look at the following two lines of code:

  1. // DISPATCH_TIME_NOW starts at the current time
  2. // NSEC_PER_SEC is a macro that represents time. You can search online to understand it.
  3. dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(10.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
  4.           
  5. NSLog(@ "delayed 10 seconds to execute" );
  6. });

3. dispatch_apply

dispatch_apply is similar to a for loop, which will run the block task n times in the specified dispatch queue. If the queue is a concurrent queue, the block task will be executed concurrently. dispatch_apply is a synchronous call, which will return after the block task is executed n times. Since it is synchronous, there will be problems if we write it like this:

We can see that there is a problem, but if we put it in a serial queue or a parallel queue, it will be like this

4. dispatch_group_t

Let's first talk about the role of dispatch_group_t. When multiple tasks appended to the Dispatch Queue are all completed, you want to perform the end processing. This situation will often occur. When using only one Serial Dispatch Queue, you only need to append all the operations you want to perform to the Serial Dispatch Queue and append them to the end processing. However, when using Concurrent Dispatch Queue or using multiple Dispatch Queues at the same time, it is more complicated. In this case, Dispatch Group can play its role. Take a look at the following code:

  1. -(void)testDispatch_group_t{
  2.    
  3. dispatch_group_t group_t = dispatch_group_create();
  4. dispatch_queue_t queue_t = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
  5. dispatch_group_async(group_t, queue_t, ^{
  6.                    
  7. NSLog(@ "1--Current thread%@" , [NSThread currentThread]);
  8. });
  9. dispatch_group_async(group_t, queue_t, ^{
  10.                    
  11. NSLog(@ "2--Current thread%@" , [NSThread currentThread]);
  12. });
  13. dispatch_group_async(group_t, queue_t, ^{
  14.                    
  15. NSLog(@ "3--Current thread%@" , [NSThread currentThread]);
  16. });
  17. dispatch_group_async(group_t, queue_t, ^{
  18.                    
  19. for ( int i = 1; i<10; i++) {
  20. 6
  21. NSLog(@ "4--Current thread%@" , [NSThread currentThread]);
  22. }
  23. });
  24. // All current tasks are completed
  25. dispatch_group_notify(group_t, queue_t, ^{
  26.               
  27. NSLog(@ "All previous operations have been completed%@" , [NSThread currentThread]);
  28. });
  29. }

The intent of this code is obvious, and you will understand it after reading the following print information:

Summary: This is the summary of the most basic issues about multithreading for the time being. There are still many problems that I am still summarizing, such as the following thread locks and so on. I will share them when the summary is almost complete!

<<:  China’s mobile phone sales fell 28%. LeTV and Coolpad have disappeared. Who will be the next to disappear?

>>:  WeChat launches new feature "Friends Message", but 60% of users say they don't like it

Recommend

Android phones collect and transmit large amounts of data, study shows

[[428880]] Researchers from Trinity College Dubli...

Five little-known facts about peony

(Copyrighted image from the gallery, no permissio...

The world's first shot!

On May 1, the clinical study of the inactivated C...

Alien Rubik's Cube Tutorial

Alien Rubik's Cube Tutorial Resource Introduc...

WeChat's anxiety may be more serious than you think

WeChat has been revamped, and many users who upda...

Goodbye 16GB iPhone: Is it the capacity or the strategy that has changed?

Consumers have just jumped out of the joy of final...

How to design an online traffic-generating activity from scratch?

I was slapped in the face by Double 11. I think t...

How fast is the fastest backflip? 368 turns in one second! | Natural Trumpet

Welcome to the 65th issue of the Nature Trumpet c...