A performance optimization for MKMapView

A performance optimization for MKMapView

The recent project is mainly LBS, which focuses on member positioning. Our UI design is as follows

At first glance, it looks good and beautiful. Different people will display different avatars. But when people gather together, problems arise.

When there are many people (such as shown in the picture above), the map can be felt to be stuck when sliding, and the unsmooth feeling is very torturous, so naturally we have to solve this problem (wait, don’t complain about why we don’t use map aggregation because this map has already been put on the ***, aggregation is not suitable for this problem discussion)

analyze

First, let's take a look at how I implemented this annotationView. Since this annotationView is a special shape (that is, it cannot be directly obtained by setting the rounded corners) and the pictures inside vary from user to user, the solution is to use layer.mask for masking. The code is as follows

  1. @implementationMMAnnotationView
  2. - (instancetype)initWithAnnotation:(id)annotation reuseIdentifier:(NSString *)reuseIdentifier
  3. {
  4. self = [ super initWithAnnotation:annotation reuseIdentifier:reuseIdentifier];
  5. if ( self )
  6. {
  7. self.frame = CGRectMake( 0 , 0 , TRACK_ANNOTATION_SIZE.width, TRACK_ANNOTATION_SIZE.height);
  8. self.centerOffset = CGPointMake( 0 , -(TRACK_ANNOTATION_SIZE.height- 3 )/ 2 );
  9. self.canShowCallout = NO;
  10. self.avatarView = [[UIImageView alloc] initWithFrame:self.bounds];
  11. [self addSubview:self.avatarView];
  12. self.avatarView.contentMode = UIViewContentModeScaleAspectFill;
  13. CAShapeLayer *shapelayer = [CAShapeLayer layer];
  14. shapelayer.frame = self.bounds;
  15. shapelayer.path = self.framePath.CGPath;
  16. self.avatarView.layer.mask = shapelayer;
  17. self.layer.shadowPath = self.framePath.CGPath;
  18. self.layer.shadowRadius = 1 .0f;
  19. self.layer.shadowColor = [UIColor colorWithHex: 0x666666FF ].CGColor;
  20. self.layer.shadowOpacity = 1 .0f;
  21. self.layer.shadowOffset = CGSizeMake( 0 , 0 );
  22. self.layer.masksToBounds = NO;
  23. }
  24. return self;
  25. }
  26. //mask path  
  27. - (UIBezierPath *)framePath
  28. {
  29. if ( !_framePath )
  30. {
  31. CGFloat arrowWidth = 14 ;
  32. CGMutablePathRef path = CGPathCreateMutable();
  33. CGRect rectangle = CGRectInset(CGRectMake( 0 , 0 , CGRectGetWidth(self.bounds), CGRectGetWidth(self.bounds)), 3 , 3 );
  34. CGPoint p[ 3 ] = {
  35. {CGRectGetMidX(self.bounds)-arrowWidth/ 2 , CGRectGetWidth(self.bounds)- 6 },
  36. {CGRectGetMidX(self.bounds)+arrowWidth/ 2 , CGRectGetWidth(self.bounds)- 6 },
  37. {CGRectGetMidX(self.bounds), CGRectGetHeight(self.bounds)- 4 }
  38. };
  39. CGPathAddRoundedRect(path, NULL, rectangle, 5 , 5 );
  40. CGPathAddLines(path, NULL, p, 3 );
  41. CGPathCloseSubpath(path);
  42. _framePath = [UIBezierPath bezierPathWithCGPath:path];
  43. CGPathRelease(path);
  44. }
  45. return _framePath;
  46. }
  47.  
  48. I used code to generate the shape path and used it to generate the layer's mask and shadowPath
  49.  
  50. When using it, just use SDWebImage to set the avatar directly.
  51. 1  
  52.  
  53. [annotationView.avatarView sd_setImageWithURL:[NSURL URLWithString:avatarURL] placeholderImage:placeHolderImage];

Next, use the tool to analyze the problem. To analyze the performance, of course, select Instruments (I won't introduce the usage here). Open Core Animation and run the program. Slide the map to see the performance analysis as follows

It turns out that the average frame rate is less than 30 frames, which is far from our target of 60 frames.

Then use Debug Option to analyze it in depth

Because of MKMapView, we are mainly concerned with these options here

Color Blended Layers

Color Misaligned Images

Color Offscreen-Rendered Yellow

The results of opening these options separately are as follows

You can see

There is no problem with Color Blended Layers, but this is normal because there is no transparent area due to the use of masks

Color Misaligned Images is all correct except for the default avatar. This is because the image size on the server is inconsistent with the displayed size, which causes scaling. The default avatar is consistent, so there is no problem.

Color Offscreen-Rendered Yellow is all due to the use of masks, which leads to a lot of off-screen rendering, which is also the main reason for the performance degradation

solve

Now that the cause of the problem has been found, how can we solve it?

First of all, the mask is definitely not usable.

Secondly, we need to preprocess the downloaded images to their actual size

Then wouldn't it be ok to just combine the downloaded pictures into the final result we want to display? Try it

  1. - ( void )loadAnnotationImageWithURL:(NSString*)url imageView:(UIImageView*)imageView
  2. {
  3. //Cache the synthesized image  
  4. NSString *annoImageURL = url;
  5. NSString *annoImageCacheURL = [annoImageURL stringByAppendingString:@ "cache" ];
  6. UIImage *cacheImage = [[SDImageCache sharedImageCache] imageFromDiskCacheForKey:annoImageCacheURL];
  7. if ( cacheImage )
  8. {
  9. //LLLog(@"hit cache");  
  10. imageView.image = cacheImage;
  11. }
  12. else  
  13. {
  14. //LLLog(@"no cache");  
  15. [imageView sd_setImageWithURL:[NSURL URLWithString:annoImageURL]
  16. placeholderImage:placeHolderImage
  17. completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, NSURL *imageURL) {
  18. if (!error)
  19. {
  20. UIImage *annoImage = [image annotationImage];
  21. imageView.image = annoImage;
  22. [[SDImageCache sharedImageCache] storeImage:annoImage forKey:annoImageCacheURL];
  23. }
  24. }];
  25. }
  26. }
  27. @implementation UIImage (LJC)
  28. - (UIImage*) annotationImage
  29. {
  30. static UIView *snapshotView = nil;
  31. static UIImageView *imageView = nil;
  32. if ( !snapshotView )
  33. {
  34. snapshotView = [UIView new ];
  35. snapshotView.frame = CGRectMake( 0 , 0 , TRACK_ANNOTATION_SIZE.width, TRACK_ANNOTATION_SIZE.height);
  36. imageView = [UIImageView new ];
  37. [snapshotView addSubview:imageView];
  38. imageView.clipsToBounds = YES;
  39. imageView.frame = snapshotView.bounds;
  40. imageView.contentMode = UIViewContentModeScaleAspectFill;
  41. CGFloat arrowWidth = 14 ;
  42. CGMutablePathRef path = CGPathCreateMutable();
  43. CGRect rectangle = CGRectInset(CGRectMake( 0 , 0 , CGRectGetWidth(imageView.bounds), CGRectGetWidth(imageView.bounds)), 3 , 3 );
  44. CGPoint p[ 3 ] = {
  45. {CGRectGetMidX(imageView.bounds)-arrowWidth/ 2 , CGRectGetWidth(imageView.bounds)- 6 },
  46. {CGRectGetMidX(imageView.bounds)+arrowWidth/ 2 , CGRectGetWidth(imageView.bounds)- 6 },
  47. {CGRectGetMidX(imageView.bounds), CGRectGetHeight(imageView.bounds)- 4 }
  48. };
  49. CGPathAddRoundedRect(path, NULL, rectangle, 5 , 5 );
  50. CGPathAddLines(path, NULL, p, 3 );
  51. CGPathCloseSubpath(path);
  52. CAShapeLayer *shapelayer = [CAShapeLayer layer];
  53. shapelayer.frame = imageView.bounds;
  54. shapelayer.path = path;
  55. imageView.layer.mask = shapelayer;
  56. snapshotView.layer.shadowPath = path;
  57. snapshotView.layer.shadowRadius = 1 .0f;
  58. snapshotView.layer.shadowColor = [UIColor colorWithHex: 0x666666FF ].CGColor;
  59. snapshotView.layer.shadowOpacity = 1 .0f;
  60. snapshotView.layer.shadowOffset = CGSizeMake( 0 , 0 );
  61. CGPathRelease(path);
  62. }
  63. imageView.image = self;
  64. UIGraphicsBeginImageContextWithOptions(TRACK_ANNOTATION_SIZE, NO, 0 );
  65. [snapshotView.layer renderInContext:UIGraphicsGetCurrentContext()];
  66. UIImage *copied = UIGraphicsGetImageFromCurrentImageContext();
  67. UIGraphicsEndImageContext();
  68. return copied;
  69. }
  70. @end  

Then when using it, just simply call it as follows

  1. [self loadAnnotationImageWithURL:avatarURL imageView:annotationView.avatarView];

See how the modified Instruments performs

Color Blended Layers is unavoidable because what is displayed is a transparent image. However, due to the special nature of the map (the position of the avatar changes at a long interval, so it does not often trigger synthesis and there is no animation), so this is not a problem.

Color Misaligned Images are no longer a problem because the avatars have been scaled to the same size.

Color Offscreen-Rendered Yellow is fine because it is simply displaying an image and nothing needs to be rendered offscreen.

Let's look at the frame rate

Oh-Yeah~ Not only did the frame rate reach our target of 60 frames (because there are business logic threads running in the background, so it is not so stable), but the average running time has also dropped a lot. Even if we can display dozens more people on the map, it is no problem.

summary

Not only MKMapView, but also many places including UITableView can use the method mentioned in the article to optimize. The core point is synthesis + caching. Of course, since synthesis still consumes some resources, it is more suitable for small resources such as avatars.

For graphics performance optimization, you can read this good article (if you don’t understand the Debug Option mentioned in the article, here is a detailed explanation)

<<:  How to make the next match-3 game?

>>:  Xiaomi Vice President: Open-sourcing Android is the smartest decision Google has made

Recommend

A complete list of APP promotion methods in 2019, take it and don’t thank me!

This article mainly makes a comprehensive knowled...

An offline event planning and execution form to help you sort out your thoughts!

I have been collecting cases and organizing mater...

Where is the traffic pool for educational institutions in 2021?

In 2020, affected by the COVID-19 pandemic, the o...

5 steps to prepare for a successful microservices journey

[Editor's Note] This article introduces 5 pre...

The technology that is sealed on the iPhone 6

In addition to allowing us to once again witness ...

Sneak peek at what the famous Apple Watch apps look like

Before you get your much-desired Apple Watch, are...