Small flaws in iOS native image tagging that you may not have noticed

Small flaws in iOS native image tagging that you may not have noticed

As we use smartphones for longer periods of time and in a wider range of scenarios, many subdivided functions have become important. The demand for image tagging alone has spawned a large number of iOS image widgets, which can perform some simple but very necessary image processing for iOS users, such as circling eye-catching locations, blurring secret information, using a partial magnifier, and text annotation.

Apple did not add a native image tagging function until iOS 10. Users can open iOS photos, enter the edit mode, and click the menu to call up the tagging function. It is a system-provided image tagging tool, similar to the functions of apps such as "Circle", "PointOut", and "Pinpoint". Here you can add hand-drawn lines, handwritten text, text input, Emoji, image magnifier, etc. All annotations exist as independent elements. Users can drag the position of the annotation, enlarge or reduce the annotation, and cancel the annotation.

This feature has been further optimized in iOS 11. Although it is not as professional and powerful as other paid apps, it is enough to deal with daily situations. The only regret is that the native image tagging lacks an important tool - hidden information coding.

Most of the time, many students who use native tags may choose to use a black brush to cover up certain information instead of coding. This seems to work in the absence of coding tools.

However, I need to remind you that the blackening brush of the iOS image marking tool cannot achieve the purpose of coding, because its handwriting cannot cover the original information. Even if you repeat it several times, people who want to know the covered information can still use small tricks to see the original text and numbers clearly.

For example, in the picture below, the money should have been completely blackened and invisible, but as long as you use the built-in photo editing tool on the iPhone, adjust the photo brightness to maximum, and zoom in on the details, it is still easy to recognize the recipient's information.

So if you really need to blur important information in an image, don’t use the native marking tool to black it out directly. To be safe, use special mosaic tools in other apps.

At the same time, we also suggest that Apple can add a special coding function to the native image tagging of iOS to make up for this small defect, especially when the new system iOS 12 will be unveiled at WWDC in June. Users naturally hope that it can add more details to better meet our usage needs.

<<:  Apple is still considering whether to completely rebuild Siri

>>:  Implementing iPhone X’s FaceID feature with Python and deep learning

Recommend

Is strawberry the “first of all poisons”?

Every year during the strawberry season, some med...

Insights into e-commerce information flow advertising in 2019!

The transaction volume of the Double Eleven battl...

To master fission and increase followers, just grasp these 3 points!

Since the fission processes are similar, you can ...

Some basic techniques for operation, promotion and attracting new customers!

Operations is a very complex business system. Gen...

L3 - the most practical mini-MBA program

L3 - the most practical mini-MBA project resource...

Q&A on the unfair terms of telecommunications services!

[[129390]] I read an article today. Liu Min, depu...

Brand marketing skills from 0 to 1!

What are the most common mistakes for new brands ...

A complete guide to mobile DSP advertising in 10 major industries!

More and more industries are beginning to enter t...