Small flaws in iOS native image tagging that you may not have noticed

Small flaws in iOS native image tagging that you may not have noticed

As we use smartphones for longer periods of time and in a wider range of scenarios, many subdivided functions have become important. The demand for image tagging alone has spawned a large number of iOS image widgets, which can perform some simple but very necessary image processing for iOS users, such as circling eye-catching locations, blurring secret information, using a partial magnifier, and text annotation.

Apple did not add a native image tagging function until iOS 10. Users can open iOS photos, enter the edit mode, and click the menu to call up the tagging function. It is a system-provided image tagging tool, similar to the functions of apps such as "Circle", "PointOut", and "Pinpoint". Here you can add hand-drawn lines, handwritten text, text input, Emoji, image magnifier, etc. All annotations exist as independent elements. Users can drag the position of the annotation, enlarge or reduce the annotation, and cancel the annotation.

This feature has been further optimized in iOS 11. Although it is not as professional and powerful as other paid apps, it is enough to deal with daily situations. The only regret is that the native image tagging lacks an important tool - hidden information coding.

Most of the time, many students who use native tags may choose to use a black brush to cover up certain information instead of coding. This seems to work in the absence of coding tools.

However, I need to remind you that the blackening brush of the iOS image marking tool cannot achieve the purpose of coding, because its handwriting cannot cover the original information. Even if you repeat it several times, people who want to know the covered information can still use small tricks to see the original text and numbers clearly.

For example, in the picture below, the money should have been completely blackened and invisible, but as long as you use the built-in photo editing tool on the iPhone, adjust the photo brightness to maximum, and zoom in on the details, it is still easy to recognize the recipient's information.

So if you really need to blur important information in an image, don’t use the native marking tool to black it out directly. To be safe, use special mosaic tools in other apps.

At the same time, we also suggest that Apple can add a special coding function to the native image tagging of iOS to make up for this small defect, especially when the new system iOS 12 will be unveiled at WWDC in June. Users naturally hope that it can add more details to better meet our usage needs.

<<:  Apple is still considering whether to completely rebuild Siri

>>:  Implementing iPhone X’s FaceID feature with Python and deep learning

Recommend

How to attract users through business?

Most operational work is generated with the Inter...

New trends in overseas promotion and operation of social media in 2019!

I have previously discussed the correct approach ...

Wentian Laboratory, what is it going to do in space?

Mixed Knowledge Specially designed to cure confus...

Where did the red underwear for the birth year come from?

The New Year is coming, and from the city to the ...

The romantic rise and fall of WeChat public accounts

[[247951]] Prequel On July 21, 2012, Beijing, whi...

Wang Zheng's "Gestalt Psychotherapy" 60 lessons, from entry to mastery

Training course video content introduction: This ...

Mobike, why do you occupy other people's parking spaces?

Mobike, one of the biggest highlights of which is...

How to develop mini programs in the pharmaceutical industry?

In our environment, getting sick may be a normal ...

React Native touch event processing detailed explanation

Touch is the core function of mobile devices and ...

Want high-resolution remote sensing images? Balloons can also help!

Recently, Urban Sky, a U.S. stratospheric balloon...