How will Apple support the future of its five major OS?

How will Apple support the future of its five major OS?

At 10:00 a.m. Pacific Time on June 3, 2019, at the McEnery Convention Center in downtown San Jose, the most noteworthy Keynote session of Apple's WWDC 2019 Global Developers Conference officially opened. In this session, in addition to the release of Mac Pro and Pro Display XDR at the hardware level, Apple also released a series of development tools for developers, such as ARKit 3, RealityKit, Core ML 3, SiriKit, etc.

[[267293]]

What are the highlights of these developer tools? Leifeng.com will take you to find out.

AR: More diverse functions

Looking back at the WWDC conferences of the past two years, Apple's emphasis on AR has only increased; at WWDC 2019, in addition to upgrading ARKit, Apple also announced a new advanced AR framework RealityKit and a new application Reality Composer that can easily create AR experiences.

ARKit was launched in 2017 and was Apple's first step into AR. In 2018, Apple upgraded it to ARKit 2, with two major updates: the new file format USDZ in cooperation with Pixar and multiplayer AR. Now, ARKit has been upgraded again, ushering in ARKit 3.

ARKit 3 introduces real-time human occlusion, which can know the position of people and AR objects and occlude the scene appropriately. Not only that, it can also track human movements through motion capture as input for AR scenes. At the same time, with ARKit 3, the front and back cameras of the device can now be used at the same time, so the user's facial expressions can also become part of the AR experience.

In addition to being able to use two cameras at the same time, multi-face tracking and real-time collaborative sessions between multiple people are also the highlights of ARKit 3, which can give users a more diverse AR experience.

ARKit 3 is an upgrade based on the original ARKit; different from ARKit 3, RealityKit and Reality Composer, which were first announced this year, are more novel.

RealityKit is a new high-level framework built specifically for augmented reality with features such as photorealistic rendering, camera effects, animation, physics, etc. It can handle networking for multiplayer AR applications, which means developers don’t need to be network engineers to develop shared AR experiences.

Reality Composer is a new development program that is not only for iOS, but can also be used in macOS. This tool allows developers to visually create AR scenes and add animations such as movement, scaling, and rotation to the scenes. Not only that, developers can also set animations: perform actions when the user taps an object, when the user approaches an object, or when other triggers are activated.

Core ML 3: Support for advanced neural networks

At WWDC 2019, Apple introduced Core ML 3, the latest version of Apple's machine learning model framework.

Core ML is a high-performance machine learning framework that can be used on Apple products. It can help developers quickly integrate multiple machine learning models into apps. It was launched in 2017 and upgraded to Core ML 2 in 2018, with a 30% increase in processing speed.

Now, Core ML has been upgraded to Core ML 3, which will provide training for on-device machine learning for the first time. Because the model can be updated with user data on the device, Core ML 3 can help the model remain relevant to user behavior without compromising privacy.

Not only that, Core ML 3 also supports advanced neural networks and more than 100 layer types, which makes it perform better in image and sound recognition. In addition, it can seamlessly utilize the CPU, GPU, and neural engine to provide maximum performance and efficiency.

SiriKit: Better App Extensions

Siri is Apple's first AI application and one of the most popular voice-activated assistants in the world. At WWDC 2019, Siri was also upgraded.

One of the most intuitive changes is that Apple has adopted Neural Text-to-Speech (TTS) technology in iOS 13 to make Siri's voice sound more natural. This also means that Siri no longer relies on human voice samples to speak.

The combination of Siri and AirPods is also one of the highlights. For example, when users receive text messages, they can ask Siri to read them directly in AirPods. Not only that, you can also use AirPods to quickly reply to text messages. In addition, the Siri experience on HomePod has also been greatly improved and is more personalized; for example, HomePod can recognize different users in the family, and when different users' phones are close to HomePod, it can know the user's favorite podcasts and music.

It is worth noting that at this year's WWDC conference, Apple launched SiriKit. SiriKit includes Intents and Intents UI frameworks, which developers can use to implement application extensions; once an application adopts SiriKit, even if the application itself is not running, it can implement application extensions through Siri.

HomeKit: Strengthening privacy protection

HomeKit is a smart home framework introduced by Apple in 2015. It is built into iOS 8 and is used to communicate with and control connected accessories in a user's home.

At this WWDC, Apple mainly emphasized HomeKit's protection of user privacy. For example, Apple launched HomeKit Secure Video for the first time, which can analyze the video of local smart home devices (such as security cameras), and then encrypt it and upload it to iCloud.

Also making its debut alongside a host of third-party-supported HomeKit routers is HomeKit Secure Video, which can isolate devices to protect the entire network from attack.

It is understood that the privacy protection measures provided by HomeKit routers go far beyond the scope of home security cameras. Its automatic firewall can be connected to HomeKit accessories. As long as one of the accessories is invaded, the intruder will have no way to access other devices, preventing the leakage of personal information.

SwiftUI: From a hundred lines of code to a dozen

At this year's Worldwide Developers Conference, Apple released SwiftUI, a framework based on the development language Swift.

Swift is a new development language released by Apple at WWDC in 2014. It can run on macOS and iOS platforms together with Objective-C and is used to build applications based on Apple platforms. Swift is designed with safety as the starting point to avoid various common types of programming errors; in 2015, Apple made Swift open source.

The SwiftUI released by Apple this time is based on the Swift language. It uses a set of tools and APIs to provide a unified UI framework on all Apple platforms. Of course, it can also be used for multiple Apple operating systems such as watchOS, tvOS, and macOS. It has the characteristics of automatic support for dynamic type, dark mode, localization and accessibility.

For example, the new SwiftUI programming framework has a new interactive developer experience, and the preview on the simulated device will be updated immediately as the developer changes. For example, SwiftUI uses graphical modules to put in code snippets, adding lines when expanding, and drop-down menus make it easier for developers to change parameters. With just one click, developers can switch to the simulator and the application can be moved to the actual hardware almost immediately.

At the press conference, Apple's Craig Federighi also demonstrated how to simplify a hundred lines of code to about a dozen lines, greatly reducing the development process for developers.

It’s worth mentioning that SwiftUI is also integrated with other APIs, such as ARKit; at the same time, it is also specially optimized for certain languages ​​​​that are input from left to right - of course, SwiftUI also natively supports dark mode.

Summarize

Judging from the development kits released at this conference, Apple mainly focuses on two aspects: one is to focus on the development of the technical ecosystem in AR and AI, and the other is to focus on the cross-system development experience under the Apple ecosystem, and fully cover its operating systems such as macOS, watchOS, iOS, tvOS and iPadOS. This not only brings a better user experience, but also makes the connection between each part of the Apple operating system ecosystem closer, making it more attractive.

It can be said that through this WWDC, we have already vaguely seen the future of the entire Apple application ecosystem.

This article is reproduced from Leiphone.com. If you need to reprint it, please go to Leiphone.com official website to apply for authorization.

<<:  Who did Apple kill this time?

>>:  Did you know? The mobile phone is 46 years old

Recommend

Can the mind also catch a cold? And it’s quite serious!!!

Audit expert: Li Xianhong National Level 2 Psycho...

Promotion | Introduction to Fanstong Products

The first in a series of popular science articles...

618 group buying venue event promotion methods and review!

This article reviews the 618 group buying event t...

The most comprehensive e-commerce operation plan for you!

What should a qualified e-commerce operation look...

Android 5.0 Lollipop source code released

[[122293]] Google uploaded the latest Android 5.0...

5 steps to plan an event promotion!

Whether you are doing user operations, new media ...

How to do holiday marketing most effectively?

It’s the annual “buy, sell, buy” Double 11 shoppi...