What’s New for Devs in iOS 12 and Xcode 10?
Amidst all the fanfare of another WWDC, Apple introduced us to iOS 12. This is one of the most focused releases for both consumers and developers, emphasizing performance optimization. As well as this focus on performance and optimization, iOS also brings some feature iterations on many of the libraries we know and love. This includes the evolution of emojis (with memojis), Siri shortcuts, augmented reality, and machine learning.
For developers, Apple has focused on providing its community with a plethora of new tools and SDKs to help power the next generation of iOS apps. Developers also get a lot of improvements to Xcode, and Apple’s newest frameworks—ARKit, Core ML, and SiriKit—have received significant feature upgrades.
Machine Learning has evolved rapidly since it was first introduced in 2017, with Core ML 2 making it easy for mainstream developers to design, train and test models. ARKit 2, Apple’s second iteration of its augmented reality library, is another library that has improved distinctly this year, thanks to three prominent new features:
- shared experiences enabling multiplayer AR experiences
- persistent experiences which allow for the saving and restoring of states
- 3D object detection in addition to 2D moving image tracking.
Apple has also improved notifications by adding the ability to group and prioritize notifications contextually, and it has created the new Password Auto-fill framework for integrating third-party password management apps.
Siri Shortcuts is Apple’s addition to SiriKit, allowing third-party developers to integrate custom intelligent shortcuts in spotlight search and lock-screen, as well as to use their voice to summon and perform that custom action.
Xcode 10 also introduces a whole new range of improvements, from dark mode to multi-cursor editing, new source control visuals in the source code editor, and an entirely new build system. Xcode 10 also supports parallel testing, reducing the time to complete tests drastically.
Objectives of This Article
In this article, you will learn all about the new changes that are coming to iOS 12 for developers. We will be covering the following:
- machine learning with Core ML 2 and Natural Language framework
- augmented reality with ARKit 2
- interactive notifications
- Siri shortcuts with SiriKit
- Authentication Services and Password AutoFill
- Xcode 10
- other changes
Machine learning, introduced in iOS 11, has not merely evolved this year but has taken massive strides towards making machine learning mainstream. The next iteration of Core ML, version 2, in keeping with the theme of performance and speed, is now 30% faster, with AI model sizes reduced by up to 75%. More significantly, Apple has drastically simplified the libraries and tools to make it easier for everyone to adopt without prior mathematics or machine learning backgrounds.
Core ML showed a lot of potential when first introduced, but it wasn’t embraced by the broader iOS developer community. However, with the introduction of Create ML this year (a macOS framework), anyone can create Core ML models for use in their apps with greater ease. This new framework integrates with playgrounds in the new Xcode to allow you to visually interact with your model creation workflows in real time, merely by adding a few lines of Swift code.
In addition to supporting extensive deep learning with over 30 layer types, it also supports standard models such as tree ensembles, SVMs, and generalized linear models…. Because it’s built on top of low-level technologies like Metal and Accelerate, Core ML seamlessly takes advantage of the CPU and GPU to provide maximum performance and efficiency. (WWDC)
Many of Apple’s products have already been taking advantage of Core ML 2, including Siri and Camera, through its computer vision and the brand new Natural Language framework. The Natural Language framework, new to iOS 12, supports the analysis of natural language text to deduce specific metadata, and is used alongside Create ML in training and deploying custom natural language processing models. For more information on creating and running Core ML models, consult the following two Apple reference documents:
ARKit, also first announced last year, gained prominence on stage at this year’s WWDC, with ARKit 2 capturing consumer and engineering imaginations immediately thanks to three prominent new features:
- shared AR experiences
- persistent AR experiences
- 3D object detection capabilities
Demonstrated live on-stage at the WWDC keynote this year, shared AR experiences allow you to incorporate multiple uses in the augmented reality experience simultaneously. In other words, you can now quickly create multiplayer augmented reality experiences where you can not only have multiple users playing but even third-person bystanders observing the same experiences.
Persistent AR experiences let developers implement functionality to save sessions at a given state, to be resumed at a later stage. That is, if you are building a virtual lego sculpture, you can save and resume the state, and the virtual objects would persist in the same space and time without having to start over again.
Finally, 3D object detection and tracking capabilities have been improved upon since ARKit 1.5, with the ability to now fully track 2D images, including movable objects such as postcards, newspapers, or magazines. You can, for instance, hold a postcard of a location and move the card around, while ARKit is continuously tracking and recognizing your object. ARKit 2 also lets you detect known real-world 3D objects like televisions, furniture, or sculptures.
Notifications have been enhanced significantly in iOS 12, with the ability to prioritize and group messages, a feature that users and developers have been anticipating for a long time.
The ability to group notifications contextually gives end users a more concise user experience, one which has been available to Android users for many years. Users can, for example, use the notification groupings so that a conversation thread in an app or game notifications are part of one group parcel rather than individual notifications taking up the entire screen. Instead, with a single swipe, users can dismiss a group of contextual notifications.
Notifications are also prioritized by importance so that essential notifications such as messages, emails or a calendar notification will appear above social media notifications. Third-party developers can also classify a notification as an alert with prioritization—with approval from the Apple review team.
SiriKit & Intents
SiriKit has also received some attention this WWDC with the addition of Siri Shortcuts. This feature intelligently suggests shortcuts right when they are needed, by pairing and predicting user routines and behaviors. Through the new Shortcuts API, users can quickly and conveniently accomplish tasks on the lock screen or search screen using these shortcuts visually, or through Siri voice shortcuts, as well as creating their own shortcuts.
An example would be for a coffee store notification to appear on the lock screen as a result of the user behavior of walking outside in the mornings and ordering a mint mojito beverage. Beyond the visual suggestion, the user would also be able to use Siri and say “Order my favorite coffee beverage from Philz,” or create a custom shortcut that triggers that action.
Apple has introduced a whole new AuthenticationServices library that integrates password managers like LastPass or 1Password with the operating system’s Password Autofill. Previously, users only had access to credentials stored in the iCloud keychain, and accessing third-party passwords stored in apps like 1Password meant users had to switch between both apps to obtain the username and password information.
Within the QuickType bar, users can access their credentials from the third-party password management apps. Developers also have the ability, through the implementation of the new
ASWebAuthenticationSession class, to share login information between an active session in Safari and your app. This works as a sort of password handoff.
Apple has also empowered developers beyond new SDKs and frameworks, with notable improvements to Xcode and the Xcode build toolchain. The source editor, for example, has gained some great refactoring improvements, including multi-cursor editing so that you can rapidly edit multiple lines of code (i.e. different methods) at the same time.
Xcode 9 first introduced integration with GitHub, from creating and opening GitHub repos from within the IDE to interacting with code more collaboratively. Xcode 10 extends beyond GitHub to work with two other popular repository vendors, GitLab and Bitbucket.
On the source editor side, changes in the local repository or upstream are now highlighted in the left column, quickly allowing you to see changes made to your code as well as changes not yet pushed, upstream changes others have made, and possible conflicts that need to be addressed before commits. Finally, Xcode 10 generates SSH keys for you if needed, and uploads them to your source control repository for you.
Moving to the toolchain, Xcode 10 introduces a whole new build system for developers, and in keeping with the theme of the conference, it gains performance and speed improvements. For example, developers testing their apps in Simulator can now leverage test parallelization, the ability to run multiple tests at the same time.
In addition to unit tests, developers can now run continuous integration tests across many different simulated device types, spawning clones of a single simulated device, resulting in tests that complete at a fraction of the time Xcode 9 would take. Developers can either choose to take advantage of their Mac’s CPU to perform the tests in parallel or allocate another Mac in the network hosting Xcode Server to automate the build and testing in parallel.
Apple has now deprecated
print() statements in favor of the new
OSLog signpost statements, which provide for a far greater debugging speed compared to
print(). Developers can also implement custom instruments to mark significant points throughout the code, which are then tracked through signposts within instruments, appearing alongside other analysis events (such as CPU, memory, or network), giving developers even greater insight to troubleshoot their apps.
Xcode 10 also includes custom templates to help developers build their own instruments with custom visualization and data analysis, which can be reused and shared with other team members, or published with your frameworks. There are some other minor changes to the icons, including moving the library content to a new overlay window instead of the bottom of the inspector. There are also other minor changes to Interface Builder icons and layout as well.
iTunes Connect has been rebranded appropriately as App Store Connect, along with some enhancements to make the user interface more efficient yet user-friendly. Through the new web interface, developers can create product pages with screenshots and previews, toggle in-app purchases, and set up subscriptions, pre-orders, and introductory pricing.
Finally, TestFlight has received a prominent enhancement with a new feature called TestFlight Public Link, which makes the process of inviting people to test your app a whole lot easier. Where previously the developer would need to manually invite users using their email addresses in what was a cumbersome process, developers can now create an “open invitation” through a public URL.
iOS 12 is a very focused release for both consumers and developers, with an emphasis on performance optimization. For developers, Xcode and the build and testing tools have been optimized with an emphasis on speed at the forefront, and Apple’s newest frameworks such as ARKit, Core ML and SiriKit have received significant feature upgrades.
In the space of two years, we are starting to see Apple’s foray into machine learning evolve and mature, with benefits rolling over into their other technology frameworks, such as SiriKit, ARKit, and Photos, enabling developers to create more intelligent user experiences. More importantly, machine learning has moved from being a niche library to one where any developer—regardless of machine learning experience—can train and implement models.
With WWDC serving as a prelude to Apple’s hardware announcements in September, it will be interesting to see how the developer community embraces ARKit in its second iteration, in the space of multiplayer gaming.