#18 iOS 14 – Privacy Upgradation for Users & Developers

iOS 14 is the newest iteration of Apple’s iPhone operating system, and in this article we will focus on the changes which we will have to make in our application to support iOS 14 and what are the latest key features of iOS 14.



iOS 14 is the newest iteration of Apple’s iPhone operating system, and in this article we will focus on the changes which we will have to make in our application to support iOS 14 and what are the latest key features of iOS 14.

Let’s check on which devices iOS 14 is capable of running and these are.

iOS 14 supported devices

  • iPhone 11, iPhone 11 Pro, iPhone 11 Pro Max
  • iPhone XS, iPhone XS Max,
  • iPhone XR, iPhone X
  • iPhone 8, iPhone 8 Plus
  • iPhone 7, iPhone 7 Plus
  • iPhone 6s, iPhone 6s Plus
  • iPhone SE (1st generation), iPhone SE (2nd generation)
  • iPod touch (7th generation)

and below is the configuration of software we require to develop & run iOS 14 apps.

Development requirements

Use Xcode 12 to build your apps for iOS 14. Xcode 12 requires an Intel-based Mac running macOS Catalina 10.15. 4 or later. Xcode 12 includes Swift 5.3 and SDKs for iOS 14, iPadOS 14, tvOS 14, watchOS 7, and macOS Big Sur.

Now, let’s check what are the main changes this iOS brings for us which we need to take care.

Changes in iOS 14

While this is not directly related to iOS 14, but now is the time to finally remove UIWebView from your app and start using WKWebView because Apple stopped accepting new app submissions using UIWebView back in April.

Below are the changes you should have to do in your app to support iOS 14.

Data Privacy Changes

In iOS 14 you will be required to disclose information regarding the kind of data your app collects and whether it is used to track users. This information you provide will be shown on App Store page of your app. This can be done in App Store Connect when you submit a new version of your app for review.

ios 14

Tracking Permission

With iOS 14, access to the user IDFA (identifier for advertisers) is now opt-in. You need to explicitly ask the user for consent via the App Tracking Transparency framework in the following scenarios:

  1. displaying targeted advertising
  2. sharing personal data with a third-party (e.g., location)
  3. using a third-party analytics SDK that collects data to serve advertising or measure advertising efficiency

If the data collected by your app never leaves the user’s device then you do not need to ask for this permission, or if it used for security purposes or fraud prevention.

If the user denies consent, the advertising identifier will return a string of zeros, rendering it useless. For more details on the implications this may have on your app, please check this article “Apple’s Changes to IDFA in iOS 14.

Photo Library Changes

Starting with iOS 14, there are now two ways you can access the user’s photo library:

  • PHPicker(which replaces UIImagePickerController)
  • limited photo picker.

PHPicker offers an integrated search and allows for multiple selection. This is particularly useful as it forgoes the need to develop or integrate a custom photo picker.

Limited picker, where your app only sees a (very) limited subset of the user’s photo library. This becomes the standard way of asking for permission and is not opt-in for apps. It also affects apps built with older SDKs, the user will see below dialog when your app tries to access the photo library.

Apps with an older SDK will always return “authorized” even if the user grants you limited permissions this is because of the compatibility purpose. Your apps with older SDK remain working if you don’t do anything, but the user experience may be impacted.

With limited picker option the recommended practice is to request authorization every time you need access, this is because If you are authorising single time only, and if you selects, some photos, then the next time when you will access to the photo library, you will only see the exact same photos as you have provided access to those photos only.

Approximation Location

Now users can limit an app’s location access to an approximate location and can grant precise or approximate location access to the app. You can also prompt for a one-time precise location if needed. The radius of the approximation is few miles in diameter and will increase/decrease dynamically depending on the context. 

You app will remain functional, but if you require precise location at all times, you must convince users that it is necessary but your app should handle the scenario gracefully if the user denies. Also, users can change the precise location setting from the Settings app.

There are many more API level changes if you would like to check.. do visit iOS release notes.

It’s time to check some cool iOS 14 key features 🙂

iOS 14 Key Features

App Clips

An app clip is a lightweight version of an app that offers users some flavour of its functionality. You can open app clips from a number of places like Safari, Maps, and Messages, or through QR codes and NFC tags. Using app clip you can also download the full app from the App Store.

App clips are designed in such a way that it should capable of executing a single task. It should be lightweight & fast without needing the app to be installed.

Your full app can have only one App Clip, and the full app must support all of the App Clip’s functionality. App Clips must be small — no more than 10 MB for the uncompressed App Clip — to launch instantly.

There are some limitations to ensure a fast launch experience, preserve resources and to protect user privacy. To create an App Clip, you should first review the technology available to App Clips & identify your app’s most important task which you can put in an App Clip. Then, you can start with following tasks:

  • Making changes to your app’s Xcode project and your code; for example, adding an App Clip target, sharing code between your App Clip and full app, and so on.
  • Associating your App Clip with your website to allow the system to verify your App Clip.
  • Creating your App Clip’s launch experience in App Store Connect and adding code to respond to invocations.
  • Creating App Clip Codes that offer the best experience for users to discover and launch your App Clip.

For more information on App Clip implementation you can check App Clip documentation.


Through widgets you can access to timely, at-a-glance information from your app right on the iOS Home screen.  If you are thinking to make your own Widgets then it must be written in SwiftUI.

WidgetKit gives users ready access to content in your app by putting widgets on the iOS Home screen or macOS Notification Center. Your widgets stay up to date so users always have the latest information at a glance. When they need more details, your widget takes them directly to the appropriate place in your app.

To implement a widget, you add a widget extension to your app. You configure the widget with a timeline provider, and use SwiftUI views to display the widget’s content. The timeline provider tells WidgetKit when to update your widget’s content.

To implement Widgets you can check creating a widget extension.

Augmented Reality

Its ARKit 4 now with Location Anchors, which leverages the refine location feature in the new Apple Map to enable rear-camera AR experiences in specific geographic locations. You can place AR experiences at specific places. It allows you to anchor your AR creations at specific latitude, longitude, and altitude coordinates. It requires iPhone XS, iPhone XS Max, iPhone XR, or later and for now it is available in some cities.

A new Depth API lets you access even more precise distance and depth information and is specific to devices equipped with the LiDAR Scanner (iPad Pro11-inch (2nd generation), iPad Pro 12.9-inch (4th generation), iPhone 12 Pro, iPhone 12 Pro Max)

Face Tracking extends to the front-facing camera on all devices equipped with a front-facing camera and the Apple Neural Engine (A12 Bionic and later).

Check more click ARKit 4 features and ARKit framework.

Machine Learning

Your machine learning apps gain new functionality, flexibility, and security with the updates in iOS 14.

Core ML adds model deployment with a dashboard for hosting and deploying models using CloudKit, so you can easily make updates to your models without updating your app or hosting the models yourself. Core ML model encryption adds another layer of security for your models, handling the encryption process and key management for you. The Core ML converter supports direct conversion of PyTorch models to Core ML. For more information, see the Core ML

The Create ML app’s new Style Transfer template stylizes photos and videos in real time, and the new Action Classification template classifies a single person’s actions in a video clip. For more information, see Create ML 


You can use the Siri Event Suggestions Markup to provide event details on a webpage and in email. Siri parses travel arrangements, movies, sporting events, live shows, restaurant reservations, and social events. Once parsed, Siri can suggest driving directions, a ride share to a scheduled event, or activation of Do Not Disturb just before a show starts. To learn how to integrate your own events with Siri, see the Siri Event Suggestions Markudocumentation.


iOS 14 mostly focused on privacy. It is essential to make sure your app is privacy-oriented while providing a unique experience to your customers. Developers who add support of new features (like Widgets) early will (likely) get a healthy boost in engagement and puts you in a prime position.

Explore more @Teknonauts.com

Leave a Reply

Your email address will not be published. Required fields are marked *