Earlier today, Apple announced iOS 7, the first major (yes, major) update to iOS since its release in 2007. With it comes a completely different design and many new features. For developers, the beta was released a little while ago. You’ll find the release notes for that below (note: it’s incredibly long). For general information about the changes in iOS 7, visit Apple’s iOS 7 product page.
iOS 7 is a major update with compelling features for developers to incorporate into their apps. The UI has been completely redesigned. In addition, iOS 7 introduces a new animation system for creating 2D and 2.5D games. Multitasking enhancements, peer-to-peer connectivity, and many other important features make iOS 7 the most significant release since the first iPhone SDK.
This article summarizes the key developer-related features introduced in iOS 7. This version of the operating system runs on current iOS-based devices. In addition to describing the key new features, this article lists the documents that describe those features in more detail.
For late-breaking news and information about known issues, see iOS 7 Release Notes. For the complete list of new APIs added in iOS 7, see iOS 7.0 API Diffs.
User Interface Changes
iOS 7 includes many new features intended to help you create great user interfaces.
A New Design
The iOS 7 UI has been completely redesigned. Throughout the system, a sharpened focus on functionality and on the user’s content informs every aspect of design. Translucency, refined visual touches—and fluid, realistic motion—impart clarity, depth, and vitality to the user experience. Whether you’re creating a new app or updating an existing one, keep these qualities in mind as you work on the design.
Apps compiled against the iOS 7 SDK automatically receive the new appearance for any standard system views when the app is run on iOS 7. If you use Auto Layout to set the size and position of your views, those views are repositioned as needed. But there may still be additional work to do to make sure your interface still has the appearance you want. Similarly, if you customize your app’s views, you may need to make changes to support the new appearance fully.
For guidance on how to design apps that take full advantage of the new look in iOS 7, see iOS 7 Design Resources.
Apps can now specify dynamic behaviors for
UIViewobjects and for other objects that conform to the
UIDynamicItemprotocol. (Objects that conform to this protocol are calleddynamic items.) Dynamic behaviors offer a way to improve the user experience of your app by incorporating real-world behavior and characteristics, such as gravity, into your app’s animations. UIKit supports the following types of dynamic behaviors:
UIAttachmentBehaviorobject specifies a connection between two dynamic items or between an item and a point. When one item (or point) moves, the attached item also moves. The connection is not completely static, though. An attachment behavior has damping and oscillation properties that determine how the behavior changes over time.
UICollisionBehaviorobject lets dynamic items participate in collisions with each other and with the behavior’s specified boundaries. The behavior also lets those items respond appropriately to collisions.
UIGravityBehaviorobject specifies a gravity vector for its dynamic items. Dynamic items accelerate in the vector’s direction until they collide with other appropriately configured items or with a boundary.
UIPushBehaviorobject specifies a continuous or instantaneous force vector for its dynamic items.
UISnapBehaviorobject specifies a snap point for a dynamic item. The item snaps to the point with a configured effect. For example, it can snap to the point as if it were attached to a spring.
Dynamic behaviors become active when you add them to an animator object, which is an instance of the
UIDynamicAnimatorclass. The animator provides the context in which dynamic behaviors execute. A given dynamic item can have multiple behaviors, but all of those behaviors must be animated by the same animator object.
For information about the behaviors you can apply, see UIKit Framework Reference.
Text Kit is a full-featured, high-level framework for apps that need to handle text that has all the characteristics of fine typography. Text Kit can lay out styled text into paragraphs, columns, and pages; it easily flows text around arbitrary regions such as graphics; and it manages multiple fonts. Text Kit is integrated with all UIKit text-based controls to enable applications to create, edit, display, and store text more easily—and with less code than was previously possible in iOS.
Text Kit comprises new classes plus extensions to existing classes, including the following:
NSAttributedStringclass has been extended to support new attributes.
NSLayoutManagerclass generates glyphs and lays out text.
NSTextContainerclass defines a region where text is laid out.
NSTextStorageclass defines the fundamental interface for managing text-based content.
For more information about Text Kit, see Text Programming Guide for iOS.
iOS 7 supports the following new background execution modes for apps:
- Apps that regularly require new content can register with the system and be woken up or launched periodically to download that content in the background. To register, include the
UIBackgroundModeskey with the
fetchvalue in your app’s
Info.plistfile and set the minimum time you want between fetch operations using the
setMinimumBackgroundFetchInterval:method. You must also implement the
application:performFetchWithCompletionHandler:method in your app delegate to perform any downloads.
- Apps that use push notifications to notify the user that new content is available can now use those notifications to initiate background download operations. To support this mode, include the
UIBackgroundModeskey with the
remote-notificationvalue in your app’s
Info.plistfile. Your app delegate must also implement the
Apps supporting either the fetch or remote-notification background modes may be launched or moved from the suspended to background state at appropriate times. In the case of the
fetchbackground mode, the system uses available information to determine the best time to launch or wake apps. For example, it does so when networking condition are good or when the device is already awake. Apps supporting the remote-notification background mode may be woken up when a new push notification arrives but before that notification is delivered to the user. The app can use the interval to download new content and have that content ready to present to the user when the notification is subsequently delivered.
To handle the downloading of content in the background, apps should use the new
NSURLSessionclass. This class improves on the existing
NSURLConnectionclass by providing a simple, task-based interface for initiating and processing
NSURLRequestobjects. A single
NSURLSessionobject can initiate multiple download and upload tasks, and through its delegate can handle any authentication requests coming from the server.
For more information about the new background modes, see “App States and Multitasking” in iOS App Programming Guide.
iOS 7 includes enhanced support for games.
Sprite Kit Framework
The Sprite Kit framework (
SpriteKit.framework) provides a hardware-accelerated animation system optimized for creating 2D and 2.5D games. Sprite Kit provides the infrastructure that most games need, including a graphics rendering and animation system, sound playback support, and a physics simulation engine. Using Sprite Kit frees you from creating these things yourself and lets you focus on the design of your content and the high-level interactions for that content.
Content in a Sprite Kit app is organized into scenes. A scene can include textured objects, video, path-based shapes, Core Image filters, and other special effects. Sprite Kit takes those objects and determines the most efficient way to render them onscreen. When it comes time to animate the content in your scenes, you can use Sprite Kit to specify explicit actions you want to perform or use the physics simulation engine to define physical behaviors (such as gravity, attraction, or repulsion) for your objects.
In addition to the Sprite Kit framework, there are Xcode tools for creating particle emitter effects and texture atlases. You can use the Xcode tools to manage app assets and update Sprite Kit scenes quickly.
For more information about how to use Sprite Kit, see Sprite Kit Programming Guide. To see an example of how to use Sprite Kit to build a working app, see code:Explained Adventure.
Game Controller Framework
The Game Controller framework (
GameController.framework) lets you discover and configure Made-for-iPhone/iPod/iPad (MFi) game controller hardware in your app. Game controllers can be devices connected physically to an iOS device or connected wirelessly over Bluetooth. The Game Controller framework notifies your app when controllers become available and lets you specify which controller inputs are relevant to your app.
For more information about supporting game controllers, Game Controller Programming Guide.
Game Center Improvements
Game Center includes the following improvements:
- Exchanges give players an opportunity to initiate actions with other players, even when it is not their turn. You can use this feature to implement simultaneous turns, player chats, and trading between players when it is not their turn.
- The limit on per-app leaderboards has been raised from 25 to 100. You can also organize your leaderboards using a
GKLeaderboardSetobject, which increases the limit to 500.
- You can add conditions to challenges that define when the challenge has been met. For example, a challenge to beat a time in a driving game might stipulate that other players must use the same vehicle.
- The framework has improved its authentication support and added other features to prevent cheating.
For more information about how to use the new Game Center features, see Game Center Programming Guide. For information about the classes of the Game Kit framework, seeGame Kit Framework Reference.
The Map Kit framework (
MapKit.framework) includes numerous improvements and features for apps that use map-based information. Apps that use maps to display location-based information can now take full advantage of the 3D map support found in the Maps app, including controlling the viewing perspective programmatically. Map Kit includes other changes that enhance maps in your app.
- Overlays can be placed at different levels in the map content, so that they appear above or below other relevant data.
- You can apply an
MKMapCameraobject to a map to add position, tilt, and heading information to their appearance. Camera information imparts a 3D viewing experience onto your map like the one provided by Maps.
MKDirectionsclass lets you ask for direction-related route information from Apple. You can use that route information to create overlays for display on your own maps.
MKGeodesicPolylineclass lets you create a line-based overlay that follows the curvature of the earth.
- Apps can use the
MKMapSnapshotterclass to capture map-based images.
- The visual representation of overlays is now based on the
MKOverlayRendererclass, which replaces overlay views and offers a simpler rendering approach.
- Apps can now supplement or replace a map’s existing tiles using the
For more information about the classes of the Map Kit framework, see Map Kit Framework Reference.
AirDrop lets users share photos, documents, URLs, and other kinds of data with nearby devices. AirDrop support is now built-in to the existing
UIActivityViewControllerclass. This class displays different options for sharing the content that you specify. If you are not yet using this class, you should consider adding it to your interface.
For more information about using an activity view controller to share data, see UIActivityViewController Class Reference
The Audio Unit framework (
AudioUnit.framework) adds support for Inter-App Audio, which enables the ability to send MIDI commands and stream audio between apps on the same device. For example, you might use this feature to record music from an app acting as an instrument or use it to send audio to another app for processing. To vend your app’s audio data, publish a AURemoteIO instance as an audio component that is visible to other processes. to use audio features from another app, use the audio component discovery interfaces in iOS 7.
For information about the new interfaces, see the framework header files. For general information about the interfaces of this framework, see Audio Toolbox Framework Reference.
The Multipeer Connectivity framework (
MultipeerConnectivity.framework) supports the discovery of nearby devices and the direct communication with those devices without requiring Internet connectivity. This framework makes it possible to create multipeer sessions easily and to support reliable in-order data transmission and real-time data transmission. With this framework, your app can communicate with nearby devices and seamlessly exchange data.
The framework provides programmatic and UI-based options for discovering and managing network services. Apps can integrate the
MCPeerPickerViewControllerclass into their UI to display a list of peer devices for the user to choose from. Alternatively, you can use the
MCNearbyServiceBrowserclass to look for and manage peer devices programmatically.
For more information about the interfaces of this framework, see Multipeer Connectivity Framework Reference.
iOS 7 includes the following new frameworks:
- The Game Controller framework provides an interface for communicating with game-related hardware; see
“Game Controller Framework”.
- The Sprite Kit framework provides support for sprite-based animations and graphics rendering; see “Sprite Kit Framework.”
- The Multipeer Connectivity framework provides peer-to-peer networking for apps; see “Peer-to-Peer Connectivity.”
- The Media Accessibility framework (
MediaAccessibility.framework) manages the presentation of closed-caption content in your media files. This framework works in conjunction with new settings that let the user enable the display of closed captions.
- The Safari Services framework (
SafariServices.framework) provides support for programmatically adding URLs to the user’s Safari reading list. For information about the class provided by this framework, see the framework header files.
Additional Framework Enhancements
In addition to the items discussed in the preceding sections, the following frameworks have significant enhancements. For a complete list of new interfaces, see iOS 7.0 API Diffs.
The UIKit framework (
UIKit.framework) includes the following enhancements:
- All UI elements have been updated to present the new look associated with iOS 7.
- UIKit Dynamics lets you mimic real-world effects such as gravity in your animations; see “UIKit Dynamics.”
- Text Kit provides sophisticated text editing and display capabilities; see “Text Kit.”
UIViewclass defines the following additions:
tintColorproperty applies a tint color that affects both the view and its subviews. For information on how to apply tint colors, see iOS 7 UI Transition Guide.
- You can create keyframe-based animations using views. You can also make changes to your views and specifically prevent any animations from being performed.
UIViewControllerclass defines the following additions:
- View controller transitions can be customized, driven interactively, or replaced altogether with ones you designate.
- View controllers can now specify their preferred status bar style and visibility. The system uses the provided information to manage the status bar style as new view controllers appear. You can also control how this behavior is applied using the
UIViewControllerBasedStatusBarAppearancekey in your app’s
UIMotionEffectclass defines the basic behavior for motion effects, which are objects that define how a view responds to device-based motion.
- Collection views support UIKit Dynamics. Using this support, you can apply behavior objects to layout attributes to animate the corresponding items in the collection.
UIImagesupports retrieving images stored in image asset catalogs, which are a way to manage and optimize assets that have multiple sizes and resolutions. You create image asset catalogs in Xcode.
- There is a new
UIScreencreates a view that you can use to present your app’s content.
UIKeyCommandclass wraps keyboard events received from an external hardware keyboard. These events are delivered to the app’s responder chain for processing.
UIFontDescriptorobject describes a font using a dictionary of attributes. Use font descriptors to interoperate with other platforms.
UIFontDescriptorclasses support dynamic text sizing, which improves legibility for text in apps. With this feature, the user controls the desired font size that all apps in the system should use.
UIActivityclass now supports new activity types, including activities for sending items via AirDrop, adding items to a Safari reading list, and posting content to Flickr, Tencent Weibo, and Vimeo.
UIApplicationDelegateprotocol adds methods for handling background fetch behaviors.
- UIKit adds support for running in a guided-access mode, which allows an app to lock itself to prevent modification by the user. This app is intended for institutions such as schools where users bring their own devices but need to run apps provided by the institution.
- State restoration now allows the saving and restoration of any object. Objects adopting the
UIStateRestoringprotocol can write out state information when the app moves to the background and have that state restored during subsequent launches.
- Table views now support estimating the height of rows and other elements, which improves scrolling performance.
For information about the classes of this framework, see UIKit Framework Reference.
Store Kit Framework
The Store Kit framework (
StoreKit.framework) has migrated to a new receipt system that developers can use to verify in-app purchases on the device itself. You can also use it to verify the app purchase receipt on the server.
For more information about how to use this new receipt system, see Receipt Validation Programming Guide.
The Security framework (
Security.framework) adds support for syncing passwords between user’s devices via iCloud. Apps can mark their keychain items for iCloud via a new keychain attribute (
For more information about this attribute, see the framework header files. For general information about the keychain, see the Keychain Services Programming Guide.
Pass Kit Framework
The Pass Kit framework (
PassKit.framework) includes new APIs for adding multiple passes at once, along with these additions to the pass file format:
- New keys specify the expiration date for a pass.
- You can specify that a pass is relevant only when it is in the vicinity of specific Bluetooth beacons.
- New attributes control how a pass is displayed. You can group passes together, display links with custom text on the back of a pass, and control how time values are displayed on the pass.
- You can now associate extra data with a pass. This data is available to your app but not displayed to the user.
- You can designate which data detectors to apply to the fields of your passes.
For information about how to use Pass Kit in your app, see Passbook Programming Guide. For information about the pass file format, see Passbook Package Format Reference.
OpenGL ES includes the following new extensions:
- The EXT_sRGB extension adds support for sRGB framebuffer operations.
- The GL_APPLE_pvrtc_sRGB extension adds support for sRGB texture data compressed in the PVRTC texture format.
- The GL_APPLE_draw_instanced and GL_APPLE_instanced_arrays extensions can improve rendering performance when your app draws multiple instances of the same object. You use a single call to draw instances of the same object. You add variation to each instance by specifying how fast each vertex attribute advances or by referencing an ID for each instance in your shader.
As always, check for the existence of an extension before using it in your app.
Also, textures can now be accessed in vertex shaders; query the value of the
MAX_VERTEX_TEXTURE_IMAGE_UNITSattribute to determine the exact number of textures you can access.
In the MessageUI framework, the
MFMessageComposeViewControllerclass adds support for attaching files to messages. For information about the new interfaces, see the framework header files.
For information about the classes of this framework, see Message UI Framework Reference.
Media Player Framework
In the Media Player framework, the
MPVolumeViewclass provides support for determining whether wireless routes such as AirPlay and Bluetooth are available for the user to select. You can also determine whether one of these wireless routes is currently active. For information about the new interfaces, see the framework header files. For information about the classes of Media Player framework, see Media Player Framework Reference.
Map Kit Framework
The Map Kit framework (
MapKit.framework) includes changes that are described in “Maps.”
For information about the classes of this framework, see Map Kit Framework Reference.
Image I/O Framework
The Image I/O framework (
ImageIO.framework) now has interfaces for getting and setting image metadata.
For information about the new interfaces, see the framework header files. For information about the classes of this framework, see Image I/O Reference Collection.
The iAd framework (
iAd.framework) includes two extensions to other frameworks that make it easier to incorporate ads into your app’s content:
- The framework introduces new methods on the
MPMoviePlayerControllerclass that let you run ads before a movie.
- The framework extends the
UIViewControllerclass to make it easier to create ad-supported content. You can now configure your view controllers to display ads before displaying the actual content they manage.
For information about the new interfaces, see the framework header files. For information about the classes of this framework, see Ad Support Framework Reference.
Game Kit Framework
The Game Kit framework (
GameKit.framework) includes numerous changes, which are described in “Game Center Improvements.”
For information about the classes of this framework, see Game Kit Framework Reference.
The Foundation framework (
Foundation.framework) includes the following enhancements:
NSURLSessionclass is a new class for managing the acquisition of network-based resources in the background. This class serves as a replacement for the
NSURLConnectionclass and its delegate; it also replaces the
NSURLDownloadclass and its delegate.
NSURLComponentsclass is a new class for parsing the components of a URL. This class supports the URI standard (rfc3986/STD66) for parsing URLs.
NSNetServiceBrowserclasses support peer-to-peer discovery over Bluetooth and Wi-Fi.
NSURLCredentialStorageclasses let you create credentials with a synchronizable policy and provide the option of removing credentials with a synchronizable policy from iCloud.
NSHTTPCookieStorageclasses now support for the asynchronous processing of storage requests.
NSCalendarclass supports new calendar types.
For information about the new interfaces, see the framework header files and Foundation release notes. For general information about the classes of this framework, seeFoundation Framework Reference.
Core Telephony Framework
The Core Telephony framework (
CoreTelephony.framework) lets you get information about the type of radio technology in use by the device, the current signal strength, and the cell ID serving the device. Apps developed in conjunction with a carrier can also authenticate their apps against a particular subscriber for that carrier.
For information about the new interfaces, see the framework header files. For general information about the classes of the Core Telephony framework, see Core Telephony Framework Reference
Core Motion Framework
The Core Motion framework (
CoreMotion.framework) adds support for step counting and motion tracking. With step counting, the framework detects movements that correspond to user motion and uses that information to report the number of steps to your app. Because the system detects the motion, the system can continue to gather step data even when your app is not running. Alongside this feature, the framework can also distinguish different types of motion, including different motions reflective of travel by walking, by running, or by automobile. Navigation apps might use that data to change the type of directions they give to users.
For information about the new interfaces, see the framework header files. For general information about the classes of this framework, see Core Motion Framework Reference.
Core Location Framework
The Core Location framework (
CoreLocation.framework) supports ranging using Bluetooth devices. Ranging lets you determine the relative range of nearby Bluetooth devices and respond appropriately. For example, a museum might place Bluetooth beacons in its galleries and provide visitors with an app that displays information as the user enters and exits those galleries. The framework also supports deferring the delivery of location updates until a specific time has elapsed or the user has moved a minimum distance.
For general information about the classes of this framework, see Core Location Framework Reference.
Core Foundation Framework
The Core Foundation framework (
CoreFoundation.framework) now supports scheduling stream objects on dispatch queues. For information about the new interfaces, see the framework header files.
For general information about the interfaces of this framework, see Core Foundation Framework Reference.
Core Bluetooth Framework
The Core Bluetooth framework (
CoreBluetooth.framework) includes the following enhancements:
- The framework supports saving state information for central and peripheral objects and restoring that state at app launch time. You can use this feature to support long-term actions involving Bluetooth devices.
- The central and peripheral classes now use an
NSUUIDobject to store unique identifiers.
- You can now retrieve peripheral objects from a central manager synchronously.
For information about the new interfaces, see the framework header files. For general information about the classes of this framework, see Core Bluetooth Framework Reference.
AV Foundation Framework
The AV Foundation framework (
AVFoundation.framework) includes the following enhancements:
AVAudioSessionclass supports the following new behaviors:
- Support for selecting the preferred audio input, including audio from built-in microphones
- Support for multichannel input and output
AVVideoCompositingand related classes let you support custom video compositors.
AVSpeechSynthesizerand related classes provide speech synthesis capabilities.
- The capture classes add support and interfaces for the following features:
- Discovery of a camera’s supported formats
- Support for 60 fps recording
- Video zoom (true and digital) in recordings and video preview, including custom ramping
- Real-time discovery of machine-readable barcode metadata
- Autofocus range restriction
- Access to the clocks used during capture
- There are new metadata key spaces for supported ISO formats such as MPEG-4 and 3GPP and improved support for filtering metadata items when copying those items from source assets to output files using the
AVAssetWriterclass provides assistance in formulating output settings and there are new level constants for H.264 encoding.
AVPlayerLayerclass adds the
videoRectproperty, which you can use to get the size and position of the video image.
AVPlayerItemclass supports the following changes:
- Asset properties can be loaded automatically when
AVPlayerItemobjects are prepared for playback.
- When you link your app against iOS SDK 7, the behavior when getting the values of player item properties—such as the
presentationSizeproperties—is different than pervious versions of iOS. The properties of this class now return a default value and no longer block your app if the
AVPlayerItemobject is is not yet ready to play. As soon as the player item’s status changes to
AVPlayerItemStatusReadyToPlay, the getters reflect the actual values of the underlying media resource. If you use key-value observing to monitor changes to the properties, your observers are notified as soon as changes are available.
AVPlayerItemLegibleOutputclass can process timed text from media files.
AVAssetResourceLoadingDelegateprotocol now supports loading of arbitrary ranges of bytes from a media resource.
For information about the new interfaces, see the framework header files. For general information about the classes of this framework, see AV Foundation Framework Reference.
The Accelerate framework (
Accelerate.framework) includes the following enhancements:
- Improved support for manipulating Core Graphics data types
- Support for working with grayscale images of 1, 2, or 4 bits per pixel
- New routines for converting images between different formats and transforming image contents
- Support for biquad (IIR) operations
For information about the new interfaces, see the framework header files. For general information about the functions and types of this framework, see Accelerate Framework Reference.
Feel free to follow Brian and Gadget Unit on Twitter.