-
Notifications
You must be signed in to change notification settings - Fork 543
AVFoundation iOS xcode26.0 b1
Rolf Bjarne Kvinge edited this page Jun 20, 2025
·
2 revisions
#AVFoundation.framework
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsset.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsset.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsset.h 2025-04-19 05:01:52
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsset.h 2025-05-24 01:34:50
@@ -19,38 +19,21 @@
#import <CoreGraphics/CGAffineTransform.h>
#import <CoreMedia/CMTime.h>
+#import <UniformTypeIdentifiers/UniformTypeIdentifiers.h>
#pragma mark --- AVAsset ---
-/*!
- @class AVAsset
- @abstract
- An AVAsset is an abstract class that defines AVFoundation's model for timed audiovisual media.
-
- Each asset contains a collection of tracks that are intended to be presented or processed together, each of a uniform media type, including but not limited to audio, video, text, closed captions, and subtitles.
-
- @discussion
- AVAssets are often instantiated via its concrete subclass AVURLAsset with NSURLs that refer to audiovisual media resources, such as streams (including HTTP live streams), QuickTime movie files, MP3 files, and files of other types.
-
- They can also be instantiated using other concrete subclasses that extend the basic model for audiovisual media in useful ways, as AVComposition does for temporal editing.
-
- Properties of assets as a whole are defined by AVAsset. Additionally, references to instances of AVAssetTracks representing tracks of the collection can be obtained, so that each of these can be examined independently.
-
- Because of the nature of timed audiovisual media, upon successful initialization of an AVAsset some or all of the values for its keys may not be immediately available. The value of any key can be requested at any time, and AVAsset will always return its value synchronously, although it may have to block the calling thread in order to do so.
-
- In order to avoid blocking, clients can register their interest in particular keys and to become notified when their values become available. For further details, see AVAsynchronousKeyValueLoading.h. For clients who want to examine a subset of the tracks, metadata, and other parts of the asset, asynchronous methods like -loadTracksWithMediaType:completionHandler: can be used to load this information without blocking. When using these asynchronous methods, it is not necessary to load the associated property beforehand. Swift clients can also use the load(:) method to load properties in a type safe manner.
-
- On platforms other than macOS, it is particularly important to avoid blocking. To preserve responsiveness, a synchronous request that blocks for too long (eg, a property request on an asset on a slow HTTP server) may lead to media services being reset.
-
- To play an instance of AVAsset, initialize an instance of AVPlayerItem with it, use the AVPlayerItem to set up its presentation state (such as whether only a limited timeRange of the asset should be played, etc.), and provide the AVPlayerItem to an AVPlayer according to whether the items is to be played by itself or together with a collection of other items. Full details available in AVPlayerItem.h and AVPlayer.h.
-
- AVAssets can also be inserted into AVMutableCompositions in order to assemble audiovisual constructs from one or more source assets.
-
-*/
-
NS_ASSUME_NONNULL_BEGIN
@class AVAssetTrack;
+#if AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER
+@class AVURLAssetTrack;
+#else // ! AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER
+typedef AVAssetTrack AVURLAssetTrack;
+#if __swift__
+typedef AVAssetTrack *AVURLAssetTrackRef NS_SWIFT_NAME(AVURLAssetTrack);
+#endif // __swift__
+#endif // AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER
@class AVFragmentedAssetTrack;
@class AVMetadataItem;
@class AVMediaSelection;
@@ -58,6 +41,25 @@
@class AVDisplayCriteria;
@class AVAssetInternal;
+/// An AVAsset is an abstract class that defines AVFoundation's model for timed audiovisual media.
+///
+/// Each asset contains a collection of tracks that are intended to be presented or processed together, each of a uniform media type, including but not limited to audio, video, text, closed captions, and subtitles.
+///
+/// AVAssets are often instantiated via its concrete subclass AVURLAsset with NSURLs that refer to audiovisual media resources, such as streams (including HTTP live streams), QuickTime movie files, MP3 files, and files of other types.
+///
+/// They can also be instantiated using other concrete subclasses that extend the basic model for audiovisual media in useful ways, as AVComposition does for temporal editing.
+///
+/// Properties of assets as a whole are defined by AVAsset. Additionally, references to instances of AVAssetTracks representing tracks of the collection can be obtained, so that each of these can be examined independently.
+///
+/// Because of the nature of timed audiovisual media, upon successful initialization of an AVAsset some or all of the values for its keys may not be immediately available. The value of any key can be requested at any time, and AVAsset will always return its value synchronously, although it may have to block the calling thread in order to do so.
+///
+/// In order to avoid blocking, clients can register their interest in particular keys and to become notified when their values become available. For further details, see AVAsynchronousKeyValueLoading.h. For clients who want to examine a subset of the tracks, metadata, and other parts of the asset, asynchronous methods like -loadTracksWithMediaType:completionHandler: can be used to load this information without blocking. When using these asynchronous methods, it is not necessary to load the associated property beforehand. Swift clients can also use the load(:) method to load properties in a type safe manner.
+///
+/// On platforms other than macOS, it is particularly important to avoid blocking. To preserve responsiveness, a synchronous request that blocks for too long (eg, a property request on an asset on a slow HTTP server) may lead to media services being reset.
+///
+/// To play an instance of AVAsset, initialize an instance of AVPlayerItem with it, use the AVPlayerItem to set up its presentation state (such as whether only a limited timeRange of the asset should be played, etc.), and provide the AVPlayerItem to an AVPlayer according to whether the items is to be played by itself or together with a collection of other items. Full details available in AVPlayerItem.h and AVPlayer.h.
+///
+/// AVAssets can also be inserted into AVMutableCompositions in order to assemble audiovisual constructs from one or more source assets.
API_AVAILABLE(macos(10.7), ios(4.0), tvos(9.0), watchos(1.0), visionos(1.0))
@interface AVAsset : NSObject <NSCopying, AVAsynchronousKeyValueLoading>
{
@@ -65,40 +67,35 @@
AVAssetInternal *_asset;
}
-/*!
- @method assetWithURL:
- @abstract Returns an instance of AVAsset for inspection of a media resource.
- @param URL
- An instance of NSURL that references a media resource.
- @result An instance of AVAsset.
- @discussion Returns a newly allocated instance of a subclass of AVAsset initialized with the specified URL.
-*/
+/// Returns an instance of AVAsset for inspection of a media resource.
+///
+/// Returns a newly allocated instance of a subclass of AVAsset initialized with the specified URL.
+///
+/// - Parameter URL: An instance of NSURL that references a media resource.
+///
+/// - Returns: An instance of AVAsset.
+ (instancetype)assetWithURL:(NSURL *)URL AVF_DEPRECATED_FOR_SWIFT_ONLY("Use AVURLAsset(url:) instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0), visionos(1.0, 2.0));
-/* Indicates the duration of the asset. If @"providesPreciseDurationAndTiming" is NO, a best-available estimate of the duration is returned. The degree of precision preferred for timing-related properties can be set at initialization time for assets initialized with URLs. See AVURLAssetPreferPreciseDurationAndTimingKey for AVURLAsset below.
-*/
+/// Indicates the duration of the asset.
+///
+/// If @"providesPreciseDurationAndTiming" is NO, a best-available estimate of the duration is returned. The degree of precision preferred for timing-related properties can be set at initialization time for assets initialized with URLs
+///
+/// - Seealso: AVURLAssetPreferPreciseDurationAndTimingKey for AVURLAsset below.
@property (nonatomic, readonly) CMTime duration AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.duration) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates the natural rate at which the asset is to be played; often but not always 1.0
-*/
+/// Indicates the natural rate at which the asset is to be played; often but not always 1.0
@property (nonatomic, readonly) float preferredRate AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.preferredRate) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates the preferred volume at which the audible media of an asset is to be played; often but not always 1.0
-*/
+/// Indicates the preferred volume at which the audible media of an asset is to be played; often but not always 1.0
@property (nonatomic, readonly) float preferredVolume AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.preferredVolume) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates the preferred transform to apply to the visual content of the asset for presentation or processing; the value is often but not always the identity transform
-*/
+/// Indicates the preferred transform to apply to the visual content of the asset for presentation or processing; the value is often but not always the identity transform
@property (nonatomic, readonly) CGAffineTransform preferredTransform AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.preferredTransform) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* The following property is deprecated. Instead, use the naturalSize and preferredTransform, as appropriate, of the receiver's video tracks. See -tracksWithMediaType: below.
-*/
+/// The following property is deprecated. Instead, use the naturalSize and preferredTransform, as appropriate, of the receiver's video tracks. See -tracksWithMediaType: below.
@property (nonatomic, readonly) CGSize naturalSize API_DEPRECATED("Use the naturalSize and preferredTransform, as appropriate, of the receiver's video tracks. See -tracksWithMediaType:", macos(10.7, 10.8), ios(4.0, 5.0), tvos(9.0, 9.0)) API_UNAVAILABLE(watchos, visionos);
-/*!
- @property preferredDisplayCriteria
- @abstract Guides to a display mode that is optimal for playing this particular asset.
- */
+/// Guides to a display mode that is optimal for playing this particular asset.
@property (nonatomic, readonly) AVDisplayCriteria *preferredDisplayCriteria
#if __swift__
API_DEPRECATED("Use load(.preferredDisplayCriteria) instead", tvos(11.2, 16.0)) API_UNAVAILABLE(ios, visionos) API_UNAVAILABLE(macos, watchos);
@@ -106,11 +103,9 @@
API_AVAILABLE(tvos(11.2), visionos(1.0)) API_UNAVAILABLE(ios) API_UNAVAILABLE(macos, watchos);
#endif
-/*!
- @property minimumTimeOffsetFromLive
- @abstract Indicates how close to the latest content in a live stream playback can be sustained.
- @discussion For non-live assets this value is kCMTimeInvalid.
- */
+/// Indicates how close to the latest content in a live stream playback can be sustained.
+///
+/// For non-live assets this value is kCMTimeInvalid.
@property (nonatomic, readonly) CMTime minimumTimeOffsetFromLive
#if __swift__
API_DEPRECATED("Use load(.minimumTimeOffsetFromLive) instead", macos(10.15, 13.0), ios(13.0, 16.0), tvos(13.0, 16.0), watchos(6.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -123,15 +118,12 @@
@interface AVAsset (AVAssetAsynchronousLoading)
-/* Indicates that the asset provides precise timing. See @"duration" above and AVURLAssetPreferPreciseDurationAndTimingKey below.
-*/
+/// Indicates that the asset provides precise timing. See @"duration" above and AVURLAssetPreferPreciseDurationAndTimingKey below.
@property (nonatomic, readonly) BOOL providesPreciseDurationAndTiming AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.providesPreciseDurationAndTiming) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/*!
- @method cancelLoading
- @abstract Cancels the loading of all values for all observers.
- @discussion Deallocation or finalization of an instance of AVAsset will implicitly cancel loading if any loading requests are still outstanding.
-*/
+/// Cancels the loading of all values for all observers.
+///
+/// Deallocation or finalization of an instance of AVAsset will implicitly cancel loading if any loading requests are still outstanding.
- (void)cancelLoading;
@end
@@ -139,40 +131,27 @@
@interface AVAsset (AVAssetReferenceRestrictions)
-/*!
- @enum AVAssetReferenceRestrictions
- @abstract These constants can be passed in to AVURLAssetReferenceRestrictionsKey to control the resolution of references to external media data.
-
- @constant AVAssetReferenceRestrictionForbidNone
- Indicates that all types of references should be followed.
- @constant AVAssetReferenceRestrictionForbidRemoteReferenceToLocal
- Indicates that references from a remote asset (e.g. referenced via http URL) to local media data (e.g. stored in a local file) should not be followed.
- @constant AVAssetReferenceRestrictionForbidLocalReferenceToRemote
- Indicates that references from a local asset to remote media data should not be followed.
- @constant AVAssetReferenceRestrictionForbidCrossSiteReference
- Indicates that references from a remote asset to remote media data stored at a different site should not be followed.
- @constant AVAssetReferenceRestrictionForbidLocalReferenceToLocal
- Indicates that references from a local asset to local media data stored outside the asset's container file should not be followed.
- @constant AVAssetReferenceRestrictionForbidAll
- Indicates that only references to media data stored within the asset's container file should be allowed.
-*/
+/// These constants can be passed in to AVURLAssetReferenceRestrictionsKey to control the resolution of references to external media data.
typedef NS_OPTIONS(NSUInteger, AVAssetReferenceRestrictions) {
+ /// Indicates that all types of references should be followed.
AVAssetReferenceRestrictionForbidNone = 0UL,
+ /// Indicates that references from a remote asset (e.g. referenced via http URL) to local media data (e.g. stored in a local file) should not be followed.
AVAssetReferenceRestrictionForbidRemoteReferenceToLocal = (1UL << 0),
+ /// Indicates that references from a local asset to remote media data should not be followed.
AVAssetReferenceRestrictionForbidLocalReferenceToRemote = (1UL << 1),
+ /// Indicates that references from a remote asset to remote media data stored at a different site should not be followed.
AVAssetReferenceRestrictionForbidCrossSiteReference = (1UL << 2),
+ /// Indicates that references from a local asset to local media data stored outside the asset's container file should not be followed.
AVAssetReferenceRestrictionForbidLocalReferenceToLocal = (1UL << 3),
+ /// Indicates that only references to media data stored within the asset's container file should be allowed.
AVAssetReferenceRestrictionForbidAll = 0xFFFFUL,
AVAssetReferenceRestrictionDefaultPolicy = AVAssetReferenceRestrictionForbidLocalReferenceToRemote
};
-/*!
- @property referenceRestrictions
- @abstract Indicates the reference restrictions being used by the receiver.
- @discussion
- For AVURLAsset, this property reflects the value passed in for AVURLAssetReferenceRestrictionsKey, if any. See AVURLAssetReferenceRestrictionsKey below for a full discussion of reference restrictions. The default value for this property is AVAssetReferenceRestrictionForbidLocalReferenceToRemote.
-*/
+/// Indicates the reference restrictions being used by the receiver.
+///
+/// For AVURLAsset, this property reflects the value passed in for AVURLAssetReferenceRestrictionsKey, if any. See AVURLAssetReferenceRestrictionsKey below for a full discussion of reference restrictions. The default value for this property is AVAssetReferenceRestrictionForbidLocalReferenceToRemote.
@property (nonatomic, readonly) AVAssetReferenceRestrictions referenceRestrictions API_AVAILABLE(macos(10.7), ios(5.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
@end
@@ -182,20 +161,16 @@
@interface AVAsset (AVAssetTrackInspection)
-/*!
- @property tracks
- @abstract Provides the array of AVAssetTracks contained by the asset
-*/
+/// Provides the array of AVAssetTracks contained by the asset
@property (nonatomic, readonly) NSArray<AVAssetTrack *> *tracks AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.tracks) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/*!
- @method trackWithTrackID:
- @abstract Provides an instance of AVAssetTrack that represents the track of the specified trackID.
- @param trackID
- The trackID of the requested AVAssetTrack.
- @result An instance of AVAssetTrack; may be nil if no track of the specified trackID is available.
- @discussion Becomes callable without blocking when the key @"tracks" has been loaded
-*/
+/// Provides an instance of AVAssetTrack that represents the track of the specified trackID.
+///
+/// Becomes callable without blocking when the key @"tracks" has been loaded
+///
+/// - Parameter trackID: The trackID of the requested AVAssetTrack.
+///
+/// - Returns: An instance of AVAssetTrack; may be nil if no track of the specified trackID is available.
- (nullable AVAssetTrack *)trackWithTrackID:(CMPersistentTrackID)trackID
#if __swift__
API_DEPRECATED("Use loadTrack(withTrackID:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -203,24 +178,19 @@
API_DEPRECATED("Use loadTrackWithTrackID:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadTrackWithTrackID:completionHandler:
- @abstract Loads an instance of AVAssetTrack that represents the track of the specified trackID.
- @param trackID
- The trackID of the requested AVAssetTrack.
- @param completionHandler
- A block that is called when the loading is finished, with either the loaded track (which may be nil if no track of the specified trackID is available) or an error.
-*/
+/// Loads an instance of AVAssetTrack that represents the track of the specified trackID.
+///
+/// - Parameter trackID: The trackID of the requested AVAssetTrack.
+/// - Parameter completionHandler: A block that is called when the loading is finished, with either the loaded track (which may be nil if no track of the specified trackID is available) or an error.
- (void)loadTrackWithTrackID:(CMPersistentTrackID)trackID completionHandler:(void (^ NS_SWIFT_SENDABLE)(AVAssetTrack * _Nullable_result, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
-/*!
- @method tracksWithMediaType:
- @abstract Provides an array of AVAssetTracks of the asset that present media of the specified media type.
- @param mediaType
- The media type according to which AVAsset filters its AVAssetTracks. (Media types are defined in AVMediaFormat.h.)
- @result An NSArray of AVAssetTracks; may be empty if no tracks of the specified media type are available.
- @discussion Becomes callable without blocking when the key @"tracks" has been loaded
-*/
+/// Provides an array of AVAssetTracks of the asset that present media of the specified media type.
+///
+/// Becomes callable without blocking when the key @"tracks" has been loaded
+///
+/// - Parameter mediaType: The media type according to which AVAsset filters its AVAssetTracks. (Media types are defined in AVMediaFormat.h.)
+///
+/// - Returns: An NSArray of AVAssetTracks; may be empty if no tracks of the specified media type are available.
- (NSArray<AVAssetTrack *> *)tracksWithMediaType:(AVMediaType)mediaType
#if __swift__
API_DEPRECATED("Use loadTracks(withMediaType:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -228,24 +198,19 @@
API_DEPRECATED("Use loadTracksWithMediaType:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadTracksWithMediaType:completionHandler:
- @abstract Loads an array of AVAssetTracks of the asset that present media of the specified media type.
- @param mediaType
- The media type according to which AVAsset filters its AVAssetTracks. (Media types are defined in AVMediaFormat.h.)
- @param completionHandler
- A block that is called when the loading is finished, with either the loaded tracks (which may be empty if no tracks of the specified media type are available) or an error.
-*/
+/// Loads an array of AVAssetTracks of the asset that present media of the specified media type.
+///
+/// - Parameter mediaType: The media type according to which AVAsset filters its AVAssetTracks. (Media types are defined in AVMediaFormat.h.)
+/// - Parameter completionHandler: A block that is called when the loading is finished, with either the loaded tracks (which may be empty if no tracks of the specified media type are available) or an error.
- (void)loadTracksWithMediaType:(AVMediaType)mediaType completionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVAssetTrack *> * _Nullable, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
-/*!
- @method tracksWithMediaCharacteristic:
- @abstract Provides an array of AVAssetTracks of the asset that present media with the specified characteristic.
- @param mediaCharacteristic
- The media characteristic according to which AVAsset filters its AVAssetTracks. (Media characteristics are defined in AVMediaFormat.h.)
- @result An NSArray of AVAssetTracks; may be empty if no tracks with the specified characteristic are available.
- @discussion Becomes callable without blocking when the key @"tracks" has been loaded
-*/
+/// Provides an array of AVAssetTracks of the asset that present media with the specified characteristic.
+///
+/// Becomes callable without blocking when the key @"tracks" has been loaded
+///
+/// - Parameter mediaCharacteristic: The media characteristic according to which AVAsset filters its AVAssetTracks. (Media characteristics are defined in AVMediaFormat.h.)
+///
+/// - Returns: An NSArray of AVAssetTracks; may be empty if no tracks with the specified characteristic are available.
- (NSArray<AVAssetTrack *> *)tracksWithMediaCharacteristic:(AVMediaCharacteristic)mediaCharacteristic
#if __swift__
API_DEPRECATED("Use loadTracks(withMediaCharacteristic:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -253,24 +218,15 @@
API_DEPRECATED("Use loadTracksWithMediaCharacteristic:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadTracksWithMediaCharacteristic:completionHandler:
- @abstract Loads an array of AVAssetTracks of the asset that present media with the specified characteristic.
- @param mediaCharacteristic
- The media characteristic according to which AVAsset filters its AVAssetTracks. (Media characteristics are defined in AVMediaFormat.h.)
- @param completionHandler
- A block that is called when the loading is finished, with either the loaded tracks (which may be empty if no tracks with the specified characteristic are available) or an error.
-*/
+/// Loads an array of AVAssetTracks of the asset that present media with the specified characteristic.
+///
+/// - Parameter mediaCharacteristic: The media characteristic according to which AVAsset filters its AVAssetTracks. (Media characteristics are defined in AVMediaFormat.h.)
+/// - Parameter completionHandler: A block that is called when the loading is finished, with either the loaded tracks (which may be empty if no tracks with the specified characteristic are available) or an error.
- (void)loadTracksWithMediaCharacteristic:(AVMediaCharacteristic)mediaCharacteristic completionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVAssetTrack *> * _Nullable, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
-/*!
- @property trackGroups
- @abstract
- All track groups in the receiver.
-
- @discussion
- The value of this property is an NSArray of AVAssetTrackGroups, each representing a different grouping of tracks in the receiver.
- */
+/// All track groups in the receiver.
+///
+/// The value of this property is an NSArray of AVAssetTrackGroups, each representing a different grouping of tracks in the receiver.
@property (nonatomic, readonly) NSArray<AVAssetTrackGroup *> *trackGroups
#if __swift__
API_DEPRECATED("Use load(.trackGroups) instead", macos(10.9, 13.0), ios(7.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -285,8 +241,7 @@
// high-level access to selected metadata of common interest
-/* Indicates the creation date of the asset as an AVMetadataItem. May be nil. If a creation date has been stored by the asset in a form that can be converted to an NSDate, the dateValue property of the AVMetadataItem will provide an instance of NSDate. Otherwise the creation date is available only as a string value, via -[AVMetadataItem stringValue].
-*/
+/// Indicates the creation date of the asset as an AVMetadataItem. May be nil. If a creation date has been stored by the asset in a form that can be converted to an NSDate, the dateValue property of the AVMetadataItem will provide an instance of NSDate. Otherwise the creation date is available only as a string value, via -[AVMetadataItem stringValue].
@property (nonatomic, readonly, nullable) AVMetadataItem *creationDate
#if __swift__
API_DEPRECATED("Use load(.creationDate) instead", macos(10.8, 13.0), ios(5.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -294,16 +249,13 @@
API_AVAILABLE(macos(10.8), ios(5.0), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/* Provides access to the lyrics of the asset suitable for the current locale.
-*/
+/// Provides access to the lyrics of the asset suitable for the current locale.
@property (nonatomic, readonly, nullable) NSString *lyrics AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.lyrics) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* Provides access to an array of AVMetadataItems for each common metadata key for which a value is available; items can be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:] and according to identifier via +[AVMetadataItem metadataItemsFromArray:filteredByIdentifier:].
-*/
+/// Provides access to an array of AVMetadataItems for each common metadata key for which a value is available; items can be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:] and according to identifier via +[AVMetadataItem metadataItemsFromArray:filteredByIdentifier:].
@property (nonatomic, readonly) NSArray<AVMetadataItem *> *commonMetadata AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.commonMetadata) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* Provides access to an array of AVMetadataItems for all metadata identifiers for which a value is available; items can be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:] and according to identifier via +[AVMetadataItem metadataItemsFromArray:filteredByIdentifier:].
-*/
+/// Provides access to an array of AVMetadataItems for all metadata identifiers for which a value is available; items can be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:] and according to identifier via +[AVMetadataItem metadataItemsFromArray:filteredByIdentifier:].
@property (nonatomic, readonly) NSArray<AVMetadataItem *> *metadata
#if __swift__
API_DEPRECATED("Use load(.metadata) instead", macos(10.10, 13.0), ios(8.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -311,18 +263,16 @@
API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/* Provides an NSArray of NSStrings, each representing a metadata format that's available to the asset (e.g. ID3, iTunes metadata, etc.). Metadata formats are defined in AVMetadataFormat.h.
-*/
+/// Provides an NSArray of NSStrings, each representing a metadata format that's available to the asset (e.g. ID3, iTunes metadata, etc.). Metadata formats are defined in AVMetadataFormat.h.
@property (nonatomic, readonly) NSArray<AVMetadataFormat> *availableMetadataFormats AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.availableMetadataFormats) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/*!
- @method metadataForFormat:
- @abstract Provides an NSArray of AVMetadataItems, one for each metadata item in the container of the specified format; can subsequently be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:], according to locale via +[AVMetadataItem metadataItemsFromArray:withLocale:], or according to key via +[AVMetadataItem metadataItemsFromArray:withKey:keySpace:].
- @param format
- The metadata format for which items are requested.
- @result An NSArray containing AVMetadataItems; may be empty if there is no metadata of the specified format.
- @discussion Becomes callable without blocking when the key @"availableMetadataFormats" has been loaded
-*/
+/// Provides an NSArray of AVMetadataItems, one for each metadata item in the container of the specified format; can subsequently be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:], according to locale via +[AVMetadataItem metadataItemsFromArray:withLocale:], or according to key via +[AVMetadataItem metadataItemsFromArray:withKey:keySpace:].
+///
+/// Becomes callable without blocking when the key @"availableMetadataFormats" has been loaded
+///
+/// - Parameter format: The metadata format for which items are requested.
+///
+/// - Returns: An NSArray containing AVMetadataItems; may be empty if there is no metadata of the specified format.
- (NSArray<AVMetadataItem *> *)metadataForFormat:(AVMetadataFormat)format
#if __swift__
API_DEPRECATED("Use loadMetadata(for:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -330,14 +280,10 @@
API_DEPRECATED("Use loadMetadataForFormat:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadMetadataForFormat:completionHandler:
- @abstract Loads an NSArray of AVMetadataItems, one for each metadata item in the container of the specified format; can subsequently be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:], according to locale via +[AVMetadataItem metadataItemsFromArray:withLocale:], or according to key via +[AVMetadataItem metadataItemsFromArray:withKey:keySpace:].
- @param format
- The metadata format for which items are requested.
- @param completionHandler
- A block that is invoked when loading is complete, vending the array of metadata items (which may be empty if there is no metadata of the specified format) or an error.
-*/
+/// Loads an NSArray of AVMetadataItems, one for each metadata item in the container of the specified format; can subsequently be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:], according to locale via +[AVMetadataItem metadataItemsFromArray:withLocale:], or according to key via +[AVMetadataItem metadataItemsFromArray:withKey:keySpace:].
+///
+/// - Parameter format: The metadata format for which items are requested.
+/// - Parameter completionHandler: A block that is invoked when loading is complete, vending the array of metadata items (which may be empty if there is no metadata of the specified format) or an error.
- (void)loadMetadataForFormat:(AVMetadataFormat)format completionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVMetadataItem *> * _Nullable, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
@end
@@ -347,8 +293,7 @@
@interface AVAsset (AVAssetChapterInspection)
-/* array of NSLocale
-*/
+/// array of NSLocale
@property (readonly) NSArray<NSLocale *> *availableChapterLocales
#if __swift__
API_DEPRECATED("Use load(.availableChapterLocales) instead", macos(10.7, 13.0), ios(4.3, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -356,22 +301,18 @@
API_AVAILABLE(macos(10.7), ios(4.3), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/*!
- @method chapterMetadataGroupsWithTitleLocale:containingItemsWithCommonKeys:
- @abstract Provides an array of chapters.
- @param locale
- Locale of the metadata items carrying chapter titles to be returned (supports the IETF BCP 47 specification).
- @param commonKeys
- Array of common keys of AVMetadataItem to be included; can be nil.
- AVMetadataCommonKeyArtwork is the only supported key for now.
- @result An NSArray of AVTimedMetadataGroup.
- @discussion
- This method returns an array of AVTimedMetadataGroup objects. Each object in the array always contains an AVMetadataItem representing the chapter title; the timeRange property of the AVTimedMetadataGroup object is equal to the time range of the chapter title item.
-
- An AVMetadataItem with the specified common key will be added to an existing AVTimedMetadataGroup object if the time range (timestamp and duration) of the metadata item and the metadata group overlaps. The locale of items not carrying chapter titles need not match the specified locale parameter.
-
- Further filtering of the metadata items in AVTimedMetadataGroups according to language can be accomplished using +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:]; filtering of the metadata items according to locale can be accomplished using +[AVMetadataItem metadataItemsFromArray:withLocale:].
-*/
+/// Provides an array of chapters.
+///
+/// This method returns an array of AVTimedMetadataGroup objects. Each object in the array always contains an AVMetadataItem representing the chapter title; the timeRange property of the AVTimedMetadataGroup object is equal to the time range of the chapter title item.
+///
+/// An AVMetadataItem with the specified common key will be added to an existing AVTimedMetadataGroup object if the time range (timestamp and duration) of the metadata item and the metadata group overlaps. The locale of items not carrying chapter titles need not match the specified locale parameter.
+///
+/// Further filtering of the metadata items in AVTimedMetadataGroups according to language can be accomplished using +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:]; filtering of the metadata items according to locale can be accomplished using +[AVMetadataItem metadataItemsFromArray:withLocale:].
+///
+/// - Parameter locale: Locale of the metadata items carrying chapter titles to be returned (supports the IETF BCP 47 specification).
+/// - Parameter commonKeys: Array of common keys of AVMetadataItem to be included; can be nil. AVMetadataCommonKeyArtwork is the only supported key for now.
+///
+/// - Returns: An NSArray of AVTimedMetadataGroup.
- (NSArray<AVTimedMetadataGroup *> *)chapterMetadataGroupsWithTitleLocale:(NSLocale *)locale containingItemsWithCommonKeys:(nullable NSArray<AVMetadataKey> *)commonKeys
#if __swift__
API_DEPRECATED("Use loadChapterMetadataGroups(withTitleLocale:containingItemsWithCommonKeys:) instead", macos(10.7, 13.0), ios(4.3, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -379,41 +320,32 @@
API_DEPRECATED("Use loadChapterMetadataGroupsWithTitleLocale:containingItemsWithCommonKeys:completionHandler: instead", macos(10.7, 15.0), ios(4.3, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadChapterMetadataGroupsWithTitleLocale:containingItemsWithCommonKeys:completionHandler:
- @abstract Loads an array of chapters.
- @param locale
- Locale of the metadata items carrying chapter titles to be returned (supports the IETF BCP 47 specification).
- @param commonKeys
- Array of common keys of AVMetadataItem to be included; if no common keys are required, send an empty list.
- AVMetadataCommonKeyArtwork is the only supported key for now.
- @param completionHandler
- A block that is invoked when loading is complete, vending the array of timed metadata groups or an error.
- @discussion
- This method vends an array of AVTimedMetadataGroup objects. Each object in the array always contains an AVMetadataItem representing the chapter title; the timeRange property of the AVTimedMetadataGroup object is equal to the time range of the chapter title item.
-
- An AVMetadataItem with the specified common key will be added to an existing AVTimedMetadataGroup object if the time range (timestamp and duration) of the metadata item and the metadata group overlaps. The locale of items not carrying chapter titles need not match the specified locale parameter.
-
- Further filtering of the metadata items in AVTimedMetadataGroups according to language can be accomplished using +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:]; filtering of the metadata items according to locale can be accomplished using +[AVMetadataItem metadataItemsFromArray:withLocale:].
-*/
+/// Loads an array of chapters.
+///
+/// This method vends an array of AVTimedMetadataGroup objects. Each object in the array always contains an AVMetadataItem representing the chapter title; the timeRange property of the AVTimedMetadataGroup object is equal to the time range of the chapter title item.
+///
+/// An AVMetadataItem with the specified common key will be added to an existing AVTimedMetadataGroup object if the time range (timestamp and duration) of the metadata item and the metadata group overlaps. The locale of items not carrying chapter titles need not match the specified locale parameter.
+///
+/// Further filtering of the metadata items in AVTimedMetadataGroups according to language can be accomplished using +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:]; filtering of the metadata items according to locale can be accomplished using +[AVMetadataItem metadataItemsFromArray:withLocale:].
+///
+/// - Parameter locale: Locale of the metadata items carrying chapter titles to be returned (supports the IETF BCP 47 specification).
+/// - Parameter commonKeys: Array of common keys of AVMetadataItem to be included; if no common keys are required, send an empty list. AVMetadataCommonKeyArtwork is the only supported key for now.
+/// - Parameter completionHandler: A block that is invoked when loading is complete, vending the array of timed metadata groups or an error.
- (void)loadChapterMetadataGroupsWithTitleLocale:(NSLocale *)locale containingItemsWithCommonKeys:(NSArray<AVMetadataKey> *)commonKeys completionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVTimedMetadataGroup *> * _Nullable, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
-/*!
- @method chapterMetadataGroupsBestMatchingPreferredLanguages:
- @abstract Tests, in order of preference, for a match between language identifiers in the specified array of preferred languages and the available chapter locales, and returns the array of chapters corresponding to the first match that's found.
- @param preferredLanguages
- An array of language identifiers in order of preference, each of which is an IETF BCP 47 (RFC 4646) language identifier. Use +[NSLocale preferredLanguages] to obtain the user's list of preferred languages.
- @result An NSArray of AVTimedMetadataGroup.
- @discussion
- Safe to call without blocking when the AVAsset key availableChapterLocales has status AVKeyValueStatusLoaded.
-
- Returns an array of AVTimedMetadataGroup objects. Each object in the array always contains an AVMetadataItem representing the chapter title; the timeRange property of the AVTimedMetadataGroup object is equal to the time range of the chapter title item.
-
- All of the available chapter metadata is included in the metadata groups, including items with the common key AVMetadataCommonKeyArtwork, if such items are present. Items not carrying chapter titles will be added to an existing AVTimedMetadataGroup object if the time range (timestamp and duration) of the metadata item and that of the metadata group overlaps. The locale of such items need not match the locale of the chapter titles.
-
- Further filtering of the metadata items in AVTimedMetadataGroups according to language can be accomplished using +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:]; filtering of the metadata items according to locale can be accomplished using +[AVMetadataItem metadataItemsFromArray:withLocale:].
-.
-*/
+/// Tests, in order of preference, for a match between language identifiers in the specified array of preferred languages and the available chapter locales, and returns the array of chapters corresponding to the first match that's found.
+///
+/// Safe to call without blocking when the AVAsset key availableChapterLocales has status AVKeyValueStatusLoaded.
+///
+/// Returns an array of AVTimedMetadataGroup objects. Each object in the array always contains an AVMetadataItem representing the chapter title; the timeRange property of the AVTimedMetadataGroup object is equal to the time range of the chapter title item.
+///
+/// All of the available chapter metadata is included in the metadata groups, including items with the common key AVMetadataCommonKeyArtwork, if such items are present. Items not carrying chapter titles will be added to an existing AVTimedMetadataGroup object if the time range (timestamp and duration) of the metadata item and that of the metadata group overlaps. The locale of such items need not match the locale of the chapter titles.
+///
+/// Further filtering of the metadata items in AVTimedMetadataGroups according to language can be accomplished using +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:]; filtering of the metadata items according to locale can be accomplished using +[AVMetadataItem metadataItemsFromArray:withLocale:].
+///
+/// - Parameter preferredLanguages: An array of language identifiers in order of preference, each of which is an IETF BCP 47 (RFC 4646) language identifier. If your goal is to provide the best match for the end user's preferred languages without consideration of your app's available localizations, pass [NSLocale preferredLanguages] as the value of preferredLanguages. However, if you want to filter the available choices in order to obtain the best match among the localizations that are available for your app, pass [NSBundle preferredLocalizationsFromArray:[[NSBundle mainBundle] localizations] forPreferences:[NSLocale preferredLanguages]] instead. The latter choice is normally more appropriate for strings intended for display as part of the app's UI.
+///
+/// - Returns: An NSArray of AVTimedMetadataGroup.
- (NSArray<AVTimedMetadataGroup *> *)chapterMetadataGroupsBestMatchingPreferredLanguages:(NSArray<NSString *> *)preferredLanguages
#if __swift__
API_DEPRECATED("Use loadChapterMetadataGroups(bestMatchingPreferredLanguages:) instead", macos(10.8, 13.0), ios(6.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -421,20 +353,16 @@
API_DEPRECATED("Use loadChapterMetadataGroupsBestMatchingPreferredLanguages:completionHandler: instead", macos(10.8, 15.0), ios(6.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadChapterMetadataGroupsBestMatchingPreferredLanguages:completionHandler:
- @abstract Tests, in order of preference, for a match between language identifiers in the specified array of preferred languages and the available chapter locales, and loads the array of chapters corresponding to the first match that's found.
- @param preferredLanguages
- An array of language identifiers in order of preference, each of which is an IETF BCP 47 (RFC 4646) language identifier. Use +[NSLocale preferredLanguages] to obtain the user's list of preferred languages.
- @param completionHandler
- A block that is invoked when loading is complete, vending the array of timed metadata groups or an error.
- @discussion
- Returns an array of AVTimedMetadataGroup objects. Each object in the array always contains an AVMetadataItem representing the chapter title; the timeRange property of the AVTimedMetadataGroup object is equal to the time range of the chapter title item.
-
- All of the available chapter metadata is included in the metadata groups, including items with the common key AVMetadataCommonKeyArtwork, if such items are present. Items not carrying chapter titles will be added to an existing AVTimedMetadataGroup object if the time range (timestamp and duration) of the metadata item and that of the metadata group overlaps. The locale of such items need not match the locale of the chapter titles.
-
- Further filtering of the metadata items in AVTimedMetadataGroups according to language can be accomplished using +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:]; filtering of the metadata items according to locale can be accomplished using +[AVMetadataItem metadataItemsFromArray:withLocale:].
-*/
+/// Tests, in order of preference, for a match between language identifiers in the specified array of preferred languages and the available chapter locales, and loads the array of chapters corresponding to the first match that's found.
+///
+/// Returns an array of AVTimedMetadataGroup objects. Each object in the array always contains an AVMetadataItem representing the chapter title; the timeRange property of the AVTimedMetadataGroup object is equal to the time range of the chapter title item.
+///
+/// All of the available chapter metadata is included in the metadata groups, including items with the common key AVMetadataCommonKeyArtwork, if such items are present. Items not carrying chapter titles will be added to an existing AVTimedMetadataGroup object if the time range (timestamp and duration) of the metadata item and that of the metadata group overlaps. The locale of such items need not match the locale of the chapter titles.
+///
+/// Further filtering of the metadata items in AVTimedMetadataGroups according to language can be accomplished using +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:]; filtering of the metadata items according to locale can be accomplished using +[AVMetadataItem metadataItemsFromArray:withLocale:].
+///
+/// - Parameter preferredLanguages: An array of language identifiers in order of preference, each of which is an IETF BCP 47 (RFC 4646) language identifier. If your goal is to provide the best match for the end user's preferred languages without consideration of your app's available localizations, pass [NSLocale preferredLanguages] as the value of preferredLanguages. However, if you want to filter the available choices in order to obtain the best match among the localizations that are available for your app, pass [NSBundle preferredLocalizationsFromArray:[[NSBundle mainBundle] localizations] forPreferences:[NSLocale preferredLanguages]] instead. The latter choice is normally more appropriate for strings intended for display as part of the app's UI.
+/// - Parameter completionHandler: A block that is invoked when loading is complete, vending the array of timed metadata groups or an error.
- (void)loadChapterMetadataGroupsBestMatchingPreferredLanguages:(NSArray<NSString *> *)preferredLanguages completionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVTimedMetadataGroup *> * _Nullable, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
@end
@@ -444,8 +372,7 @@
@interface AVAsset (AVAssetMediaSelection)
-/* Provides an NSArray of NSStrings, each NSString indicating a media characteristic for which a media selection option is available.
-*/
+/// Provides an NSArray of NSStrings, each NSString indicating a media characteristic for which a media selection option is available.
@property (nonatomic, readonly) NSArray<AVMediaCharacteristic> *availableMediaCharacteristicsWithMediaSelectionOptions
#if __swift__
API_DEPRECATED("Use load(.availableMediaCharacteristicsWithMediaSelectionOptions) instead", macos(10.8, 13.0), ios(5.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -453,23 +380,20 @@
API_AVAILABLE(macos(10.8), ios(5.0), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/*!
- @method mediaSelectionGroupForMediaCharacteristic:
- @abstract Provides an instance of AVMediaSelectionGroup that contains one or more options with the specified media characteristic.
- @param mediaCharacteristic
- A media characteristic for which you wish to obtain the available media selection options. AVMediaCharacteristicAudible, AVMediaCharacteristicLegible, and AVMediaCharacteristicVisual are currently supported.
-
- Pass AVMediaCharacteristicAudible to obtain the group of available options for audio media in various languages and for various purposes, such as descriptive audio.
- Pass AVMediaCharacteristicLegible to obtain the group of available options for subtitles in various languages and for various purposes.
- Pass AVMediaCharacteristicVisual to obtain the group of available options for video media.
- @result An instance of AVMediaSelectionGroup. May be nil.
- @discussion
- Becomes callable without blocking when the key @"availableMediaCharacteristicsWithMediaSelectionOptions" has been loaded.
-
- If the asset has no AVMediaSelectionGroup containing options with the specified media characteristic, the return value will be nil.
-
- Filtering of the options in the returned AVMediaSelectionGroup according to playability, locale, and additional media characteristics can be accomplished using the category AVMediaSelectionOptionFiltering defined on AVMediaSelectionGroup.
-*/
+/// Provides an instance of AVMediaSelectionGroup that contains one or more options with the specified media characteristic.
+///
+/// Becomes callable without blocking when the key @"availableMediaCharacteristicsWithMediaSelectionOptions" has been loaded.
+///
+/// If the asset has no AVMediaSelectionGroup containing options with the specified media characteristic, the return value will be nil.
+///
+/// Filtering of the options in the returned AVMediaSelectionGroup according to playability, locale, and additional media characteristics can be accomplished using the category AVMediaSelectionOptionFiltering defined on AVMediaSelectionGroup.
+///
+/// - Parameter mediaCharacteristic: A media characteristic for which you wish to obtain the available media selection options. AVMediaCharacteristicAudible, AVMediaCharacteristicLegible, and AVMediaCharacteristicVisual are currently supported.
+/// Pass AVMediaCharacteristicAudible to obtain the group of available options for audio media in various languages and for various purposes, such as descriptive audio.
+/// Pass AVMediaCharacteristicLegible to obtain the group of available options for subtitles in various languages and for various purposes.
+/// Pass AVMediaCharacteristicVisual to obtain the group of available options for video media.
+///
+/// - Returns: An instance of AVMediaSelectionGroup. May be nil.
- (nullable AVMediaSelectionGroup *)mediaSelectionGroupForMediaCharacteristic:(AVMediaCharacteristic)mediaCharacteristic
#if __swift__
API_DEPRECATED("Use loadMediaSelectionGroup(for:) instead", macos(10.8, 13.0), ios(5.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -477,27 +401,20 @@
API_DEPRECATED("Use loadMediaSelectionGroupForMediaCharacteristic:completionHandler: instead", macos(10.8, 15.0), ios(5.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadMediaSelectionGroupForMediaCharacteristic:completionHandler:
- @abstract Loads an instance of AVMediaSelectionGroup that contains one or more options with the specified media characteristic.
- @param mediaCharacteristic
- A media characteristic for which you wish to obtain the available media selection options. AVMediaCharacteristicAudible, AVMediaCharacteristicLegible, and AVMediaCharacteristicVisual are currently supported.
-
- Pass AVMediaCharacteristicAudible to obtain the group of available options for audio media in various languages and for various purposes, such as descriptive audio.
- Pass AVMediaCharacteristicLegible to obtain the group of available options for subtitles in various languages and for various purposes.
- Pass AVMediaCharacteristicVisual to obtain the group of available options for video media.
- @param completionHandler
- A block that is invoked when loading is complete, vending an instance of AVMediaSelectionGroup (which may be nil) or an error.
- @discussion
- If the asset has no AVMediaSelectionGroup containing options with the specified media characteristic, the return value will be nil.
-
- Filtering of the options in the returned AVMediaSelectionGroup according to playability, locale, and additional media characteristics can be accomplished using the category AVMediaSelectionOptionFiltering defined on AVMediaSelectionGroup.
-*/
+/// Loads an instance of AVMediaSelectionGroup that contains one or more options with the specified media characteristic.
+///
+/// If the asset has no AVMediaSelectionGroup containing options with the specified media characteristic, the return value will be nil.
+///
+/// Filtering of the options in the returned AVMediaSelectionGroup according to playability, locale, and additional media characteristics can be accomplished using the category AVMediaSelectionOptionFiltering defined on AVMediaSelectionGroup.
+///
+/// - Parameter mediaCharacteristic: A media characteristic for which you wish to obtain the available media selection options. AVMediaCharacteristicAudible, AVMediaCharacteristicLegible, and AVMediaCharacteristicVisual are currently supported.
+/// Pass AVMediaCharacteristicAudible to obtain the group of available options for audio media in various languages and for various purposes, such as descriptive audio.
+/// Pass AVMediaCharacteristicLegible to obtain the group of available options for subtitles in various languages and for various purposes
+/// Pass AVMediaCharacteristicVisual to obtain the group of available options for video media.
+/// - Parameter completionHandler: A block that is invoked when loading is complete, vending an instance of AVMediaSelectionGroup (which may be nil) or an error.
- (void)loadMediaSelectionGroupForMediaCharacteristic:(AVMediaCharacteristic)mediaCharacteristic completionHandler:(void (^ NS_SWIFT_SENDABLE)(AVMediaSelectionGroup * _Nullable_result, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
-/*!
- @property preferredMediaSelection
- @abstract Provides an instance of AVMediaSelection with default selections for each of the receiver's media selection groups.
-*/
+
+/// Provides an instance of AVMediaSelection with default selections for each of the receiver's media selection groups.
@property (nonatomic, readonly) AVMediaSelection *preferredMediaSelection
#if __swift__
API_DEPRECATED("Use load(.preferredMediaSelection) instead", macos(10.11, 13.0), ios(9.0, 16.0), tvos(9.0, 16.0), watchos(2.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -505,10 +422,7 @@
API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), watchos(2.0), visionos(1.0));
#endif
-/*!
- @property allMediaSelections
- @abstract Provides an array of all permutations of AVMediaSelection for this asset.
-*/
+/// Provides an array of all permutations of AVMediaSelection for this asset.
@property (nonatomic, readonly) NSArray <AVMediaSelection *> *allMediaSelections
#if __swift__
API_DEPRECATED("Use load(.allMediaSelections) instead", macos(10.13, 13.0), ios(11.0, 16.0), tvos(11.0, 16.0), watchos(4.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -521,11 +435,9 @@
@interface AVAsset (AVAssetProtectedContent)
-/*!
- @property hasProtectedContent
- @abstract Indicates whether or not the asset has protected content.
- @discussion Assets containing protected content may not be playable without successful authorization, even if the value of the "playable" property is YES. See the properties in the AVAssetUsability category for details on how such an asset may be used. On macOS, clients can use the interfaces in AVPlayerItemProtectedContentAdditions.h to request authorization to play the asset.
-*/
+/// Indicates whether or not the asset has protected content.
+///
+/// Assets containing protected content may not be playable without successful authorization, even if the value of the "playable" property is YES. See the properties in the AVAssetUsability category for details on how such an asset may be used. On macOS, clients can use the interfaces in AVPlayerItemProtectedContentAdditions.h to request authorization to play the asset.
@property (nonatomic, readonly) BOOL hasProtectedContent
#if __swift__
API_DEPRECATED("Use load(.hasProtectedContent) instead", macos(10.7, 13.0), ios(4.2, 16.0), tvos(9.0, 16.0)) API_UNAVAILABLE(watchos, visionos);
@@ -538,12 +450,9 @@
@interface AVAsset (AVAssetFragments)
-/*!
- @property canContainFragments
- @abstract Indicates whether the asset is capable of being extended by fragments.
- @discussion For QuickTime movie files and MPEG-4 files, the value of canContainFragments is YES if an 'mvex' box is present in the 'moov' box. For those types, the 'mvex' box signals the possible presence of later 'moof' boxes.
-*/
-
+/// Indicates whether the asset is capable of being extended by fragments.
+///
+/// For QuickTime movie files and MPEG-4 files, the value of canContainFragments is YES if an 'mvex' box is present in the 'moov' box. For those types, the 'mvex' box signals the possible presence of later 'moof' boxes.
@property (nonatomic, readonly) BOOL canContainFragments
#if __swift__
API_DEPRECATED("Use load(.canContainFragments) instead", macos(10.11, 13.0), ios(9.0, 16.0), tvos(9.0, 16.0)) API_UNAVAILABLE(watchos, visionos);
@@ -551,11 +460,9 @@
API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
#endif
-/*!
- @property containsFragments
- @abstract Indicates whether the asset is extended by at least one fragment.
- @discussion For QuickTime movie files and MPEG-4 files, the value of this property is YES if canContainFragments is YES and at least one 'moof' box is present after the 'moov' box.
-*/
+/// Indicates whether the asset is extended by at least one fragment.
+///
+/// For QuickTime movie files and MPEG-4 files, the value of this property is YES if canContainFragments is YES and at least one 'moof' box is present after the 'moov' box.
@property (nonatomic, readonly) BOOL containsFragments
#if __swift__
API_DEPRECATED("Use load(.containsFragments) instead", macos(10.11, 13.0), ios(9.0, 16.0), tvos(9.0, 16.0)) API_UNAVAILABLE(watchos, visionos);
@@ -563,11 +470,9 @@
API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
#endif
-/*!
- @property overallDurationHint
- @abstract Indicates the total duration of fragments that either exist now or may be appended in the future in order to extend the duration of the asset.
- @discussion For QuickTime movie files and MPEG-4 files, the value of this property is obtained from the 'mehd' box of the 'mvex' box, if present. If no total fragment duration hint is available, the value of this property is kCMTimeInvalid.
-*/
+/// Indicates the total duration of fragments that either exist now or may be appended in the future in order to extend the duration of the asset.
+///
+/// For QuickTime movie files and MPEG-4 files, the value of this property is obtained from the 'mehd' box of the 'mvex' box, if present. If no total fragment duration hint is available, the value of this property is kCMTimeInvalid.
@property (nonatomic, readonly) CMTime overallDurationHint
#if __swift__
API_DEPRECATED("Use load(.overallDurationHint) instead", macos(10.12.2, 13.0), ios(10.2, 16.0), tvos(10.2, 16.0), watchos(3.2, 9.0)) API_UNAVAILABLE(visionos);
@@ -580,11 +485,9 @@
@interface AVAsset (AVAssetUsability)
-/*!
- @property playable
- @abstract Indicates whether an AVPlayer can play the contents of the asset in a manner that meets user expectations.
- @discussion A client can attempt playback when playable is NO, this however may lead to a substandard playback experience.
-*/
+/// Indicates whether an AVPlayer can play the contents of the asset in a manner that meets user expectations.
+///
+/// A client can attempt playback when playable is NO, this however may lead to a substandard playback experience.
@property (nonatomic, readonly, getter=isPlayable) BOOL playable
#if __swift__
API_DEPRECATED("Use load(.isPlayable) instead", macos(10.7, 13.0), ios(4.3, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -592,8 +495,7 @@
API_AVAILABLE(macos(10.7), ios(4.3), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/* indicates whether an AVAssetExportSession can be used with the receiver for export
-*/
+/// Indicates whether an AVAssetExportSession can be used with the receiver for export
@property (nonatomic, readonly, getter=isExportable) BOOL exportable
#if __swift__
API_DEPRECATED("Use load(.isExportable) instead", macos(10.7, 13.0), ios(4.3, 16.0), tvos(9.0, 16.0)) API_UNAVAILABLE(watchos, visionos);
@@ -601,8 +503,7 @@
API_AVAILABLE(macos(10.7), ios(4.3), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
#endif
-/* indicates whether an AVAssetReader can be used with the receiver for extracting media data
-*/
+/// Indicates whether an AVAssetReader can be used with the receiver for extracting media data
@property (nonatomic, readonly, getter=isReadable) BOOL readable
#if __swift__
API_DEPRECATED("Use load(.isReadable) instead", macos(10.7, 13.0), ios(4.3, 16.0), tvos(9.0, 16.0)) API_UNAVAILABLE(watchos, visionos);
@@ -610,8 +511,7 @@
API_AVAILABLE(macos(10.7), ios(4.3), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
#endif
-/* indicates whether the receiver can be used to build an AVMutableComposition
-*/
+/// Indicates whether the receiver can be used to build an AVMutableComposition
@property (nonatomic, readonly, getter=isComposable) BOOL composable
#if __swift__
API_DEPRECATED("Use load(.isComposable) instead", macos(10.7, 13.0), ios(4.3, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -619,8 +519,7 @@
API_AVAILABLE(macos(10.7), ios(4.3), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/* indicates whether the receiver can be written to the saved photos album
-*/
+/// Indicates whether the receiver can be written to the saved photos album
@property (nonatomic, readonly, getter=isCompatibleWithSavedPhotosAlbum) BOOL compatibleWithSavedPhotosAlbum
#if __swift__
API_DEPRECATED("Use load(.isCompatibleWithSavedPhotosAlbum) instead", ios(5.0, 16.0), tvos(9.0, 16.0)) API_UNAVAILABLE(macos, watchos, visionos);
@@ -628,11 +527,9 @@
API_AVAILABLE(ios(5.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(macos, watchos);
#endif
-/*!
- @property compatibleWithAirPlayVideo
- @abstract Indicates whether the asset is compatible with AirPlay Video.
- @discussion YES if an AVPlayerItem initialized with the receiver can be played by an external device via AirPlay Video.
- */
+/// Indicates whether the asset is compatible with AirPlay Video.
+///
+/// YES if an AVPlayerItem initialized with the receiver can be played by an external device via AirPlay Video.
@property (nonatomic, readonly, getter=isCompatibleWithAirPlayVideo) BOOL compatibleWithAirPlayVideo
#if __swift__
API_DEPRECATED("Use load(.isCompatibleWithAirPlayVideo) instead", macos(10.11, 13.0), ios(9.0, 16.0), tvos(9.0, 16.0)) API_UNAVAILABLE(watchos, visionos);
@@ -647,140 +544,100 @@
#pragma mark --- AVURLAsset ---
// Keys for options dictionary for use with -[AVURLAsset initWithURL:options:]
-/*!
- @constant AVURLAssetPreferPreciseDurationAndTimingKey
- @abstract
- Indicates whether the asset should be prepared to indicate a precise duration and provide precise random access by time.
- The value for this key is a boolean NSNumber.
- @discussion
- If nil is passed as the value of the options parameter to -[AVURLAsset initWithURL:options:], or if a dictionary that lacks a value for the key AVURLAssetPreferPreciseDurationAndTimingKey is passed instead, a default value of NO is assumed. If the asset is intended to be played only, because AVPlayer will support approximate random access by time when full precision isn't available, the default value of NO will suffice.
- Pass YES if longer loading times are acceptable in cases in which precise timing is required. If the asset is intended to be inserted into an AVMutableComposition, precise random access is typically desirable and the value of YES is recommended.
- Note that such precision may require additional parsing of the resource in advance of operations that make use of any portion of it, depending on the specifics of its container format. Many container formats provide sufficient summary information for precise timing and do not require additional parsing to prepare for it; QuickTime movie files and MPEG-4 files are examples of such formats. Other formats do not provide sufficient summary information, and precise random access for them is possible only after a preliminary examination of a file's contents.
- If you pass YES for an asset that you intend to play via an instance of AVPlayerItem and you are prepared for playback to commence before the value of -[AVPlayerItem duration] becomes available, you can omit the key @"duration" from the array of AVAsset keys you pass to -[AVPlayerItem initWithAsset:automaticallyLoadedAssetKeys:] in order to prevent AVPlayerItem from automatically loading the value of duration while the item becomes ready to play.
- If precise duration and timing is not possible for the timed media resource referenced by the asset's URL, AVAsset.providesPreciseDurationAndTiming will be NO even if precise timing is requested via the use of this key.
-
-*/
+/// Indicates whether the asset should be prepared to indicate a precise duration and provide precise random access by time.
+///
+/// The value for this key is a boolean NSNumber.
+///
+/// If nil is passed as the value of the options parameter to -[AVURLAsset initWithURL:options:], or if a dictionary that lacks a value for the key AVURLAssetPreferPreciseDurationAndTimingKey is passed instead, a default value of NO is assumed. If the asset is intended to be played only, because AVPlayer will support approximate random access by time when full precision isn't available, the default value of NO will suffice.
+/// Pass YES if longer loading times are acceptable in cases in which precise timing is required. If the asset is intended to be inserted into an AVMutableComposition, precise random access is typically desirable and the value of YES is recommended.
+/// Note that such precision may require additional parsing of the resource in advance of operations that make use of any portion of it, depending on the specifics of its container format. Many container formats provide sufficient summary information for precise timing and do not require additional parsing to prepare for it; QuickTime movie files and MPEG-4 files are examples of such formats. Other formats do not provide sufficient summary information, and precise random access for them is possible only after a preliminary examination of a file's contents.
+/// If you pass YES for an asset that you intend to play via an instance of AVPlayerItem and you are prepared for playback to commence before the value of -[AVPlayerItem duration] becomes available, you can omit the key @"duration" from the array of AVAsset keys you pass to -[AVPlayerItem initWithAsset:automaticallyLoadedAssetKeys:] in order to prevent AVPlayerItem from automatically loading the value of duration while the item becomes ready to play.
+/// If precise duration and timing is not possible for the timed media resource referenced by the asset's URL, AVAsset.providesPreciseDurationAndTiming will be NO even if precise timing is requested via the use of this key.
AVF_EXPORT NSString *const AVURLAssetPreferPreciseDurationAndTimingKey API_AVAILABLE(macos(10.7), ios(4.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/*!
- @constant AVURLAssetOverrideMIMETypeKey
- @abstract
- Indicates the MIME type that should be used to identify the format of the media resource.
- @discussion
- When a value for this key is provided, only the specified MIME type is considered in determining how to handle or parse the media resource. Any other information that may be available, such as the URL path extension or a server-provided MIME type, is ignored.
-*/
+/// Indicates the MIME type that should be used to identify the format of the media resource.
+///
+/// When a value for this key is provided, only the specified MIME type is considered in determining how to handle or parse the media resource. Any other information that may be available, such as the URL path extension or a server-provided MIME type, is ignored.
AVF_EXPORT NSString *const AVURLAssetOverrideMIMETypeKey API_AVAILABLE(macos(14.0), ios(17.0), tvos(17.0), watchos(10.0), visionos(1.0));
-/*!
- @constant AVURLAssetReferenceRestrictionsKey
- @abstract
- Indicates the restrictions used by the asset when resolving references to external media data. The value of this key is an NSNumber wrapping an AVAssetReferenceRestrictions enum value or the logical combination of multiple such values.
- @discussion
- Some assets can contain references to media data stored outside the asset's container file, for example in another file. This key can be used to specify a policy to use when these references are encountered. If an asset contains one or more references of a type that is forbidden by the reference restrictions, loading of asset properties will fail. In addition, such an asset cannot be used with other AVFoundation modules, such as AVPlayerItem or AVAssetExportSession.
-*/
+/// Indicates the restrictions used by the asset when resolving references to external media data. The value of this key is an NSNumber wrapping an AVAssetReferenceRestrictions enum value or the logical combination of multiple such values.
+///
+/// Some assets can contain references to media data stored outside the asset's container file, for example in another file. This key can be used to specify a policy to use when these references are encountered. If an asset contains one or more references of a type that is forbidden by the reference restrictions, loading of asset properties will fail. In addition, such an asset cannot be used with other AVFoundation modules, such as AVPlayerItem or AVAssetExportSession.
AVF_EXPORT NSString *const AVURLAssetReferenceRestrictionsKey API_AVAILABLE(macos(10.7), ios(5.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @constant AVURLAssetHTTPCookiesKey
- @abstract
- HTTP cookies that the AVURLAsset may send with HTTP requests
- Standard cross-site policy still applies: cookies will only be sent to domains to which they apply.
- @discussion
- By default, an AVURLAsset will only have access to cookies in the client's default cookie storage
- that apply to the AVURLAsset's URL. You can supplement the cookies available to the asset
- via use of this initialization option
-
- HTTP cookies do not apply to non-HTTP(S) URLS.
- In HLS, many HTTP requests (e.g., media, crypt key, variant index) might be issued to different paths or hosts.
- In both of these cases, HTTP requests will be missing any cookies that do not apply to the AVURLAsset's URL.
- This init option allows the AVURLAsset to use additional HTTP cookies for those HTTP(S) requests.
- */
+/// HTTP cookies that the AVURLAsset may send with HTTP requests
+///
+/// Standard cross-site policy still applies: cookies will only be sent to domains to which they apply.
+///
+/// By default, an AVURLAsset will only have access to cookies in the client's default cookie storage
+/// that apply to the AVURLAsset's URL. You can supplement the cookies available to the asset
+/// via use of this initialization option
+///
+/// HTTP cookies do not apply to non-HTTP(S) URLS.
+/// In HLS, many HTTP requests (e.g., media, crypt key, variant index) might be issued to different paths or hosts.
+/// In both of these cases, HTTP requests will be missing any cookies that do not apply to the AVURLAsset's URL.
+/// This init option allows the AVURLAsset to use additional HTTP cookies for those HTTP(S) requests.
AVF_EXPORT NSString *const AVURLAssetHTTPCookiesKey API_AVAILABLE(macos(10.15), ios(8.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/*!
- @constant AVURLAssetAllowsCellularAccessKey
- @abstract Indicates whether network requests on behalf of this asset are allowed to use the cellular interface.
- @discussion
- Default is YES.
-*/
+/// Indicates whether network requests on behalf of this asset are allowed to use the cellular interface.
+///
+/// Default is YES.
AVF_EXPORT NSString *const AVURLAssetAllowsCellularAccessKey API_AVAILABLE(macos(10.15), ios(10.0), tvos(10.0), watchos(3.0), visionos(1.0));
-/*!
- @constant AVURLAssetAllowsExpensiveNetworkAccessKey
- @abstract Indicates whether network requests on behalf of this asset are allowed to use the expensive interface (e.g. cellular, tethered, constrained).
- @discussion
- Default is YES.
- */
+/// Indicates whether network requests on behalf of this asset are allowed to use the expensive interface (e.g. cellular, tethered, constrained).
+///
+/// Default is YES.
AVF_EXPORT NSString *const AVURLAssetAllowsExpensiveNetworkAccessKey API_AVAILABLE(macos(10.15), ios(13.0), tvos(13.0), watchos(6.0), visionos(1.0));
-/*!
- @constant AVURLAssetAllowsConstrainedNetworkAccessKey
- @abstract Indicates whether network requests on behalf of this asset are allowed to use the constrained interface (e.g. interfaces marked as being in data saver mode).
- @discussion
- Default is YES.
- */
+/// Indicates whether network requests on behalf of this asset are allowed to use the constrained interface (e.g. interfaces marked as being in data saver mode).
+///
+/// Default is YES.
AVF_EXPORT NSString *const AVURLAssetAllowsConstrainedNetworkAccessKey API_AVAILABLE(macos(10.15), ios(13.0), tvos(13.0), watchos(6.0), visionos(1.0));
-/*!
- @constant AVURLAssetShouldSupportAliasDataReferencesKey
- @abstract Indicates whether alias data references in the asset should be parsed and resolved.
- @discussion
- Default is NO. Although the majority of QuickTime movie files contain all of the media data they require, some contain references to media stored in other files. While AVFoundation and CoreMedia typically employ a URL reference for this purpose, older implementations such as QuickTime 7 have commonly employed a Macintosh alias instead, as documented in the QuickTime File Format specification. If your application must work with legacy QuickTime movie files containing alias-based references to media data stored in other files, the use of this AVURLAsset initialization option is appropriate.
-
- If you provide a value for AVURLAssetReferenceRestrictionsKey, restrictions will be observed for resolved alias references just as they are for URL references.
-
- For more details about alias resolution, consult documentation of the bookmark-related interfaces of NSURL.
- */
+/// Indicates whether alias data references in the asset should be parsed and resolved.
+///
+/// Default is NO. Although the majority of QuickTime movie files contain all of the media data they require, some contain references to media stored in other files. While AVFoundation and CoreMedia typically employ a URL reference for this purpose, older implementations such as QuickTime 7 have commonly employed a Macintosh alias instead, as documented in the QuickTime File Format specification. If your application must work with legacy QuickTime movie files containing alias-based references to media data stored in other files, the use of this AVURLAsset initialization option is appropriate.
+///
+/// If you provide a value for AVURLAssetReferenceRestrictionsKey, restrictions will be observed for resolved alias references just as they are for URL references.
+///
+/// For more details about alias resolution, consult documentation of the bookmark-related interfaces of NSURL.
AVF_EXPORT NSString *const AVURLAssetShouldSupportAliasDataReferencesKey API_AVAILABLE(macos(10.10)) API_UNAVAILABLE(ios, tvos, watchos, visionos);
-/*!
- @constant AVURLAssetURLRequestAttributionKey
- @abstract
- Specifies the attribution of the URLs requested by this asset.
- @discussion
- Value is an NSNumber whose value is an NSURLRequestAttribution (see NSURLRequest.h).
- Default value is NSURLRequestAttributionDeveloper.
- All NSURLRequests issed on behalf of this AVURLAsset will be attributed with this value and follow the App Privacy Policy accordingly.
-*/
+/// Specifies the attribution of the URLs requested by this asset.
+///
+/// Value is an NSNumber whose value is an NSURLRequestAttribution (see NSURLRequest.h).
+/// Default value is NSURLRequestAttributionDeveloper.
+/// All NSURLRequests issed on behalf of this AVURLAsset will be attributed with this value and follow the App Privacy Policy accordingly.
AVF_EXPORT NSString *const AVURLAssetURLRequestAttributionKey API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
-/*!
- @constant AVURLAssetHTTPUserAgentKey
- @abstract
- Specifies the value of the User-Agent header to add to HTTP requests made by this asset.
- @discussion
- Value is an NSString
- Default value is the systems's default User-Agent.
-*/
+/// Specifies the value of the User-Agent header to add to HTTP requests made by this asset.
+///
+/// Value is an NSString
+/// Default value is the systems's default User-Agent.
AVF_EXPORT NSString *const AVURLAssetHTTPUserAgentKey API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
-/*!
- @constant AVURLAssetPrimarySessionIdentifierKey
- @abstract
- Specifies a UUID to append as the value of the query parameter "_HLS_primary_id" to selected HTTP requests issued on behalf of the asset. Supported for HLS assets only.
- @discussion
- Value is an NSUUID. Its UUID string value will be used as the query parameter.
- If you create AVURLAssets for the templateItems of AVPlayerInterstitialEvents and you want the instances of AVURLAsset that you create to be used during interstitial playback rather than equivalent AVURLAssets with the same URL, you must provide a value for this key that's equal to the httpSessionIdentifier of the primary AVPlayerItem's asset. See AVPlayerInterstitialEventController.h. This is especially useful if you require the use of a custom AVAssetResourceLoader delegate for interstitial assets.
-*/
+/// Specifies a UUID to append as the value of the query parameter "_HLS_primary_id" to selected HTTP requests issued on behalf of the asset. Supported for HLS assets only.
+///
+/// Value is an NSUUID. Its UUID string value will be used as the query parameter.
+/// If you create AVURLAssets for the templateItems of AVPlayerInterstitialEvents and you want the instances of AVURLAsset that you create to be used during interstitial playback rather than equivalent AVURLAssets with the same URL, you must provide a value for this key that's equal to the httpSessionIdentifier of the primary AVPlayerItem's asset. See AVPlayerInterstitialEventController.h. This is especially useful if you require the use of a custom AVAssetResourceLoader delegate for interstitial assets.
AVF_EXPORT NSString *const AVURLAssetPrimarySessionIdentifierKey API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
+/// Indicates whether additional projected media signaling in the asset should be parsed and resolved as format description extensions.
+///
+/// Default is NO.
+AVF_EXPORT NSString *const AVURLAssetShouldParseExternalSphericalTagsKey API_AVAILABLE(macos(26.0), ios(26.0), visionos(26.0)) API_UNAVAILABLE(tvos, watchos);
-/*!
- @class AVURLAsset
-
- @abstract AVURLAsset provides access to the AVAsset model for timed audiovisual media referenced by URL.
-
- @discussion
- Note that although instances of AVURLAsset are immutable, values for its keys may not be immediately available without blocking. See the discussion of the class AVAsset above regarding the availability of values for keys and the use of AVAsynchronousKeyValueLoading.
-
- Once an AVURLAsset's value for a key is available, it will not change. AVPlayerItem provides access to information that can change dynamically during playback; see AVPlayerItem.duration and AVPlayerItem.tracks.
-
- AVURLAssets can be initialized with NSURLs that refer to audiovisual media resources, such as streams (including HTTP live streams), QuickTime movie files, MP3 files, and files of other types.
-
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
@class AVURLAssetInternal;
+/// AVURLAsset provides access to the AVAsset model for timed audiovisual media referenced by URL.
+///
+/// Note that although instances of AVURLAsset are immutable, values for its keys may not be immediately available without blocking. See the discussion of the class AVAsset above regarding the availability of values for keys and the use of AVAsynchronousKeyValueLoading.
+///
+/// Once an AVURLAsset's value for a key is available, it will not change. AVPlayerItem provides access to information that can change dynamically during playback; see AVPlayerItem.duration and AVPlayerItem.tracks.
+///
+/// AVURLAssets can be initialized with NSURLs that refer to audiovisual media resources, such as streams (including HTTP live streams), QuickTime movie files, MP3 files, and files of other types.
+///
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(10.7), ios(4.0), tvos(9.0), watchos(1.0), visionos(1.0))
@interface AVURLAsset : AVAsset
@@ -790,84 +647,67 @@
}
AV_INIT_UNAVAILABLE
-/*!
- @method audiovisualTypes
- @abstract Provides the file types the AVURLAsset class understands.
- @result An NSArray of UTIs identifying the file types the AVURLAsset class understands.
-*/
-+ (NSArray<AVFileType> *)audiovisualTypes API_AVAILABLE(macos(10.7), ios(5.0), tvos(9.0), watchos(1.0), visionos(1.0));
+/// Provides the file types the AVURLAsset class understands.
+///
+/// - Returns: An NSArray of UTIs identifying the file types the AVURLAsset class understands.
++ (NSArray<AVFileType> *)audiovisualTypes API_DEPRECATED("Use audiovisualContentTypes instead", macos(10.7, API_TO_BE_DEPRECATED), ios(5.0, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), watchos(1.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED));
-/*!
- @method audiovisualMIMETypes
- @abstract Provides the MIME types the AVURLAsset class understands.
- @result An NSArray of NSStrings containing MIME types the AVURLAsset class understands.
-*/
+/// Provides the MIME types the AVURLAsset class understands.
+///
+/// - Returns: An NSArray of NSStrings containing MIME types the AVURLAsset class understands.
+ (NSArray<NSString *> *)audiovisualMIMETypes API_AVAILABLE(macos(10.7), ios(5.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/*!
- @method isPlayableExtendedMIMEType:
- @abstract Returns YES if asset is playable with the codec(s) and container type specified in extendedMIMEType. Returns NO otherwise.
- @param extendedMIMEType
- @result YES or NO.
- @discussion
- On releases prior to macOS 14, iOS 17, tvOS 17, and watchOS 10, regardless of the specified MIME type this method interprets all codecs parameters according to the ISO family syntax defined by RFC 6381 and evaluates playability according to whether the indicated codecs are supported when carried in container formats that conform to the ISO BMFF specification, such as the MPEG-4 file format.
- On releases starting with macOS 14, iOS 17, tvOS 17, and watchOS 10, this method interprets codecs parameters according to the syntax and namespace determined by the specified MIME type and evaluates playability according to whether the indicated codecs are supported when carried in the container format indicated by that MIME type. Codecs parameters for each of the following MIME types are supported: video/mp4 (per RFC 6381, ISO/IEC 14496-15 Annex E, et al), video/quicktime (RFC 6381 et al), video/mp2t (ISO/IEC 13818-1), audio/vnd.wave (RFC 2361), audio/aiff (using the CoreAudio AudioFormatID namespace), audio/x-caf (also using the CoreAudio AudioFormatID namespace), and audio/mpeg (e.g. codecs="mp3"). MIME types supported as alternatives for the same container formats, e.g audio/mp4, are equivalently treated. If the indicated MIME type defines no supported syntax and namespace for codecs parameters, when any codecs parameter is present this method returns NO.
-*/
+/// Provides the content types the AVURLAsset class understands.
+///
+/// - Returns: An NSArray of UTTypes identifying the content types the AVURLAsset class understands.
+@property(class, readonly, copy) NSArray<UTType *> *audiovisualContentTypes API_AVAILABLE(macos(26.0), ios(26.0), tvos(26.0), watchos(26.0), visionos(26.0));
+
+/// Returns YES if asset is playable with the codec(s) and container type specified in extendedMIMEType. Returns NO otherwise.
+///
+/// On releases prior to macOS 14, iOS 17, tvOS 17, and watchOS 10, regardless of the specified MIME type this method interprets all codecs parameters according to the ISO family syntax defined by RFC 6381 and evaluates playability according to whether the indicated codecs are supported when carried in container formats that conform to the ISO BMFF specification, such as the MPEG-4 file format.
+/// On releases starting with macOS 14, iOS 17, tvOS 17, and watchOS 10, this method interprets codecs parameters according to the syntax and namespace determined by the specified MIME type and evaluates playability according to whether the indicated codecs are supported when carried in the container format indicated by that MIME type. Codecs parameters for each of the following MIME types are supported: video/mp4 (per RFC 6381, ISO/IEC 14496-15 Annex E, et al), video/quicktime (RFC 6381 et al), video/mp2t (ISO/IEC 13818-1), audio/vnd.wave (RFC 2361), audio/aiff (using the CoreAudio AudioFormatID namespace), audio/x-caf (also using the CoreAudio AudioFormatID namespace), and audio/mpeg (e.g. codecs="mp3"). MIME types supported as alternatives for the same container formats, e.g audio/mp4, are equivalently treated. If the indicated MIME type defines no supported syntax and namespace for codecs parameters, when any codecs parameter is present this method returns NO.
+///
+/// - Parameter extendedMIMEType:
+///
+/// - Returns: YES or NO.
+ (BOOL)isPlayableExtendedMIMEType: (NSString *)extendedMIMEType API_AVAILABLE(macos(10.7), ios(5.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/*!
- @method URLAssetWithURL:options:
- @abstract Returns an instance of AVURLAsset for inspection of a media resource.
- @param URL
- An instance of NSURL that references a media resource.
- @param options
- An instance of NSDictionary that contains keys for specifying options for the initialization of the AVURLAsset. See AVURLAssetPreferPreciseDurationAndTimingKey and AVURLAssetReferenceRestrictionsKey above.
- @result An instance of AVURLAsset.
-*/
+/// Returns an instance of AVURLAsset for inspection of a media resource.
+///
+/// - Parameter URL: An instance of NSURL that references a media resource.
+/// - Parameter options: An instance of NSDictionary that contains keys for specifying options for the initialization of the AVURLAsset. See AVURLAssetPreferPreciseDurationAndTimingKey and AVURLAssetReferenceRestrictionsKey above.
+///
+/// - Returns: An instance of AVURLAsset.
+ (instancetype)URLAssetWithURL:(NSURL *)URL options:(nullable NSDictionary<NSString *, id> *)options;
-/*!
- @method initWithURL:options:
- @abstract Initializes an instance of AVURLAsset for inspection of a media resource.
- @param URL
- An instance of NSURL that references a media resource.
- @param options
- An instance of NSDictionary that contains keys for specifying options for the initialization of the AVURLAsset. See AVURLAssetPreferPreciseDurationAndTimingKey and AVURLAssetReferenceRestrictionsKey above.
- @result An instance of AVURLAsset.
-*/
+/// Initializes an instance of AVURLAsset for inspection of a media resource.
+///
+/// - Parameter URL: An instance of NSURL that references a media resource.
+/// - Parameter options: An instance of NSDictionary that contains keys for specifying options for the initialization of the AVURLAsset. See AVURLAssetPreferPreciseDurationAndTimingKey and AVURLAssetReferenceRestrictionsKey above.
+///
+/// - Returns: An instance of AVURLAsset.
- (instancetype)initWithURL:(NSURL *)URL options:(nullable NSDictionary<NSString *, id> *)options NS_DESIGNATED_INITIALIZER;
-/*!
- @property URL
- @abstract
- Indicates the URL with which the instance of AVURLAsset was initialized.
-*/
+/// Indicates the URL with which the instance of AVURLAsset was initialized.
@property (nonatomic, readonly, copy) NSURL *URL;
-/*!
- @property httpSessionIdentifier
- @abstract
- Provides the identifier that's automatically included in any HTTP request issued on behalf of this asset in the HTTP header field "X-Playback-Session-Id".
- @discussion
- The value is an NSUUID from which the UUID string can be obtained.
- Note that copies of an AVURLAsset vend an equivalent httpSessionIdentifier.
- */
- @property (nonatomic, readonly) NSUUID *httpSessionIdentifier API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
+/// Provides the identifier that's automatically included in any HTTP request issued on behalf of this asset in the HTTP header field "X-Playback-Session-Id".
+///
+/// The value is an NSUUID from which the UUID string can be obtained.
+/// Note that copies of an AVURLAsset vend an equivalent httpSessionIdentifier.
+@property (nonatomic, readonly) NSUUID *httpSessionIdentifier API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
@end
-
@class AVAssetResourceLoader;
@interface AVURLAsset (AVURLAssetURLHandling)
-/*!
- @property resourceLoader
- @abstract
- Provides access to an instance of AVAssetResourceLoader, which offers limited control over the handling of URLs that may be loaded in the course of performing operations on the asset, such as playback.
- The loading of file URLs cannot be mediated via use of AVAssetResourceLoader.
- Note that copies of an AVAsset will vend the same instance of AVAssetResourceLoader.
-*/
+/// Provides access to an instance of AVAssetResourceLoader, which offers limited control over the handling of URLs that may be loaded in the course of performing operations on the asset, such as playback.
+///
+/// The loading of file URLs cannot be mediated via use of AVAssetResourceLoader.
+///
+/// Note that copies of an AVAsset will vend the same instance of AVAssetResourceLoader.
@property (nonatomic, readonly) AVAssetResourceLoader *resourceLoader API_AVAILABLE(macos(10.9), ios(6.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
@end
@@ -876,27 +716,21 @@
@interface AVURLAsset (AVURLAssetCache)
-/*!
- @property assetCache
- @abstract Provides access to an instance of AVAssetCache to use for inspection of locally cached media data. Will be nil if an asset has not been configured to store or access media data from disk.
-*/
+/// Provides access to an instance of AVAssetCache to use for inspection of locally cached media data. Will be nil if an asset has not been configured to store or access media data from disk.
@property (nonatomic, readonly, nullable) AVAssetCache *assetCache API_AVAILABLE(macos(10.12), ios(10.0), tvos(10.0), visionos(1.0)) API_UNAVAILABLE(watchos);
@end
@interface AVURLAsset (AVAssetCompositionUtility )
-/*!
- @method compatibleTrackForCompositionTrack:
- @abstract Provides a reference to an AVAssetTrack of the target from which any timeRange
- can be inserted into a mutable composition track (via -[AVMutableCompositionTrack insertTimeRange:ofTrack:atTime:error:]).
- @param compositionTrack
- The composition track for which a compatible AVAssetTrack is requested.
- @result an instance of AVAssetTrack
- @discussion
- Finds a track of the target with content that can be accommodated by the specified composition track.
- The logical complement of -[AVMutableComposition mutableTrackCompatibleWithTrack:].
-*/
+/// Provides a reference to an AVAssetTrack of the target from which any timeRange can be inserted into a mutable composition track (via -[AVMutableCompositionTrack insertTimeRange:ofTrack:atTime:error:]).
+///
+/// Finds a track of the target with content that can be accommodated by the specified composition track.
+/// The logical complement of -[AVMutableComposition mutableTrackCompatibleWithTrack:].
+///
+/// - Parameter compositionTrack: The composition track for which a compatible AVAssetTrack is requested.
+///
+/// - Returns: an instance of AVAssetTrack
- (nullable AVAssetTrack *)compatibleTrackForCompositionTrack:(AVCompositionTrack *)compositionTrack
#if __swift__
API_DEPRECATED("Use findCompatibleTrack(for:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -904,29 +738,22 @@
API_DEPRECATED("Use findCompatibleTrackForCompositionTrack:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method findCompatibleTrackForCompositionTrack:completionHandler:
- @abstract Loads a reference to an AVAssetTrack of the target from which any timeRange
- can be inserted into a mutable composition track (via -[AVMutableCompositionTrack insertTimeRange:ofTrack:atTime:error:]).
- @param compositionTrack
- The composition track for which a compatible AVAssetTrack is requested.
- @param completionHandler
- A block that is invoked when loading is complete, vending an instance of AVAssetTrack or an error.
- @discussion
- Finds a track of the target with content that can be accommodated by the specified composition track.
- The logical complement of -[AVMutableComposition mutableTrackCompatibleWithTrack:].
-*/
+/// Loads a reference to an AVAssetTrack of the target from which any timeRange can be inserted into a mutable composition track (via -[AVMutableCompositionTrack insertTimeRange:ofTrack:atTime:error:]).
+///
+/// Finds a track of the target with content that can be accommodated by the specified composition track.
+/// The logical complement of -[AVMutableComposition mutableTrackCompatibleWithTrack:].
+///
+/// - Parameter compositionTrack: The composition track for which a compatible AVAssetTrack is requested.
+/// - Parameter completionHandler: A block that is invoked when loading is complete, vending an instance of AVAssetTrack or an error.
- (void)findCompatibleTrackForCompositionTrack:(AVCompositionTrack *)compositionTrack completionHandler:(void (^ NS_SWIFT_SENDABLE)(AVAssetTrack * _Nullable_result, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
@end
@interface AVURLAsset (AVAssetVariantInspection)
-/*!
- @property variants
- @abstract Provides an array of AVAssetVariants contained in the asset
- @discussion Some variants may not be playable according to the current device configuration.
-*/
+/// Provides an array of AVAssetVariants contained in the asset
+///
+/// Some variants may not be playable according to the current device configuration.
@property (nonatomic, readonly) NSArray<AVAssetVariant *> *variants
#if __swift__
API_DEPRECATED("Use load(.variants) instead", macos(10.12, 13.0), ios(15.0, 16.0), tvos(15.0, 16.0), watchos(8.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -936,59 +763,41 @@
@end
-/*!
- @category AVURLAssetNSItemProvider
- @discussion
- AVURLAssets can be shared through any interface that supports passing NSItemProviders. Note that only AVURLAssets with file URLs can be added to NSItemProviders. Attempting to share assets with non file URLs will result in an error.
-
- AVURLAssets can be retrieved from NSItemProviders by directly requesting an AVURLAsset through -[NSItemProvider loadObjectOfClass:completionHandler:]. Requesting data representations of AVURLAssets is not supported. File representations of AVURLAssets will be sent without copying the underlying media and the receiver will be extended readonly sandbox access to the sender's original URL until the AVURLAsset is deallocated. Use of NSFileCoordinator and NSFilePresenter is recommended for both the sender and receive to coordinate possible changes in the file's state once sharing has been completed.
-*/
+/// AVURLAssets can be shared through any interface that supports passing NSItemProviders. Note that only AVURLAssets with file URLs can be added to NSItemProviders. Attempting to share assets with non file URLs will result in an error.
+///
+/// AVURLAssets can be retrieved from NSItemProviders by directly requesting an AVURLAsset through -[NSItemProvider loadObjectOfClass:completionHandler:]. Requesting data representations of AVURLAssets is not supported. File representations of AVURLAssets will be sent without copying the underlying media and the receiver will be extended readonly sandbox access to the sender's original URL until the AVURLAsset is deallocated. Use of NSFileCoordinator and NSFilePresenter is recommended for both the sender and receive to coordinate possible changes in the file's state once sharing has been completed.
@interface AVURLAsset (AVURLAssetNSItemProvider) <NSItemProviderReading, NSItemProviderWriting>
@end
-/*!
- @interface AVMediaExtensionProperties
- @abstract A class incorporating properties for a MediaExtension
- @discussion AVMediaExtensionProperties objects are returned from property queries on AVAsset, AVPlayerItemTrack, AVSampleBufferDisplayLayer, or AVSampleBufferVideoRenderer.
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
+/// A class incorporating properties for a MediaExtension
+///
+/// AVMediaExtensionProperties objects are returned from property queries on AVAsset, AVPlayerItemTrack, AVSampleBufferDisplayLayer, or AVSampleBufferVideoRenderer.
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(15.0)) API_UNAVAILABLE(ios, tvos, watchos, visionos)
@interface AVMediaExtensionProperties : NSObject <NSCopying>
AV_INIT_UNAVAILABLE
-/*!
- @property extensionIdentifier
- @abstract The identifier of the Media Extension.
- @discussion The extension identifier string, corresponding to the ClassImplementationID value from the EXAppExtensionAttributes dictionary in the Info.plist file.
-*/
+/// The identifier of the Media Extension.
+///
+/// The extension identifier string, corresponding to the ClassImplementationID value from the EXAppExtensionAttributes dictionary in the Info.plist file.
@property (nonatomic, readonly) NSString *extensionIdentifier;
-/*!
- @property extensionName
- @abstract The name of the MediaExtension.
- @discussion The localized name of the MediaExtension format reader or video decoder, corresponding to the CFBundleDisplayName.
-*/
+/// The name of the MediaExtension.
+///
+/// The localized name of the MediaExtension format reader or video decoder, corresponding to the CFBundleDisplayName.
@property (nonatomic, readonly) NSString *extensionName;
-/*!
- @property containingBundleName
- @abstract The name of the containing application bundle.
- @discussion The localized name of the application that hosts the MediaExtension.
-*/
+/// The name of the containing application bundle.
+///
+/// The localized name of the application that hosts the MediaExtension.
@property (nonatomic, readonly) NSString *containingBundleName;
-/*!
- @property extensionURL
- @abstract The file URL of the MediaExtension bundle.
-*/
+/// The file URL of the MediaExtension bundle.
@property (nonatomic, readonly) NSURL *extensionURL;
-/*!
- @property containingBundleURL
- @abstract The file URL of the host application for the MediaExtension.
-*/
+/// The file URL of the host application for the MediaExtension.
@property (nonatomic, readonly) NSURL *containingBundleURL;
@end
@@ -996,13 +805,16 @@
API_AVAILABLE(macos(15.0)) API_UNAVAILABLE(ios, tvos, watchos, visionos)
@interface AVURLAsset (AVMediaExtension)
-/*!
- @property mediaExtensionProperties
- @abstract The properties of the MediaExtension format reader for the asset.
- @discussion If the asset is being decoded using a MediaExtension format reader, this property will return a AVMediaExtensionProperties object describing the extension. If the asset is not being decoded with a MediaExtension format reader, this property will return nil.
-*/
+/// The properties of the MediaExtension format reader for the asset.
+///
+/// If the asset is being decoded using a MediaExtension format reader, this property will return a AVMediaExtensionProperties object describing the extension. If the asset is not being decoded with a MediaExtension format reader, this property will return nil.
@property (nonatomic, readonly, nullable) AVMediaExtensionProperties *mediaExtensionProperties;
+/// The sidecar URL used by the MediaExtension.
+///
+/// The sidecar URL is returned only if the MediaExtension format reader supports sidecar files, and implements this property [MEFileInfo setSidecarFilename:]. Will return nil otherwise.
+@property (nonatomic, readonly, nullable) NSURL *sidecarURL API_AVAILABLE(macos(26.0)) API_UNAVAILABLE(ios, tvos, watchos, visionos) NS_SWIFT_UNAVAILABLE("Use load(.sidecarURL) instead");
+
@end
@@ -1013,61 +825,44 @@
Some of the notifications are also posted by instances of dynamic subclasses, AVFragmentedAsset and AVFragmentedMovie, but these are capable of changing only in well-defined ways and only under specific conditions that you control.
*/
-/*!
- @constant AVAssetDurationDidChangeNotification
- @abstract Posted when the duration of an AVFragmentedAsset changes while it's being minded by an AVFragmentedAssetMinder, but only for changes that occur after the status of the value of @"duration" has reached AVKeyValueStatusLoaded.
-*/
+/// Posted when the duration of an AVFragmentedAsset changes while it's being minded by an AVFragmentedAssetMinder, but only for changes that occur after the status of the value of @"duration" has reached AVKeyValueStatusLoaded.
AVF_EXPORT NSString *const AVAssetDurationDidChangeNotification API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), watchos(2.0), visionos(1.0));
-/*!
- @constant AVAssetContainsFragmentsDidChangeNotification
- @abstract Posted after the value of @"containsFragments" has already been loaded and the AVFragmentedAsset is added to an AVFragmentedAssetMinder, either when 1) fragments are detected in the asset on disk after it had previously contained none or when 2) no fragments are detected in the asset on disk after it had previously contained one or more.
-*/
+/// Posted after the value of @"containsFragments" has already been loaded and the AVFragmentedAsset is added to an AVFragmentedAssetMinder, either when 1) fragments are detected in the asset on disk after it had previously contained none or when 2) no fragments are detected in the asset on disk after it had previously contained one or more.
AVF_EXPORT NSString *const AVAssetContainsFragmentsDidChangeNotification API_AVAILABLE(macos(10.11), ios(12.0), tvos(12.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @constant AVAssetWasDefragmentedNotification
- @abstract Posted when the asset on disk is defragmented while an AVFragmentedAsset is being minded by an AVFragmentedAssetMinder, but only if the defragmentation occurs after the status of the value of @"canContainFragments" has reached AVKeyValueStatusLoaded.
- @discussion After this notification is posted, the value of the asset properties canContainFragments and containsFragments will both be NO.
-*/
+/// Posted when the asset on disk is defragmented while an AVFragmentedAsset is being minded by an AVFragmentedAssetMinder, but only if the defragmentation occurs after the status of the value of @"canContainFragments" has reached AVKeyValueStatusLoaded.
+///
+/// After this notification is posted, the value of the asset properties canContainFragments and containsFragments will both be NO.
AVF_EXPORT NSString *const AVAssetWasDefragmentedNotification API_AVAILABLE(macos(10.11), ios(12.0), tvos(12.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @constant AVAssetChapterMetadataGroupsDidChangeNotification
- @abstract Posted when the collection of arrays of timed metadata groups representing chapters of an AVAsset change and when any of the contents of the timed metadata groups change, but only for changes that occur after the status of the value of @"availableChapterLocales" has reached AVKeyValueStatusLoaded.
-*/
+/// Posted when the collection of arrays of timed metadata groups representing chapters of an AVAsset change and when any of the contents of the timed metadata groups change, but only for changes that occur after the status of the value of @"availableChapterLocales" has reached AVKeyValueStatusLoaded.
AVF_EXPORT NSString *const AVAssetChapterMetadataGroupsDidChangeNotification API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), watchos(2.0), visionos(1.0));
-/*!
- @constant AVAssetMediaSelectionGroupsDidChangeNotification
- @abstract Posted when the collection of media selection groups provided by an AVAsset changes and when any of the contents of its media selection groups change, but only for changes that occur after the status of the value of @"availableMediaCharacteristicsWithMediaSelectionOptions" has reached AVKeyValueStatusLoaded.
-*/
+/// Posted when the collection of media selection groups provided by an AVAsset changes and when any of the contents of its media selection groups change, but only for changes that occur after the status of the value of @"availableMediaCharacteristicsWithMediaSelectionOptions" has reached AVKeyValueStatusLoaded.
AVF_EXPORT NSString *const AVAssetMediaSelectionGroupsDidChangeNotification API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), watchos(2.0), visionos(1.0));
#pragma mark --- AVFragmentedAsset ---
-/*!
- @class AVFragmentedAsset
-
- @abstract A subclass of AVURLAsset that represents media resources that can be extended in total duration without modifying previously existing data structures.
- Such media resources include QuickTime movie files and MPEG-4 files that indicate, via an 'mvex' box in their 'moov' box, that they accommodate additional fragments. Media resources of other types may also be supported. To check whether a given instance of AVFragmentedAsset can be used to monitor the addition of fragments, check the value of the AVURLAsset property canContainFragments.
- An AVFragmentedAsset is capable of changing the values of certain of its properties and those of its tracks, while an operation that appends fragments to the underlying media resource in in progress, if the AVFragmentedAsset is associated with an instance of AVFragmentedAssetMinder.
- @discussion While associated with an AVFragmentedAssetMinder, AVFragmentedAsset posts AVAssetDurationDidChangeNotification whenever new fragments are detected, as appropriate. It may also post AVAssetContainsFragmentsDidChangeNotification and AVAssetWasDefragmentedNotification, as discussed in documentation of those notifications.
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
@protocol AVFragmentMinding
-/*!
- @property associatedWithFragmentMinder
- @abstract Indicates whether an AVAsset that supports fragment minding is currently associated with a fragment minder, e.g. an instance of AVFragmentedAssetMinder.
- @discussion AVAssets that support fragment minding post change notifications only while associated with a fragment minder.
-*/
+/// Indicates whether an AVAsset that supports fragment minding is currently associated with a fragment minder, e.g. an instance of AVFragmentedAssetMinder.
+///
+/// AVAssets that support fragment minding post change notifications only while associated with a fragment minder.
@property (nonatomic, readonly, getter=isAssociatedWithFragmentMinder) BOOL associatedWithFragmentMinder API_AVAILABLE(macos(10.11), ios(12.0), tvos(12.0), watchos(6.0), visionos(1.0));
@end
@class AVFragmentedAssetInternal;
+/// A subclass of AVURLAsset that represents media resources that can be extended in total duration without modifying previously existing data structures.
+///
+/// Such media resources include QuickTime movie files and MPEG-4 files that indicate, via an 'mvex' box in their 'moov' box, that they accommodate additional fragments. Media resources of other types may also be supported. To check whether a given instance of AVFragmentedAsset can be used to monitor the addition of fragments, check the value of the AVURLAsset property canContainFragments.
+///
+/// An AVFragmentedAsset is capable of changing the values of certain of its properties and those of its tracks, while an operation that appends fragments to the underlying media resource in in progress, if the AVFragmentedAsset is associated with an instance of AVFragmentedAssetMinder.
+///
+/// While associated with an AVFragmentedAssetMinder, AVFragmentedAsset posts AVAssetDurationDidChangeNotification whenever new fragments are detected, as appropriate. It may also post AVAssetContainsFragmentsDidChangeNotification and AVAssetWasDefragmentedNotification, as discussed in documentation of those notifications.
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(10.11), ios(12.0), tvos(12.0), watchos(6.0), visionos(1.0))
@interface AVFragmentedAsset : AVURLAsset <AVFragmentMinding>
@@ -1076,36 +871,30 @@
AVFragmentedAssetInternal *_fragmentedAsset __attribute__((unused));
}
-/*!
- @method fragmentedAssetWithURL:options:
- @abstract Returns an instance of AVFragmentedAsset for inspection of a fragmented media resource.
- @param URL
- An instance of NSURL that references a media resource.
- @param options
- An instance of NSDictionary that contains keys for specifying options for the initialization of the AVFragmentedAsset. See AVURLAssetPreferPreciseDurationAndTimingKey and AVURLAssetReferenceRestrictionsKey above.
- @result An instance of AVFragmentedAsset.
-*/
+/// Returns an instance of AVFragmentedAsset for inspection of a fragmented media resource.
+///
+/// - Parameter URL: An instance of NSURL that references a media resource.
+/// - Parameter options: An instance of NSDictionary that contains keys for specifying options for the initialization of the AVFragmentedAsset. See AVURLAssetPreferPreciseDurationAndTimingKey and AVURLAssetReferenceRestrictionsKey above.
+///
+/// - Returns: An instance of AVFragmentedAsset.
+ (instancetype)fragmentedAssetWithURL:(NSURL *)URL options:(nullable NSDictionary<NSString *, id> *)options;
-/*!
- @property tracks
- @abstract The tracks in an asset.
- @discussion The value of this property is an array of tracks the asset contains; the tracks are of type AVFragmentedAssetTrack.
-*/
+/// The tracks in an asset.
+///
+/// The value of this property is an array of tracks the asset contains; the tracks are of type AVFragmentedAssetTrack.
@property (nonatomic, readonly) NSArray<AVFragmentedAssetTrack *> *tracks AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.tracks) instead", macos(10.11, 13.0), ios(12.0, 16.0), tvos(12.0, 16.0), watchos(6.0, 9.0), visionos(1.0, 1.0));
@end
@interface AVFragmentedAsset (AVFragmentedAssetTrackInspection)
-/*!
- @method trackWithTrackID:
- @abstract Provides an instance of AVFragmentedAssetTrack that represents the track of the specified trackID.
- @param trackID
- The trackID of the requested AVFragmentedAssetTrack.
- @result An instance of AVFragmentedAssetTrack; may be nil if no track of the specified trackID is available.
- @discussion Becomes callable without blocking when the key @"tracks" has been loaded
-*/
+/// Provides an instance of AVFragmentedAssetTrack that represents the track of the specified trackID.
+///
+/// Becomes callable without blocking when the key @"tracks" has been loaded
+///
+/// - Parameter trackID: The trackID of the requested AVFragmentedAssetTrack.
+///
+/// - Returns: An instance of AVFragmentedAssetTrack; may be nil if no track of the specified trackID is available.
- (nullable AVFragmentedAssetTrack *)trackWithTrackID:(CMPersistentTrackID)trackID
#if __swift__
API_DEPRECATED("Use loadTrack(withTrackID:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -1113,24 +902,19 @@
API_DEPRECATED("Use loadTrackWithTrackID:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadTrackWithTrackID:completionHandler:
- @abstract Loads an instance of AVFragmentedAssetTrack that represents the track of the specified trackID.
- @param trackID
- The trackID of the requested AVFragmentedAssetTrack.
- @param completionHandler
- A block that is called when the loading is finished, with either the loaded track (which may be nil if no track of the specified trackID is available) or an error.
-*/
+/// Loads an instance of AVFragmentedAssetTrack that represents the track of the specified trackID.
+///
+/// - Parameter trackID: The trackID of the requested AVFragmentedAssetTrack.
+/// - Parameter completionHandler: A block that is called when the loading is finished, with either the loaded track (which may be nil if no track of the specified trackID is available) or an error.
- (void)loadTrackWithTrackID:(CMPersistentTrackID)trackID completionHandler:(void (^ NS_SWIFT_SENDABLE)(AVFragmentedAssetTrack * _Nullable_result, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
-/*!
- @method tracksWithMediaType:
- @abstract Provides an array of AVFragmentedAssetTracks of the asset that present media of the specified media type.
- @param mediaType
- The media type according to which the receiver filters its AVFragmentedAssetTracks. (Media types are defined in AVMediaFormat.h)
- @result An NSArray of AVFragmentedAssetTracks; may be empty if no tracks of the specified media type are available.
- @discussion Becomes callable without blocking when the key @"tracks" has been loaded
-*/
+/// Provides an array of AVFragmentedAssetTracks of the asset that present media of the specified media type.
+///
+/// Becomes callable without blocking when the key @"tracks" has been loaded
+///
+/// - Parameter mediaType: The media type according to which the receiver filters its AVFragmentedAssetTracks. (Media types are defined in AVMediaFormat.h)
+///
+/// - Returns: An NSArray of AVFragmentedAssetTracks; may be empty if no tracks of the specified media type are available.
- (NSArray<AVFragmentedAssetTrack *> *)tracksWithMediaType:(AVMediaType)mediaType
#if __swift__
API_DEPRECATED("Use loadTracks(withMediaType:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -1138,24 +922,19 @@
API_DEPRECATED("Use loadTracksWithMediaType:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadTracksWithMediaType:completionHandler:
- @abstract Loads an array of AVFragmentedAssetTracks of the asset that present media of the specified media type.
- @param mediaType
- The media type according to which AVAsset filters its AVFragmentedAssetTracks. (Media types are defined in AVMediaFormat.h.)
- @param completionHandler
- A block that is called when the loading is finished, with either the loaded tracks (which may be empty if no tracks of the specified media type are available) or an error.
-*/
+/// Loads an array of AVFragmentedAssetTracks of the asset that present media of the specified media type.
+///
+/// - Parameter mediaType: The media type according to which AVAsset filters its AVFragmentedAssetTracks. (Media types are defined in AVMediaFormat.h.)
+/// - Parameter completionHandler: A block that is called when the loading is finished, with either the loaded tracks (which may be empty if no tracks of the specified media type are available) or an error.
- (void)loadTracksWithMediaType:(AVMediaType)mediaType completionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVFragmentedAssetTrack *> * _Nullable, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
-/*!
- @method tracksWithMediaCharacteristic:
- @abstract Provides an array of AVFragmentedAssetTracks of the asset that present media with the specified characteristic.
- @param mediaCharacteristic
- The media characteristic according to which the receiver filters its AVFragmentedAssetTracks. (Media characteristics are defined in AVMediaFormat.h)
- @result An NSArray of AVFragmentedAssetTracks; may be empty if no tracks with the specified characteristic are available.
- @discussion Becomes callable without blocking when the key @"tracks" has been loaded
-*/
+/// Provides an array of AVFragmentedAssetTracks of the asset that present media with the specified characteristic.
+///
+/// Becomes callable without blocking when the key @"tracks" has been loaded
+///
+/// - Parameter mediaCharacteristic: The media characteristic according to which the receiver filters its AVFragmentedAssetTracks. (Media characteristics are defined in AVMediaFormat.h)
+///
+/// - Returns: An NSArray of AVFragmentedAssetTracks; may be empty if no tracks with the specified characteristic are available.
- (NSArray<AVFragmentedAssetTrack *> *)tracksWithMediaCharacteristic:(AVMediaCharacteristic)mediaCharacteristic
#if __swift__
API_DEPRECATED("Use loadTracks(withMediaCharacteristic:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -1163,26 +942,19 @@
API_DEPRECATED("Use loadTracksWithMediaCharacteristic:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadTracksWithMediaCharacteristic:completionHandler:
- @abstract Loads an array of AVFragmentedAssetTracks of the asset that present media with the specified characteristic.
- @param mediaCharacteristic
- The media characteristic according to which AVAsset filters its AVFragmentedAssetTracks. (Media characteristics are defined in AVMediaFormat.h.)
- @param completionHandler
- A block that is called when the loading is finished, with either the loaded tracks (which may be empty if no tracks with the specified characteristic are available) or an error.
-*/
+/// Loads an array of AVFragmentedAssetTracks of the asset that present media with the specified characteristic.
+///
+/// - Parameter mediaCharacteristic: The media characteristic according to which AVAsset filters its AVFragmentedAssetTracks. (Media characteristics are defined in AVMediaFormat.h.)
+/// - Parameter completionHandler: A block that is called when the loading is finished, with either the loaded tracks (which may be empty if no tracks with the specified characteristic are available) or an error.
- (void)loadTracksWithMediaCharacteristic:(AVMediaCharacteristic)mediaCharacteristic completionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVFragmentedAssetTrack *> * _Nullable, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
@end
#pragma mark --- AVFragmentedAssetMinder ---
-/*!
- @class AVFragmentedAssetMinder
- @abstract A class that periodically checks whether additional fragments have been appended to fragmented assets.
-*/
@class AVFragmentedAssetMinderInternal;
+/// A class that periodically checks whether additional fragments have been appended to fragmented assets.
NS_SWIFT_NONSENDABLE
API_AVAILABLE(macos(10.11), ios(12.0), tvos(12.0), watchos(6.0), visionos(1.0))
@interface AVFragmentedAssetMinder : NSObject
@@ -1191,67 +963,49 @@
AVFragmentedAssetMinderInternal *_fragmentedAssetMinder;
}
-/*!
- @method fragmentedAssetMinderWithAsset:mindingInterval:
- @abstract Creates an AVFragmentedAssetMinder, adds the specified asset to it, and sets the mindingInterval to the specified value.
- @param asset
- An instance of AVFragmentedAsset to add to the AVFragmentedAssetMinder
- @param mindingInterval
- The initial minding interval of the AVFragmentedAssetMinder.
- @result A new instance of AVFragmentedAssetMinder.
-*/
+/// Creates an AVFragmentedAssetMinder, adds the specified asset to it, and sets the mindingInterval to the specified value.
+///
+/// - Parameter asset: An instance of AVFragmentedAsset to add to the AVFragmentedAssetMinder
+/// - Parameter mindingInterval: The initial minding interval of the AVFragmentedAssetMinder.
+///
+/// - Returns: A new instance of AVFragmentedAssetMinder.
+ (instancetype)fragmentedAssetMinderWithAsset:(AVAsset<AVFragmentMinding> *)asset mindingInterval:(NSTimeInterval)mindingInterval;
-/*!
- @method initWithAsset:mindingInterval:
- @abstract Creates an AVFragmentedAssetMinder, adds the specified asset to it, and sets the mindingInterval to the specified value.
- @param asset
- An instance of AVFragmentedAsset to add to the AVFragmentedAssetMinder
- @param mindingInterval
- The initial minding interval of the AVFragmentedAssetMinder.
- @result A new instance of AVFragmentedAssetMinder.
-*/
+/// Creates an AVFragmentedAssetMinder, adds the specified asset to it, and sets the mindingInterval to the specified value.
+///
+/// - Parameter asset: An instance of AVFragmentedAsset to add to the AVFragmentedAssetMinder
+/// - Parameter mindingInterval: The initial minding interval of the AVFragmentedAssetMinder.
+///
+/// - Returns: A new instance of AVFragmentedAssetMinder.
- (instancetype)initWithAsset:(AVAsset<AVFragmentMinding> *)asset mindingInterval:(NSTimeInterval)mindingInterval;
-/*!
- @property mindingInterval
- @abstract An NSTimeInterval indicating how often a check for additional fragments should be performed. The default interval is 10.0.
- @discussion This property throws an excepion if a value is set less than one millisecond (0.001) in duration.
-*/
+/// An NSTimeInterval indicating how often a check for additional fragments should be performed. The default interval is 10.0.
+///
+/// This property throws an excepion if a value is set less than one millisecond (0.001) in duration.
@property (nonatomic) NSTimeInterval mindingInterval;
-/*!
- @property assets
- @abstract An NSArray of the AVFragmentedAsset objects being minded.
-*/
+/// An NSArray of the AVFragmentedAsset objects being minded.
@property (nonatomic, readonly) NSArray<AVAsset<AVFragmentMinding> *> *assets;
-/*!
- @method addFragmentedAsset:
- @abstract Adds a fragmented asset to the array of assets being minded.
- @param asset
- The fragmented asset to add to the minder.
- @discussion This method throws an exception if the asset is not a supported type (AVFragmentedAsset, AVFragmentedMovie), or if the asset is already being minded by another fragment minder.
-*/
+/// Adds a fragmented asset to the array of assets being minded.
+///
+/// This method throws an exception if the asset is not a supported type (AVFragmentedAsset, AVFragmentedMovie), or if the asset is already being minded by another fragment minder.
+///
+/// - Parameter asset: The fragmented asset to add to the minder.
- (void)addFragmentedAsset:(AVAsset<AVFragmentMinding> *)asset;
-/*!
- @method removeFragmentedAsset:
- @abstract Removes a fragmented asset from the array of assets being minded.
- @param asset
- The fragmented asset to remove from the minder.
- @discussion This method throws an exception if the asset is not a supported type (AVFragmentedAsset, AVFragmentedMovie).
-*/
+/// Removes a fragmented asset from the array of assets being minded.
+///
+/// This method throws an exception if the asset is not a supported type (AVFragmentedAsset, AVFragmentedMovie).
+///
+/// - Parameter asset: The fragmented asset to remove from the minder.
- (void)removeFragmentedAsset:(AVAsset<AVFragmentMinding> *)asset;
@end
@interface AVURLAsset (AVURLAssetContentKeyEligibility) <AVContentKeyRecipient>
-/*!
- @property mayRequireContentKeysForMediaDataProcessing
- @abstract Allows AVURLAsset to be added as a content key recipient to an AVContentKeySession.
-*/
+/// Allows AVURLAsset to be added as a content key recipient to an AVContentKeySession.
@property (nonatomic, readonly) BOOL mayRequireContentKeysForMediaDataProcessing API_AVAILABLE(macos(10.12.4), ios(10.3), tvos(10.2), watchos(3.3), visionos(1.0));
@end
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetCache.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetCache.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetCache.h 2025-04-19 06:23:33
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetCache.h 2025-05-31 21:11:58
@@ -15,39 +15,37 @@
@class AVMediaSelectionGroup;
@class AVMediaSelectionOption;
+@class AVMediaPresentationSelector;
+@class AVMediaPresentationSetting;
-/*!
- @class AVAssetCache
-
- @abstract
- AVAssetCache is a class vended by an AVAsset used for the inspection of locally available media data.
-
- @discussion
- AVAssetCaches are vended by AVURLAsset's assetCache property.
-
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
+/// AVAssetCache is a class vended by an AVAsset used for the inspection of locally available media data.
+///
+/// AVAssetCaches are vended by AVURLAsset's assetCache property.
+///
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(10.12), ios(10.0), tvos(10.0), watchos(10.0), visionos(1.0))
@interface AVAssetCache : NSObject
-/*
- @property playableOffline
- @abstract
- Returns YES if a complete rendition of an AVAsset is available to be played without a network connection.
- @discussion
- An answer of YES does not indicate that any given media selection is available for offline playback. To determine if a specific media selection is available offline, see mediaSelectionOptionsInMediaSelectionGroup:.
-*/
+/// Returns YES if a complete rendition of an AVAsset is available to be played without a network connection.
+///
+/// An answer of YES does not indicate that any given media selection is available for offline playback. To determine if a specific media selection is available offline, see mediaSelectionOptionsInMediaSelectionGroup:.
@property (nonatomic, readonly, getter=isPlayableOffline) BOOL playableOffline;
-/*
- @method mediaSelectionOptionsInMediaSelectionGroup:
- @abstract
- Returns an array of AVMediaSelectionOptions in an AVMediaSelectionGroup that are available for offline operations, e.g. playback.
-*/
+/// Returns an array of AVMediaSelectionOptions in an AVMediaSelectionGroup that are available for offline operations, e.g. playback.
- (NSArray<AVMediaSelectionOption *> *)mediaSelectionOptionsInMediaSelectionGroup:(AVMediaSelectionGroup *)mediaSelectionGroup;
AV_INIT_UNAVAILABLE
+
+@end
+
+@interface AVAssetCache (AVAssetCacheCustomMediaSelectionScheme)
+
+/// For each AVMediaPresentationSelector defined by the AVCustomMediaSelectionScheme of an AVMediaSelectionGroup, returns the AVMediaPresentationSettings that can be satisfied for offline operations, e.g. playback.
+- (NSDictionary <AVMediaPresentationSelector *, NSArray<AVMediaPresentationSetting *> *> *)mediaPresentationSettingsForMediaSelectionGroup:(AVMediaSelectionGroup *)mediaSelectionGroup API_AVAILABLE(macos(26.0), ios(26.0), tvos(26.0), watchos(26.0), visionos(26.0));
+
+/// Returns an array of extended language tags for languages that can be selected for offline operations via use of the AVMediaSelectionGroup's AVCustomMediaSelectionScheme.
+- (NSArray <NSString *> *)mediaPresentationLanguagesForMediaSelectionGroup:(AVMediaSelectionGroup *)mediaSelectionGroup API_AVAILABLE(macos(26.0), ios(26.0), tvos(26.0), watchos(26.0), visionos(26.0));
@end
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetDownloadStorageManager.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetDownloadStorageManager.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetDownloadStorageManager.h 2025-04-19 03:33:09
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetDownloadStorageManager.h 2025-05-31 21:42:12
@@ -8,120 +8,80 @@
*/
-/*!
- @class AVAssetDownloadStorageManager
-
- @abstract An AVAssetDownloadStorageManager manages the policy for automatic purging of downloaded AVAssets. The policy is vended as AVAssetDownloadStorageManagementPolicy object.
-
- @discussion When a storage management policy needs to be set on an asset, sharedDownloadStorageManager singleton needs to be fetched.
- The new policy can then be set by using setStorageManagementPolicy and the location of the downloaded asset.
- */
-
#import <AVFoundation/AVBase.h>
#import <Foundation/Foundation.h>
NS_ASSUME_NONNULL_BEGIN
@class AVAssetDownloadStorageManagementPolicy;
-/*!
- @group AVAssetDownloadedAssetEvictionPriority string constants
- @brief Used by AVAssetDownloadStorageManagementPolicy.
-*/
-typedef NSString *AVAssetDownloadedAssetEvictionPriority NS_STRING_ENUM API_AVAILABLE(macos(10.15), ios(11.0), visionos(1.0)) API_UNAVAILABLE(tvos, watchos);
+// Used by AVAssetDownloadStorageManagementPolicy.
-/*!
- @enum AVAssetDownloadedAssetEvictionPriority
- @abstract These constants represents the eviction priority of downloaded assets.
-
- @constant AVAssetDownloadedAssetEvictionPriorityImportant
- Used to mark assets with the highest priority. They will be the last to be purged.
- @constant AVAssetDownloadedAssetEvictionPriorityDefault
- Used to mark assets have the default priority. They will be the first to be purged.
-*/
+/// These constants represents the eviction priority of downloaded assets.
+typedef NSString *AVAssetDownloadedAssetEvictionPriority NS_STRING_ENUM API_AVAILABLE(macos(10.15), ios(11.0), visionos(1.0)) API_UNAVAILABLE(tvos, watchos);
+/// Used to mark assets with the highest priority. They will be the last to be purged.
AVF_EXPORT AVAssetDownloadedAssetEvictionPriority const AVAssetDownloadedAssetEvictionPriorityImportant API_AVAILABLE(macos(10.15), ios(11.0), visionos(1.0)) API_UNAVAILABLE(tvos, watchos);
+/// Used to mark assets have the default priority. They will be the first to be purged.
AVF_EXPORT AVAssetDownloadedAssetEvictionPriority const AVAssetDownloadedAssetEvictionPriorityDefault API_AVAILABLE(macos(10.15), ios(11.0), visionos(1.0)) API_UNAVAILABLE(tvos, watchos);
+/// An AVAssetDownloadStorageManager manages the policy for automatic purging of downloaded AVAssets. The policy is vended as AVAssetDownloadStorageManagementPolicy object.
+///
+/// When a storage management policy needs to be set on an asset, sharedDownloadStorageManager singleton needs to be fetched.
+/// The new policy can then be set by using setStorageManagementPolicy and the location of the downloaded asset.
API_AVAILABLE(macos(10.15), ios(11.0), visionos(1.0)) API_UNAVAILABLE(tvos, watchos)
@interface AVAssetDownloadStorageManager : NSObject
-/*!
- @method sharedDownloadStorageManager
- @abstract returns singleton instance.
-*/
+/// returns singleton instance.
+ (AVAssetDownloadStorageManager *)sharedDownloadStorageManager;
-/*!
- @method setStorageManagementPolicy: forURL
- @abstract Sets the policy for asset with disk backing at downloadStorageURL.
- @param downloadStorageURL
- The location of downloaded asset.
-*/
+/// Sets the policy for asset with disk backing at downloadStorageURL.
+///
+/// - Parameter downloadStorageURL: The location of downloaded asset.
- (void)setStorageManagementPolicy:(AVAssetDownloadStorageManagementPolicy *)storageManagementPolicy forURL:(NSURL *)downloadStorageURL;
-/*!
- @method storageManagementPolicyForURL:downloadStorageURL
- @abstract Returns the storage management policy for asset downloaded at downloadStorageURL.
- This may be nil if a storageManagementPolicy was never set on the downloaded asset.
- @param downloadStorageURL
- The location of downloaded asset.
-*/
+/// Returns the storage management policy for asset downloaded at downloadStorageURL. This may be nil if a storageManagementPolicy was never set on the downloaded asset.
+///
+/// - Parameter downloadStorageURL: The location of downloaded asset.
- (nullable AVAssetDownloadStorageManagementPolicy *)storageManagementPolicyForURL:(NSURL *)downloadStorageURL;
@end
-/*!
- @class AVAssetDownloadStorageManagementPolicy
-
- @abstract A class to inform the system of a policy for automatic purging of downloaded AVAssets.
-
- @discussion System will put in best-effort to evict all the assets based on expirationDate before evicting based on priority.
- */
+
@class AVAssetDownloadStorageManagementPolicyInternal;
+/// A class to inform the system of a policy for automatic purging of downloaded AVAssets.
+///
+/// System will put in best-effort to evict all the assets based on expirationDate before evicting based on priority.
API_AVAILABLE(macos(10.15), ios(11.0), visionos(1.0)) API_UNAVAILABLE(tvos, watchos)
@interface AVAssetDownloadStorageManagementPolicy : NSObject <NSCopying, NSMutableCopying> {
@private
AVAssetDownloadStorageManagementPolicyInternal *_storageManagementPolicy;
}
-/*
- @property priority
- @abstract Indicates the eviction priority of downloaded asset.
- @discussion Assets with default priority will be purged first before assets with higher priorities.
- In case this is not set, default priority is used.
- */
+/// Indicates the eviction priority of downloaded asset.
+///
+/// Assets with default priority will be purged first before assets with higher priorities.
+/// In case this is not set, default priority is used.
@property (nonatomic, readonly, copy) AVAssetDownloadedAssetEvictionPriority priority;
-/*
- @property expirationDate
- @abstract Returns the expiration date of asset.
- */
+/// Returns the expiration date of asset.
@property (nonatomic, readonly, copy) NSDate *expirationDate;
@end
-/*!
- @class AVMutableAssetDownloadStorageManagementPolicy
-
- @abstract A mutable subclass of AVAssetDownloadStorageManagementPolicy.
-
- @discussion System will put in best-effort to evict all the assets based on expirationDate before evicting based on priority.
- */
+
+/// A mutable subclass of AVAssetDownloadStorageManagementPolicy.
+///
+/// System will put in best-effort to evict all the assets based on expirationDate before evicting based on priority.
NS_SWIFT_NONSENDABLE
API_AVAILABLE(macos(10.15), ios(11.0), visionos(1.0)) API_UNAVAILABLE(tvos, watchos)
@interface AVMutableAssetDownloadStorageManagementPolicy : AVAssetDownloadStorageManagementPolicy
-/*
- @property priority
- @abstract Indicates the eviction priority of downloaded asset.
- @discussion Assets with default priority will be purged first before assets with higher priorities.
- In case this is not set, default priority is used.
- */
+/// Indicates the eviction priority of downloaded asset.
+///
+/// Assets with default priority will be purged first before assets with higher priorities.
+/// In case this is not set, default priority is used.
@property (nonatomic, copy) AVAssetDownloadedAssetEvictionPriority priority;
-/*
- @property expirationDate
- @abstract Returns the expiration date of asset.
- */
+/// Returns the expiration date of asset.
@property (nonatomic, copy) NSDate *expirationDate;
@end
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetDownloadTask.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetDownloadTask.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetDownloadTask.h 2025-04-19 03:37:56
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetDownloadTask.h 2025-05-31 21:50:41
@@ -13,6 +13,7 @@
#import <AVFoundation/AVAsset.h>
#import <AVFoundation/AVMediaSelection.h>
#import <AVFoundation/AVAssetVariant.h>
+#import <AVFoundation/AVMetrics.h>
#import <CoreMedia/CMTimeRange.h>
#import <AVFoundation/AVPlayerMediaSelectionCriteria.h>
@@ -23,87 +24,62 @@
// Keys for options dictionary for use with -[AVAssetDownloadURLSession assetDownloadTaskWithURLAsset:assetTitle:assetArtworkData:options:]
-/*!
- @constant AVAssetDownloadTaskMinimumRequiredMediaBitrateKey
- @abstract The lowest media bitrate greater than or equal to this value will be selected. Value should be a NSNumber in bps. If no suitable media bitrate is found, the highest media bitrate will be selected.
- The value for this key should be a NSNumber.
- @discussion By default, the highest media bitrate will be selected for download.
-*/
+/// The lowest media bitrate greater than or equal to this value will be selected. Value should be a NSNumber in bps. If no suitable media bitrate is found, the highest media bitrate will be selected.
+/// The value for this key should be a NSNumber.
+///
+/// By default, the highest media bitrate will be selected for download.
AVF_EXPORT NSString *const AVAssetDownloadTaskMinimumRequiredMediaBitrateKey API_DEPRECATED("Use AVAssetDownloadConfiguration:variantQualifiers with assetVariantQualifierWithPredicate using desired comparison value against averageBitRate/peakBitRate instead", macos(10.15, API_TO_BE_DEPRECATED), ios(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*!
- @constant AVAssetDownloadTaskMinimumRequiredPresentationSizeKey
- @abstract The lowest media presentation size greater than or equal to this value will be selected. If no suitable media presentation size is found, the highest media presentation size will be selected.
- The value for this key should be a NSValue of CGSize.
- @discussion By default, the highest media presentation size will be selected for download.
-*/
+/// The lowest media presentation size greater than or equal to this value will be selected. If no suitable media presentation size is found, the highest media presentation size will be selected.
+/// The value for this key should be a NSValue of CGSize.
+///
+/// By default, the highest media presentation size will be selected for download.
AVF_EXPORT NSString *const AVAssetDownloadTaskMinimumRequiredPresentationSizeKey API_DEPRECATED("Use AVAssetDownloadConfiguration:variantQualifiers with predicateForPresentationWidth and predicateForPresentationHeight instead", macos(11.0, API_TO_BE_DEPRECATED), ios(14.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*!
- @constant AVAssetDownloadTaskMediaSelectionKey
- @abstract The media selection for this download.
- The value for this key should be an AVMediaSelection.
- @discussion By default, media selections for AVAssetDownloadTask will be automatically selected.
-*/
+/// The media selection for this download.
+/// The value for this key should be an AVMediaSelection.
+///
+/// By default, media selections for AVAssetDownloadTask will be automatically selected.
AVF_EXPORT NSString *const AVAssetDownloadTaskMediaSelectionKey API_DEPRECATED("Use AVAssetDownloadConfiguration:mediaSelections instead", macos(10.15, API_TO_BE_DEPRECATED), ios(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*!
- @constant AVAssetDownloadTaskMediaSelectionPrefersMultichannelKey
- @abstract Download the specified media selections with or without support for multichannel playback.
- The value for this key should be an NSNumber representing a BOOL.
- @discussion By default AVAssetDownloadTask will prefer multichannel by downloading the most capable multichannel rendition available in additon to stereo.
-*/
+/// Download the specified media selections with or without support for multichannel playback.
+/// The value for this key should be an NSNumber representing a BOOL.
+///
+/// By default AVAssetDownloadTask will prefer multichannel by downloading the most capable multichannel rendition available in additon to stereo.
AVF_EXPORT NSString *const AVAssetDownloadTaskMediaSelectionPrefersMultichannelKey API_DEPRECATED("Use AVAssetDownloadConfiguration:variantQualifiers with predicateForChannelCount instead", macos(10.15, API_TO_BE_DEPRECATED), ios(13.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*!
-@constant AVAssetDownloadTaskPrefersLosslessAudioKey
-@abstract Download the specified media selections in lossless audio representation.
- The value for this key should be an NSNumber representing a BOOL.
-@discussion By default AVAssetDownloadTask will prefer lossy audio representation.
- */
+/// Download the specified media selections in lossless audio representation.
+/// The value for this key should be an NSNumber representing a BOOL.
+///
+/// By default AVAssetDownloadTask will prefer lossy audio representation.
AVF_EXPORT NSString *const AVAssetDownloadTaskPrefersLosslessAudioKey API_DEPRECATED("Use AVAssetDownloadConfiguration:variantQualifiers with assetVariantQualifierWithPredicate using [NSPredicate predicateWithFormat:@'%d in audioAttributes.formatIDs', kAudioFormatAppleLossless]", macos(11.3, API_TO_BE_DEPRECATED), ios(14.5, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*!
-@constant AVAssetDownloadTaskPrefersHDRKey
-@abstract Download the specified media selections with or without HDR content.
- The value for this key should be an NSNumber representing a BOOL.
-@discussion By default AVAssetDownloadTask will prefer HDR content.
- */
+/// Download the specified media selections with or without HDR content.
+/// The value for this key should be an NSNumber representing a BOOL.
+///
+/// By default AVAssetDownloadTask will prefer HDR content.
AVF_EXPORT NSString *const AVAssetDownloadTaskPrefersHDRKey API_DEPRECATED("Use AVAssetDownloadConfiguration:variantQualifiers with assetVariantQualifierWithPredicate using [NSPredicate predicateWithFormat:@'videoAttributes.videoRange == %@', AVVideoRangePQ]", macos(11.0, API_TO_BE_DEPRECATED), ios(14.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*!
- @class AVAssetDownloadTask
- @abstract A NSURLSessionTask that accepts remote AVURLAssets to download locally.
- @discussion Should be created with -[AVAssetDownloadURLSession assetDownloadTaskWithURLAsset:assetTitle:assetArtworkData:options:]. To utilize local data for playback for downloads that are in-progress, re-use the URLAsset supplied in initialization. An AVAssetDownloadTask may be instantiated with a destinationURL pointing to an existing asset on disk, for the purpose of completing or augmenting a downloaded asset.
-*/
-
+/// A NSURLSessionTask that accepts remote AVURLAssets to download locally.
+///
+/// Should be created with -[AVAssetDownloadURLSession assetDownloadTaskWithURLAsset:assetTitle:assetArtworkData:options:]. To utilize local data for playback for downloads that are in-progress, re-use the URLAsset supplied in initialization. An AVAssetDownloadTask may be instantiated with a destinationURL pointing to an existing asset on disk, for the purpose of completing or augmenting a downloaded asset.
API_AVAILABLE(macos(10.15), ios(9.0), watchos(10.0), visionos(1.0)) API_UNAVAILABLE(tvos)
@interface AVAssetDownloadTask : NSURLSessionTask
-/*!
- @property URLAsset
- @abstract The asset supplied to the download task upon initialization.
-*/
+/// The asset supplied to the download task upon initialization.
@property (nonatomic, readonly) AVURLAsset *URLAsset;
-/*!
- @property destinationURL
- @abstract The file URL supplied to the download task upon initialization.
- @discussion This URL may have been appended with the appropriate extension for the asset.
-*/
+/// The file URL supplied to the download task upon initialization.
+///
+/// This URL may have been appended with the appropriate extension for the asset.
@property (nonatomic, readonly) NSURL *destinationURL API_DEPRECATED("Use the URL property of URLAsset instead", ios(9.0, 10.0)) API_UNAVAILABLE(tvos, watchos, visionos) API_UNAVAILABLE(macos);
-/*!
- @property options
- @abstract The options supplied to the download task upon initialization.
-*/
+/// The options supplied to the download task upon initialization.
@property (nonatomic, readonly, nullable) NSDictionary<NSString *, id> *options API_DEPRECATED("Use AVAssetDownloadConfiguration instead", macos(10.15, API_TO_BE_DEPRECATED), ios(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*!
- @property loadedTimeRanges
- @abstract This property provides a collection of time ranges for which the download task has media data already downloaded and playable. The ranges provided might be discontinuous.
- @discussion Returns an NSArray of NSValues containing CMTimeRanges.
-*/
+/// This property provides a collection of time ranges for which the download task has media data already downloaded and playable. The ranges provided might be discontinuous.
+///
+/// Returns an NSArray of NSValues containing CMTimeRanges.
@property (nonatomic, readonly) NSArray<NSValue *> *loadedTimeRanges API_DEPRECATED("Use NSURLSessionTask.progress instead", macos(10.15, API_TO_BE_DEPRECATED), ios(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
// NSURLRequest and NSURLResponse objects are not available for AVAssetDownloadTask
@@ -114,118 +90,86 @@
@end
-/*!
- @class AVAssetDownloadConfiguration
- @abstract Configuration parameters for the download task.
- @discussion Download configuration consists of primary and auxiliary content configurations. Primary content configuration represents the primary set of renditions essential for offline playback. Auxiliary content configurations represent additional configurations to complement the primary.
- For example, the primary content configuration may represent stereo audio renditions and auxiliary configuration may represent complementing multichannel audio renditions.
-
- It is important to configure your download configuration object appropriately before using it to create a download task. Download task makes a copy of the configuration settings you provide and use those settings to configure the task. Once configured, the task object ignores any changes you make to the NSURLSessionConfiguration object. If you need to modify your settings, you must update the download configuration object and use it to create a new download task object.
- */
+/// Configuration parameters for the download task.
+///
+/// Download configuration consists of primary and auxiliary content configurations. Primary content configuration represents the primary set of renditions essential for offline playback. Auxiliary content configurations represent additional configurations to complement the primary.
+/// For example, the primary content configuration may represent stereo audio renditions and auxiliary configuration may represent complementing multichannel audio renditions.
+///
+/// It is important to configure your download configuration object appropriately before using it to create a download task. Download task makes a copy of the configuration settings you provide and use those settings to configure the task. Once configured, the task object ignores any changes you make to the NSURLSessionConfiguration object. If you need to modify your settings, you must update the download configuration object and use it to create a new download task object.
NS_SWIFT_NONSENDABLE
API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(10.0), visionos(1.0))
@interface AVAssetDownloadConfiguration : NSObject
AV_INIT_UNAVAILABLE
-/*!
- @method downloadConfigurationWithAsset:title:
- @abstract Creates and initializes a download configuration object.
- @discussion This method will throw an exception if AVURLAsset has been invalidated.
- @param asset
- The asset to create the download configuration for.
- @param title
- A human readable title for this asset, expected to be as suitable as possible for the user's preferred languages. Will show up in the usage pane of the settings app.
- */
+/// Creates and initializes a download configuration object.
+///
+/// This method will throw an exception if AVURLAsset has been invalidated.
+///
+/// - Parameter asset: The asset to create the download configuration for.
+/// - Parameter title: A human readable title for this asset, expected to be as suitable as possible for the user's preferred languages. Will show up in the usage pane of the settings app.
+ (instancetype)downloadConfigurationWithAsset:(AVURLAsset *)asset title:(NSString *)title;
-/*!
- @property assetArtworkData
- @abstract NSData representing artwork data for this asset. Optional. May be displayed, for example, by the usage pane of the Settings app. Must work with +[UIImage imageWithData:].
- */
+/// NSData representing artwork data for this asset. Optional. May be displayed, for example, by the usage pane of the Settings app. Must work with +[UIImage imageWithData:].
@property (nonatomic, copy, nullable) NSData *artworkData;
-/*!
- @property primaryContentConfiguration
- @abstract The primary content for the download.
- */
+/// The primary content for the download.
@property (nonatomic, readonly) AVAssetDownloadContentConfiguration *primaryContentConfiguration;
-/*!
- @property auxiliaryContentConfigurations
- @abstract The auxiliary content for the download. Optional.
- @discussion By default, auxiliaryContentConfigurations will have one or more default auxiliary content configurations. These content configurations can be augmented with additional content configurations or removed entirely if no auxiliary content is desired.
- */
+/// The auxiliary content for the download. Optional.
+///
+/// By default, auxiliaryContentConfigurations will have one or more default auxiliary content configurations. These content configurations can be augmented with additional content configurations or removed entirely if no auxiliary content is desired.
@property (nonatomic, copy) NSArray <AVAssetDownloadContentConfiguration *> *auxiliaryContentConfigurations;
-/*!
- @property optimizesAuxiliaryContentConfigurations
- @abstract Optimizes auxiliary content selection depending on the primary to minimize total number of video renditions downloaded. True by default.
- @discussion For example, if the primary content configuration represents stereo renditions and auxiliary content configuration represents multichannel audio renditions, auxiliary multichannel variant will be chosen so as to avoid downloading duplicate video renditions.
- */
+/// Optimizes auxiliary content selection depending on the primary to minimize total number of video renditions downloaded. True by default.
+///
+/// For example, if the primary content configuration represents stereo renditions and auxiliary content configuration represents multichannel audio renditions, auxiliary multichannel variant will be chosen so as to avoid downloading duplicate video renditions.
@property (nonatomic) BOOL optimizesAuxiliaryContentConfigurations;
-/*!
- @property downloadsInterstitialAssets
- @abstract Download interstitial assets as listed in the index file. False by default.
- @discussion Ordinarily, interstitial assets are skipped when downloading content for later playback. Setting this property to true will cause interstitial assets to be downloaded as well. Playback of the downloaded content can then match the experience of online streaming playback as closely as possible.
- */
+/// Download interstitial assets as listed in the index file. False by default.
+///
+/// Ordinarily, interstitial assets are skipped when downloading content for later playback. Setting this property to true will cause interstitial assets to be downloaded as well. Playback of the downloaded content can then match the experience of online streaming playback as closely as possible.
@property (nonatomic) BOOL downloadsInterstitialAssets API_UNAVAILABLE(macos, ios, tvos, watchos, visionos);
-/*!
- @method setInterstitialMediaSelectionCriteria:forMediaCharacteristic:
- @abstract Sets media selection on interstitials for this asset
- @discussion Typically, interstitial assets have not been discovered when the main download is initiated.
- This method allows the user to specify AVMediaSelectionCriteria for all interstitials that are discovered.
- Each AVPlayerMediaSelectionCriteria in the array of criteria specfies a set of criteria for a variant to download.
- @param criteria
- The array of selection criteria to set
- @param mediaCharacteristic
- The AVMediaCharacteristic to which the criteria will be applied
-*/
+/// Sets media selection on interstitials for this asset
+///
+/// Typically, interstitial assets have not been discovered when the main download is initiated.
+/// This method allows the user to specify AVMediaSelectionCriteria for all interstitials that are discovered.
+/// Each AVPlayerMediaSelectionCriteria in the array of criteria specfies a set of criteria for a variant to download.
+///
+/// - Parameter criteria: The array of selection criteria to set
+/// - Parameter mediaCharacteristic: The AVMediaCharacteristic to which the criteria will be applied
- (void)setInterstitialMediaSelectionCriteria:(NSArray<AVPlayerMediaSelectionCriteria *> *) criteria forMediaCharacteristic:(AVMediaCharacteristic)mediaCharacteristic API_AVAILABLE(macos(15.4), ios(18.4), tvos(18.4), watchos(11.4), visionos(2.4));
@end
-/*!
- @class AVAssetDownloadContentConfiguration
- @abstract Represents the configuration consisting of variant and the variant's media options.
-*/
+/// Represents the configuration consisting of variant and the variant's media options.
NS_SWIFT_NONSENDABLE
API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(10.0), visionos(1.0))
@interface AVAssetDownloadContentConfiguration : NSObject <NSCopying>
-/*!
- @property variantQualifiers
- @abstract An array of variant qualifiers.
- @discussion The qualifiers are expected to be added in the preferential order and will be evaluated in that order until the qualifier matches one or more AVAssetVariants. Only those variants which can be played on the current device configuration will be initially chosen for evaluation. If there is more than one match, automatic variant selection will be used to choose among the matched.
- If a variant qualifier is constructed to explicitly choose a variant, no evaluation is performed and the variant provided will be downloaded as is, even if it is not playable on current device configuration.
- If a variant qualifier has not been provided, or if the variant qualifier when evaluated does not match any of the variants which can be played according to the current device configuration, automatic variant selection will be used.
-*/
+/// An array of variant qualifiers.
+///
+/// The qualifiers are expected to be added in the preferential order and will be evaluated in that order until the qualifier matches one or more AVAssetVariants. Only those variants which can be played on the current device configuration will be initially chosen for evaluation. If there is more than one match, automatic variant selection will be used to choose among the matched.
+/// If a variant qualifier is constructed to explicitly choose a variant, no evaluation is performed and the variant provided will be downloaded as is, even if it is not playable on current device configuration.
+/// If a variant qualifier has not been provided, or if the variant qualifier when evaluated does not match any of the variants which can be played according to the current device configuration, automatic variant selection will be used.
@property (nonatomic, copy) NSArray<AVAssetVariantQualifier *> *variantQualifiers;
-/*!
- @property mediaSelections
- @abstract An array of media selections obtained from the AVAsset.
- @discussion If a media selection is not provided, automatic media selection associated with the asset will be used.
-*/
+/// An array of media selections obtained from the AVAsset.
+///
+/// If a media selection is not provided, automatic media selection associated with the asset will be used.
@property (nonatomic, copy) NSArray<AVMediaSelection *> *mediaSelections;
@end
-/*!
- @class AVAggregateAssetDownloadTask
- @abstract An AVAssetDownloadTask used for downloading multiple AVMediaSelections for a single AVAsset, under the umbrella of a single download task.
- @discussion Should be created with -[AVAssetDownloadURLSession aggregateAssetDownloadTaskWithURLAsset:mediaSelections:assetTitle:assetArtworkData:options:. For progress tracking, monitor the delegate callbacks for each childAssetDownloadTask.
-
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
+/// An AVAssetDownloadTask used for downloading multiple AVMediaSelections for a single AVAsset, under the umbrella of a single download task.
+///
+/// Should be created with -[AVAssetDownloadURLSession aggregateAssetDownloadTaskWithURLAsset:mediaSelections:assetTitle:assetArtworkData:options:. For progress tracking, monitor the delegate callbacks for each childAssetDownloadTask.
+///
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
API_DEPRECATED("Use assetDownloadTaskWithConfiguration: instead", macos(10.15, API_TO_BE_DEPRECATED), ios(11.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos)
@interface AVAggregateAssetDownloadTask : NSURLSessionTask
-/*!
- @property URLAsset
- @abstract The asset supplied to the download task upon initialization.
-*/
+/// The asset supplied to the download task upon initialization.
@property (nonatomic, readonly) AVURLAsset *URLAsset;
// NSURLRequest and NSURLResponse objects are not available for AVAggregateAssetDownloadTask
@@ -236,198 +180,136 @@
@end
-/*!
- @protocol AVAssetDownloadDelegate
- @abstract Delegate methods to implement when adopting AVAssetDownloadTask.
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
-
+/// Delegate methods to implement when adopting AVAssetDownloadTask. Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(10.15), ios(9.0), watchos(10.0), visionos(1.0)) API_UNAVAILABLE(tvos)
@protocol AVAssetDownloadDelegate <NSURLSessionTaskDelegate>
@optional
-/*!
- @method URLSession:assetDownloadTask:didFinishDownloadingToURL:
- @abstract Sent when a download task that has completed a download.
- @discussion Unlike NSURLSessionDownloadDelegate, the delegate should NOT move the file from this directory after it has been called. Downloaded assets must remain at the system provided URL. URLSession:task:didCompleteWithError: will still be called.
- @param session
- The session the asset download task is on.
- @param assetDownloadTask
- The AVAssetDownloadTask whose downloaded completed.
- @param location
- The location the asset has been downloaded to.
-*/
+
+/// Sent when a download task that has completed a download.
+///
+/// Unlike NSURLSessionDownloadDelegate, the delegate should NOT move the file from this directory after it has been called. Downloaded assets must remain at the system provided URL. URLSession:task:didCompleteWithError: will still be called.
+///
+/// - Parameter session: The session the asset download task is on.
+/// - Parameter assetDownloadTask: The AVAssetDownloadTask whose downloaded completed.
+/// - Parameter location: The location the asset has been downloaded to.
- (void)URLSession:(NSURLSession *)session assetDownloadTask:(AVAssetDownloadTask *)assetDownloadTask didFinishDownloadingToURL:(NSURL *)location API_DEPRECATED("Use URLSession:assetDownloadTask:willDownloadToURL: instead", macos(10.15, API_TO_BE_DEPRECATED), ios(10.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED), watchos(10.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos);
-/*!
- @method URLSession:assetDownloadTask:didLoadTimeRange:totalTimeRangesLoaded:timeRangeExpectedToLoad:
- @abstract Method to adopt to subscribe to progress updates of an AVAssetDownloadTask.
- @param session
- The session the asset download task is on.
- @param assetDownloadTask
- The AVAssetDownloadTask which is being updated.
- @param timeRange
- A CMTimeRange indicating the time range loaded since the last time this method was called.
- @param loadedTimeRanges
- A NSArray of NSValues of CMTimeRanges indicating all the time ranges loaded by this asset download task.
- @param timeRangeExpectedToLoad
- A CMTimeRange indicating the single time range that is expected to be loaded when the download is complete.
-*/
+/// Method to adopt to subscribe to progress updates of an AVAssetDownloadTask.
+///
+/// - Parameter session: The session the asset download task is on.
+/// - Parameter assetDownloadTask: The AVAssetDownloadTask which is being updated.
+/// - Parameter timeRange: A CMTimeRange indicating the time range loaded since the last time this method was called.
+/// - Parameter loadedTimeRanges: A NSArray of NSValues of CMTimeRanges indicating all the time ranges loaded by this asset download task.
+/// - Parameter timeRangeExpectedToLoad: A CMTimeRange indicating the single time range that is expected to be loaded when the download is complete.
- (void)URLSession:(NSURLSession *)session assetDownloadTask:(AVAssetDownloadTask *)assetDownloadTask didLoadTimeRange:(CMTimeRange)timeRange totalTimeRangesLoaded:(NSArray<NSValue *> *)loadedTimeRanges timeRangeExpectedToLoad:(CMTimeRange)timeRangeExpectedToLoad API_DEPRECATED("Use NSURLSessionTask.progress instead", macos(10.15, API_TO_BE_DEPRECATED), ios(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*
- @method URLSession:assetDownloadTask:didResolveMediaSelection:
- @abstract Method called when the media selection for the download is fully resolved, including any automatic selections.
- @param session
- The session the asset download task is on.
- @param assetDownloadTask
- The AVAssetDownloadTask which is being updated.
- @param resolvedMediaSelection
- The resolved media selection for the download task. For the best chance of playing back downloaded content without further network I/O, apply this selection to subsequent AVPlayerItems.
-*/
+
+/// Method called when the media selection for the download is fully resolved, including any automatic selections.
+///
+/// - Parameter session: The session the asset download task is on.
+/// - Parameter assetDownloadTask: The AVAssetDownloadTask which is being updated.
+/// - Parameter resolvedMediaSelection: The resolved media selection for the download task. For the best chance of playing back downloaded content without further network I/O, apply this selection to subsequent AVPlayerItems.
- (void)URLSession:(NSURLSession *)session assetDownloadTask:(AVAssetDownloadTask *)assetDownloadTask didResolveMediaSelection:(AVMediaSelection *)resolvedMediaSelection API_AVAILABLE(macos(10.15), ios(9.0), visionos(1.0)) API_UNAVAILABLE(tvos, watchos);
-/*
- @method URLSession:assetDownloadTask:willDownloadToURL:
- @abstract Method called when the asset download task determines the location this asset will be downloaded to.
- @discussion This URL should be saved for future instantiations of AVAsset. While an AVAsset already exists for this content, it is advisable to re-use that instance.
- @param session
- The session the asset download task is on.
- @param assetDownloadTask
- The AVAssetDownloadTask.
- @param location
- The file URL this task will download media data to.
-*/
+/// Method called when the asset download task determines the location this asset will be downloaded to.
+///
+/// This URL should be saved for future instantiations of AVAsset. While an AVAsset already exists for this content, it is advisable to re-use that instance.
+///
+/// - Parameter session: The session the asset download task is on.
+/// - Parameter assetDownloadTask: The AVAssetDownloadTask.
+/// - Parameter location: The file URL this task will download media data to.
- (void)URLSession:(NSURLSession *)session assetDownloadTask:(AVAssetDownloadTask *)assetDownloadTask willDownloadToURL:(NSURL *)location API_AVAILABLE(macos(14.0), ios(18.0), watchos(10.0), visionos(1.0)) API_UNAVAILABLE(tvos);
-/*
- @method URLSession:aggregateAssetDownloadTask:willDownloadToURL:
- @abstract Method called when the aggregate download task determines the location this asset will be downloaded to.
- @discussion This URL should be saved for future instantiations of AVAsset. While an AVAsset already exists for this content, it is advisable to re-use that instance.
- @param session
- The session the aggregate asset download task is on.
- @param aggregateAssetDownloadTask
- The AVAggregateAssetDownloadTask.
- @param location
- The file URL this task will download media data to.
-*/
+/// Method called when the aggregate download task determines the location this asset will be downloaded to.
+///
+/// This URL should be saved for future instantiations of AVAsset. While an AVAsset already exists for this content, it is advisable to re-use that instance.
+///
+/// - Parameter session: The session the aggregate asset download task is on.
+/// - Parameter aggregateAssetDownloadTask: The AVAggregateAssetDownloadTask.
+/// - Parameter location: The file URL this task will download media data to.
- (void)URLSession:(NSURLSession *)session aggregateAssetDownloadTask:(AVAggregateAssetDownloadTask *)aggregateAssetDownloadTask willDownloadToURL:(NSURL *)location API_DEPRECATED("Use URLSession:assetDownloadTask:willDownloadToURL: instead", macos(10.15, API_TO_BE_DEPRECATED), ios(11.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*
- @method URLSession:aggregateAssetDownloadTask:didCompleteForMediaSelection:
- @abstract Method called when a child AVAssetDownloadTask completes.
- @param session
- The session the aggregate asset download task is on.
- @param aggregateAssetDownloadTask
- The AVAggregateAssetDownloadTask.
- @param mediaSelection
- The AVMediaSelection which is now fully available for offline use.
-*/
+/// Method called when a child AVAssetDownloadTask completes.
+///
+/// - Parameter session: The session the aggregate asset download task is on.
+/// - Parameter aggregateAssetDownloadTask: The AVAggregateAssetDownloadTask.
+/// - Parameter mediaSelection: The AVMediaSelection which is now fully available for offline use.
- (void)URLSession:(NSURLSession *)session aggregateAssetDownloadTask:(AVAggregateAssetDownloadTask *)aggregateAssetDownloadTask didCompleteForMediaSelection:(AVMediaSelection *)mediaSelection API_DEPRECATED("Use the NSURLSessionDownloadDelegate method instead, URLSession:task:didCompleteWithError:" ,macos(10.15, API_TO_BE_DEPRECATED), ios(11.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*
- @method URLSession:aggregateAssetDownloadTask:didLoadTimeRange:totalTimeRangesLoaded:timeRangeExpectedToLoad:forMediaSelection:
- @abstract Method to adopt to subscribe to progress updates of an AVAggregateAssetDownloadTask
- @param session
- The session the asset download task is on.
- @param aggregateAssetDownloadTask
- The AVAggregateAssetDownloadTask.
- @param timeRange
- A CMTimeRange indicating the time range loaded for the media selection being downloaded.
- @param loadedTimeRanges
- A NSArray of NSValues of CMTimeRanges indicating all the time ranges loaded for the media selection being downloaded.
- @param timeRangeExpectedToLoad
- A CMTimeRange indicating the single time range that is expected to be loaded when the download is complete for the media selection being downloaded.
- @param mediaSelection
- The media selection which has additional media data loaded for offline use.
-*/
+/// Method to adopt to subscribe to progress updates of an AVAggregateAssetDownloadTask
+///
+/// - Parameter session: The session the asset download task is on.
+/// - Parameter aggregateAssetDownloadTask: The AVAggregateAssetDownloadTask.
+/// - Parameter timeRange: A CMTimeRange indicating the time range loaded for the media selection being downloaded.
+/// - Parameter loadedTimeRanges: A NSArray of NSValues of CMTimeRanges indicating all the time ranges loaded for the media selection being downloaded.
+/// - Parameter timeRangeExpectedToLoad: A CMTimeRange indicating the single time range that is expected to be loaded when the download is complete for the media selection being downloaded.
+/// - Parameter mediaSelection: The media selection which has additional media data loaded for offline use.
- (void)URLSession:(NSURLSession *)session aggregateAssetDownloadTask:(AVAggregateAssetDownloadTask *)aggregateAssetDownloadTask didLoadTimeRange:(CMTimeRange)timeRange totalTimeRangesLoaded:(NSArray<NSValue *> *)loadedTimeRanges timeRangeExpectedToLoad:(CMTimeRange)timeRangeExpectedToLoad forMediaSelection:(AVMediaSelection *)mediaSelection API_DEPRECATED("Use NSURLSessionTask.progress: instead", macos(10.15, API_TO_BE_DEPRECATED), ios(11.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
@optional
-/*!
- @method URLSession:assetDownloadTask:willDownloadVariants:
- @abstract Sent when a download task has completed the variant selection.
- @param session
- The session the asset download task is on.
- @param assetDownloadTask
- The asset download task.
- @param variants
- The variants chosen. Depends on the environmental condition when the download starts.
-*/
+
+/// Sent when a download task has completed the variant selection.
+///
+/// - Parameter session: The session the asset download task is on.
+/// - Parameter assetDownloadTask: The asset download task.
+/// - Parameter variants: The variants chosen. Depends on the environmental condition when the download starts.
- (void)URLSession:(NSURLSession *)session assetDownloadTask:(AVAssetDownloadTask *)assetDownloadTask willDownloadVariants:(NSArray <AVAssetVariant *> *)variants API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(10.0), visionos(1.0));
+/// Sent when a download task receives an AVMetricEvent.
+///
+/// - Parameter session: The NSURLSession corresponding to this AVAssetDownloadTask.
+/// - Parameter assetDownloadTask: The asset download task.
+/// - Parameter metricEvent: The metric event received.
+- (void)URLSession:(NSURLSession *)session assetDownloadTask:(AVAssetDownloadTask *)assetDownloadTask didReceiveMetricEvent:(AVMetricEvent *)metricEvent API_AVAILABLE(macos(26.0), ios(26.0), tvos(26.0), watchos(28.0), visionos(26.0));
+
@end
-/*!
- @class AVAssetDownloadURLSession
- @abstract A subclass of NSURLSession to support AVAssetDownloadTask.
-*/
+/// A subclass of NSURLSession to support AVAssetDownloadTask.
API_AVAILABLE(macos(10.15), ios(9.0), watchos(10.0), visionos(1.0)) API_UNAVAILABLE(tvos)
@interface AVAssetDownloadURLSession : NSURLSession
-/*!
- @method sessionWithConfiguration:assetDownloadDelegate:delegateQueue:
- @abstract Creates and initializes an AVAssetDownloadURLSession for use with AVAssetDownloadTasks.
- @param configuration
- The configuration for this URLSession. Must be a background configuration.
- @param delegate
- The delegate object to handle asset download progress updates and other session related events.
- @param delegateQueue
- The queue to receive delegate callbacks on. If nil, a serial queue will be provided.
-*/
+/// Creates and initializes an AVAssetDownloadURLSession for use with AVAssetDownloadTasks.
+///
+/// - Parameter configuration: The configuration for this URLSession. Must be a background configuration.
+/// - Parameter delegate: The delegate object to handle asset download progress updates and other session related events.
+/// - Parameter delegateQueue: The queue to receive delegate callbacks on. If nil, a serial queue will be provided.
+ (AVAssetDownloadURLSession *)sessionWithConfiguration:(NSURLSessionConfiguration *)configuration assetDownloadDelegate:(nullable id <AVAssetDownloadDelegate>)delegate delegateQueue:(nullable NSOperationQueue *)delegateQueue;
-/*!
- @method assetDownloadTaskWithURLAsset:destinationURL:options:
- @abstract Creates and initializes an AVAssetDownloadTask to be used with this AVAssetDownloadURLSession.
- @discussion This method may return nil if the URLSession has been invalidated.
- @param URLAsset
- The AVURLAsset to download locally.
- @param destinationURL
- The local URL to download the asset to. This must be a file URL.
- @param options
- See AVAssetDownloadTask*Key above. Configures non-default behavior for the download task. Using this parameter is required for downloading non-default media selections for HLS assets.
-*/
+/// Creates and initializes an AVAssetDownloadTask to be used with this AVAssetDownloadURLSession.
+///
+/// This method may return nil if the URLSession has been invalidated.
+///
+/// - Parameter URLAsset: The AVURLAsset to download locally.
+/// - Parameter destinationURL: The local URL to download the asset to. This must be a file URL.
+/// - Parameter options: See AVAssetDownloadTask*Key above. Configures non-default behavior for the download task. Using this parameter is required for downloading non-default media selections for HLS assets.
- (nullable AVAssetDownloadTask *)assetDownloadTaskWithURLAsset:(AVURLAsset *)URLAsset destinationURL:(NSURL *)destinationURL options:(nullable NSDictionary<NSString *, id> *)options API_DEPRECATED("Use assetDownloadTaskWithURLAsset:assetTitle:assetArtworkData:options: instead", ios(9.0, 10.0)) API_UNAVAILABLE(tvos, watchos, visionos) API_UNAVAILABLE(macos);
-/*!
- @method assetDownloadTaskWithURLAsset:assetTitle:assetArtworkData:options:
- @abstract Creates and initializes an AVAssetDownloadTask to be used with this AVAssetDownloadURLSession.
- @discussion This method may return nil if the URLSession has been invalidated.
- @param URLAsset
- The AVURLAsset to download locally.
- @param title
- A human readable title for this asset, expected to be as suitable as possible for the user's preferred languages. Will show up in the usage pane of the settings app.
- @param artworkData
- NSData representing artwork data for this asset. Optional. Will show up in the usage pane of the settings app. Must work with +[UIImage imageWithData:].
- @param options
- See AVAssetDownloadTask*Key above. Configures non-default behavior for the download task. Using this parameter is required for downloading non-default media selections for HLS assets.
-*/
+/// Creates and initializes an AVAssetDownloadTask to be used with this AVAssetDownloadURLSession.
+///
+/// This method may return nil if the URLSession has been invalidated.
+///
+/// - Parameter URLAsset: The AVURLAsset to download locally.
+/// - Parameter title: A human readable title for this asset, expected to be as suitable as possible for the user's preferred languages. Will show up in the usage pane of the settings app.
+/// - Parameter artworkData: NSData representing artwork data for this asset. Optional. Will show up in the usage pane of the settings app. Must work with +[UIImage imageWithData:].
+/// - Parameter options: See AVAssetDownloadTask*Key above. Configures non-default behavior for the download task. Using this parameter is required for downloading non-default media selections for HLS assets.
- (nullable AVAssetDownloadTask *)assetDownloadTaskWithURLAsset:(AVURLAsset *)URLAsset assetTitle:(NSString *)title assetArtworkData:(nullable NSData *)artworkData options:(nullable NSDictionary<NSString *, id> *)options API_DEPRECATED("Use assetDownloadTaskWithConfiguration: instead", macos(10.15, API_TO_BE_DEPRECATED), ios(10.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*!
- @method aggregateAssetDownloadTaskWithURLAsset:mediaSelections:assetTitle:assetArtworkData:options:
- @abstract Creates and initializes an AVAggregateAssetDownloadTask to download multiple AVMediaSelections on an AVURLAsset.
- @discussion This method may return nil if the URLSession has been invalidated. The value of AVAssetDownloadTaskMediaSelectionKey will be ignored.
- @param URLAsset
- The AVURLAsset to download locally.
- @param mediaSelections
- A list of AVMediaSelections. Each AVMediaSelection will correspond to a childAssetDownloadTask. Use -[AVAsset allMediaSelections] to download all AVMediaSelections on this AVAsset.
- @param title
- A human readable title for this asset, expected to be as suitable as possible for the user's preferred languages. Will show up in the usage pane of the settings app.
- @param artworkData
- Artwork data for this asset. Optional. Will show up in the usage pane of the settings app.
- @param options
- See AVAssetDownloadTask*Key above. Configures non-default behavior for the download task.
-*/
+/// Creates and initializes an AVAggregateAssetDownloadTask to download multiple AVMediaSelections on an AVURLAsset.
+///
+/// This method may return nil if the URLSession has been invalidated. The value of AVAssetDownloadTaskMediaSelectionKey will be ignored.
+///
+/// - Parameter URLAsset: The AVURLAsset to download locally.
+/// - Parameter mediaSelections: A list of AVMediaSelections. Each AVMediaSelection will correspond to a childAssetDownloadTask. Use -[AVAsset allMediaSelections] to download all AVMediaSelections on this AVAsset.
+/// - Parameter title: A human readable title for this asset, expected to be as suitable as possible for the user's preferred languages. Will show up in the usage pane of the settings app.
+/// - Parameter artworkData: Artwork data for this asset. Optional. Will show up in the usage pane of the settings app.
+/// - Parameter options: See AVAssetDownloadTask*Key above. Configures non-default behavior for the download task.
- (nullable AVAggregateAssetDownloadTask *)aggregateAssetDownloadTaskWithURLAsset:(AVURLAsset *)URLAsset mediaSelections:(NSArray <AVMediaSelection *> *)mediaSelections assetTitle:(NSString *)title assetArtworkData:(nullable NSData *)artworkData options:(nullable NSDictionary<NSString *, id> *)options API_DEPRECATED("Use assetDownloadTaskWithConfiguration: instead", macos(10.15, API_TO_BE_DEPRECATED), ios(11.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED)) API_UNAVAILABLE(tvos, watchos);
-/*!
- @method assetDownloadTaskWithConfiguration:
- @abstract Creates and initializes an AVAssetDownloadTask to be used with this AVAssetDownloadURLSession.
- @discussion This method will throw an exception if the URLSession has been invalidated.
- @param downloadConfiguration
- The configuration to be used to create the download task.
-*/
+/// Creates and initializes an AVAssetDownloadTask to be used with this AVAssetDownloadURLSession.
+///
+/// This method will throw an exception if the URLSession has been invalidated.
+///
+/// - Parameter downloadConfiguration: The configuration to be used to create the download task.
- (AVAssetDownloadTask *)assetDownloadTaskWithConfiguration:(AVAssetDownloadConfiguration *)downloadConfiguration API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
// only AVAssetDownloadTasks can be created with AVAssetDownloadURLSession
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetExportSession.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetExportSession.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetExportSession.h 2025-04-19 04:45:47
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetExportSession.h 2025-05-31 00:03:08
@@ -4,7 +4,7 @@
Framework: AVFoundation
- Copyright 2010-2023 Apple Inc. All rights reserved.
+ Copyright 2010-2025 Apple Inc. All rights reserved.
*/
@@ -53,10 +53,13 @@
AVF_EXPORT NSString *const AVAssetExportPresetHEVC1920x1080WithAlpha API_AVAILABLE(macos(10.15), ios(13.0), tvos(13.0), visionos(1.0)) API_UNAVAILABLE(watchos);
AVF_EXPORT NSString *const AVAssetExportPresetHEVC3840x2160 API_AVAILABLE(macos(10.13), ios(11.0), tvos(11.0), visionos(1.0)) API_UNAVAILABLE(watchos);
AVF_EXPORT NSString *const AVAssetExportPresetHEVC3840x2160WithAlpha API_AVAILABLE(macos(10.15), ios(13.0), tvos(13.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-AVF_EXPORT NSString *const AVAssetExportPresetHEVC7680x4320 API_AVAILABLE(macos(12.1)) API_UNAVAILABLE(ios, tvos, watchos, visionos);
+AVF_EXPORT NSString *const AVAssetExportPresetHEVC4320x2160 API_AVAILABLE(macos(26.0), ios(26.0), visionos(26.0)) API_UNAVAILABLE(tvos, watchos);
+AVF_EXPORT NSString *const AVAssetExportPresetHEVC7680x4320 API_AVAILABLE(macos(12.1), ios(26.0), visionos(26.0)) API_UNAVAILABLE(tvos, watchos);
AVF_EXPORT NSString *const AVAssetExportPresetMVHEVC960x960 API_AVAILABLE(macos(14.0), ios(17.0), visionos(1.0)) API_UNAVAILABLE(tvos) API_UNAVAILABLE(watchos);
AVF_EXPORT NSString *const AVAssetExportPresetMVHEVC1440x1440 API_AVAILABLE(macos(14.0), ios(17.0), visionos(1.0)) API_UNAVAILABLE(tvos) API_UNAVAILABLE(watchos);
+AVF_EXPORT NSString *const AVAssetExportPresetMVHEVC4320x4320 API_AVAILABLE(macos(26.0), ios(26.0), visionos(26.0)) API_UNAVAILABLE(tvos, watchos);
+AVF_EXPORT NSString *const AVAssetExportPresetMVHEVC7680x7680 API_AVAILABLE(macos(26.0), ios(26.0), visionos(26.0)) API_UNAVAILABLE(tvos, watchos);
/* This export option will produce an audio-only .m4a file with appropriate iTunes gapless playback data */
AVF_EXPORT NSString *const AVAssetExportPresetAppleM4A API_AVAILABLE(macos(10.7), ios(4.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetImageGenerator.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetImageGenerator.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetImageGenerator.h 2025-04-19 03:33:10
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetImageGenerator.h 2025-05-24 01:45:47
@@ -123,6 +123,7 @@
- "renderSize" width or height is less than zero
- "frameDuration" is invalid or less than or equal to zero
- "sourceTrackIDForFrameTiming" is less than zero
+ - "outputBufferDescription" is non-nil
*/
@property (nonatomic, copy, nullable) AVVideoComposition *videoComposition;
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetPlaybackAssistant.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetPlaybackAssistant.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetPlaybackAssistant.h 2025-04-19 06:23:35
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetPlaybackAssistant.h 2025-05-31 21:11:59
@@ -4,7 +4,7 @@
Framework: AVFoundation
- Copyright 2021 Apple Inc. All rights reserved.
+ Copyright 2021-2025 Apple Inc. All rights reserved.
*/
@@ -17,54 +17,52 @@
typedef NSString *AVAssetPlaybackConfigurationOption NS_STRING_ENUM API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
-/*!
- @constant AVAssetPlaybackConfigurationOptionStereoVideo
- @abstract Indicates whether or not the asset can be rendered as stereo video.
- @discussion Clients may use this property to determine whether to configure stereo video rendering.
-*/
+/// Indicates whether or not the asset can be rendered as stereo video.
+///
+/// Clients may use this property to determine whether to configure stereo video rendering.
AVF_EXPORT AVAssetPlaybackConfigurationOption const AVAssetPlaybackConfigurationOptionStereoVideo API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
-/*!
- @constant AVAssetPlaybackConfigurationOptionStereoMultiviewVideo
- @abstract Indicates whether or not the asset can rendered as stereo video and is also in a multiview compression format.
- @discussion Clients may use this property to determine whether to configure stereo video rendering.
-*/
+/// Indicates whether or not the asset can rendered as stereo video and is also in a multiview compression format.
+///
+/// Clients may use this property to determine whether to configure stereo video rendering.
AVF_EXPORT AVAssetPlaybackConfigurationOption const AVAssetPlaybackConfigurationOptionStereoMultiviewVideo API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
-/*!
- @constant AVAssetPlaybackConfigurationOptionSpatialVideo
- @abstract Indicates whether or not the asset can be rendered as spatial video.
- @discussion Clients may use this property to determine whether to configure spatial video rendering.
- */
+/// Indicates whether or not the asset can be rendered as spatial video.
+///
+/// Clients may use this property to determine whether to configure spatial video rendering.
AVF_EXPORT AVAssetPlaybackConfigurationOption const AVAssetPlaybackConfigurationOptionSpatialVideo API_AVAILABLE(macos(15.0), ios(18.0), tvos(18.0), watchos(11.0), visionos(2.0));
-/*!
- @class AVAssetPlaybackAssistant
- @abstract AVAssetPlaybackAssistant provides playback information for an asset.
- @discussion Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
+/// Indicates whether the asset calls for the use of a non-rectilinear projection for rendering video.
+///
+/// Clients may use this property to determine whether to configure a non-rectilinear projection when displaying video.
+AVF_EXPORT AVAssetPlaybackConfigurationOption const AVAssetPlaybackConfigurationOptionNonRectilinearProjection API_AVAILABLE(macos(26.0), ios(26.0), visionos(26.0)) API_UNAVAILABLE(tvos, watchos);
+
+/// Indicates whether the asset is Apple Immersive Video.
+///
+/// Clients may use this property to switch into specific display and control modes for Apple Immersive Video playback.
+AVF_EXPORT AVAssetPlaybackConfigurationOption const AVAssetPlaybackConfigurationOptionAppleImmersiveVideo API_AVAILABLE(macos(26.0), visionos(26.0)) API_UNAVAILABLE(ios, tvos, watchos);
+
+/// AVAssetPlaybackAssistant provides playback information for an asset.
+///
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0))
@interface AVAssetPlaybackAssistant : NSObject
AV_INIT_UNAVAILABLE
-/*!
- @method assetPlaybackAssistantWithAsset:
- @abstract Returns an instance of AVAssetPlaybackAssistant for inspection of an AVAsset object.
- @param asset
- An instance of AVAsset.
- @result An instance of AVAssetPlaybackAssistant.
-*/
+/// Returns an instance of AVAssetPlaybackAssistant for inspection of an AVAsset object.
+///
+/// - Parameter asset: An instance of AVAsset.
+///
+/// - Returns: An instance of AVAssetPlaybackAssistant.
+ (instancetype)assetPlaybackAssistantWithAsset:(AVAsset *)asset;
-/*!
- @method loadPlaybackConfigurationOptionsWithCompletionHandler:
- @abstract Calls the completionHandler with information about the asset.
- @param completionHandler
- Called with an array of AVAssetPlaybackConfigurationOption values describing capabilities of the asset.
- @discussion completionHandler is called when all of the AVAssetPlaybackConfigurationOption values have been loaded. If AVAssetPlaybackAssistant encounters failures when inspecting the contents of the asset, it will return no AVAssetPlaybackConfigurationOptions associated with those contents.
-*/
+/// Calls the completionHandler with information about the asset.
+///
+/// completionHandler is called when all of the AVAssetPlaybackConfigurationOption values have been loaded. If AVAssetPlaybackAssistant encounters failures when inspecting the contents of the asset, it will return no AVAssetPlaybackConfigurationOptions associated with those contents.
+///
+/// - Parameter completionHandler: Called with an array of AVAssetPlaybackConfigurationOption values describing capabilities of the asset.
- (void)loadPlaybackConfigurationOptionsWithCompletionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVAssetPlaybackConfigurationOption> *playbackConfigurationOptions))completionHandler NS_SWIFT_ASYNC_NAME(getter:playbackConfigurationOptions());
@end
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetReader.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetReader.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetReader.h 2025-04-19 04:45:47
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetReader.h 2025-05-31 21:25:21
@@ -177,7 +177,14 @@
This method throws an exception if the output has already been added to an AVAssetReader or if reading has started (`status` has progressed beyond AVAssetReaderStatusUnknown).
*/
-- (void)addOutput:(AVAssetReaderOutput *)output;
+- (void)addOutput:(AVAssetReaderOutput *)output
+#if __swift__
+API_DEPRECATED("Use the appropriate AVAssetReader.outputProvider(for:...) overload for your output and optional adaptor instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
/*!
@method startReading
@@ -194,7 +201,14 @@
This method throws an exception if reading has already started (`status` has progressed beyond AVAssetReaderStatusUnknown).
*/
-- (BOOL)startReading;
+- (BOOL)startReading
+#if __swift__
+API_DEPRECATED("Use start() instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
/*!
@method cancelReading
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetReaderOutput.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetReaderOutput.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetReaderOutput.h 2025-04-19 04:45:47
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetReaderOutput.h 2025-05-31 00:24:55
@@ -67,7 +67,14 @@
This property throws an exception if a value is set after reading has started (the asset reader has progressed beyond AVAssetReaderStatusUnknown).
*/
-@property (nonatomic) BOOL alwaysCopiesSampleData API_AVAILABLE(macos(10.8), ios(5.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
+@property (nonatomic) BOOL alwaysCopiesSampleData
+#if __swift__
+API_DEPRECATED("It is not necessary to copy the sample data in order to make it safe to use the vended buffer", macos(10.8, API_TO_BE_DEPRECATED), ios(5.0, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.8), ios(5.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
/*!
@method copyNextSampleBuffer
@@ -84,7 +91,12 @@
This method throws an exception if this output is not added to an instance of AVAssetReader (using -addOutput:) and -startReading is not called on that asset reader.
*/
-- (nullable CMSampleBufferRef)copyNextSampleBuffer CF_RETURNS_RETAINED;
+- (nullable CMSampleBufferRef)copyNextSampleBuffer CF_RETURNS_RETAINED
+#if __swift__
+API_DEPRECATED("Use AVAssetReaderOutput.Provider.next() instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#endif
+;
@end
@@ -102,9 +114,16 @@
The default value is NO, which means that the asset reader output may not be reconfigured once reading has begun. When the value of this property is NO, AVAssetReader may be able to read media data more efficiently, particularly when multiple asset reader outputs are attached.
- This property throws an exception if a value is set after reading has started (the asset reader has progressed beyond AVAssetReaderStatusUnknown).
+ This property throws an exception if a value is set after reading has started (the asset reader has progressed beyond AVAssetReaderStatusUnknown) or after an AVAssetReaderOutput.Provider is attached.
*/
-@property (nonatomic) BOOL supportsRandomAccess API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
+@property (nonatomic) BOOL supportsRandomAccess
+#if __swift__
+API_DEPRECATED("Use AVAssetReader.outputProviderWithRandomAccess(for:) instead", macos(10.10, API_TO_BE_DEPRECATED), ios(8.0, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
/*!
@method resetForReadingTimeRanges:
@@ -135,7 +154,14 @@
- cannot be called without setting "supportsRandomAccess" to YES
- cannot be called after calling -markConfigurationAsFinal
*/
-- (void)resetForReadingTimeRanges:(NSArray<NSValue *> *)timeRanges API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
+- (void)resetForReadingTimeRanges:(NSArray<NSValue *> *)timeRanges
+#if __swift__
+API_DEPRECATED("Use RandomAccessController.resetForReading(timeRanges:) instead", macos(10.10, API_TO_BE_DEPRECATED), ios(8.0, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
/*!
@method markConfigurationAsFinal
@@ -149,7 +175,14 @@
Once this method has been called, further invocations of -resetForReadingTimeRanges: are disallowed.
*/
-- (void)markConfigurationAsFinal API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
+- (void)markConfigurationAsFinal
+#if __swift__
+API_DEPRECATED("Use RandomAccessController.markConfigurationAsFinal() instead", macos(10.10, API_TO_BE_DEPRECATED), ios(8.0, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
@end
@@ -530,7 +563,12 @@
Defines an interface for reading metadata, packaged as instances of AVTimedMetadataGroup, from a single AVAssetReaderTrackOutput object.
*/
+#if __swift__
+API_DEPRECATED("Use AVAssetReader.outputMetadataProvider(for:) instead", macos(10.10, API_TO_BE_DEPRECATED), ios(8.0, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
@interface AVAssetReaderOutputMetadataAdaptor : NSObject
{
@private
@@ -608,7 +646,12 @@
An adaptor class for reading instances of AVCaptionGroup from a track containing timed text (i.e. subtitles or closed captions).
*/
+#if __swift__
+API_DEPRECATED("Use AVAssetReader.outputCaptionProvider(for:validationDelegate:) instead", macos(12.0, API_TO_BE_DEPRECATED), ios(18.0, API_TO_BE_DEPRECATED), macCatalyst(15.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(tvos, watchos, visionos)
+#else
API_AVAILABLE(macos(12.0), ios(18.0), macCatalyst(15.0)) API_UNAVAILABLE(tvos, watchos, visionos)
+#endif
@interface AVAssetReaderOutputCaptionAdaptor : NSObject
{
@private
@@ -674,7 +717,7 @@
The returned array contains the set of captions in the given group whose time ranges have the same start time as the group. This method is provided as a convenience for clients who want to process captions one-by-one and do not need a complete view of the set of captions active at a given time.
*/
- (NSArray<AVCaption *> *)captionsNotPresentInPreviousGroupsInCaptionGroup:(AVCaptionGroup *)captionGroup;
-
+
@end
/*!
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetSegmentReport.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetSegmentReport.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetSegmentReport.h 2025-04-19 05:01:49
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetSegmentReport.h 2025-05-31 21:25:21
@@ -37,9 +37,7 @@
/*!
@class AVAssetSegmentReport
@abstract This class provides information on a segment data.
- @discussion Clients may get an instance of AVAssetSegmentReport through the -assetWriter:didOutputSegmentData:segmentType:segmentReport: delegate method, which is defined in AVAssetWriter.h.
-
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
+ @discussion Clients may get an instance of AVAssetSegmentReport through the -assetWriter:didOutputSegmentData:segmentType:segmentReport: delegate method, which is defined in AVAssetWriter.h. Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
*/
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(11.0), ios(14.0), tvos(14.0), visionos(1.0)) API_UNAVAILABLE(watchos)
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetTrack.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetTrack.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetTrack.h 2025-04-19 03:37:56
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetTrack.h 2025-05-31 00:03:09
@@ -4,21 +4,10 @@
Framework: AVFoundation
- Copyright 2010-2022 Apple Inc. All rights reserved.
+ Copyright 2010-2025 Apple Inc. All rights reserved.
*/
-/*!
- @class AVAssetTrack
-
- @abstract An AVAssetTrack object provides provides the track-level inspection interface for all assets.
-
- @discussion
- AVAssetTrack adopts the AVAsynchronousKeyValueLoading protocol. Methods in the protocol should be used to access a track's properties without blocking the current thread. To cancel load requests for all keys of AVAssetTrack one must message the parent AVAsset object (for example, [track.asset cancelLoading]).
-
- For clients who want to examine a subset of the metadata or other parts of the track, asynchronous methods like -loadMetadataForFormat:completionHandler: can be used to load this information without blocking. When using these asynchronous methods, it is not necessary to load the associated property beforehand. Swift clients can also use the load(:) method to load properties in a type safe manner.
-*/
-
#import <AVFoundation/AVBase.h>
#import <AVFoundation/AVAsynchronousKeyValueLoading.h>
#import <AVFoundation/AVAsset.h>
@@ -31,6 +20,11 @@
@class AVAssetTrackInternal;
+/// An AVAssetTrack object provides provides the track-level inspection interface for all assets.
+///
+/// AVAssetTrack adopts the AVAsynchronousKeyValueLoading protocol. Methods in the protocol should be used to access a track's properties without blocking the current thread. To cancel load requests for all keys of AVAssetTrack one must message the parent AVAsset object (for example, [track.asset cancelLoading]).
+///
+/// For clients who want to examine a subset of the metadata or other parts of the track, asynchronous methods like -loadMetadataForFormat:completionHandler: can be used to load this information without blocking. When using these asynchronous methods, it is not necessary to load the associated property beforehand. Swift clients can also use the load(:) method to load properties in a type safe manner.
API_AVAILABLE(macos(10.7), ios(4.0), tvos(9.0), watchos(1.0), visionos(1.0))
@interface AVAssetTrack : NSObject <NSCopying, AVAsynchronousKeyValueLoading>
{
@@ -39,10 +33,10 @@
}
AV_INIT_UNAVAILABLE
-/* provides a reference to the AVAsset of which the AVAssetTrack is a part */
+/// Provides a reference to the AVAsset of which the AVAssetTrack is a part
@property (nonatomic, readonly, weak) AVAsset *asset;
-/* indicates the persistent unique identifier for this track of the asset */
+/// Indicates the persistent unique identifier for this track of the asset
@property (nonatomic, readonly) CMPersistentTrackID trackID;
/* Note that cancellation of loading requests for all keys of AVAssetTrack must be made on the parent AVAsset, e.g. [[track.asset] cancelLoading] */
@@ -52,16 +46,13 @@
@interface AVAssetTrack (AVAssetTrackBasicPropertiesAndCharacteristics)
-/* indicates the media type for this track, e.g. AVMediaTypeVideo, AVMediaTypeAudio, etc., as defined in AVMediaFormat.h. */
+/// Indicates the media type for this track, e.g. AVMediaTypeVideo, AVMediaTypeAudio, etc., as defined in AVMediaFormat.h.
@property (nonatomic, readonly) AVMediaType mediaType;
-/* provides an array of CMFormatDescriptions
- each of which indicates the format of media samples referenced by the track;
- a track that presents uniform media, e.g. encoded according to the same encoding settings,
- will provide an array with a count of 1 */
+/// Provides an array of CMFormatDescriptions each of which indicates the format of media samples referenced by the track; a track that presents uniform media, e.g. encoded according to the same encoding settings, will provide an array with a count of 1.
@property (nonatomic, readonly) NSArray *formatDescriptions AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.formatDescriptions) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* Indicates whether the receiver is playable in the current environment; if YES, an AVPlayerItemTrack of an AVPlayerItem initialized with the receiver's asset can be enabled for playback. */
+/// Indicates whether the receiver is playable in the current environment; if YES, an AVPlayerItemTrack of an AVPlayerItem initialized with the receiver's asset can be enabled for playback.
@property (nonatomic, readonly, getter=isPlayable) BOOL playable
#if __swift__
API_DEPRECATED("Use load(.isPlayable) instead", macos(10.8, 13.0), ios(5.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -69,7 +60,7 @@
API_AVAILABLE(macos(10.8), ios(5.0), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/* Indicates whether the receiver is decodable in the current environment; if YES, the track can be decoded even though decoding may be too slow for real time playback. */
+/// Indicates whether the receiver is decodable in the current environment; if YES, the track can be decoded even though decoding may be too slow for real time playback.
@property (nonatomic, readonly, getter=isDecodable) BOOL decodable
#if __swift__
API_DEPRECATED("Use load(.isDecodable) instead", macos(10.13, 13.0), ios(11.0, 16.0), tvos(11.0, 16.0), watchos(4.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -77,24 +68,20 @@
API_AVAILABLE(macos(10.13), ios(11.0), tvos(11.0), watchos(4.0), visionos(1.0));
#endif
-/* indicates whether the track is enabled according to state stored in its container or construct;
- note that its presentation state can be changed from this default via AVPlayerItemTrack */
+/// Indicates whether the track is enabled according to state stored in its container or construct; note that its presentation state can be changed from this default via AVPlayerItemTrack
@property (nonatomic, readonly, getter=isEnabled) BOOL enabled AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.isEnabled) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates whether the track references sample data only within its storage container */
+/// Indicates whether the track references sample data only within its storage container
@property (nonatomic, readonly, getter=isSelfContained) BOOL selfContained AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.isSelfContained) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates the total number of bytes of sample data required by the track */
+/// Indicates the total number of bytes of sample data required by the track
@property (nonatomic, readonly) long long totalSampleDataLength AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.totalSampleDataLength) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/*!
- @method hasMediaCharacteristic:
- @abstract Reports whether the track references media with the specified media characteristic.
- @param mediaCharacteristic
- The media characteristic of interest, e.g. AVMediaCharacteristicVisual, AVMediaCharacteristicAudible, AVMediaCharacteristicLegible, etc.,
- as defined above.
- @result YES if the track references media with the specified characteristic, otherwise NO.
-*/
+/// Reports whether the track references media with the specified media characteristic.
+///
+/// - Parameter mediaCharacteristic: The media characteristic of interest, e.g. AVMediaCharacteristicVisual, AVMediaCharacteristicAudible, AVMediaCharacteristicLegible, etc., as defined above.
+///
+/// - Returns: YES if the track references media with the specified characteristic, otherwise NO.
- (BOOL)hasMediaCharacteristic:(AVMediaCharacteristic)mediaCharacteristic AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.mediaCharacteristics) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
@end
@@ -102,14 +89,13 @@
@interface AVAssetTrack (AVAssetTrackTemporalProperties)
-/* Indicates the timeRange of the track within the overall timeline of the asset;
- a track with CMTIME_COMPARE_INLINE(timeRange.start, >, kCMTimeZero) will initially present an empty interval. */
+/// Indicates the timeRange of the track within the overall timeline of the asset; a track with CMTIME_COMPARE_INLINE(timeRange.start, >, kCMTimeZero) will initially present an empty interval.
@property (nonatomic, readonly) CMTimeRange timeRange AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.timeRange) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates a timescale in which time values for the track can be operated upon without extraneous numerical conversion */
+/// Indicates a timescale in which time values for the track can be operated upon without extraneous numerical conversion
@property (nonatomic, readonly) CMTimeScale naturalTimeScale AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.naturalTimeScale) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates the estimated data rate of the media data referenced by the track, in units of bits per second */
+/// Indicates the estimated data rate of the media data referenced by the track, in units of bits per second
@property (nonatomic, readonly) float estimatedDataRate AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.estimatedDataRate) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
@end
@@ -117,12 +103,10 @@
@interface AVAssetTrack (AVAssetTrackLanguageProperties)
-/* indicates the language associated with the track, as an ISO 639-2/T language code;
- may be nil if no language is indicated */
+/// Indicates the language associated with the track, as an ISO 639-2/T language code; may be nil if no language is indicated
@property (nonatomic, readonly, nullable) NSString *languageCode AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.languageCode) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates the language tag associated with the track, as an IETF BCP 47 (RFC 4646) language identifier;
- may be nil if no language tag is indicated */
+/// Indicates the language tag associated with the track, as an IETF BCP 47 (RFC 4646) language identifier; may be nil if no language tag is indicated
@property (nonatomic, readonly, nullable) NSString *extendedLanguageTag AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.extendedLanguageTag) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
@end
@@ -130,11 +114,10 @@
@interface AVAssetTrack (AVAssetTrackPropertiesForVisualCharacteristic)
-/* indicates the natural dimensions of the media data referenced by the track as a CGSize */
+/// Indicates the natural dimensions of the media data referenced by the track as a CGSize
@property (nonatomic, readonly) CGSize naturalSize AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.naturalSize) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates the transform specified in the track's storage container as the preferred transformation of the visual media data for display purposes;
- its value is often but not always CGAffineTransformIdentity */
+/// Indicates the transform specified in the track's storage container as the preferred transformation of the visual media data for display purposes; its value is often but not always CGAffineTransformIdentity
@property (nonatomic, readonly) CGAffineTransform preferredTransform AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.preferredTransform) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
@end
@@ -142,10 +125,10 @@
@interface AVAssetTrack (AVAssetTrackPropertiesForAudibleCharacteristic)
-/* indicates the volume specified in the track's storage container as the preferred volume of the audible media data */
+/// Indicates the volume specified in the track's storage container as the preferred volume of the audible media data
@property (nonatomic, readonly) float preferredVolume AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.preferredVolume) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates whether this audio track has dependencies (e.g. kAudioFormatMPEGD_USAC) */
+/// Indicates whether this audio track has dependencies (e.g. kAudioFormatMPEGD_USAC)
@property (nonatomic, readonly) BOOL hasAudioSampleDependencies
#if __swift__
API_DEPRECATED("Use load(.hasAudioSampleDependencies) instead", macos(10.15, 13.0), ios(13.0, 16.0), tvos(13.0, 16.0), watchos(6.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -158,14 +141,12 @@
@interface AVAssetTrack (AVAssetTrackPropertiesForFrameBasedCharacteristic)
-/*!
- @property nominalFrameRate
- @abstract For tracks that carry a full frame per media sample, indicates the frame rate of the track in units of frames per second.
- @discussion For field-based video tracks that carry one field per media sample, the value of this property is the field rate, not the frame rate.
-*/
+/// For tracks that carry a full frame per media sample, indicates the frame rate of the track in units of frames per second.
+///
+/// For field-based video tracks that carry one field per media sample, the value of this property is the field rate, not the frame rate.
@property (nonatomic, readonly) float nominalFrameRate AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.nominalFrameRate) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* indicates the minimum duration of the track's frames; the value will be kCMTimeInvalid if the minimum frame duration is not known or cannot be calculated */
+/// Indicates the minimum duration of the track's frames; the value will be kCMTimeInvalid if the minimum frame duration is not known or cannot be calculated
@property (nonatomic, readonly) CMTime minFrameDuration
#if __swift__
API_DEPRECATED("Use load(.minFrameDuration) instead", macos(10.10, 13.0), ios(7.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -173,10 +154,7 @@
API_AVAILABLE(macos(10.10), ios(7.0), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/*!
- @property requiresFrameReordering
- @abstract Indicates whether samples in the track may have different values for their presentation and decode timestamps.
-*/
+/// Indicates whether samples in the track may have different values for their presentation and decode timestamps.
@property (nonatomic, readonly) BOOL requiresFrameReordering
#if __swift__
API_DEPRECATED("Use load(.requiresFrameReordering) instead", macos(10.10, 13.0), ios(8.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -189,18 +167,16 @@
@interface AVAssetTrack (AVAssetTrackSegments)
-/* Provides an array of AVAssetTrackSegments with time mappings from the timeline of the track's media samples to the timeline of the track.
- Empty edits, i.e. timeRanges for which no media data is available to be presented, have a value of AVAssetTrackSegment.empty equal to YES. */
+/// Provides an array of AVAssetTrackSegments with time mappings from the timeline of the track's media samples to the timeline of the track. Empty edits, i.e. timeRanges for which no media data is available to be presented, have a value of AVAssetTrackSegment.empty equal to YES.
@property (nonatomic, copy, readonly) NSArray<AVAssetTrackSegment *> *segments AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.segments) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/*!
- @method segmentForTrackTime:
- @abstract Supplies the AVAssetTrackSegment from the segments array with a target timeRange that either contains the specified track time or is the closest to it among the target timeRanges of the track's segments.
- @param trackTime
- The trackTime for which an AVAssetTrackSegment is requested.
- @result An AVAssetTrackSegment.
- @discussion If the trackTime does not map to a sample presentation time (e.g. it's outside the track's timeRange), the segment closest in time to the specified trackTime is returned.
-*/
+/// Supplies the AVAssetTrackSegment from the segments array with a target timeRange that either contains the specified track time or is the closest to it among the target timeRanges of the track's segments.
+///
+/// If the trackTime does not map to a sample presentation time (e.g. it's outside the track's timeRange), the segment closest in time to the specified trackTime is returned.
+///
+/// - Parameter trackTime: The trackTime for which an AVAssetTrackSegment is requested.
+///
+/// - Returns: An AVAssetTrackSegment.
- (nullable AVAssetTrackSegment *)segmentForTrackTime:(CMTime)trackTime
#if __swift__
API_DEPRECATED("Use loadSegment(forTrackTime:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -208,24 +184,19 @@
API_DEPRECATED("Use loadSegmentForTrackTime:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadSegmentForTrackTime:completionHandler:
- @abstract Loads the AVAssetTrackSegment from the segments array with a target timeRange that either contains the specified track time or is the closest to it among the target timeRanges of the track's segments.
- @param trackTime
- The trackTime for which an AVAssetTrackSegment is requested.
- @param completionHandler
- A block that is invoked when loading is complete, vending an AVAssetTrackSegment or an error.
- @discussion If the trackTime does not map to a sample presentation time (e.g. it's outside the track's timeRange), the segment closest in time to the specified trackTime is returned.
-*/
+/// Loads the AVAssetTrackSegment from the segments array with a target timeRange that either contains the specified track time or is the closest to it among the target timeRanges of the track's segments.
+///
+/// If the trackTime does not map to a sample presentation time (e.g. it's outside the track's timeRange), the segment closest in time to the specified trackTime is returned.
+///
+/// - Parameter trackTime: The trackTime for which an AVAssetTrackSegment is requested.
+/// - Parameter completionHandler: A block that is invoked when loading is complete, vending an AVAssetTrackSegment or an error.
- (void)loadSegmentForTrackTime:(CMTime)trackTime completionHandler:(void (^ NS_SWIFT_SENDABLE)(AVAssetTrackSegment * _Nullable_result, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
-/*!
- @method samplePresentationTimeForTrackTime:
- @abstract Maps the specified trackTime through the appropriate time mapping and returns the resulting sample presentation time.
- @param trackTime
- The trackTime for which a sample presentation time is requested.
- @result A CMTime; will be invalid if the trackTime is out of range
-*/
+/// Maps the specified trackTime through the appropriate time mapping and returns the resulting sample presentation time.
+///
+/// - Parameter trackTime: The trackTime for which a sample presentation time is requested.
+///
+/// - Returns: A CMTime; will be invalid if the trackTime is out of range
- (CMTime)samplePresentationTimeForTrackTime:(CMTime)trackTime
#if __swift__
API_DEPRECATED("Use loadSamplePresentationTime(forTrackTime:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -233,14 +204,10 @@
API_DEPRECATED("Use loadSamplePresentationTimeForTrackTime:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadSamplePresentationTimeForTrackTime:completionHandler:
- @abstract Maps the specified trackTime through the appropriate time mapping and loads the resulting sample presentation time.
- @param trackTime
- The trackTime for which a sample presentation time is requested.
- @param completionHandler
- A block that is invoked when loading is complete, vending a CMTime (which will be invalid if the trackTime is out of range) or an error.
-*/
+/// Maps the specified trackTime through the appropriate time mapping and loads the resulting sample presentation time.
+///
+/// - Parameter trackTime: The trackTime for which a sample presentation time is requested.
+/// - Parameter completionHandler: A block that is invoked when loading is complete, vending a CMTime (which will be invalid if the trackTime is out of range) or an error.
- (void)loadSamplePresentationTimeForTrackTime:(CMTime)trackTime completionHandler:(void (^ NS_SWIFT_SENDABLE)(CMTime, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
@end
@@ -250,11 +217,10 @@
// high-level access to selected metadata of common interest
-/* provides access to an array of AVMetadataItems for each common metadata key for which a value is available */
+/// Provides access to an array of AVMetadataItems for each common metadata key for which a value is available
@property (nonatomic, readonly) NSArray<AVMetadataItem *> *commonMetadata AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.commonMetadata) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/* Provides access to an array of AVMetadataItems for all metadata identifiers for which a value is available; items can be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:] and according to identifier via +[AVMetadataItem metadataItemsFromArray:filteredByIdentifier:].
-*/
+/// Provides access to an array of AVMetadataItems for all metadata identifiers for which a value is available; items can be filtered according to language via +[AVMetadataItem metadataItemsFromArray:filteredAndSortedAccordingToPreferredLanguages:] and according to identifier via +[AVMetadataItem metadataItemsFromArray:filteredByIdentifier:].
@property (nonatomic, readonly) NSArray<AVMetadataItem *> *metadata
#if __swift__
API_DEPRECATED("Use load(.metadata) instead", macos(10.10, 13.0), ios(8.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -262,18 +228,16 @@
API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/* provides an NSArray of NSStrings, each representing a format of metadata that's available for the track (e.g. QuickTime userdata, etc.)
- Metadata formats are defined in AVMetadataItem.h. */
+/// Provides an NSArray of NSStrings, each representing a format of metadata that's available for the track (e.g. QuickTime userdata, etc.) Metadata formats are defined in AVMetadataItem.h.
@property (nonatomic, readonly) NSArray<AVMetadataFormat> *availableMetadataFormats AVF_DEPRECATED_FOR_SWIFT_ONLY("Use load(.availableMetadataFormats) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0), visionos(1.0, 1.0));
-/*!
- @method metadataForFormat:
- @abstract Provides an NSArray of AVMetadataItems, one for each metadata item in the container of the specified format.
- @param format
- The metadata format for which items are requested.
- @result An NSArray containing AVMetadataItems.
- @discussion Becomes callable without blocking when the key @"availableMetadataFormats" has been loaded
-*/
+/// Provides an NSArray of AVMetadataItems, one for each metadata item in the container of the specified format.
+///
+/// Becomes callable without blocking when the key @"availableMetadataFormats" has been loaded
+///
+/// - Parameter format: The metadata format for which items are requested.
+///
+/// - Returns: An NSArray containing AVMetadataItems.
- (NSArray<AVMetadataItem *> *)metadataForFormat:(AVMetadataFormat)format
#if __swift__
API_DEPRECATED("Use loadMetadata(for:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -281,14 +245,10 @@
API_DEPRECATED("Use loadMetadataForFormat:completionHandler: instead", macos(10.7, 15.0), ios(4.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadMetadataForFormat:completionHandler:
- @abstract Loads an NSArray of AVMetadataItems, one for each metadata item in the container of the specified format.
- @param format
- The metadata format for which items are requested.
- @param completionHandler
- A block that is invoked when loading is complete, vending the array of metadata items (which may be empty if there is no metadata of the specified format) or an error.
-*/
+/// Loads an NSArray of AVMetadataItems, one for each metadata item in the container of the specified format.
+///
+/// - Parameter format: The metadata format for which items are requested.
+/// - Parameter completionHandler: A block that is invoked when loading is complete, vending the array of metadata items (which may be empty if there is no metadata of the specified format) or an error.
- (void)loadMetadataForFormat:(AVMetadataFormat)format completionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVMetadataItem *> * _Nullable, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
@end
@@ -296,87 +256,62 @@
@interface AVAssetTrack (AVAssetTrackTrackAssociations)
-/*!
- @typedef AVTrackAssociationType
- @abstract
- The type of a track association.
-*/
+/// The type of a track association.
typedef NSString * AVTrackAssociationType NS_STRING_ENUM;
-/*
- @constant AVTrackAssociationTypeAudioFallback
- @abstract Indicates an association between an audio track with another audio track that contains the same content but is typically encoded in a different format that's more widely supported, used to nominate a track that should be used in place of an unsupported track.
-
- @discussion
- Associations of type AVTrackAssociationTypeAudioFallback are supported only between audio tracks. This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with a corresponding track that has content that's less widely supported, and the input parameter should be an instance of AVAssetWriterInput with a corresponding track that has content that's more widely supported.
-
- Example: Using AVTrackAssociationTypeAudioFallback, a stereo audio track with media subtype kAudioFormatMPEG4AAC could be nominated as the "fallback" for an audio track encoding the same source material but with media subtype kAudioFormatAC3 and a 5.1 channel layout. This would ensure that all clients are capable of playing back some form of the audio.
-
- */
+/// Indicates an association between an audio track with another audio track that contains the same content but is typically encoded in a different format that's more widely supported, used to nominate a track that should be used in place of an unsupported track.
+///
+/// Associations of type AVTrackAssociationTypeAudioFallback are supported only between audio tracks. This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with a corresponding track that has content that's less widely supported, and the input parameter should be an instance of AVAssetWriterInput with a corresponding track that has content that's more widely supported.
+///
+/// Example: Using AVTrackAssociationTypeAudioFallback, a stereo audio track with media subtype kAudioFormatMPEG4AAC could be nominated as the "fallback" for an audio track encoding the same source material but with media subtype kAudioFormatAC3 and a 5.1 channel layout. This would ensure that all clients are capable of playing back some form of the audio.
AVF_EXPORT AVTrackAssociationType const AVTrackAssociationTypeAudioFallback API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/*
- @constant AVTrackAssociationTypeChapterList
- @abstract Indicates an association between a track with another track that contains chapter information. The track containing chapter information may be a text track, a video track, or a timed metadata track.
-
- @discussion
- This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with a corresponding track that has renderable content while the input parameter should be an instance of AVAssetWriterInput with a corresponding track that contains chapter metadata.
- */
+/// Indicates an association between a track with another track that contains chapter information. The track containing chapter information may be a text track, a video track, or a timed metadata track.
+///
+/// This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with a corresponding track that has renderable content while the input parameter should be an instance of AVAssetWriterInput with a corresponding track that contains chapter metadata.
AVF_EXPORT AVTrackAssociationType const AVTrackAssociationTypeChapterList API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/*
- @constant AVTrackAssociationTypeForcedSubtitlesOnly
- @abstract Indicates an association between a subtitle track typically containing both forced and non-forced subtitles with another subtitle track that contains only forced subtitles, for use when the user indicates that only essential subtitles should be displayed. When such an association is established, the forced subtitles in both tracks are expected to present the same content in the same language but may have different timing.
-
- @discussion
- Associations of type AVTrackAssociationTypeForcedSubtitlesOnly are supported only between subtitle tracks. This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with a corresponding subtitle track that contains non-forced subtitles, and the input parameter should be an instance of AVAssetWriterInput with a corresponding subtitle track that contains forced subtitles only.
- */
+/// Indicates an association between a subtitle track typically containing both forced and non-forced subtitles with another subtitle track that contains only forced subtitles, for use when the user indicates that only essential subtitles should be displayed. When such an association is established, the forced subtitles in both tracks are expected to present the same content in the same language but may have different timing.
+///
+/// Associations of type AVTrackAssociationTypeForcedSubtitlesOnly are supported only between subtitle tracks. This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with a corresponding subtitle track that contains non-forced subtitles, and the input parameter should be an instance of AVAssetWriterInput with a corresponding subtitle track that contains forced subtitles only.
AVF_EXPORT AVTrackAssociationType const AVTrackAssociationTypeForcedSubtitlesOnly API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/*
- @constant AVTrackAssociationTypeSelectionFollower
- @abstract Indicates an association between a pair of tracks that specifies that, when the first of the pair is selected, the second of the pair should be considered an appropriate default for selection also. Example: a subtitle track in the same language as an audio track may be associated with that audio track using AVTrackAssociationTypeSelectionFollower, to indicate that selection of the subtitle track, in the absence of a directive for subtitle selection from the user, can "follow" the selection of the audio track.
-
- @discussion
- This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the input parameter should be an instance of AVAssetWriterInput whose selection may depend on the selection of the receiver. In the example above, the receiver would be the instance of AVAssetWriterInput corresponding with the audio track and the input parameter would be the instance of AVAssetWriterInput corresponding with the subtitle track.
- */
+/// Indicates an association between a pair of tracks that specifies that, when the first of the pair is selected, the second of the pair should be considered an appropriate default for selection also. Example: a subtitle track in the same language as an audio track may be associated with that audio track using AVTrackAssociationTypeSelectionFollower, to indicate that selection of the subtitle track, in the absence of a directive for subtitle selection from the user, can "follow" the selection of the audio track.
+///
+/// This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the input parameter should be an instance of AVAssetWriterInput whose selection may depend on the selection of the receiver. In the example above, the receiver would be the instance of AVAssetWriterInput corresponding with the audio track and the input parameter would be the instance of AVAssetWriterInput corresponding with the subtitle track.
AVF_EXPORT AVTrackAssociationType const AVTrackAssociationTypeSelectionFollower API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/*
- @constant AVTrackAssociationTypeTimecode
- @abstract Indicates an association between a track with another track that contains timecode information. The track containing timecode information should be a timecode track.
-
- @discussion
- This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with a corresponding track that may be a video track or an audio track while the input parameter should be an instance of AVAssetWriterInput with a corresponding timecode track.
- */
+/// Indicates an association between a track with another track that contains timecode information. The track containing timecode information should be a timecode track.
+///
+/// This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with a corresponding track that may be a video track or an audio track while the input parameter should be an instance of AVAssetWriterInput with a corresponding timecode track.
AVF_EXPORT AVTrackAssociationType const AVTrackAssociationTypeTimecode API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/*
-@constant AVTrackAssociationTypeMetadataReferent
-@abstract Indicates an association between a metadata track and the track that's described or annotated via the contents of the metadata track.
-
-@discussion
- This track association is optional for AVAssetTracks with the mediaType AVMediaTypeMetadata. When a metadata track lacks this track association, its contents are assumed to describe or annotate the asset as a whole.
- This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with mediaType AVMediaTypeMetadata while the input parameter should be an instance of AVAssetWriterInput that's used to create the track to which the contents of the receiver's corresponding metadata track refer.
-*/
+/// Indicates an association between a metadata track and the track that's described or annotated via the contents of the metadata track.
+///
+/// This track association is optional for AVAssetTracks with the mediaType AVMediaTypeMetadata. When a metadata track lacks this track association, its contents are assumed to describe or annotate the asset as a whole.
+/// This association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with mediaType AVMediaTypeMetadata while the input parameter should be an instance of AVAssetWriterInput that's used to create the track to which the contents of the receiver's corresponding metadata track refer.
AVF_EXPORT AVTrackAssociationType const AVTrackAssociationTypeMetadataReferent API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), watchos(1.0), visionos(1.0));
-/* Provides an NSArray of NSStrings, each representing a type of track association that the receiver has with one or more of the other tracks of the asset (e.g. AVTrackAssociationTypeChapterList, AVTrackAssociationTypeTimecode, etc.).
- Track association types are defined immediately above. */
+/// Indicates an association between a metadata track and another track where the metadata provides additional information for rendering of that track.
+///
+/// This track association is not symmetric; when used with -[AVAssetWriterInput addTrackAssociationWithTrackOfInput:type:], the receiver should be an instance of AVAssetWriterInput with mediaType, AVMediaTypeMetadata, while the input parameter should be an instance of AVAssetWriterInput for the target track that would be rendered (for example, a video track).
+AVF_EXPORT AVTrackAssociationType const AVTrackAssociationTypeRenderMetadataSource API_AVAILABLE(macos(26.0), ios(26.0), tvos(26.0), watchos(26.0), visionos(26.0));
+
+/// Provides an NSArray of NSStrings, each representing a type of track association that the receiver has with one or more of the other tracks of the asset (e.g. AVTrackAssociationTypeChapterList, AVTrackAssociationTypeTimecode, etc.). Track association types are defined immediately above.
@property (nonatomic, readonly) NSArray<AVTrackAssociationType> *availableTrackAssociationTypes
#if __swift__
API_DEPRECATED("Use load(.availableTrackAssociationTypes) instead", macos(10.9, 13.0), ios(7.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
#else
API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), watchos(1.0), visionos(1.0));
#endif
-/*!
- @method associatedTracksOfType:
- @abstract Provides an NSArray of AVAssetTracks, one for each track associated with the receiver with the specified type of track association.
- @param trackAssociationType
- The type of track association for which associated tracks are requested.
- @result An NSArray containing AVAssetTracks; may be empty if there is no associated tracks of the specified type.
- @discussion Becomes callable without blocking when the key @"availableTrackAssociationTypes" has been loaded.
-*/
+
+/// Provides an NSArray of AVAssetTracks, one for each track associated with the receiver with the specified type of track association.
+///
+/// Becomes callable without blocking when the key @"availableTrackAssociationTypes" has been loaded.
+///
+/// - Parameter trackAssociationType: The type of track association for which associated tracks are requested.
+///
+/// - Returns: An NSArray containing AVAssetTracks; may be empty if there is no associated tracks of the specified type.
- (NSArray<AVAssetTrack *> *)associatedTracksOfType:(AVTrackAssociationType)trackAssociationType
#if __swift__
API_DEPRECATED("Use loadAssociatedTracks(ofType:) instead", macos(10.9, 13.0), ios(7.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0)) API_UNAVAILABLE(visionos);
@@ -384,14 +319,10 @@
API_DEPRECATED("Use loadAssociatedTracksOfType:completionHandler: instead", macos(10.9, 15.0), ios(7.0, 18.0), tvos(9.0, 18.0), watchos(1.0, 11.0)) API_UNAVAILABLE(visionos);
#endif
-/*!
- @method loadAssociatedTracksOfType:completionHandler:
- @abstract Provides an NSArray of AVAssetTracks, one for each track associated with the receiver with the specified type of track association.
- @param trackAssociationType
- The type of track association for which associated tracks are requested.
- @param completionHandler
- A block that is invoked when loading is comlete, vending an array of tracks (which may be empty if there is no associated tracks of the specified type) or an error.
-`*/
+/// Provides an NSArray of AVAssetTracks, one for each track associated with the receiver with the specified type of track association.
+///
+/// - Parameter trackAssociationType: The type of track association for which associated tracks are requested.
+/// - Parameter completionHandler: A block that is invoked when loading is comlete, vending an array of tracks (which may be empty if there is no associated tracks of the specified type) or an error. `
- (void)loadAssociatedTracksOfType:(AVTrackAssociationType)trackAssociationType completionHandler:(void (^ NS_SWIFT_SENDABLE)(NSArray<AVAssetTrack *> * _Nullable, NSError * _Nullable))completionHandler API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0));
@end
@@ -401,7 +332,7 @@
@interface AVAssetTrack (AVAssetTrackSampleCursorProvision)
-/* Indicates whether the receiver can provide instances of AVSampleCursor for traversing its media samples and discovering information about them. */
+/// Indicates whether the receiver can provide instances of AVSampleCursor for traversing its media samples and discovering information about them.
@property (nonatomic, readonly) BOOL canProvideSampleCursors
#if __swift__
API_DEPRECATED("Use load(.canProvideSampleCursors) instead", macos(10.10, 13.0)) API_UNAVAILABLE(ios, tvos, watchos, visionos);
@@ -409,32 +340,29 @@
API_AVAILABLE(macos(10.10), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
#endif
-/*!
- @method makeSampleCursorWithPresentationTimeStamp:
- @abstract Creates an instance of AVSampleCursor and positions it at or near the specified presentation timestamp.
- @param presentationTimeStamp
- The desired initial presentation timestamp of the returned AVSampleCursor.
- @result An instance of AVSampleCursor.
- @discussion If the receiver's asset has a value of YES for providesPreciseDurationAndTiming, the sample cursor will be accurately positioned at the receiver's last media sample with presentation timestamp less than or equal to the desired timestamp, or, if there are no such samples, the first sample in presentation order.
- If the receiver's asset has a value of NO for providesPreciseDurationAndTiming, and it is prohibitively expensive to locate the precise sample at the desired timestamp, the sample cursor may be approximately positioned.
- This method will return nil if there are no samples in the track.
-*/
+/// Creates an instance of AVSampleCursor and positions it at or near the specified presentation timestamp.
+///
+/// If the receiver's asset has a value of YES for providesPreciseDurationAndTiming, the sample cursor will be accurately positioned at the receiver's last media sample with presentation timestamp less than or equal to the desired timestamp, or, if there are no such samples, the first sample in presentation order.
+/// If the receiver's asset has a value of NO for providesPreciseDurationAndTiming, and it is prohibitively expensive to locate the precise sample at the desired timestamp, the sample cursor may be approximately positioned.
+/// This method will return nil if there are no samples in the track.
+///
+/// - Parameter presentationTimeStamp: The desired initial presentation timestamp of the returned AVSampleCursor.
+///
+/// - Returns: An instance of AVSampleCursor.
- (nullable AVSampleCursor *)makeSampleCursorWithPresentationTimeStamp:(CMTime)presentationTimeStamp API_AVAILABLE(macos(10.10), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
-/*!
- @method makeSampleCursorAtFirstSampleInDecodeOrder:
- @abstract Creates an instance of AVSampleCursor and positions it at the receiver's first media sample in decode order.
- @result An instance of AVSampleCursor.
- @discussion This method will return nil if there are no samples in the track.
-*/
+/// Creates an instance of AVSampleCursor and positions it at the receiver's first media sample in decode order.
+///
+/// This method will return nil if there are no samples in the track.
+///
+/// - Returns: An instance of AVSampleCursor.
- (nullable AVSampleCursor *)makeSampleCursorAtFirstSampleInDecodeOrder API_AVAILABLE(macos(10.10), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
-/*!
- @method makeSampleCursorAtLastSampleInDecodeOrder:
- @abstract Creates an instance of AVSampleCursor and positions it at the receiver's last media sample in decode order.
- @result An instance of AVSampleCursor.
- @discussion This method will return nil if there are no samples in the track.
-*/
+/// Creates an instance of AVSampleCursor and positions it at the receiver's last media sample in decode order.
+///
+/// This method will return nil if there are no samples in the track.
+///
+/// - Returns: An instance of AVSampleCursor.
- (nullable AVSampleCursor *)makeSampleCursorAtLastSampleInDecodeOrder API_AVAILABLE(macos(10.10), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
@end
@@ -446,34 +374,23 @@
Some of the notifications are also posted by instances of dynamic subclasses, AVFragmentedAssetTrack and AVFragmentedMovieTrack, but these are capable of changing only in well-defined ways and only under specific conditions that you control.
*/
-/*!
- @constant AVAssetTrackTimeRangeDidChangeNotification
- @abstract Posted when the timeRange of an AVFragmentedAssetTrack changes while the associated instance of AVFragmentedAsset is being minded by an AVFragmentedAssetMinder, but only for changes that occur after the status of the value of @"timeRange" has reached AVKeyValueStatusLoaded.
-*/
+/// Posted when the timeRange of an AVFragmentedAssetTrack changes while the associated instance of AVFragmentedAsset is being minded by an AVFragmentedAssetMinder, but only for changes that occur after the status of the value of @"timeRange" has reached AVKeyValueStatusLoaded.
AVF_EXPORT NSString *const AVAssetTrackTimeRangeDidChangeNotification API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), watchos(2.0), visionos(1.0));
-/*!
- @constant AVAssetTrackSegmentsDidChangeNotification
- @abstract Posted when the array of segments of an AVFragmentedAssetTrack changes while the associated instance of AVFragmentedAsset is being minded by an AVFragmentedAssetMinder, but only for changes that occur after the status of the value of @"segments" has reached AVKeyValueStatusLoaded.
-*/
+/// Posted when the array of segments of an AVFragmentedAssetTrack changes while the associated instance of AVFragmentedAsset is being minded by an AVFragmentedAssetMinder, but only for changes that occur after the status of the value of @"segments" has reached AVKeyValueStatusLoaded.
AVF_EXPORT NSString *const AVAssetTrackSegmentsDidChangeNotification API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), watchos(2.0), visionos(1.0));
-/*!
- @constant AVAssetTrackTrackAssociationsDidChangeNotification
- @abstract Posted when the collection of track associations of an AVAssetTrack changes, but only for changes that occur after the status of the value of @"availableTrackAssociationTypes" has reached AVKeyValueStatusLoaded.
-*/
+/// Posted when the collection of track associations of an AVAssetTrack changes, but only for changes that occur after the status of the value of @"availableTrackAssociationTypes" has reached AVKeyValueStatusLoaded.
AVF_EXPORT NSString *const AVAssetTrackTrackAssociationsDidChangeNotification API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), watchos(2.0), visionos(1.0));
#pragma mark --- AVFragmentedAssetTrack ---
-/*!
- @class AVFragmentedAssetTrack
- @abstract A subclass of AVAssetTrack for handling tracks of fragmented assets. An AVFragmentedAssetTrack is capable of changing the values of certain of its properties, if its parent asset is associated with an instance of AVFragmentedAssetMinder when one or more fragments are appended to the underlying media resource.
- @discussion While its parent asset is associated with an AVFragmentedAssetMinder, AVFragmentedAssetTrack posts AVAssetTrackTimeRangeDidChangeNotification and AVAssetTrackSegmentsDidChangeNotification whenever new fragments are detected, as appropriate.
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
@class AVFragmentedAssetTrackInternal;
+/// A subclass of AVAssetTrack for handling tracks of fragmented assets. An AVFragmentedAssetTrack is capable of changing the values of certain of its properties, if its parent asset is associated with an instance of AVFragmentedAssetMinder when one or more fragments are appended to the underlying media resource.
+///
+/// While its parent asset is associated with an AVFragmentedAssetMinder, AVFragmentedAssetTrack posts AVAssetTrackTimeRangeDidChangeNotification and AVAssetTrackSegmentsDidChangeNotification whenever new fragments are detected, as appropriate.
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(10.11), ios(12.0), tvos(12.0), watchos(6.0), visionos(1.0))
@interface AVFragmentedAssetTrack : AVAssetTrack
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetVariant.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetVariant.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetVariant.h 2025-04-19 03:37:57
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetVariant.h 2025-05-31 21:50:42
@@ -23,346 +23,234 @@
@class AVAssetVariantAudioAttributes;
@class AVAssetVariantAudioRenditionSpecificAttributes;
-/*!
- @class AVAssetVariant
- @abstract An AVAssetVariant represents a bit rate variant.
- Each asset contains a collection of variants that represent a combination of audio, video, text, closed captions, and subtitles for a particular bit rate.
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
+/// An AVAssetVariant represents a bit rate variant. Each asset contains a collection of variants that represent a combination of audio, video, text, closed captions, and subtitles for a particular bit rate. Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0))
@interface AVAssetVariant : NSObject
AV_INIT_UNAVAILABLE
-/*!
- @property peakBitRate
- @abstract If it is not declared, the value will be negative.
- */
+/// If it is not declared, the value will be negative.
@property (nonatomic, readonly) double peakBitRate;
-/*!
- @property averageBitRate
- @abstract If it is not declared, the value will be negative.
- */
+/// If it is not declared, the value will be negative.
@property (nonatomic, readonly) double averageBitRate;
-/*!
- @property videoAttributes
- @abstract Provides variant's video rendition attributes. If no video attributes are declared, it will be nil.
- */
+/// Provides variant's video rendition attributes. If no video attributes are declared, it will be nil.
@property (nonatomic, readonly, nullable) AVAssetVariantVideoAttributes *videoAttributes;
-/*!
- @property audioAttributes
- @abstract Provides variant's audio rendition attributes. If no audio attributes are declared, it will be nil.
- */
+/// Provides variant's audio rendition attributes. If no audio attributes are declared, it will be nil.
@property (nonatomic, readonly, nullable) AVAssetVariantAudioAttributes *audioAttributes;
+/// Provides URL to media playlist corresponding to variant
+@property (nonatomic, readonly) NSURL *URL API_AVAILABLE(macos(26.0), ios(26.0), tvos(26.0), watchos(26.0), visionos(26.0));
+
@end
-/*!
- @class AVAssetVariantVideoAttributes
- @abstract Video attributes for an asset variant.
- @discussion Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
+/// Video attributes for an asset variant.
+///
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0))
@interface AVAssetVariantVideoAttributes : NSObject
AV_INIT_UNAVAILABLE
-/*!
- @property videoRange
- @abstract Provides the video range of the variant. If it is not declared, it will be AVVideoRangeSDR.
- */
+/// Provides the video range of the variant. If it is not declared, it will be AVVideoRangeSDR.
@property (nonatomic, readonly) AVVideoRange videoRange;
-/*!
- @property codecTypes
- @abstract Provides an array of video sample codec types present in the variant's renditions if any are declared. Each value in the array is a NSNumber representation of CMVideoCodecType.
- */
+/// Provides an array of video sample codec types present in the variant's renditions if any are declared. Each value in the array is a NSNumber representation of CMVideoCodecType.
@property (nonatomic, readonly) NSArray <NSNumber *> *codecTypes;
-/*!
- @property presentationSize
- @abstract If it is not declared, it will be CGSizeZero.
- */
+/// If it is not declared, it will be CGSizeZero.
@property (nonatomic, readonly) CGSize presentationSize;
-/*!
- @property nominalFrameRate
- @abstract If it is not declared, the value will be negative.
- */
+/// If it is not declared, the value will be negative.
@property (nonatomic, readonly) double nominalFrameRate;
-/*!
- @property videoLayoutAttributes
- @abstract Describes the video layout attributes.
- @discussion videoLayoutAttributes' count may be greater than one if this variant contains a collection of differing video layout media attributes over time.
-*/
+/// Describes the video layout attributes.
+///
+/// videoLayoutAttributes' count may be greater than one if this variant contains a collection of differing video layout media attributes over time.
@property (nonatomic, readonly) NSArray <AVAssetVariantVideoLayoutAttributes *> *videoLayoutAttributes API_AVAILABLE(macos(14.0), ios(17.0), tvos(17.0), watchos(10.0), visionos(1.0));
@end
-/*!
- @class AVAssetVariantVideoLayoutAttributes
- @discussion Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
-
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(14.0), ios(17.0), tvos(17.0), watchos(10.0), visionos(1.0))
@interface AVAssetVariantVideoLayoutAttributes : NSObject
AV_INIT_UNAVAILABLE
-/*!
- @property stereoViewComponents
- @abstract Describes the stereo components. If not declared, the value will be `kCMStereoViewComponent_None`.
- In case of monoscopic content, the value will be `kCMStereoViewComponent_None` and incase of stereoscopic content, the value will be `(kCMStereoViewComponent_LeftEye | kCMStereoViewComponent_RightEye)`.
-*/
+/// Describes the stereo components. If not declared, the value will be `kCMStereoViewComponent_None`. In case of monoscopic content, the value will be `kCMStereoViewComponent_None` and incase of stereoscopic content, the value will be `(kCMStereoViewComponent_LeftEye | kCMStereoViewComponent_RightEye)`.
@property (nonatomic, readonly) CMStereoViewComponents stereoViewComponents;
+/// Describes the video projection.
+@property (nonatomic, readonly) CMProjectionType projectionType;
+
@end
-/*!
- @class AVAssetVariantAudioAttributes
- @abstract Audio attributes for an asset variant.
- @discussion Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
+/// Audio attributes for an asset variant.
+///
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0))
@interface AVAssetVariantAudioAttributes : NSObject
AV_INIT_UNAVAILABLE
-/*!
- @property formatIDs
- @abstract Provides an array of audio formats present in the variant's renditions if any are declared. Each value in the array is a NSNumber representation of AudioFormatID.
- */
+/// Provides an array of audio formats present in the variant's renditions if any are declared. Each value in the array is a NSNumber representation of AudioFormatID.
@property (nonatomic, readonly) NSArray <NSNumber *> *formatIDs;
-/*!
- @method renditionSpecificAttributesForMediaOption:
- @abstract Provides attributes for a specific audio media selection option. If no rendition specific attributes are declared, it will be nil.
- @param mediaSelectionOption
- The option to return rendition specific information for.
- */
+/// Provides attributes for a specific audio media selection option. If no rendition specific attributes are declared, it will be nil.
+///
+/// - Parameter mediaSelectionOption: The option to return rendition specific information for.
- (nullable AVAssetVariantAudioRenditionSpecificAttributes *)renditionSpecificAttributesForMediaOption:(AVMediaSelectionOption *)mediaSelectionOption;
@end
-/*!
- @class AVAssetVariantAudioRenditionSpecificAttributes
- @abstract Audio rendition attributes for an asset variant.
- @discussion Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
+/// Audio rendition attributes for an asset variant.
+///
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(8.0), visionos(1.0))
@interface AVAssetVariantAudioRenditionSpecificAttributes : NSObject
-/*!
- @property channelCount
- @abstract If it is not declared, the value will be negative.
- @discussion A channel count greater than two indicates that the variant offers a rich multichannel authoring.
- */
+/// If it is not declared, the value will be negative.
+///
+/// A channel count greater than two indicates that the variant offers a rich multichannel authoring.
@property (nonatomic, readonly) NSInteger channelCount;
-/*!
- @property binaural
- @abstract Indicates that the variant is best suited for delivery to headphones.
- @discussion A binaural variant may originate from a direct binaural recording or from the processing of a multichannel audio source.
-*/
+/// Indicates that the variant is best suited for delivery to headphones.
+///
+/// A binaural variant may originate from a direct binaural recording or from the processing of a multichannel audio source.
@property (nonatomic, readonly, getter=isBinaural) BOOL binaural API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
-/*!
- @property immersive
- @abstract Indicates that this variant contains virtualized or otherwise pre-processed audio content that is suitable for a variety of purposes.
- @discussion If a variant audio redition is immersive it is eligible for rendering either to headphones or speakers.
-*/
+/// Indicates that this variant contains virtualized or otherwise pre-processed audio content that is suitable for a variety of purposes.
+///
+/// If a variant audio redition is immersive it is eligible for rendering either to headphones or speakers.
@property (nonatomic, readonly, getter=isImmersive) BOOL immersive API_AVAILABLE(macos(14.0), ios(17.0), tvos(17.0), watchos(10.0), visionos(1.0));
-/*!
- @property downmix
- @abstract Indicates that this variant is declared as a downmix derivative of other media of greater channel count.
- @discussion If one or more multichannel variants are also provided, the dowmix is assumed to be compatible in its internal timing and other attributes with those variants. Typically this is because it has been derived from the same source. A downmix can be used as a suitable substitute for a multichannel variant under some conditions.
-*/
+/// Indicates that this variant is declared as a downmix derivative of other media of greater channel count.
+///
+/// If one or more multichannel variants are also provided, the dowmix is assumed to be compatible in its internal timing and other attributes with those variants. Typically this is because it has been derived from the same source. A downmix can be used as a suitable substitute for a multichannel variant under some conditions.
@property (nonatomic, readonly, getter=isDownmix) BOOL downmix API_AVAILABLE(macos(13.0), ios(16.0), tvos(16.0), watchos(9.0), visionos(1.0));
@end
-/*!
- @class AVAssetVariantQualifier
- @abstract The qualifier of an asset variant.
- @discussion Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
-*/
+/// The qualifier of an asset variant.
+///
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0), watchos(10.0), visionos(1.0))
@interface AVAssetVariantQualifier : NSObject <NSCopying>
AV_INIT_UNAVAILABLE
-/*!
- @method assetVariantQualifierWithPredicate:
- @abstract Returns a qualifer for a predicate.
- @param predicate
- The variant predicate. Must be a valid, non-nil NSPredicate.
- */
+/// Returns a qualifer for a predicate.
+///
+/// - Parameter predicate: The variant predicate. Must be a valid, non-nil NSPredicate.
+ (instancetype)assetVariantQualifierWithPredicate:(nonnull NSPredicate *)predicate;
-/*!
- @method assetVariantQualifierWithVariant:
- @abstract Returns a qualifer for a particular asset variant.
- @param variant
- A variant obtained from the -[AVAsset variants] or -[AVAssetDownloadConfiguration playableVariants]. Must be a valid, non-nil AVAssetVariant.
- */
+/// Returns a qualifer for a particular asset variant.
+///
+/// - Parameter variant: A variant obtained from the -[AVAsset variants] or -[AVAssetDownloadConfiguration playableVariants]. Must be a valid, non-nil AVAssetVariant.
+ (instancetype)assetVariantQualifierWithVariant:(nonnull AVAssetVariant *)variant;
-/*!
- @method assetVariantQualifierForMinimumValueInKeyPath:
- @abstract Returns a qualifer for finding variant with minimum value in the input key path.
- @param keyPath
- AVAssetVariant keyPath. Allowed keyPath values are peakBitRate, averageBitRate, videoAttributes.presentationSize. Must be a valid, non-nil NSString.
- */
-
+/// Returns a qualifer for finding variant with minimum value in the input key path.
+///
+/// - Parameter keyPath: AVAssetVariant keyPath. Allowed keyPath values are peakBitRate, averageBitRate, videoAttributes.presentationSize. Must be a valid, non-nil NSString.
+ (instancetype)assetVariantQualifierForMinimumValueInKeyPath:(nonnull NSString *)keyPath API_UNAVAILABLE(macos, ios, tvos, watchos, visionos);
-/*!
- @method assetVariantQualifierForMaximumValueInKeyPath:
- @abstract Returns a qualifer for finding variant with maximum value in the input key path
- @param keyPath
- AVAssetVariant keyPath. Allowed keyPath values are peakBitRate, averageBitRate, videoAttributes.presentationSize. Must be a valid, non-nil NSString.
- */
+/// Returns a qualifer for finding variant with maximum value in the input key path
+///
+/// - Parameter keyPath: AVAssetVariant keyPath. Allowed keyPath values are peakBitRate, averageBitRate, videoAttributes.presentationSize. Must be a valid, non-nil NSString.
+ (instancetype)assetVariantQualifierForMaximumValueInKeyPath:(nonnull NSString *)keyPath API_UNAVAILABLE(macos, ios, tvos, watchos, visionos);
-/*!
- @method predicateForChannelCount:mediaSelectionOption:operatorType:
- @abstract Creates a NSPredicate for audio channel count which can be used with other NSPredicates to express variant preferences.
- @param channelCount
- The RHS value for the channel count in the predicate equation.
- @param mediaSelectionOption
- The audio media selection option under consideration.
- @param operatorType
- The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
- */
+/// Creates a NSPredicate for audio channel count which can be used with other NSPredicates to express variant preferences.
+///
+/// - Parameter channelCount: The RHS value for the channel count in the predicate equation.
+/// - Parameter mediaSelectionOption: The audio media selection option under consideration.
+/// - Parameter operatorType: The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
+ (NSPredicate *)predicateForChannelCount:(NSInteger)channelCount mediaSelectionOption:(nullable AVMediaSelectionOption *)mediaSelectionOption operatorType:(NSPredicateOperatorType)operatorType;
-/*!
- @method predicateForBinauralAudio:mediaSelectionOption:
- @abstract Creates a NSPredicate for binaural which can be used with other NSPredicates to express variant preferences.
- @param isBinaural
- The RHS value for the value of isBinauralAudio in the predicate equation.
- @param mediaSelectionOption
- The audio media selection option under consideration.
- */
+/// Creates a NSPredicate for binaural which can be used with other NSPredicates to express variant preferences.
+///
+/// - Parameter isBinaural: The RHS value for the value of isBinauralAudio in the predicate equation.
+/// - Parameter mediaSelectionOption: The audio media selection option under consideration.
+ (NSPredicate *)predicateForBinauralAudio:(BOOL)isBinauralAudio mediaSelectionOption:(nullable AVMediaSelectionOption *)mediaSelectionOption API_AVAILABLE(macos(14.0), ios(17.0), tvos(17.0), watchos(10.0), visionos(1.0));
-/*!
- @method predicateForImmersiveAudio:mediaSelectionOption:
- @abstract Creates a NSPredicate for immersive audio which can be used with other NSPredicates to express variant preferences.
- @param isImmersiveAudio
- The RHS value for the value of isImmersiveAudio in the predicate equation.
- @param mediaSelectionOption
- The audio media selection option under consideration.
- */
+/// Creates a NSPredicate for immersive audio which can be used with other NSPredicates to express variant preferences.
+///
+/// - Parameter isImmersiveAudio: The RHS value for the value of isImmersiveAudio in the predicate equation.
+/// - Parameter mediaSelectionOption: The audio media selection option under consideration.
+ (NSPredicate *)predicateForImmersiveAudio:(BOOL)isImmersiveAudio mediaSelectionOption:(nullable AVMediaSelectionOption *)mediaSelectionOption API_AVAILABLE(macos(14.0), ios(17.0), tvos(17.0), watchos(10.0), visionos(1.0));
-/*!
- @method predicateForDownmixAudio:mediaSelectionOption:
- @abstract Creates a NSPredicate for immersive audio which can be used with other NSPredicates to express variant preferences.
- @param isDownmixAudio
- The RHS value for the value of isDownmixAudio in the predicate equation.
- @param mediaSelectionOption
- The audio media selection option under consideration.
- */
+/// Creates a NSPredicate for immersive audio which can be used with other NSPredicates to express variant preferences.
+///
+/// - Parameter isDownmixAudio: The RHS value for the value of isDownmixAudio in the predicate equation.
+/// - Parameter mediaSelectionOption: The audio media selection option under consideration.
+ (NSPredicate *)predicateForDownmixAudio:(BOOL)isDownmixAudio mediaSelectionOption:(nullable AVMediaSelectionOption *)mediaSelectionOption API_AVAILABLE(macos(14.0), ios(17.0), tvos(17.0), watchos(10.0), visionos(1.0));
-/*!
- @method predicateForPresentationWidth:operatorType:
- @abstract Creates a NSPredicate for presentation size width which can be used with other NSPredicates to express variant preferences.
- @param width
- The RHS value for the presentation size width in the predicate equation.
- @param operatorType
- The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
- */
+/// Creates a NSPredicate for presentation size width which can be used with other NSPredicates to express variant preferences.
+///
+/// - Parameter width: The RHS value for the presentation size width in the predicate equation.
+/// - Parameter operatorType: The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
+ (NSPredicate *)predicateForPresentationWidth:(CGFloat)width operatorType:(NSPredicateOperatorType)operatorType;
-/*!
- @method predicateForPresentationHeight:operatorType:
- @abstract Creates a NSPredicate for presentation size height which can be used with other NSPredicates to express variant preferences.
- @param height
- The RHS value for the presentation size height in the predicate equation.
- @param operatorType
- The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
- */
+/// Creates a NSPredicate for presentation size height which can be used with other NSPredicates to express variant preferences.
+///
+/// - Parameter height: The RHS value for the presentation size height in the predicate equation.
+/// - Parameter operatorType: The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
+ (NSPredicate *)predicateForPresentationHeight:(CGFloat)height operatorType:(NSPredicateOperatorType)operatorType;
-/*!
- @method predicateForAudioSampleRate:mediaSelectionOption:operatorType:
- @abstract Creates a NSPredicate for audio sample rate which can be used with other NSPredicates to express variant preferences.
- @param sampleRate
- The RHS value for the sample rate in the predicate equation.
- @param mediaSelectionOption
- The audio media selection option under consideration.
- @param operatorType
- The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
- */
+/// Creates a NSPredicate for audio sample rate which can be used with other NSPredicates to express variant preferences.
+///
+/// - Parameter sampleRate: The RHS value for the sample rate in the predicate equation.
+/// - Parameter mediaSelectionOption: The audio media selection option under consideration.
+/// - Parameter operatorType: The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
+ (NSPredicate *)predicateForAudioSampleRate:(double)sampleRate mediaSelectionOption:(nullable AVMediaSelectionOption *)mediaSelectionOption operatorType:(NSPredicateOperatorType)operatorType API_AVAILABLE(macos(15.0), ios(18.0), tvos(18.0), watchos(11.0), visionos(2.0));
-
-/*!
- @method predicateForChannelCount:operatorType:
- @abstract Creates a NSPredicate for audio channel count which can be used with other NSPredicates to express variant preferences.
- @param channelCount
- The RHS value for the channel count in the predicate equation.
- @param operatorType
- The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
- @discussion Predicate will be evaluated on the media selection option selected for the asset.
- Media selection options for primary assets may be specified in the AVAssetDownloadConfiguration mediaSelections property.
- Media selection options for interstitial assets may be circumscribed by -[AVAssetDownloadConfiguration setInterstitialMediaSelectionCriteria: forMediaCharacteristic:].
- */
+/// Creates a NSPredicate for audio channel count which can be used with other NSPredicates to express variant preferences.
+///
+/// Predicate will be evaluated on the media selection option selected for the asset.
+/// Media selection options for primary assets may be specified in the AVAssetDownloadConfiguration mediaSelections property.
+/// Media selection options for interstitial assets may be circumscribed by -[AVAssetDownloadConfiguration setInterstitialMediaSelectionCriteria: forMediaCharacteristic:].
+///
+/// - Parameter channelCount: The RHS value for the channel count in the predicate equation.
+/// - Parameter operatorType: The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
+ (NSPredicate *)predicateForChannelCount:(NSInteger)channelCount operatorType:(NSPredicateOperatorType)operatorType API_AVAILABLE(macos(15.5), ios(18.5), tvos(18.5), watchos(11.5), visionos(2.5));
-/*!
- @method predicateForBinauralAudio:
- @abstract Creates a NSPredicate for binaural which can be used with other NSPredicates to express variant preferences.
- @param isBinaural
- The RHS value for the value of isBinauralAudio in the predicate equation.
- */
+/// Creates a NSPredicate for binaural which can be used with other NSPredicates to express variant preferences.
+///
+/// - Parameter isBinaural: The RHS value for the value of isBinauralAudio in the predicate equation.
+ (NSPredicate *)predicateForBinauralAudio:(BOOL)isBinauralAudio API_AVAILABLE(macos(15.5), ios(18.5), tvos(18.5), watchos(11.5), visionos(2.5));
-/*!
- @method predicateForImmersiveAudio
- @abstract Creates a NSPredicate for immersive audio which can be used with other NSPredicates to express variant preferences.
- @param isImmersiveAudio
- The RHS value for the value of isImmersiveAudio in the predicate equation.
- @discussion Predicate will be evaluated on the media selection option selected for the asset.
- Media selection options for primary assets may be specified in the AVAssetDownloadConfiguration mediaSelections property.
- Media selection options for interstitial assets may be circumscribed by -[AVAssetDownloadConfiguration setInterstitialMediaSelectionCriteria: forMediaCharacteristic:].
- */
+/// Creates a NSPredicate for immersive audio which can be used with other NSPredicates to express variant preferences.
+///
+/// Predicate will be evaluated on the media selection option selected for the asset.
+/// Media selection options for primary assets may be specified in the AVAssetDownloadConfiguration mediaSelections property.
+/// Media selection options for interstitial assets may be circumscribed by -[AVAssetDownloadConfiguration setInterstitialMediaSelectionCriteria: forMediaCharacteristic:].
+///
+/// - Parameter isImmersiveAudio: The RHS value for the value of isImmersiveAudio in the predicate equation.
+ (NSPredicate *)predicateForImmersiveAudio:(BOOL)isImmersiveAudio API_AVAILABLE(macos(15.5), ios(18.5), tvos(18.5), watchos(11.5), visionos(2.5));
-/*!
- @method predicateForDownmixAudio:mediaSelectionOption:
- @abstract Creates a NSPredicate for immersive audio which can be used with other NSPredicates to express variant preferences.
- @param isDownmixAudio
- The RHS value for the value of isDownmixAudio in the predicate equation.
- @discussion Predicate will be evaluated on the media selection option selected for the asset.
- Media selection options for primary assets may be specified in the AVAssetDownloadConfiguration mediaSelections property.
- Media selection options for interstitial assets may be circumscribed by -[AVAssetDownloadConfiguration setInterstitialMediaSelectionCriteria: forMediaCharacteristic:].
- */
+/// Creates a NSPredicate for immersive audio which can be used with other NSPredicates to express variant preferences.
+///
+/// Predicate will be evaluated on the media selection option selected for the asset.
+/// Media selection options for primary assets may be specified in the AVAssetDownloadConfiguration mediaSelections property.
+/// Media selection options for interstitial assets may be circumscribed by -[AVAssetDownloadConfiguration setInterstitialMediaSelectionCriteria: forMediaCharacteristic:].
+///
+/// - Parameter isDownmixAudio: The RHS value for the value of isDownmixAudio in the predicate equation.
+ (NSPredicate *)predicateForDownmixAudio:(BOOL)isDownmixAudio API_AVAILABLE(macos(15.5), ios(18.5), tvos(18.5), watchos(11.5), visionos(2.5));
-/*!
- @method predicateForAudioSampleRate:operatorType:
- @abstract Creates a NSPredicate for audio sample rate which can be used with other NSPredicates to express variant preferences.
- @param sampleRate
- The RHS value for the sample rate in the predicate equation.
- @param operatorType
- The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
- @discussion Predicate will be evaluated on the media selection option selected for the asset.
- Media selection options for primary assets may be specified in the AVAssetDownloadConfiguration mediaSelections property.
- Media selection options for interstitial assets may be circumscribed by -[AVAssetDownloadConfiguration setInterstitialMediaSelectionCriteria: forMediaCharacteristic:].
- */
+/// Creates a NSPredicate for audio sample rate which can be used with other NSPredicates to express variant preferences.
+///
+/// Predicate will be evaluated on the media selection option selected for the asset.
+/// Media selection options for primary assets may be specified in the AVAssetDownloadConfiguration mediaSelections property.
+/// Media selection options for interstitial assets may be circumscribed by -[AVAssetDownloadConfiguration setInterstitialMediaSelectionCriteria: forMediaCharacteristic:].
+///
+/// - Parameter sampleRate: The RHS value for the sample rate in the predicate equation.
+/// - Parameter operatorType: The valid values are NSLessThanPredicateOperatorType, NSLessThanOrEqualToPredicateOperatorType, NSGreaterThanPredicateOperatorType, NSGreaterThanOrEqualToPredicateOperatorType, NSEqualToPredicateOperatorType and NSNotEqualToPredicateOperatorType.
+ (NSPredicate *)predicateForAudioSampleRate:(double)sampleRate operatorType:(NSPredicateOperatorType)operatorType API_AVAILABLE(macos(15.5), ios(18.5), tvos(18.5), watchos(11.5), visionos(2.5));
@end
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriter.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriter.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriter.h 2025-04-19 03:33:10
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriter.h 2025-05-31 21:42:12
@@ -278,7 +278,14 @@
*Passthrough is indicated when the input's output settings are nil.
*/
-- (void)addInput:(AVAssetWriterInput *)input;
+- (void)addInput:(AVAssetWriterInput *)input
+#if __swift__
+API_DEPRECATED("Use the appropriate AVAssetWriter.inputReceiver(for:...) overload for your input and optional adaptor instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
/*!
@method startWriting
@@ -295,7 +302,14 @@
On iOS, if the status of an AVAssetWriter is AVAssetWriterStatusWriting when the client app goes into the background, its status will change to AVAssetWriterStatusFailed and appending to any of its inputs will fail. You may want to use -[UIApplication beginBackgroundTaskWithExpirationHandler:] to avoid being interrupted in the middle of a writing session and to finish writing the data that has already been appended. For more information about executing code in the background, see the iOS Application Programming Guide.
*/
-- (BOOL)startWriting;
+- (BOOL)startWriting
+#if __swift__
+API_DEPRECATED("Use start() instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
/*!
@method startSessionAtSourceTime:
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriterInput.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriterInput.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriterInput.h 2025-04-19 05:01:49
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriterInput.h 2025-05-29 01:46:33
@@ -27,18 +27,13 @@
NS_ASSUME_NONNULL_BEGIN
-/*!
- @class AVAssetWriterInput
- @abstract
- AVAssetWriterInput defines an interface for appending either new media samples or references to existing media samples packaged as CMSampleBuffer objects to a single track of the output file of an AVAssetWriter.
-
- @discussion
- Clients that need to write multiple concurrent tracks of media data should use one AVAssetWriterInput instance per track. In order to write multiple concurrent tracks with ideal interleaving of media data, clients should observe the value returned by the readyForMoreMediaData property of each AVAssetWriterInput instance.
-
- AVAssetWriterInput also supports writing per-track metadata collections to the output file.
-
- As of macOS 10.10 and iOS 8.0 AVAssetWriterInput can also be used to create tracks that are not self-contained. Such tracks reference sample data that is located in another file. This is currently supported only for instances of AVAssetWriterInput attached to an instance of AVAssetWriter that writes files of type AVFileTypeQuickTimeMovie.
- */
+/// AVAssetWriterInput defines an interface for appending either new media samples or references to existing media samples packaged as CMSampleBuffer objects to a single track of the output file of an AVAssetWriter.
+///
+/// Clients that need to write multiple concurrent tracks of media data should use one AVAssetWriterInput instance per track. In order to write multiple concurrent tracks with ideal interleaving of media data, clients should observe the value returned by the readyForMoreMediaData property of each AVAssetWriterInput instance.
+///
+/// AVAssetWriterInput also supports writing per-track metadata collections to the output file.
+///
+/// As of macOS 10.10 and iOS 8.0 AVAssetWriterInput can also be used to create tracks that are not self-contained. Such tracks reference sample data that is located in another file. This is currently supported only for instances of AVAssetWriterInput attached to an instance of AVAssetWriter that writes files of type AVFileTypeQuickTimeMovie.
API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVAssetWriterInput : NSObject
{
@@ -47,285 +42,235 @@
}
AV_INIT_UNAVAILABLE
-/*!
- @method assetWriterInputWithMediaType:outputSettings:
- @abstract
- Creates a new input of the specified media type to receive sample buffers for writing to the output file.
-
- @param mediaType
- The media type of samples that will be accepted by the input. Media types are defined in AVMediaFormat.h.
- @param outputSettings
- The settings used for encoding the media appended to the output. See AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo and for more information on how to construct an output settings dictionary. If you only require simple preset-based output settings, see AVOutputSettingsAssistant.
- @result
- An instance of AVAssetWriterInput.
-
- @discussion
- Each new input accepts data for a new track of the AVAssetWriter's output file. Inputs are added to an asset writer using -[AVAssetWriter addInput:].
-
- Passing nil for output settings instructs the input to pass through appended samples, doing no processing before they are written to the output file. This is useful if, for example, you are appending buffers that are already in a desirable compressed format. However, if not writing to a QuickTime Movie file (i.e. the AVAssetWriter was initialized with a file type other than AVFileTypeQuickTimeMovie), AVAssetWriter only supports passing through a restricted set of media types and subtypes. In order to pass through media data to files other than AVFileTypeQuickTimeMovie, a non-NULL format hint must be provided using +assetWriterInputWithMediaType:outputSettings:sourceFormatHint: instead of this method.
-
- For AVMediaTypeAudio the following keys are not currently supported in the outputSettings dictionary: AVSampleRateConverterAudioQualityKey. When using this method to construct a new instance, an audio settings dictionary must be fully specified, meaning that it must contain AVFormatIDKey, AVSampleRateKey, and AVNumberOfChannelsKey. If no other channel layout information is available, a value of 1 for AVNumberOfChannelsKey will result in mono output and a value of 2 will result in stereo output. If AVNumberOfChannelsKey specifies a channel count greater than 2, the dictionary must also specify a value for AVChannelLayoutKey. For kAudioFormatLinearPCM, all relevant AVLinearPCM*Key keys must be included, and for kAudioFormatAppleLossless, AVEncoderBitDepthHintKey keys must be included. See +assetWriterInputWithMediaType:outputSettings:sourceFormatHint: for a way to avoid having to specify a value for each of those keys.
-
- For AVMediaTypeVideo, any output settings dictionary must request a compressed video format. This means that the value passed in for outputSettings must follow the rules for compressed video output, as laid out in AVVideoSettings.h. When using this method to construct a new instance, a video settings dictionary must be fully specified, meaning that it must contain AVVideoCodecKey, AVVideoWidthKey, and AVVideoHeightKey. See +assetWriterInputWithMediaType:outputSettings:sourceFormatHint: for a way to avoid having to specify a value for each of those keys. On iOS, the only values currently supported for AVVideoCodecKey are AVVideoCodecTypeH264 and AVVideoCodecTypeJPEG. AVVideoCodecTypeH264 is not supported on iPhone 3G. For AVVideoScalingModeKey, the value AVVideoScalingModeFit is not supported.
- */
+/// Creates a new input of the specified media type to receive sample buffers for writing to the output file.
+///
+/// Each new input accepts data for a new track of the AVAssetWriter's output file. Inputs are added to an asset writer using -[AVAssetWriter addInput:].
+///
+/// Passing nil for output settings instructs the input to pass through appended samples, doing no processing before they are written to the output file. This is useful if, for example, you are appending buffers that are already in a desirable compressed format. However, if not writing to a QuickTime Movie file (i.e. the AVAssetWriter was initialized with a file type other than AVFileTypeQuickTimeMovie), AVAssetWriter only supports passing through a restricted set of media types and subtypes. In order to pass through media data to files other than AVFileTypeQuickTimeMovie, a non-NULL format hint must be provided using +assetWriterInputWithMediaType:outputSettings:sourceFormatHint: instead of this method.
+///
+/// For AVMediaTypeAudio the following keys are not currently supported in the outputSettings dictionary: AVSampleRateConverterAudioQualityKey. When using this method to construct a new instance, an audio settings dictionary must be fully specified, meaning that it must contain AVFormatIDKey, AVSampleRateKey, and AVNumberOfChannelsKey. If no other channel layout information is available, a value of 1 for AVNumberOfChannelsKey will result in mono output and a value of 2 will result in stereo output. If AVNumberOfChannelsKey specifies a channel count greater than 2, the dictionary must also specify a value for AVChannelLayoutKey. For kAudioFormatLinearPCM, all relevant AVLinearPCM*Key keys must be included, and for kAudioFormatAppleLossless, AVEncoderBitDepthHintKey keys must be included. See +assetWriterInputWithMediaType:outputSettings:sourceFormatHint: for a way to avoid having to specify a value for each of those keys.
+///
+/// For AVMediaTypeVideo, any output settings dictionary must request a compressed video format. This means that the value passed in for outputSettings must follow the rules for compressed video output, as laid out in AVVideoSettings.h. When using this method to construct a new instance, a video settings dictionary must be fully specified, meaning that it must contain AVVideoCodecKey, AVVideoWidthKey, and AVVideoHeightKey. See +assetWriterInputWithMediaType:outputSettings:sourceFormatHint: for a way to avoid having to specify a value for each of those keys. On iOS, the only values currently supported for AVVideoCodecKey are AVVideoCodecTypeH264 and AVVideoCodecTypeJPEG. AVVideoCodecTypeH264 is not supported on iPhone 3G. For AVVideoScalingModeKey, the value AVVideoScalingModeFit is not supported.
+///
+/// - Parameter mediaType: The media type of samples that will be accepted by the input. Media types are defined in AVMediaFormat.h.
+/// - Parameter outputSettings: The settings used for encoding the media appended to the output. See AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo and for more information on how to construct an output settings dictionary. If you only require simple preset-based output settings, see AVOutputSettingsAssistant.
+///
+/// - Returns: An instance of AVAssetWriterInput.
+ (instancetype)assetWriterInputWithMediaType:(AVMediaType)mediaType outputSettings:(nullable NSDictionary<NSString *, id> *)outputSettings;
-/*!
- @method assetWriterInputWithMediaType:outputSettings:sourceFormatHint:
- @abstract
- Creates a new input of the specified media type to receive sample buffers for writing to the output file.
-
- @param mediaType
- The media type of samples that will be accepted by the input. Media types are defined in AVMediaFormat.h.
- @param outputSettings
- The settings used for encoding the media appended to the output. See AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo and for more information on how to construct an output settings dictionary. If you only require simple preset-based output settings, see AVOutputSettingsAssistant.
- @param sourceFormatHint
- A hint about the format of media data that will be appended to the new input.
- @result
- An instance of AVAssetWriterInput.
-
- @discussion
- A version of +assetWriterInputWithMediaType:outputSettings: that includes the ability to hint at the format of media data that will be appended to the new instance of AVAssetWriterInput. When a source format hint is provided, the outputSettings dictionary is not required to be fully specified. For AVMediaTypeAudio, this means that AVFormatIDKey is the only required key. For AVMediaTypeVideo, this means that AVVideoCodecKey is the only required key. Values for the remaining keys will be chosen by the asset writer input, with consideration given to the attributes of the source format. To guarantee successful file writing, clients who specify a format hint should ensure that subsequently-appended buffers are of the specified format.
-
- This method throws an exception for any of the following reasons:
- - the media type of the format description does not match the media type passed into this method
- - the width and height of video format hint are not positive
- - the output settings do not match the supplied media type
- - for video inputs, the output settings do not contain a required key (AVVideoCodecKey, AVVideoWidthKey, AVVideoHeightKey)
- - the output scaling mode is AVVideoScalingModeFit
- - the output settings contain AVSampleRateConverterAudioQualityKey or AVVideoDecompressionPropertiesKey
- */
+/// Creates a new input of the specified media type to receive sample buffers for writing to the output file.
+///
+/// A version of +assetWriterInputWithMediaType:outputSettings: that includes the ability to hint at the format of media data that will be appended to the new instance of AVAssetWriterInput. When a source format hint is provided, the outputSettings dictionary is not required to be fully specified. For AVMediaTypeAudio, this means that AVFormatIDKey is the only required key. For AVMediaTypeVideo, this means that AVVideoCodecKey is the only required key. Values for the remaining keys will be chosen by the asset writer input, with consideration given to the attributes of the source format. To guarantee successful file writing, clients who specify a format hint should ensure that subsequently-appended buffers are of the specified format.
+///
+/// This method throws an exception for any of the following reasons:
+/// - the media type of the format description does not match the media type passed into this method
+/// - the width and height of video format hint are not positive
+/// - the output settings do not match the supplied media type
+/// - for video inputs, the output settings do not contain a required key (AVVideoCodecKey, AVVideoWidthKey, AVVideoHeightKey)
+/// - the output scaling mode is AVVideoScalingModeFit
+/// - the output settings contain AVSampleRateConverterAudioQualityKey or AVVideoDecompressionPropertiesKey
+///
+/// - Parameter mediaType: The media type of samples that will be accepted by the input. Media types are defined in AVMediaFormat.h.
+/// - Parameter outputSettings: The settings used for encoding the media appended to the output. See AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo and for more information on how to construct an output settings dictionary. If you only require simple preset-based output settings, see AVOutputSettingsAssistant.
+/// - Parameter sourceFormatHint: A hint about the format of media data that will be appended to the new input.
+///
+/// - Returns: An instance of AVAssetWriterInput.
+ (instancetype)assetWriterInputWithMediaType:(AVMediaType)mediaType outputSettings:(nullable NSDictionary<NSString *, id> *)outputSettings sourceFormatHint:(nullable CMFormatDescriptionRef)sourceFormatHint API_AVAILABLE(macos(10.8), ios(6.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @method initWithMediaType:outputSettings:
- @abstract
- Creates a new input of the specified media type to receive sample buffers for writing to the output file.
-
- @param mediaType
- The media type of samples that will be accepted by the input. Media types are defined in AVMediaFormat.h.
- @param outputSettings
- The settings used for encoding the media appended to the output. See AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo and for more information on how to construct an output settings dictionary. If you only require simple preset-based output settings, see AVOutputSettingsAssistant.
- @result
- An instance of AVAssetWriterInput.
-
- @discussion
- Each new input accepts data for a new track of the AVAssetWriter's output file. Inputs are added to an asset writer using -[AVAssetWriter addInput:].
-
- Passing nil for output settings instructs the input to pass through appended samples, doing no processing before they are written to the output file. This is useful if, for example, you are appending buffers that are already in a desirable compressed format. However, if not writing to a QuickTime Movie file (i.e. the AVAssetWriter was initialized with a file type other than AVFileTypeQuickTimeMovie), AVAssetWriter only supports passing through a restricted set of media types and subtypes. In order to pass through media data to files other than AVFileTypeQuickTimeMovie, a non-NULL format hint must be provided using -initWithMediaType:outputSettings:sourceFormatHint: instead of this method.
-
- For AVMediaTypeAudio the following keys are not currently supported in the outputSettings dictionary: AVSampleRateConverterAudioQualityKey. When using this initializer, an audio settings dictionary must be fully specified, meaning that it must contain AVFormatIDKey, AVSampleRateKey, and AVNumberOfChannelsKey. If no other channel layout information is available, a value of 1 for AVNumberOfChannelsKey will result in mono output and a value of 2 will result in stereo output. If AVNumberOfChannelsKey specifies a channel count greater than 2, the dictionary must also specify a value for AVChannelLayoutKey. For kAudioFormatLinearPCM, all relevant AVLinearPCM*Key keys must be included, and for kAudioFormatAppleLossless, AVEncoderBitDepthHintKey keys must be included. See -initWithMediaType:outputSettings:sourceFormatHint: for a way to avoid having to specify a value for each of those keys.
-
- For AVMediaTypeVideo, any output settings dictionary must request a compressed video format. This means that the value passed in for outputSettings must follow the rules for compressed video output, as laid out in AVVideoSettings.h. When using this initializer, a video settings dictionary must be fully specified, meaning that it must contain AVVideoCodecKey, AVVideoWidthKey, and AVVideoHeightKey. See -initWithMediaType:outputSettings:sourceFormatHint: for a way to avoid having to specify a value for each of those keys. On iOS, the only values currently supported for AVVideoCodecKey are AVVideoCodecTypeH264 and AVVideoCodecTypeJPEG. AVVideoCodecTypeH264 is not supported on iPhone 3G. For AVVideoScalingModeKey, the value AVVideoScalingModeFit is not supported.
-
- This method throws an exception for any of the following reasons:
- - the media type of the format description does not match the media type passed into this method
- - the output settings do not match the supplied media type
- - for video inputs, the output settings do not contain a required key (AVVideoCodecKey, AVVideoWidthKey, AVVideoHeightKey)
- - the output scaling mode is AVVideoScalingModeFit
- - the output settings contain AVSampleRateConverterAudioQualityKey or AVVideoDecompressionPropertiesKey
- */
+/// Creates a new input of the specified media type to receive sample buffers for writing to the output file.
+///
+/// Each new input accepts data for a new track of the AVAssetWriter's output file. Inputs are added to an asset writer using -[AVAssetWriter addInput:].
+///
+/// Passing nil for output settings instructs the input to pass through appended samples, doing no processing before they are written to the output file. This is useful if, for example, you are appending buffers that are already in a desirable compressed format. However, if not writing to a QuickTime Movie file (i.e. the AVAssetWriter was initialized with a file type other than AVFileTypeQuickTimeMovie), AVAssetWriter only supports passing through a restricted set of media types and subtypes. In order to pass through media data to files other than AVFileTypeQuickTimeMovie, a non-NULL format hint must be provided using -initWithMediaType:outputSettings:sourceFormatHint: instead of this method.
+///
+/// For AVMediaTypeAudio the following keys are not currently supported in the outputSettings dictionary: AVSampleRateConverterAudioQualityKey. When using this initializer, an audio settings dictionary must be fully specified, meaning that it must contain AVFormatIDKey, AVSampleRateKey, and AVNumberOfChannelsKey. If no other channel layout information is available, a value of 1 for AVNumberOfChannelsKey will result in mono output and a value of 2 will result in stereo output. If AVNumberOfChannelsKey specifies a channel count greater than 2, the dictionary must also specify a value for AVChannelLayoutKey. For kAudioFormatLinearPCM, all relevant AVLinearPCM*Key keys must be included, and for kAudioFormatAppleLossless, AVEncoderBitDepthHintKey keys must be included. See -initWithMediaType:outputSettings:sourceFormatHint: for a way to avoid having to specify a value for each of those keys.
+///
+/// For AVMediaTypeVideo, any output settings dictionary must request a compressed video format. This means that the value passed in for outputSettings must follow the rules for compressed video output, as laid out in AVVideoSettings.h. When using this initializer, a video settings dictionary must be fully specified, meaning that it must contain AVVideoCodecKey, AVVideoWidthKey, and AVVideoHeightKey. See -initWithMediaType:outputSettings:sourceFormatHint: for a way to avoid having to specify a value for each of those keys. On iOS, the only values currently supported for AVVideoCodecKey are AVVideoCodecTypeH264 and AVVideoCodecTypeJPEG. AVVideoCodecTypeH264 is not supported on iPhone 3G. For AVVideoScalingModeKey, the value AVVideoScalingModeFit is not supported.
+///
+/// This method throws an exception for any of the following reasons:
+/// - the media type of the format description does not match the media type passed into this method
+/// - the output settings do not match the supplied media type
+/// - for video inputs, the output settings do not contain a required key (AVVideoCodecKey, AVVideoWidthKey, AVVideoHeightKey)
+/// - the output scaling mode is AVVideoScalingModeFit
+/// - the output settings contain AVSampleRateConverterAudioQualityKey or AVVideoDecompressionPropertiesKey
+///
+/// - Parameter mediaType: The media type of samples that will be accepted by the input. Media types are defined in AVMediaFormat.h.
+/// - Parameter outputSettings: The settings used for encoding the media appended to the output. See AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo and for more information on how to construct an output settings dictionary. If you only require simple preset-based output settings, see AVOutputSettingsAssistant.
+///
+/// - Returns: An instance of AVAssetWriterInput.
- (instancetype)initWithMediaType:(AVMediaType)mediaType outputSettings:(nullable NSDictionary<NSString *, id> *)outputSettings;
-/*!
- @method initWithMediaType:outputSettings:sourceFormatHint:
- @abstract
- Creates a new input of the specified media type to receive sample buffers for writing to the output file. This is the designated initializer of AVAssetWriterInput.
-
- @param mediaType
- The media type of samples that will be accepted by the input. Media types are defined in AVMediaFormat.h.
- @param outputSettings
- The settings used for encoding the media appended to the output. See AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo and for more information on how to construct an output settings dictionary. If you only require simple preset-based output settings, see AVOutputSettingsAssistant.
- @param sourceFormatHint
- A hint about the format of media data that will be appended to the new input.
- @result
- An instance of AVAssetWriterInput.
-
- @discussion
- A version of -initWithMediaType:outputSettings: that includes the ability to hint at the format of media data that will be appended to the new instance of AVAssetWriterInput. When a source format hint is provided, the outputSettings dictionary is not required to be fully specified. For AVMediaTypeAudio, this means that AVFormatIDKey is the only required key. For AVMediaTypeVideo, this means that AVVideoCodecKey is the only required key. Values for the remaining keys will be chosen by the asset writer input, with consideration given to the attributes of the source format. To guarantee successful file writing, clients who specify a format hint should ensure that subsequently-appended buffers are of the specified format.
-
- This method throws an exception for any of the following reasons:
- - the media type of the format description does not match the media type passed into this method
- - the width and height of video format hint are not positive
- - the output settings do not match the supplied media type
- - for video inputs, the output settings do not contain a required key (AVVideoCodecKey, AVVideoWidthKey, AVVideoHeightKey)
- - the output scaling mode is AVVideoScalingModeFit
- - the output settings contain AVSampleRateConverterAudioQualityKey or AVVideoDecompressionPropertiesKey
- */
+/// Creates a new input of the specified media type to receive sample buffers for writing to the output file. This is the designated initializer of AVAssetWriterInput.
+///
+/// A version of -initWithMediaType:outputSettings: that includes the ability to hint at the format of media data that will be appended to the new instance of AVAssetWriterInput. When a source format hint is provided, the outputSettings dictionary is not required to be fully specified. For AVMediaTypeAudio, this means that AVFormatIDKey is the only required key. For AVMediaTypeVideo, this means that AVVideoCodecKey is the only required key. Values for the remaining keys will be chosen by the asset writer input, with consideration given to the attributes of the source format. To guarantee successful file writing, clients who specify a format hint should ensure that subsequently-appended buffers are of the specified format.
+///
+/// This method throws an exception for any of the following reasons:
+/// - the media type of the format description does not match the media type passed into this method
+/// - the width and height of video format hint are not positive
+/// - the output settings do not match the supplied media type
+/// - for video inputs, the output settings do not contain a required key (AVVideoCodecKey, AVVideoWidthKey, AVVideoHeightKey)
+/// - the output scaling mode is AVVideoScalingModeFit
+/// - the output settings contain AVSampleRateConverterAudioQualityKey or AVVideoDecompressionPropertiesKey
+///
+/// - Parameter mediaType: The media type of samples that will be accepted by the input. Media types are defined in AVMediaFormat.h.
+/// - Parameter outputSettings: The settings used for encoding the media appended to the output. See AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo and for more information on how to construct an output settings dictionary. If you only require simple preset-based output settings, see AVOutputSettingsAssistant.
+/// - Parameter sourceFormatHint: A hint about the format of media data that will be appended to the new input.
+///
+/// - Returns: An instance of AVAssetWriterInput.
- (instancetype)initWithMediaType:(AVMediaType)mediaType outputSettings:(nullable NSDictionary<NSString *, id> *)outputSettings sourceFormatHint:(nullable CMFormatDescriptionRef)sourceFormatHint API_AVAILABLE(macos(10.8), ios(6.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos) NS_DESIGNATED_INITIALIZER;
-/*!
- @property mediaType
- @abstract
- The media type of the samples that can be appended to the receiver.
-
- @discussion
- The value of this property is one of the media types defined in AVMediaFormat.h.
- */
+/// The media type of the samples that can be appended to the receiver.
+///
+/// The value of this property is one of the media types defined in AVMediaFormat.h.
@property (nonatomic, readonly) AVMediaType mediaType;
-/*!
- @property outputSettings
- @abstract
- The settings used for encoding the media appended to the output.
-
- @discussion
- The value of this property is an NSDictionary that contains values for keys as specified by either AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo. A value of nil indicates that the receiver will pass through appended samples, doing no processing before they are written to the output file.
-*/
+/// The settings used for encoding the media appended to the output.
+///
+/// The value of this property is an NSDictionary that contains values for keys as specified by either AVAudioSettings.h for AVMediaTypeAudio or AVVideoSettings.h for AVMediaTypeVideo. A value of nil indicates that the receiver will pass through appended samples, doing no processing before they are written to the output file.
@property (nonatomic, readonly, nullable) NSDictionary<NSString *, id> *outputSettings;
-/*!
- @property sourceFormatHint
- @abstract
- The hint given at initialization time about the format of incoming media data.
-
- @discussion
- AVAssetWriterInput may be able to use this hint to fill in missing output settings or perform more upfront validation. To guarantee successful file writing, clients who specify a format hint should ensure that subsequently-appended media data are of the specified format.
- */
+/// The hint given at initialization time about the format of incoming media data.
+///
+/// AVAssetWriterInput may be able to use this hint to fill in missing output settings or perform more upfront validation. To guarantee successful file writing, clients who specify a format hint should ensure that subsequently-appended media data are of the specified format.
@property (nonatomic, readonly, nullable) __attribute__((NSObject)) CMFormatDescriptionRef sourceFormatHint API_AVAILABLE(macos(10.8), ios(6.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property metadata
- @abstract
- A collection of metadata to be written to the track corresponding to the receiver.
-
- @discussion
- The value of this property is an array of AVMetadataItem objects representing the collection of track-level metadata to be written in the output file.
-
- This property cannot be set after writing on the receiver's AVAssetWriter has started.
- */
+/// A collection of metadata to be written to the track corresponding to the receiver.
+///
+/// The value of this property is an array of AVMetadataItem objects representing the collection of track-level metadata to be written in the output file.
+///
+/// This property cannot be set after writing on the receiver's AVAssetWriter has started.
@property (nonatomic, copy) NSArray<AVMetadataItem *> *metadata;
-/*!
- @property readyForMoreMediaData
- @abstract
- Indicates the readiness of the input to accept more media data.
-
- @discussion
- When there are multiple inputs, AVAssetWriter tries to write media data in an ideal interleaving pattern for efficiency in storage and playback. Each of its inputs signals its readiness to receive media data for writing according to that pattern via the value of readyForMoreMediaData. You can append media data to an input only while its readyForMoreMediaData property is YES.
-
- Clients writing media data from a non-real-time source, such as an instance of AVAssetReader, should hold off on generating or obtaining more media data to append to an input when the value of readyForMoreMediaData is NO. To help with control of the supply of non-real-time media data, such clients can use -requestMediaDataWhenReadyOnQueue:usingBlock in order to specify a block that the input should invoke whenever it's ready for input to be appended.
+/// Indicates the readiness of the input to accept more media data.
+///
+/// When there are multiple inputs, AVAssetWriter tries to write media data in an ideal interleaving pattern for efficiency in storage and playback. Each of its inputs signals its readiness to receive media data for writing according to that pattern via the value of readyForMoreMediaData. You can append media data to an input only while its readyForMoreMediaData property is YES.
+///
+/// Clients writing media data from a non-real-time source, such as an instance of AVAssetReader, should hold off on generating or obtaining more media data to append to an input when the value of readyForMoreMediaData is NO. To help with control of the supply of non-real-time media data, such clients can use -requestMediaDataWhenReadyOnQueue:usingBlock in order to specify a block that the input should invoke whenever it's ready for input to be appended.
+///
+/// Clients writing media data from a real-time source, such as an instance of AVCaptureOutput, should set the input's expectsMediaDataInRealTime property to YES to ensure that the value of readyForMoreMediaData is calculated appropriately. When expectsMediaDataInRealTime is YES, readyForMoreMediaData will become NO only when the input cannot process media samples as quickly as they are being provided by the client. If readyForMoreMediaData becomes NO for a real-time source, the client may need to drop samples or consider reducing the data rate of appended samples.
+///
+/// When the value of canPerformMultiplePasses is YES for any input attached to this input's asset writer, the value for this property may start as NO and/or be NO for long periods of time.
+///
+/// The value of readyForMoreMediaData will often change from NO to YES asynchronously, as previously supplied media data is processed and written to the output. It is possible for all of an AVAssetWriter's AVAssetWriterInputs temporarily to return NO for readyForMoreMediaData.
+///
+/// This property is key value observable. Observers should not assume that they will be notified of changes on a specific thread.
+@property (nonatomic, readonly, getter=isReadyForMoreMediaData) BOOL readyForMoreMediaData
+#if __swift__
+API_DEPRECATED("Use the input receiver's async append(...) method instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
- Clients writing media data from a real-time source, such as an instance of AVCaptureOutput, should set the input's expectsMediaDataInRealTime property to YES to ensure that the value of readyForMoreMediaData is calculated appropriately. When expectsMediaDataInRealTime is YES, readyForMoreMediaData will become NO only when the input cannot process media samples as quickly as they are being provided by the client. If readyForMoreMediaData becomes NO for a real-time source, the client may need to drop samples or consider reducing the data rate of appended samples.
-
- When the value of canPerformMultiplePasses is YES for any input attached to this input's asset writer, the value for this property may start as NO and/or be NO for long periods of time.
-
- The value of readyForMoreMediaData will often change from NO to YES asynchronously, as previously supplied media data is processed and written to the output. It is possible for all of an AVAssetWriter's AVAssetWriterInputs temporarily to return NO for readyForMoreMediaData.
-
- This property is key value observable. Observers should not assume that they will be notified of changes on a specific thread.
- */
-@property (nonatomic, readonly, getter=isReadyForMoreMediaData) BOOL readyForMoreMediaData;
+/// Indicates whether the input should tailor its processing of media data for real-time sources.
+///
+/// Clients appending media data to an input from a real-time source, such as an AVCaptureOutput, should set expectsMediaDataInRealTime to YES. This will ensure that readyForMoreMediaData is calculated appropriately for real-time usage.
+///
+/// For best results, do not set both this property and performsMultiPassEncodingIfSupported to YES.
+///
+/// This property cannot be set after writing on the receiver's AVAssetWriter has started.
+@property (nonatomic) BOOL expectsMediaDataInRealTime
+#if __swift__
+API_DEPRECATED("Use the input receiver's appendImmediately(...) method instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
-/*!
- @property expectsMediaDataInRealTime
- @abstract
- Indicates whether the input should tailor its processing of media data for real-time sources.
+/// Instructs the receiver to invoke a client-supplied block repeatedly, at its convenience, in order to gather media data for writing to the output file.
+///
+/// The block should append media data to the input either until the input's readyForMoreMediaData property becomes NO or until there is no more media data to supply (at which point it may choose to mark the input as finished via -markAsFinished). The block should then exit. After the block exits, if the input has not been marked as finished, once the input has processed the media data it has received and becomes ready for more media data again, it will invoke the block again in order to obtain more.
+///
+/// A typical use of this method, with a block that supplies media data to an input while respecting the input's readyForMoreMediaData property, might look like this:
+/// ```objc
+/// [myAVAssetWriterInput requestMediaDataWhenReadyOnQueue:myInputSerialQueue usingBlock:^{
+/// while ([myAVAssetWriterInput isReadyForMoreMediaData])
+/// {
+/// CMSampleBufferRef nextSampleBuffer = [self copyNextSampleBufferToWrite];
+/// if (nextSampleBuffer)
+/// {
+/// [myAVAssetWriterInput appendSampleBuffer:nextSampleBuffer];
+/// CFRelease(nextSampleBuffer);
+/// }
+/// else
+/// {
+/// [myAVAssetWriterInput markAsFinished];
+/// break;
+/// }
+/// }
+/// }];
+/// ```
+/// This method is not recommended for use with a push-style buffer source, such as AVCaptureAudioDataOutput or AVCaptureVideoDataOutput, because such a combination will likely require intermediate queueing of buffers. Instead, this method is better suited to a pull-style buffer source such as AVAssetReaderOutput, as illustrated in the above example.
+///
+/// When using a push-style buffer source, it is generally better to immediately append each buffer to the AVAssetWriterInput, directly via -[AVAssetWriter appendSampleBuffer:], as it is received. Using this strategy, it is often possible to avoid having to queue up buffers in between the buffer source and the AVAssetWriterInput. Note that many of these push-style buffer sources also produce buffers in real-time, in which case the client should set expectsMediaDataInRealTime to YES.
+///
+/// Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer.
+///
+/// This method throws an exception if this method is called more than once.
+///
+/// - Parameter queue: The queue on which the block should be invoked.
+/// - Parameter block: The block the input should invoke to obtain media data.
+- (void)requestMediaDataWhenReadyOnQueue:(dispatch_queue_t)queue usingBlock:(void (^ NS_SWIFT_SENDABLE)(void))block
+#if __swift__
+API_DEPRECATED("Use the input receiver's async append(...) method on its own task instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
- @discussion
- Clients appending media data to an input from a real-time source, such as an AVCaptureOutput, should set expectsMediaDataInRealTime to YES. This will ensure that readyForMoreMediaData is calculated appropriately for real-time usage.
-
- For best results, do not set both this property and performsMultiPassEncodingIfSupported to YES.
-
- This property cannot be set after writing on the receiver's AVAssetWriter has started.
- */
-@property (nonatomic) BOOL expectsMediaDataInRealTime;
+/// Appends samples to the receiver.
+///
+/// The timing information in the sample buffer, considered relative to the time passed to -[AVAssetWriter startSessionAtSourceTime:], will be used to determine the timing of those samples in the output file.
+///
+/// For track types other than audio tracks, to determine the duration of all samples in the output file other than the very last sample that's appended, the difference between the sample buffer's output DTS and the following sample buffer's output DTS will be used. The duration of the last sample is determined as follows:
+/// 1. If a marker sample buffer with kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration is appended following the last media-bearing sample, the difference between the output DTS of the marker sample buffer and the output DTS of the last media-bearing sample will be used.
+/// 2. If the marker sample buffer is not provided and if the output duration of the last media-bearing sample is valid, it will be used.
+/// 3. if the output duration of the last media-bearing sample is not valid, the duration of the second-to-last sample will be used.
+///
+/// For audio tracks, the properties of each appended sample buffer are used to determine corresponding output durations.
+///
+/// The receiver will retain the CMSampleBuffer until it is done with it, and then release it. Do not modify a CMSampleBuffer or its contents after you have passed it to this method.
+///
+/// If the sample buffer contains audio data and the AVAssetWriterInput was intialized with an outputSettings dictionary then the format must be linear PCM. If the outputSettings dictionary was nil then audio data can be provided in a compressed format, and it will be passed through to the output without any re-compression. Note that advanced formats like AAC will have encoder delay present in their bitstreams. This data is inserted by the encoder and is necessary for proper decoding, but it is not meant to be played back. Clients who provide compressed audio bitstreams must use kCMSampleBufferAttachmentKey_TrimDurationAtStart to mark the encoder delay (generally restricted to the first sample buffer). Packetization can cause there to be extra audio frames in the last packet which are not meant to be played back. These remainder frames should be marked with kCMSampleBufferAttachmentKey_TrimDurationAtEnd. CMSampleBuffers obtained from AVAssetReader will already have the necessary trim attachments. Please see http://developer.apple.com/mac/library/technotes/tn2009/tn2258.html for more information about encoder delay. When attaching trims make sure that the output PTS of the sample buffer is what you expect. For example if you called -[AVAssetWriter startSessionAtSourceTime:kCMTimeZero] and you want your audio to start at time zero in the output file then make sure that the output PTS of the first non-fully trimmed audio sample buffer is kCMTimeZero.
+///
+/// If the sample buffer contains a CVPixelBuffer then the choice of pixel format will affect the performance and quality of the encode. For optimal performance the format of the pixel buffer should match one of the native formats supported by the selected video encoder. Below are some recommendations:
+///
+/// The H.264 and HEVC encoders natively support kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange and kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, which should be used with 8-bit 4:2:0 video and full range input respectively; other related pixel formats in CoreVideo/CVPixelBuffer.h are ideal for 4:2:2 and 4:4:4 (and for HEVC, 10-bit). The JPEG encoder on iOS and Apple Silicon macOS natively supports kCVPixelFormatType_422YpCbCr8FullRange. If you need to work in the RGB domain then kCVPixelFormatType_32BGRA is recommended on iOS and macOS.
+///
+/// Pixel buffers not in a natively supported format will be converted internally prior to encoding when possible. Pixel format conversions within the same range (video or full) are generally faster than conversions between different ranges.
+///
+/// The ProRes encoders can preserve high bit depth sources, supporting up to 12bits/ch. ProRes 4444 can contain a mathematically lossless alpha channel and it doesn't do any chroma subsampling. This makes ProRes 4444 ideal for quality critical applications. If you are working with 8bit sources ProRes is also a good format to use due to its high image quality. Use either of the recommended pixel formats above. Note that RGB pixel formats by definition have 4:4:4 chroma sampling.
+///
+/// If you are working with high bit depth sources the following yuv pixel formats are recommended when encoding to ProRes: kCVPixelFormatType_4444AYpCbCr16, kCVPixelFormatType_422YpCbCr16, and kCVPixelFormatType_422YpCbCr10. When working in the RGB domain kCVPixelFormatType_64ARGB is recommended. Scaling and color matching are not currently supported when using AVAssetWriter with any of these high bit depth pixel formats. Please make sure that your track's output settings dictionary specifies the same width and height as the buffers you will be appending. Do not include AVVideoScalingModeKey or AVVideoColorPropertiesKey.
+///
+/// As of macOS 10.10 and iOS 8.0, this method can be used to add sample buffers that reference existing data in a file instead of containing media data to be appended to the file. This can be used to generate tracks that are not self-contained. In order to append such a sample reference to the track create a CMSampleBufferRef with a NULL dataBuffer and dataReady set to true and set the kCMSampleBufferAttachmentKey_SampleReferenceURL and kCMSampleBufferAttachmentKey_SampleReferenceByteOffset attachments on the sample buffer. Further documentation on how to create such a "sample reference" sample buffer can be found in the description of the kCMSampleBufferAttachmentKey_SampleReferenceURL and kCMSampleBufferAttachmentKey_SampleReferenceByteOffset attachment keys in the CMSampleBuffer documentation.
+///
+/// Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer. It is an error to invoke this method before starting a session (via -[AVAssetWriter startSessionAtSourceTime:]) or after ending a session (via -[AVAssetWriter endSessionAtSourceTime:]).
+///
+/// This method throws an exception if the sample buffer's media type does not match the asset writer input's media type.
+///
+/// - Parameter sampleBuffer: The CMSampleBuffer to be appended.
+///
+/// - Returns: A BOOL value indicating success of appending the sample buffer. If a result of NO is returned, clients can check the value of AVAssetWriter.status to determine whether the writing operation completed, failed, or was cancelled. If the status is AVAssetWriterStatusFailed, AVAsset.error will contain an instance of NSError that describes the failure.
+- (BOOL)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer
+#if __swift__
+API_DEPRECATED("Use SampleBufferReceiver.append(_:) instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
+API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
+;
-/*!
- @method requestMediaDataWhenReadyOnQueue:usingBlock:
- @abstract
- Instructs the receiver to invoke a client-supplied block repeatedly, at its convenience, in order to gather media data for writing to the output file.
-
- @param queue
- The queue on which the block should be invoked.
- @param block
- The block the input should invoke to obtain media data.
-
- @discussion
- The block should append media data to the input either until the input's readyForMoreMediaData property becomes NO or until there is no more media data to supply (at which point it may choose to mark the input as finished via -markAsFinished). The block should then exit. After the block exits, if the input has not been marked as finished, once the input has processed the media data it has received and becomes ready for more media data again, it will invoke the block again in order to obtain more.
-
- A typical use of this method, with a block that supplies media data to an input while respecting the input's readyForMoreMediaData property, might look like this:
-
- [myAVAssetWriterInput requestMediaDataWhenReadyOnQueue:myInputSerialQueue usingBlock:^{
- while ([myAVAssetWriterInput isReadyForMoreMediaData])
- {
- CMSampleBufferRef nextSampleBuffer = [self copyNextSampleBufferToWrite];
- if (nextSampleBuffer)
- {
- [myAVAssetWriterInput appendSampleBuffer:nextSampleBuffer];
- CFRelease(nextSampleBuffer);
- }
- else
- {
- [myAVAssetWriterInput markAsFinished];
- break;
- }
- }
- }];
-
- This method is not recommended for use with a push-style buffer source, such as AVCaptureAudioDataOutput or AVCaptureVideoDataOutput, because such a combination will likely require intermediate queueing of buffers. Instead, this method is better suited to a pull-style buffer source such as AVAssetReaderOutput, as illustrated in the above example.
-
- When using a push-style buffer source, it is generally better to immediately append each buffer to the AVAssetWriterInput, directly via -[AVAssetWriter appendSampleBuffer:], as it is received. Using this strategy, it is often possible to avoid having to queue up buffers in between the buffer source and the AVAssetWriterInput. Note that many of these push-style buffer sources also produce buffers in real-time, in which case the client should set expectsMediaDataInRealTime to YES.
-
- Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer.
-
- This method throws an exception if this method is called more than once.
- */
-- (void)requestMediaDataWhenReadyOnQueue:(dispatch_queue_t)queue usingBlock:(void (^ NS_SWIFT_SENDABLE)(void))block;
-
-/*!
- @method appendSampleBuffer:
- @abstract
- Appends samples to the receiver.
-
- @param sampleBuffer
- The CMSampleBuffer to be appended.
- @result
- A BOOL value indicating success of appending the sample buffer. If a result of NO is returned, clients can check the value of AVAssetWriter.status to determine whether the writing operation completed, failed, or was cancelled. If the status is AVAssetWriterStatusFailed, AVAsset.error will contain an instance of NSError that describes the failure.
-
- @discussion
- The timing information in the sample buffer, considered relative to the time passed to -[AVAssetWriter startSessionAtSourceTime:], will be used to determine the timing of those samples in the output file.
-
- For track types other than audio tracks, to determine the duration of all samples in the output file other than the very last sample that's appended, the difference between the sample buffer's output DTS and the following sample buffer's output DTS will be used. The duration of the last sample is determined as follows:
- 1. If a marker sample buffer with kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration is appended following the last media-bearing sample, the difference between the output DTS of the marker sample buffer and the output DTS of the last media-bearing sample will be used.
- 2. If the marker sample buffer is not provided and if the output duration of the last media-bearing sample is valid, it will be used.
- 3. if the output duration of the last media-bearing sample is not valid, the duration of the second-to-last sample will be used.
-
- For audio tracks, the properties of each appended sample buffer are used to determine corresponding output durations.
-
- The receiver will retain the CMSampleBuffer until it is done with it, and then release it. Do not modify a CMSampleBuffer or its contents after you have passed it to this method.
-
- If the sample buffer contains audio data and the AVAssetWriterInput was intialized with an outputSettings dictionary then the format must be linear PCM. If the outputSettings dictionary was nil then audio data can be provided in a compressed format, and it will be passed through to the output without any re-compression. Note that advanced formats like AAC will have encoder delay present in their bitstreams. This data is inserted by the encoder and is necessary for proper decoding, but it is not meant to be played back. Clients who provide compressed audio bitstreams must use kCMSampleBufferAttachmentKey_TrimDurationAtStart to mark the encoder delay (generally restricted to the first sample buffer). Packetization can cause there to be extra audio frames in the last packet which are not meant to be played back. These remainder frames should be marked with kCMSampleBufferAttachmentKey_TrimDurationAtEnd. CMSampleBuffers obtained from AVAssetReader will already have the necessary trim attachments. Please see http://developer.apple.com/mac/library/technotes/tn2009/tn2258.html for more information about encoder delay. When attaching trims make sure that the output PTS of the sample buffer is what you expect. For example if you called -[AVAssetWriter startSessionAtSourceTime:kCMTimeZero] and you want your audio to start at time zero in the output file then make sure that the output PTS of the first non-fully trimmed audio sample buffer is kCMTimeZero.
-
- If the sample buffer contains a CVPixelBuffer then the choice of pixel format will affect the performance and quality of the encode. For optimal performance the format of the pixel buffer should match one of the native formats supported by the selected video encoder. Below are some recommendations:
-
- The H.264 and HEVC encoders natively support kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange and kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, which should be used with 8-bit 4:2:0 video and full range input respectively; other related pixel formats in CoreVideo/CVPixelBuffer.h are ideal for 4:2:2 and 4:4:4 (and for HEVC, 10-bit). The JPEG encoder on iOS and Apple Silicon macOS natively supports kCVPixelFormatType_422YpCbCr8FullRange. If you need to work in the RGB domain then kCVPixelFormatType_32BGRA is recommended on iOS and macOS.
-
- Pixel buffers not in a natively supported format will be converted internally prior to encoding when possible. Pixel format conversions within the same range (video or full) are generally faster than conversions between different ranges.
-
- The ProRes encoders can preserve high bit depth sources, supporting up to 12bits/ch. ProRes 4444 can contain a mathematically lossless alpha channel and it doesn't do any chroma subsampling. This makes ProRes 4444 ideal for quality critical applications. If you are working with 8bit sources ProRes is also a good format to use due to its high image quality. Use either of the recommended pixel formats above. Note that RGB pixel formats by definition have 4:4:4 chroma sampling.
-
- If you are working with high bit depth sources the following yuv pixel formats are recommended when encoding to ProRes: kCVPixelFormatType_4444AYpCbCr16, kCVPixelFormatType_422YpCbCr16, and kCVPixelFormatType_422YpCbCr10. When working in the RGB domain kCVPixelFormatType_64ARGB is recommended. Scaling and color matching are not currently supported when using AVAssetWriter with any of these high bit depth pixel formats. Please make sure that your track's output settings dictionary specifies the same width and height as the buffers you will be appending. Do not include AVVideoScalingModeKey or AVVideoColorPropertiesKey.
-
- As of macOS 10.10 and iOS 8.0, this method can be used to add sample buffers that reference existing data in a file instead of containing media data to be appended to the file. This can be used to generate tracks that are not self-contained. In order to append such a sample reference to the track create a CMSampleBufferRef with a NULL dataBuffer and dataReady set to true and set the kCMSampleBufferAttachmentKey_SampleReferenceURL and kCMSampleBufferAttachmentKey_SampleReferenceByteOffset attachments on the sample buffer. Further documentation on how to create such a "sample reference" sample buffer can be found in the description of the kCMSampleBufferAttachmentKey_SampleReferenceURL and kCMSampleBufferAttachmentKey_SampleReferenceByteOffset attachment keys in the CMSampleBuffer documentation.
-
- Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer. It is an error to invoke this method before starting a session (via -[AVAssetWriter startSessionAtSourceTime:]) or after ending a session (via -[AVAssetWriter endSessionAtSourceTime:]).
-
- This method throws an exception if the sample buffer's media type does not match the asset writer input's media type.
- */
-- (BOOL)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer;
-
-/*!
- @method markAsFinished
- @abstract
- Indicates to the AVAssetWriter that no more buffers will be appended to this input.
-
- @discussion
- Clients that are monitoring each input's readyForMoreMediaData value must call markAsFinished on an input when they are done appending buffers to it. This is necessary to prevent other inputs from stalling, as they may otherwise wait forever for that input's media data, attempting to complete the ideal interleaving pattern.
-
- After invoking this method from the serial queue passed to -requestMediaDataWhenReadyOnQueue:usingBlock:, the receiver is guaranteed to issue no more invocations of the block passed to that method. The same is true of -respondToEachPassDescriptionOnQueue:usingBlock:.
-
- Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer.
- */
+/// Indicates to the AVAssetWriter that no more buffers will be appended to this input.
+///
+/// Clients that are monitoring each input's readyForMoreMediaData value must call markAsFinished on an input when they are done appending buffers to it. This is necessary to prevent other inputs from stalling, as they may otherwise wait forever for that input's media data, attempting to complete the ideal interleaving pattern.
+///
+/// After invoking this method from the serial queue passed to -requestMediaDataWhenReadyOnQueue:usingBlock:, the receiver is guaranteed to issue no more invocations of the block passed to that method. The same is true of -respondToEachPassDescriptionOnQueue:usingBlock:.
+///
+/// Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer.
- (void)markAsFinished;
@end
@@ -333,32 +278,22 @@
API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVAssetWriterInput (AVAssetWriterInputLanguageProperties)
-/*!
- @property languageCode
- @abstract
- Indicates the language to associate with the track corresponding to the receiver, as an ISO 639-2/T language code; can be nil.
-
- @discussion
- Also see extendedLanguageTag below.
-
- This property cannot be set after writing on the receiver's AVAssetWriter has started.
-
- This property throws an exception if a language code is set which does not conform to the ISO 639-2/T language codes.
- */
+/// Indicates the language to associate with the track corresponding to the receiver, as an ISO 639-2/T language code; can be nil.
+///
+/// Also see extendedLanguageTag below.
+///
+/// This property cannot be set after writing on the receiver's AVAssetWriter has started.
+///
+/// This property throws an exception if a language code is set which does not conform to the ISO 639-2/T language codes.
@property (nonatomic, copy, nullable) NSString *languageCode API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property extendedLanguageTag
- @abstract
- Indicates the language tag to associate with the track corresponding to the receiver, as an IETF BCP 47 (RFC 4646) language identifier; can be nil.
-
- @discussion
- Extended language tags are normally set only when an ISO 639-2/T language code by itself is ambiguous, as in cases in which media data should be distinguished not only by language but also by the regional dialect in use or the writing system employed.
-
- This property cannot be set after writing on the receiver's AVAssetWriter has started.
-
- This property throws an exception if an extended language tag is set which does not conform to the IETF BCP 47 (RFC 4646) language identifiers.
- */
+/// Indicates the language tag to associate with the track corresponding to the receiver, as an IETF BCP 47 (RFC 4646) language identifier; can be nil.
+///
+/// Extended language tags are normally set only when an ISO 639-2/T language code by itself is ambiguous, as in cases in which media data should be distinguished not only by language but also by the regional dialect in use or the writing system employed.
+///
+/// This property cannot be set after writing on the receiver's AVAssetWriter has started.
+///
+/// This property throws an exception if an extended language tag is set which does not conform to the IETF BCP 47 (RFC 4646) language identifiers.
@property (nonatomic, copy, nullable) NSString *extendedLanguageTag API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
@end
@@ -366,28 +301,18 @@
API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVAssetWriterInput (AVAssetWriterInputPropertiesForVisualCharacteristic)
-/*!
- @property naturalSize
- @abstract
- The size specified in the output file as the natural dimensions of the visual media data for display purposes.
-
- @discussion
- If the default value, CGSizeZero, is specified, the naturalSize of the track corresponding to the receiver is set according to dimensions indicated by the format descriptions that are ultimately written to the output track.
-
- This property cannot be set after writing on the receiver's AVAssetWriter has started.
-*/
+/// The size specified in the output file as the natural dimensions of the visual media data for display purposes.
+///
+/// If the default value, CGSizeZero, is specified, the naturalSize of the track corresponding to the receiver is set according to dimensions indicated by the format descriptions that are ultimately written to the output track.
+///
+/// This property cannot be set after writing on the receiver's AVAssetWriter has started.
@property (nonatomic) CGSize naturalSize API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property transform
- @abstract
- The transform specified in the output file as the preferred transformation of the visual media data for display purposes.
-
- @discussion
- If no value is specified, the identity transform is used.
-
- This property cannot be set after writing on the receiver's AVAssetWriter has started.
-*/
+/// The transform specified in the output file as the preferred transformation of the visual media data for display purposes.
+///
+/// If no value is specified, the identity transform is used.
+///
+/// This property cannot be set after writing on the receiver's AVAssetWriter has started.
@property (nonatomic) CGAffineTransform transform;
@end
@@ -395,16 +320,11 @@
API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVAssetWriterInput (AVAssetWriterInputPropertiesForAudibleCharacteristic)
-/*!
- @property preferredVolume
- @abstract
- The preferred volume level to be stored in the output file.
-
- @discussion
- The value for this property should typically be in the range of 0.0 to 1.0. The default value is 1.0, which is equivalent to a "normal" volume level for audio media type. For all other media types the default value is 0.0.
-
- This property cannot be set after writing on the receiver's AVAssetWriter has started.
- */
+/// The preferred volume level to be stored in the output file.
+///
+/// The value for this property should typically be in the range of 0.0 to 1.0. The default value is 1.0, which is equivalent to a "normal" volume level for audio media type. For all other media types the default value is 0.0.
+///
+/// This property cannot be set after writing on the receiver's AVAssetWriter has started.
@property (nonatomic) float preferredVolume API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
@end
@@ -412,115 +332,82 @@
API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVAssetWriterInput (AVAssetWriterInputFileTypeSpecificProperties)
-/*!
- @property marksOutputTrackAsEnabled
- @abstract
- For file types that support enabled and disabled tracks, such as QuickTime Movie files, specifies whether the track corresponding to the receiver should be enabled by default for playback and processing. The default value is YES.
-
- @discussion
- When an input group is added to an AVAssetWriter (see -[AVAssetWriter addInputGroup:]), the value of marksOutputTrackAsEnabled will automatically be set to YES for the default input and set to NO for all of the other inputs in the group. In this case, if a new value is set on this property then an exception will be raised.
-
- This property cannot be set after writing on the receiver's AVAssetWriter has started.
-
- This property throws an exception if a value is set on an asset writer input that is contained in an input group.
- */
+/// For file types that support enabled and disabled tracks, such as QuickTime Movie files, specifies whether the track corresponding to the receiver should be enabled by default for playback and processing. The default value is YES.
+///
+/// When an input group is added to an AVAssetWriter (see -[AVAssetWriter addInputGroup:]), the value of marksOutputTrackAsEnabled will automatically be set to YES for the default input and set to NO for all of the other inputs in the group. In this case, if a new value is set on this property then an exception will be raised.
+///
+/// This property cannot be set after writing on the receiver's AVAssetWriter has started.
+///
+/// This property throws an exception if a value is set on an asset writer input that is contained in an input group.
@property (nonatomic) BOOL marksOutputTrackAsEnabled API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property mediaTimeScale
- @abstract
- For file types that support media time scales, such as QuickTime Movie files, specifies the media time scale to be used.
-
- @discussion
- The default value is 0, which indicates that the receiver should choose a convenient value, if applicable. It is an error to set a value other than 0 if the receiver has media type AVMediaTypeAudio.
-
- This property cannot be set after writing has started.
-
- This property throws an exception if a value is set on an asset writer input with media type AVMediaTypeAudio.
- */
+/// For file types that support media time scales, such as QuickTime Movie files, specifies the media time scale to be used.
+///
+/// The default value is 0, which indicates that the receiver should choose a convenient value, if applicable. It is an error to set a value other than 0 if the receiver has media type AVMediaTypeAudio.
+///
+/// This property cannot be set after writing has started.
+///
+/// This property throws an exception if a value is set on an asset writer input with media type AVMediaTypeAudio.
@property (nonatomic) CMTimeScale mediaTimeScale API_AVAILABLE(macos(10.7), ios(4.3), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property preferredMediaChunkDuration
- @abstract
- For file types that support media chunk duration, such as QuickTime Movie files, specifies the duration to be used for each chunk of sample data in the output file.
-
- @discussion
- Chunk duration can influence the granularity of the I/O performed when reading a media file, e.g. during playback. A larger chunk duration can result in fewer reads from disk, at the potential expense of a higher memory footprint.
-
- A "chunk" contains one or more samples. The total duration of the samples in a chunk is no greater than this preferred chunk duration, or the duration of a single sample if the sample's duration is greater than this preferred chunk duration.
-
- The default value is kCMTimeInvalid, which means that the receiver will choose an appropriate default value.
-
- This property cannot be set after -startWriting has been called on the receiver.
-
- This property throws an exception if a duration is set which is non-numeric or non-positive (see CMTIME_IS_NUMERIC).
- */
+/// For file types that support media chunk duration, such as QuickTime Movie files, specifies the duration to be used for each chunk of sample data in the output file.
+///
+/// Chunk duration can influence the granularity of the I/O performed when reading a media file, e.g. during playback. A larger chunk duration can result in fewer reads from disk, at the potential expense of a higher memory footprint.
+///
+/// A "chunk" contains one or more samples. The total duration of the samples in a chunk is no greater than this preferred chunk duration, or the duration of a single sample if the sample's duration is greater than this preferred chunk duration.
+///
+/// The default value is kCMTimeInvalid, which means that the receiver will choose an appropriate default value.
+///
+/// This property cannot be set after -startWriting has been called on the receiver.
+///
+/// This property throws an exception if a duration is set which is non-numeric or non-positive (see CMTIME_IS_NUMERIC).
@property (nonatomic) CMTime preferredMediaChunkDuration API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property preferredMediaChunkAlignment
- @abstract
- For file types that support media chunk alignment, such as QuickTime Movie files, specifies the boundary for media chunk alignment in bytes (e.g. 512).
-
- @discussion
- The default value is 0, which means that the receiver will choose an appropriate default value. A value of 1 implies that no padding should be used to achieve a particular chunk alignment. It is an error to set a negative value for chunk alignment.
-
- This property cannot be set after -startWriting has been called on the receiver.
- */
+/// For file types that support media chunk alignment, such as QuickTime Movie files, specifies the boundary for media chunk alignment in bytes (e.g. 512).
+///
+/// The default value is 0, which means that the receiver will choose an appropriate default value. A value of 1 implies that no padding should be used to achieve a particular chunk alignment. It is an error to set a negative value for chunk alignment.
+///
+/// This property cannot be set after -startWriting has been called on the receiver.
@property (nonatomic) NSInteger preferredMediaChunkAlignment API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property sampleReferenceBaseURL
- @abstract
- For file types that support writing sample references, such as QuickTime Movie files, specifies the base URL sample references are relative to.
-
- @discussion
- If the value of this property can be resolved as an absolute URL, the sample locations written to the file when appending sample references will be relative to this URL. The URL must point to a location that is in a directory that is a parent of the sample reference location.
-
- Usage example:
-
- Setting the sampleReferenceBaseURL property to "file:///User/johnappleseed/Movies/" and appending sample buffers with the kCMSampleBufferAttachmentKey_SampleReferenceURL attachment set to "file:///User/johnappleseed/Movies/data/movie1.mov" will cause the sample reference "data/movie1.mov" to be written to the movie.
-
- If the value of the property cannot be resolved as an absolute URL or if it points to a location that is not in a parent directory of the sample reference location, the location referenced in the sample buffer will be written unmodified.
-
- The default value is nil, which means that the location referenced in the sample buffer will be written unmodified.
-
- This property cannot be set after -startWriting has been called on the receiver.
- */
+/// For file types that support writing sample references, such as QuickTime Movie files, specifies the base URL sample references are relative to.
+///
+/// If the value of this property can be resolved as an absolute URL, the sample locations written to the file when appending sample references will be relative to this URL. The URL must point to a location that is in a directory that is a parent of the sample reference location.
+///
+/// Usage example:
+///
+/// Setting the sampleReferenceBaseURL property to "file:///User/johnappleseed/Movies/" and appending sample buffers with the kCMSampleBufferAttachmentKey_SampleReferenceURL attachment set to "file:///User/johnappleseed/Movies/data/movie1.mov" will cause the sample reference "data/movie1.mov" to be written to the movie.
+///
+/// If the value of the property cannot be resolved as an absolute URL or if it points to a location that is not in a parent directory of the sample reference location, the location referenced in the sample buffer will be written unmodified.
+///
+/// The default value is nil, which means that the location referenced in the sample buffer will be written unmodified.
+///
+/// This property cannot be set after -startWriting has been called on the receiver.
@property (nonatomic, copy, nullable) NSURL *sampleReferenceBaseURL API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
typedef NSString *AVAssetWriterInputMediaDataLocation NS_STRING_ENUM API_AVAILABLE(macos(10.13), ios(11.0), tvos(11.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @constant AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData
- Indicates that the media data should be interleaved with all other media data with this constant.
- */
+/// Indicates that the media data should be interleaved with all other media data with this constant.
AVF_EXPORT AVAssetWriterInputMediaDataLocation const AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData API_AVAILABLE(macos(10.13), ios(11.0), tvos(11.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @constant AVAssetWriterInputMediaDataLocationBeforeMainMediaDataNotInterleaved
- Indicates that the media data should be laid out before all the media data with AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData and not be interleaved.
- */
+/// Indicates that the media data should be laid out before all the media data with AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData and not be interleaved.
AVF_EXPORT AVAssetWriterInputMediaDataLocation const AVAssetWriterInputMediaDataLocationBeforeMainMediaDataNotInterleaved API_AVAILABLE(macos(10.13), ios(11.0), tvos(11.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property mediaDataLocation
- @abstract
- Specifies where the media data will be laid out and whether the media data will be interleaved as the main media data.
+/// Indicates that there may be large segments of time without any media data from this track. When mediaDataLocation is set to this value, AVAssetWriter will interleave the media data, but will not wait for media data from this track to achieve tight interleaving with other tracks.
+AVF_EXPORT AVAssetWriterInputMediaDataLocation const AVAssetWriterInputMediaDataLocationSparselyInterleavedWithMainMediaData API_AVAILABLE(macos(26.0), ios(26.0), tvos(26.0), visionos(26.0)) API_UNAVAILABLE(watchos);
- @discussion
- If this value is set to AVAssetWriterInputMediaDataLocationBeforeMainMediaDataNotInterleaved, AVAssetWriter tries to write the media data for this track before all the media data for AVAssetWriterInputs with this property set to AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData.
-
- Use of this property is recommended for optimizing tracks that contain a small amount of data that is needed all at once, independent of playback time, such as chapter name tracks and chapter image tracks.
- Keep it set to AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData for tracks whose media data that's needed only as its presentation time is approaching and, when multiple inputs are present that supply media data that will be played concurrently, should be interleaved for optimal access.
-
- For file types that support preloading media data such as QuickTime movie file, if this value is set to AVAssetWriterInputMediaDataLocationBeforeMainMediaDataNotInterleaved, AVAssetWriter will write an indication such as 'load' atom that the whole media data should be preloaded.
-
- The default value is AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData, which means that the receiver will not write the indication and that the media data will be interleaved.
-
- This property cannot be set after -startWriting has been called on the receiver.
- */
+/// Specifies where the media data will be laid out and whether the media data will be interleaved as the main media data.
+///
+/// If this value is set to AVAssetWriterInputMediaDataLocationBeforeMainMediaDataNotInterleaved, AVAssetWriter tries to write the media data for this track before all the media data for AVAssetWriterInputs with this property set to AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData.
+///
+/// Use of this property is recommended for optimizing tracks that contain a small amount of data that is needed all at once, independent of playback time, such as chapter name tracks and chapter image tracks.
+/// Keep it set to AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData for tracks whose media data that's needed only as its presentation time is approaching and, when multiple inputs are present that supply media data that will be played concurrently, should be interleaved for optimal access.
+///
+/// For file types that support preloading media data such as QuickTime movie file, if this value is set to AVAssetWriterInputMediaDataLocationBeforeMainMediaDataNotInterleaved, AVAssetWriter will write an indication such as 'load' atom that the whole media data should be preloaded.
+///
+/// The default value is AVAssetWriterInputMediaDataLocationInterleavedWithMainMediaData, which means that the receiver will not write the indication and that the media data will be interleaved.
+///
+/// This property cannot be set after -startWriting has been called on the receiver.
@property (nonatomic, copy) AVAssetWriterInputMediaDataLocation mediaDataLocation API_AVAILABLE(macos(10.13), ios(11.0), tvos(11.0), visionos(1.0)) API_UNAVAILABLE(watchos);
@end
@@ -528,38 +415,24 @@
API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVAssetWriterInput (AVAssetWriterInputTrackAssociations)
-/*!
- @method canAddTrackAssociationWithTrackOfInput:type:
- @abstract
- Tests whether an association between the tracks corresponding to a pair of inputs is valid.
-
- @param input
- The instance of AVAssetWriterInput with a corresponding track to associate with track corresponding with the receiver.
- @param trackAssociationType
- The type of track association to test. Common track association types, such as AVTrackAssociationTypeTimecode, are defined in AVAssetTrack.h.
-
- @discussion
- If the type of association requires tracks of specific media types that don't match the media types of the inputs, or if the output file type does not support track associations, -canAddTrackAssociationWithTrackOfInput:type: will return NO.
- */
+/// Tests whether an association between the tracks corresponding to a pair of inputs is valid.
+///
+/// If the type of association requires tracks of specific media types that don't match the media types of the inputs, or if the output file type does not support track associations, -canAddTrackAssociationWithTrackOfInput:type: will return NO.
+///
+/// - Parameter input: The instance of AVAssetWriterInput with a corresponding track to associate with track corresponding with the receiver.
+/// - Parameter trackAssociationType: The type of track association to test. Common track association types, such as AVTrackAssociationTypeTimecode, are defined in AVAssetTrack.h.
- (BOOL)canAddTrackAssociationWithTrackOfInput:(AVAssetWriterInput *)input type:(NSString *)trackAssociationType API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @method addTrackAssociationWithTrackOfInput:type:
- @abstract
- Associates the track corresponding to the specified input with the track corresponding with the receiver.
-
- @param input
- The instance of AVAssetWriterInput with a corresponding track to associate with track corresponding to the receiver.
- @param trackAssociationType
- The type of track association to add. Common track association types, such as AVTrackAssociationTypeTimecode, are defined in AVAssetTrack.h.
-
- @discussion
- If the type of association requires tracks of specific media types that don't match the media types of the inputs, or if the output file type does not support track associations, an NSInvalidArgumentException is raised.
-
- Track associations cannot be added after writing on the receiver's AVAssetWriter has started.
-
- This method throws an exception if the input and track association type cannot be added (see -canAddTrackAssociationWithTrackOfInput:type:).
- */
+/// Associates the track corresponding to the specified input with the track corresponding with the receiver.
+///
+/// If the type of association requires tracks of specific media types that don't match the media types of the inputs, or if the output file type does not support track associations, an NSInvalidArgumentException is raised.
+///
+/// Track associations cannot be added after writing on the receiver's AVAssetWriter has started.
+///
+/// This method throws an exception if the input and track association type cannot be added (see -canAddTrackAssociationWithTrackOfInput:type:).
+///
+/// - Parameter input: The instance of AVAssetWriterInput with a corresponding track to associate with track corresponding to the receiver.
+/// - Parameter trackAssociationType: The type of track association to add. Common track association types, such as AVTrackAssociationTypeTimecode, are defined in AVAssetTrack.h.
- (void)addTrackAssociationWithTrackOfInput:(AVAssetWriterInput *)input type:(NSString *)trackAssociationType API_AVAILABLE(macos(10.9), ios(7.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
@end
@@ -570,101 +443,74 @@
API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVAssetWriterInput (AVAssetWriterInputMultiPass)
-/*!
- @property performsMultiPassEncodingIfSupported
- @abstract
- Indicates whether the input should attempt to encode the source media data using multiple passes.
-
- @discussion
- The input may be able to achieve higher quality and/or lower data rate by performing multiple passes over the source media. It does this by analyzing the media data that has been appended and re-encoding certain segments with different parameters. In order to do this re-encoding, the media data for these segments must be appended again. See -markCurrentPassAsFinished and the property currentPassDescription for the mechanism by which the input nominates segments for re-appending.
-
- When the value of this property is YES, the value of readyForMoreMediaData for other inputs attached to the same AVAssetWriter may be NO more often and/or for longer periods of time. In particular, the value of readyForMoreMediaData for inputs that do not (or cannot) perform multiple passes may start out as NO after -[AVAssetWriter startWriting] has been called and may not change to YES until after all multi-pass inputs have completed their final pass.
-
- When the value of this property is YES, the input may store data in one or more temporary files before writing compressed samples to the output file. Use the AVAssetWriter property directoryForTemporaryFiles if you need to control the location of temporary file writing.
-
- The default value is NO, meaning that no additional analysis will occur and no segments will be re-encoded. Not all asset writer input configurations (for example, inputs configured with certain media types or to use certain encoders) can benefit from performing multiple passes over the source media. To determine whether the selected encoder can perform multiple passes, query the value of canPerformMultiplePasses after calling -startWriting.
-
- For best results, do not set both this property and expectsMediaDataInRealTime to YES.
-
- This property cannot be set after writing on the receiver's AVAssetWriter has started.
- */
+/// Indicates whether the input should attempt to encode the source media data using multiple passes.
+///
+/// The input may be able to achieve higher quality and/or lower data rate by performing multiple passes over the source media. It does this by analyzing the media data that has been appended and re-encoding certain segments with different parameters. In order to do this re-encoding, the media data for these segments must be appended again. See -markCurrentPassAsFinished and the property currentPassDescription for the mechanism by which the input nominates segments for re-appending.
+///
+/// When the value of this property is YES, the value of readyForMoreMediaData for other inputs attached to the same AVAssetWriter may be NO more often and/or for longer periods of time. In particular, the value of readyForMoreMediaData for inputs that do not (or cannot) perform multiple passes may start out as NO after -[AVAssetWriter startWriting] has been called and may not change to YES until after all multi-pass inputs have completed their final pass.
+///
+/// When the value of this property is YES, the input may store data in one or more temporary files before writing compressed samples to the output file. Use the AVAssetWriter property directoryForTemporaryFiles if you need to control the location of temporary file writing.
+///
+/// The default value is NO, meaning that no additional analysis will occur and no segments will be re-encoded. Not all asset writer input configurations (for example, inputs configured with certain media types or to use certain encoders) can benefit from performing multiple passes over the source media. To determine whether the selected encoder can perform multiple passes, query the value of canPerformMultiplePasses after calling -startWriting.
+///
+/// For best results, do not set both this property and expectsMediaDataInRealTime to YES.
+///
+/// This property cannot be set after writing on the receiver's AVAssetWriter has started.
@property (nonatomic) BOOL performsMultiPassEncodingIfSupported API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property canPerformMultiplePasses
- @abstract
- Indicates whether the input might perform multiple passes over appended media data.
-
- @discussion
- When the value for this property is YES, your source for media data should be configured for random access. After appending all of the media data for the current pass (as specified by the currentPassDescription property), call -markCurrentPassAsFinished to start the process of determining whether additional passes are needed. Note that it is still possible in this case for the input to perform only the initial pass, if it determines that there will be no benefit to performing multiple passes.
-
- When the value for this property is NO, your source for media data only needs to support sequential access. In this case, append all of the source media once and call -markAsFinished.
-
- In the default configuration of AVAssetWriterInput, the value for this property will be NO. Currently the only way for this property to become YES is when performsMultiPassEncodingIfSupported has been set to YES. The final value will be available after -startWriting is called, when a specific encoder has been choosen.
-
- This property is key-value observable.
- */
+/// Indicates whether the input might perform multiple passes over appended media data.
+///
+/// When the value for this property is YES, your source for media data should be configured for random access. After appending all of the media data for the current pass (as specified by the currentPassDescription property), call -markCurrentPassAsFinished to start the process of determining whether additional passes are needed. Note that it is still possible in this case for the input to perform only the initial pass, if it determines that there will be no benefit to performing multiple passes.
+///
+/// When the value for this property is NO, your source for media data only needs to support sequential access. In this case, append all of the source media once and call -markAsFinished.
+///
+/// In the default configuration of AVAssetWriterInput, the value for this property will be NO. Currently the only way for this property to become YES is when performsMultiPassEncodingIfSupported has been set to YES. The final value will be available after -startWriting is called, when a specific encoder has been choosen.
+///
+/// This property is key-value observable.
@property (nonatomic, readonly) BOOL canPerformMultiplePasses API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @property currentPassDescription
- @abstract
- Provides an object that describes the requirements, such as source time ranges to append or re-append, for the current pass.
-
- @discussion
- If the value of this property is nil, it means there is no request to be fulfilled and -markAsFinished should be called on the asset writer input.
-
- During the first pass, the request will contain a single time range from zero to positive infinity, indicating that all media from the source should be appended. This will also be true when canPerformMultiplePasses is NO, in which case only one pass will be performed.
-
- The value of this property will be nil before -startWriting is called on the attached asset writer. It will transition to an initial non-nil value during the call to -startWriting. After that, the value of this property will change only after a call to -markCurrentPassAsFinished. For an easy way to be notified at the beginning of each pass, see -respondToEachPassDescriptionOnQueue:usingBlock:.
-
- This property is key-value observable. Observers should not assume that they will be notified of changes on a specific thread.
- */
+/// Provides an object that describes the requirements, such as source time ranges to append or re-append, for the current pass.
+///
+/// If the value of this property is nil, it means there is no request to be fulfilled and -markAsFinished should be called on the asset writer input.
+///
+/// During the first pass, the request will contain a single time range from zero to positive infinity, indicating that all media from the source should be appended. This will also be true when canPerformMultiplePasses is NO, in which case only one pass will be performed.
+///
+/// The value of this property will be nil before -startWriting is called on the attached asset writer. It will transition to an initial non-nil value during the call to -startWriting. After that, the value of this property will change only after a call to -markCurrentPassAsFinished. For an easy way to be notified at the beginning of each pass, see -respondToEachPassDescriptionOnQueue:usingBlock:.
+///
+/// This property is key-value observable. Observers should not assume that they will be notified of changes on a specific thread.
@property (readonly, nullable) AVAssetWriterInputPassDescription *currentPassDescription API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @method respondToEachPassDescriptionOnQueue:usingBlock:
- @abstract
- Instructs the receiver to invoke a client-supplied block whenever a new pass has begun.
-
- @param queue
- The queue on which the block should be invoked.
- @param block
- A block the receiver should invoke whenever a new pass has begun.
-
- @discussion
- A typical block passed to this method will perform the following steps:
-
- 1. Query the value of the receiver's currentPassDescription property and reconfigure the source of media data (e.g. AVAssetReader) accordingly
- 2. Call -requestMediaDataWhenReadyOnQueue:usingBlock: to begin appending data for the current pass
- 3. Exit
-
- When all media data has been appended for the current request, call -markCurrentPassAsFinished to begin the process of determining whether an additional pass is warranted. If an additional pass is warranted, the block passed to this method will be invoked to begin the next pass. If no additional passes are needed, the block passed to this method will be invoked one final time so the client can invoke -markAsFinished in response to the value of currentPassDescription becoming nil.
-
- Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer.
-
- This method throws an exception if called more than once.
- */
+/// Instructs the receiver to invoke a client-supplied block whenever a new pass has begun.
+///
+/// A typical block passed to this method will perform the following steps:
+///
+/// 1. Query the value of the receiver's currentPassDescription property and reconfigure the source of media data (e.g. AVAssetReader) accordingly
+/// 2. Call -requestMediaDataWhenReadyOnQueue:usingBlock: to begin appending data for the current pass
+/// 3. Exit
+///
+/// When all media data has been appended for the current request, call -markCurrentPassAsFinished to begin the process of determining whether an additional pass is warranted. If an additional pass is warranted, the block passed to this method will be invoked to begin the next pass. If no additional passes are needed, the block passed to this method will be invoked one final time so the client can invoke -markAsFinished in response to the value of currentPassDescription becoming nil.
+///
+/// Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer.
+///
+/// This method throws an exception if called more than once.
+///
+/// - Parameter queue: The queue on which the block should be invoked.
+/// - Parameter block: A block the receiver should invoke whenever a new pass has begun.
- (void)respondToEachPassDescriptionOnQueue:(dispatch_queue_t)queue usingBlock:(dispatch_block_t NS_SWIFT_SENDABLE)block API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-/*!
- @method markCurrentPassAsFinished
- @abstract
- Instructs the receiver to analyze the media data that has been appended and determine whether the results could be improved by re-encoding certain segments.
-
- @discussion
- When the value of canPerformMultiplePasses is YES, call this method after you have appended all of your media data. After the receiver analyzes whether an additional pass is warranted, the value of currentPassDescription will change (usually asynchronously) to describe how to set up for the next pass. Although it is possible to use key-value observing to determine when the value of currentPassDescription has changed, it is typically more convenient to invoke -respondToEachPassDescriptionOnQueue:usingBlock: in order to start the work for each pass.
-
- After re-appending the media data for all of the time ranges of the new pass, call this method again to determine whether additional segments should be re-appended in another pass.
-
- Calling this method effectively cancels any previous invocation of -requestMediaDataWhenReadyOnQueue:usingBlock:, meaning that -requestMediaDataWhenReadyOnQueue:usingBlock: can be invoked again for each new pass. -respondToEachPassDescriptionOnQueue:usingBlock: provides a convenient way to consolidate these invocations in your code.
-
- After each pass, you have the option of keeping the most recent results by calling -markAsFinished instead of this method. If the value of currentPassDescription is nil at the beginning of a pass, call -markAsFinished to tell the receiver to not expect any further media data.
-
- If the value of canPerformMultiplePasses is NO, the value of currentPassDescription will immediately become nil after calling this method.
-
- Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer.
- */
+/// Instructs the receiver to analyze the media data that has been appended and determine whether the results could be improved by re-encoding certain segments.
+///
+/// When the value of canPerformMultiplePasses is YES, call this method after you have appended all of your media data. After the receiver analyzes whether an additional pass is warranted, the value of currentPassDescription will change (usually asynchronously) to describe how to set up for the next pass. Although it is possible to use key-value observing to determine when the value of currentPassDescription has changed, it is typically more convenient to invoke -respondToEachPassDescriptionOnQueue:usingBlock: in order to start the work for each pass.
+///
+/// After re-appending the media data for all of the time ranges of the new pass, call this method again to determine whether additional segments should be re-appended in another pass.
+///
+/// Calling this method effectively cancels any previous invocation of -requestMediaDataWhenReadyOnQueue:usingBlock:, meaning that -requestMediaDataWhenReadyOnQueue:usingBlock: can be invoked again for each new pass. -respondToEachPassDescriptionOnQueue:usingBlock: provides a convenient way to consolidate these invocations in your code.
+///
+/// After each pass, you have the option of keeping the most recent results by calling -markAsFinished instead of this method. If the value of currentPassDescription is nil at the beginning of a pass, call -markAsFinished to tell the receiver to not expect any further media data.
+///
+/// If the value of canPerformMultiplePasses is NO, the value of currentPassDescription will immediately become nil after calling this method.
+///
+/// Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer.
- (void)markCurrentPassAsFinished API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos);
@end
@@ -672,13 +518,9 @@
@class AVAssetWriterInputPassDescriptionInternal;
-/*!
- @class AVAssetWriterInputPassDescription
- @abstract
- Defines an interface for querying information about the requirements of the current pass, such as the time ranges of media data to append.
- @discussion
- Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
- */
+/// Defines an interface for querying information about the requirements of the current pass, such as the time ranges of media data to append.
+///
+/// Subclasses of this type that are used from Swift must fulfill the requirements of a Sendable type.
NS_SWIFT_SENDABLE
API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVAssetWriterInputPassDescription : NSObject
@@ -688,14 +530,9 @@
}
AV_INIT_UNAVAILABLE
-/*!
- @property sourceTimeRanges
- @abstract
- An NSArray of NSValue objects wrapping CMTimeRange structures, each representing one source time range.
-
- @discussion
- The value of this property is suitable for using as a parameter for -[AVAssetReaderOutput resetForReadingTimeRanges:].
- */
+/// An NSArray of NSValue objects wrapping CMTimeRange structures, each representing one source time range.
+///
+/// The value of this property is suitable for using as a parameter for -[AVAssetReaderOutput resetForReadingTimeRanges:].
@property (nonatomic, readonly) NSArray<NSValue *> *sourceTimeRanges;
@end
@@ -703,15 +540,15 @@
@class AVAssetWriterInputPixelBufferAdaptorInternal;
-/*!
- @class AVAssetWriterInputPixelBufferAdaptor
- @abstract
- Defines an interface for appending video samples packaged as CVPixelBuffer objects to a single AVAssetWriterInput object.
-
- @discussion
- Instances of AVAssetWriterInputPixelBufferAdaptor provide a CVPixelBufferPool that can be used to allocate pixel buffers for writing to the output file. Using the provided pixel buffer pool for buffer allocation is typically more efficient than appending pixel buffers allocated using a separate pool.
- */
+/// Defines an interface for appending video samples packaged as CVPixelBuffer objects to a single AVAssetWriterInput object.
+///
+/// Instances of AVAssetWriterInputPixelBufferAdaptor provide a CVPixelBufferPool that can be used to allocate pixel buffers for writing to the output file. Using the provided pixel buffer pool for buffer allocation is typically more efficient than appending pixel buffers allocated using a separate pool.
+#if __swift__
+API_DEPRECATED("Use AVAssetWriter.inputPixelBufferReceiver(for:pixelBufferAttributes:) instead", macos(10.7, API_TO_BE_DEPRECATED), ios(4.1, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
@interface AVAssetWriterInputPixelBufferAdaptor : NSObject
{
@private
@@ -719,344 +556,256 @@
}
AV_INIT_UNAVAILABLE
-/*!
- @method assetWriterInputPixelBufferAdaptorWithAssetWriterInput:sourcePixelBufferAttributes:
- @abstract
- Creates a new pixel buffer adaptor to receive pixel buffers for writing to the output file.
-
- @param input
- An instance of AVAssetWriterInput to which the receiver should append pixel buffers. Currently, only asset writer inputs that accept media data of type AVMediaTypeVideo can be used to initialize a pixel buffer adaptor.
- @param sourcePixelBufferAttributes
- Specifies the attributes of pixel buffers that will be vended by the input's CVPixelBufferPool.
- @result
- An instance of AVAssetWriterInputPixelBufferAdaptor.
-
- @discussion
- In order to take advantage of the improved efficiency of appending buffers created from the adaptor's pixel buffer pool, clients should specify pixel buffer attributes that most closely accommodate the source format of the video frames being appended.
-
- Pixel buffer attributes keys for the pixel buffer pool are defined in <CoreVideo/CVPixelBuffer.h>. To specify the pixel format type, the pixelBufferAttributes dictionary should contain a value for kCVPixelBufferPixelFormatTypeKey. For example, use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] for 8-bit-per-channel BGRA. See the discussion under appendPixelBuffer:withPresentationTime: for advice on choosing a pixel format.
-
- Clients that do not need a pixel buffer pool for allocating buffers should set sourcePixelBufferAttributes to nil.
-
- This method throws an exception if the input is already attached to another asset writer input pixel buffer adaptor or if the input has already started writing (the asset writer has progressed beyond AVAssetWriterStatusUnknown).
- */
+/// Creates a new pixel buffer adaptor to receive pixel buffers for writing to the output file.
+///
+/// In order to take advantage of the improved efficiency of appending buffers created from the adaptor's pixel buffer pool, clients should specify pixel buffer attributes that most closely accommodate the source format of the video frames being appended.
+///
+/// Pixel buffer attributes keys for the pixel buffer pool are defined in <CoreVideo/CVPixelBuffer.h>. To specify the pixel format type, the pixelBufferAttributes dictionary should contain a value for kCVPixelBufferPixelFormatTypeKey. For example, use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] for 8-bit-per-channel BGRA. See the discussion under appendPixelBuffer:withPresentationTime: for advice on choosing a pixel format.
+///
+/// Clients that do not need a pixel buffer pool for allocating buffers should set sourcePixelBufferAttributes to nil.
+///
+/// This method throws an exception if the input is already attached to another asset writer input pixel buffer adaptor or if the input has already started writing (the asset writer has progressed beyond AVAssetWriterStatusUnknown).
+///
+/// - Parameter input: An instance of AVAssetWriterInput to which the receiver should append pixel buffers. Currently, only asset writer inputs that accept media data of type AVMediaTypeVideo can be used to initialize a pixel buffer adaptor.
+/// - Parameter sourcePixelBufferAttributes: Specifies the attributes of pixel buffers that will be vended by the input's CVPixelBufferPool.
+///
+/// - Returns: An instance of AVAssetWriterInputPixelBufferAdaptor.
+ (instancetype)assetWriterInputPixelBufferAdaptorWithAssetWriterInput:(AVAssetWriterInput *)input sourcePixelBufferAttributes:(nullable NSDictionary<NSString *, id> *)sourcePixelBufferAttributes;
-/*!
- @method initWithAssetWriterInput:sourcePixelBufferAttributes:
- @abstract
- Creates a new pixel buffer adaptor to receive pixel buffers for writing to the output file.
-
- @param input
- An instance of AVAssetWriterInput to which the receiver should append pixel buffers. Currently, only asset writer inputs that accept media data of type AVMediaTypeVideo can be used to initialize a pixel buffer adaptor.
- @param sourcePixelBufferAttributes
- Specifies the attributes of pixel buffers that will be vended by the input's CVPixelBufferPool.
- @result
- An instance of AVAssetWriterInputPixelBufferAdaptor.
-
- @discussion
- In order to take advantage of the improved efficiency of appending buffers created from the adaptor's pixel buffer pool, clients should specify pixel buffer attributes that most closely accommodate the source format of the video frames being appended.
-
- Pixel buffer attributes keys for the pixel buffer pool are defined in <CoreVideo/CVPixelBuffer.h>. To specify the pixel format type, the pixelBufferAttributes dictionary should contain a value for kCVPixelBufferPixelFormatTypeKey. For example, use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] for 8-bit-per-channel BGRA. See the discussion under appendPixelBuffer:withPresentationTime: for advice on choosing a pixel format.
-
- Clients that do not need a pixel buffer pool for allocating buffers should set sourcePixelBufferAttributes to nil.
-
- This method throws an exception if the input is already attached to another asset writer input pixel buffer adaptor or if the input has already started writing (the asset writer has progressed beyond AVAssetWriterStatusUnknown).
- */
+/// Creates a new pixel buffer adaptor to receive pixel buffers for writing to the output file.
+///
+/// In order to take advantage of the improved efficiency of appending buffers created from the adaptor's pixel buffer pool, clients should specify pixel buffer attributes that most closely accommodate the source format of the video frames being appended.
+///
+/// Pixel buffer attributes keys for the pixel buffer pool are defined in <CoreVideo/CVPixelBuffer.h>. To specify the pixel format type, the pixelBufferAttributes dictionary should contain a value for kCVPixelBufferPixelFormatTypeKey. For example, use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] for 8-bit-per-channel BGRA. See the discussion under appendPixelBuffer:withPresentationTime: for advice on choosing a pixel format.
+///
+/// Clients that do not need a pixel buffer pool for allocating buffers should set sourcePixelBufferAttributes to nil.
+///
+/// This method throws an exception if the input is already attached to another asset writer input pixel buffer adaptor or if the input has already started writing (the asset writer has progressed beyond AVAssetWriterStatusUnknown).
+///
+/// - Parameter input: An instance of AVAssetWriterInput to which the receiver should append pixel buffers. Currently, only asset writer inputs that accept media data of type AVMediaTypeVideo can be used to initialize a pixel buffer adaptor.
+/// - Parameter sourcePixelBufferAttributes: Specifies the attributes of pixel buffers that will be vended by the input's CVPixelBufferPool.
+///
+/// - Returns: An instance of AVAssetWriterInputPixelBufferAdaptor.
- (instancetype)initWithAssetWriterInput:(AVAssetWriterInput *)input sourcePixelBufferAttributes:(nullable NSDictionary<NSString *, id> *)sourcePixelBufferAttributes NS_DESIGNATED_INITIALIZER;
-/*!
- @property assetWriterInput
- @abstract
- The asset writer input to which the receiver should append pixel buffers.
- */
+/// The asset writer input to which the receiver should append pixel buffers.
@property (nonatomic, readonly) AVAssetWriterInput *assetWriterInput;
-/*!
- @property sourcePixelBufferAttributes
- @abstract
- The pixel buffer attributes of pixel buffers that will be vended by the receiver's CVPixelBufferPool.
+/// The pixel buffer attributes of pixel buffers that will be vended by the receiver's CVPixelBufferPool.
+///
+/// The value of this property is a dictionary containing pixel buffer attributes keys defined in <CoreVideo/CVPixelBuffer.h>.
+@property (nonatomic, readonly, nullable) NSDictionary<NSString *, id> * NS_SWIFT_SENDABLE sourcePixelBufferAttributes;
- @discussion
- The value of this property is a dictionary containing pixel buffer attributes keys defined in <CoreVideo/CVPixelBuffer.h>.
- */
-@property (nonatomic, readonly, nullable) NSDictionary<NSString *, id> *sourcePixelBufferAttributes;
-
-/*!
- @property pixelBufferPool
- @abstract
- A pixel buffer pool that will vend and efficiently recycle CVPixelBuffer objects that can be appended to the receiver.
-
- @discussion
- For maximum efficiency, clients should create CVPixelBuffer objects for appendPixelBuffer:withPresentationTime: by using this pool with the CVPixelBufferPoolCreatePixelBuffer() function.
-
- The value of this property will be NULL before -[AVAssetWriter startWriting] is called on the associated AVAssetWriter object.
-
- This property is key value observable.
-
- This property throws an exception if a pixel buffer pool cannot be created with this asset writer input pixel buffer adaptor's source pixel buffer attributes (must specify width, height, and either pixel format or pixel format description).
- */
+/// A pixel buffer pool that will vend and efficiently recycle CVPixelBuffer objects that can be appended to the receiver.
+///
+/// For maximum efficiency, clients should create CVPixelBuffer objects for appendPixelBuffer:withPresentationTime: by using this pool with the CVPixelBufferPoolCreatePixelBuffer() function.
+///
+/// The value of this property will be NULL before -[AVAssetWriter startWriting] is called on the associated AVAssetWriter object.
+///
+/// This property is key value observable.
+///
+/// This property throws an exception if a pixel buffer pool cannot be created with this asset writer input pixel buffer adaptor's source pixel buffer attributes (must specify width, height, and either pixel format or pixel format description).
@property (nonatomic, readonly, nullable) CVPixelBufferPoolRef pixelBufferPool;
-/*!
- @method appendPixelBuffer:withPresentationTime:
- @abstract
- Appends a pixel buffer to the receiver.
-
- @param pixelBuffer
- The CVPixelBuffer to be appended.
- @param presentationTime
- The presentation time for the pixel buffer to be appended. This time will be considered relative to the time passed to -[AVAssetWriter startSessionAtSourceTime:] to determine the timing of the frame in the output file.
- @result
- A BOOL value indicating success of appending the pixel buffer. If a result of NO is returned, clients can check the value of AVAssetWriter.status to determine whether the writing operation completed, failed, or was cancelled. If the status is AVAssetWriterStatusFailed, AVAsset.error will contain an instance of NSError that describes the failure.
-
- @discussion
- The receiver will retain the CVPixelBuffer until it is done with it, and then release it. Do not modify a CVPixelBuffer or its contents after you have passed it to this method.
-
- For optimal performance the format of the pixel buffer should match one of the native formats supported by the selected video encoder. Below are some recommendations:
-
- The H.264 and HEVC encoders natively support kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange and kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, which should be used with 8-bit 4:2:0 video and full range input respectively; other related pixel formats in CoreVideo/CVPixelBuffer.h are ideal for 4:2:2 and 4:4:4 (and for HEVC, 10-bit). The JPEG encoder on iOS and Apple Silicon macOS natively supports kCVPixelFormatType_422YpCbCr8FullRange. If you need to work in the RGB domain then kCVPixelFormatType_32BGRA is recommended on iOS and macOS.
-
- Pixel buffers not in a natively supported format will be converted internally prior to encoding when possible. Pixel format conversions within the same range (video or full) are generally faster than conversions between different ranges.
-
- The ProRes encoders can preserve high bit depth sources, supporting up to 12bits/ch. ProRes 4444 can contain a mathematically lossless alpha channel and it doesn't do any chroma subsampling. This makes ProRes 4444 ideal for quality critical applications. If you are working with 8bit sources ProRes is also a good format to use due to its high image quality. Use either of the recommended pixel formats above. Note that RGB pixel formats by definition have 4:4:4 chroma sampling.
-
- If you are working with high bit depth sources the following yuv pixel formats are recommended when encoding to ProRes: kCVPixelFormatType_4444AYpCbCr16, kCVPixelFormatType_422YpCbCr16, and kCVPixelFormatType_422YpCbCr10. When working in the RGB domain kCVPixelFormatType_64ARGB is recommended. Scaling and color matching are not currently supported when using AVAssetWriter with any of these high bit depth pixel formats. Please make sure that your track's output settings dictionary specifies the same width and height as the buffers you will be appending. Do not include AVVideoScalingModeKey or AVVideoColorPropertiesKey.
-
- Before calling this method, you must ensure that the input that underlies the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer. It is an error to invoke this method before starting a session (via -[AVAssetWriter startSessionAtSourceTime:]) or after ending a session (via -[AVAssetWriter endSessionAtSourceTime:]).
-
- This method throws an exception if the presentation time is is non-numeric (see CMTIME_IS_NUMERIC) or if "readyForMoreMediaData" is NO.
- */
+/// Appends a pixel buffer to the receiver.
+///
+/// The receiver will retain the CVPixelBuffer until it is done with it, and then release it. Do not modify a CVPixelBuffer or its contents after you have passed it to this method.
+///
+/// For optimal performance the format of the pixel buffer should match one of the native formats supported by the selected video encoder. Below are some recommendations:
+///
+/// The H.264 and HEVC encoders natively support kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange and kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, which should be used with 8-bit 4:2:0 video and full range input respectively; other related pixel formats in CoreVideo/CVPixelBuffer.h are ideal for 4:2:2 and 4:4:4 (and for HEVC, 10-bit). The JPEG encoder on iOS and Apple Silicon macOS natively supports kCVPixelFormatType_422YpCbCr8FullRange. If you need to work in the RGB domain then kCVPixelFormatType_32BGRA is recommended on iOS and macOS.
+///
+/// Pixel buffers not in a natively supported format will be converted internally prior to encoding when possible. Pixel format conversions within the same range (video or full) are generally faster than conversions between different ranges.
+///
+/// The ProRes encoders can preserve high bit depth sources, supporting up to 12bits/ch. ProRes 4444 can contain a mathematically lossless alpha channel and it doesn't do any chroma subsampling. This makes ProRes 4444 ideal for quality critical applications. If you are working with 8bit sources ProRes is also a good format to use due to its high image quality. Use either of the recommended pixel formats above. Note that RGB pixel formats by definition have 4:4:4 chroma sampling.
+///
+/// If you are working with high bit depth sources the following yuv pixel formats are recommended when encoding to ProRes: kCVPixelFormatType_4444AYpCbCr16, kCVPixelFormatType_422YpCbCr16, and kCVPixelFormatType_422YpCbCr10. When working in the RGB domain kCVPixelFormatType_64ARGB is recommended. Scaling and color matching are not currently supported when using AVAssetWriter with any of these high bit depth pixel formats. Please make sure that your track's output settings dictionary specifies the same width and height as the buffers you will be appending. Do not include AVVideoScalingModeKey or AVVideoColorPropertiesKey.
+///
+/// Before calling this method, you must ensure that the input that underlies the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer. It is an error to invoke this method before starting a session (via -[AVAssetWriter startSessionAtSourceTime:]) or after ending a session (via -[AVAssetWriter endSessionAtSourceTime:]).
+///
+/// This method throws an exception if the presentation time is is non-numeric (see CMTIME_IS_NUMERIC) or if "readyForMoreMediaData" is NO.
+///
+/// - Parameter pixelBuffer: The CVPixelBuffer to be appended.
+/// - Parameter presentationTime: The presentation time for the pixel buffer to be appended. This time will be considered relative to the time passed to -[AVAssetWriter startSessionAtSourceTime:] to determine the timing of the frame in the output file.
+///
+/// - Returns: A BOOL value indicating success of appending the pixel buffer. If a result of NO is returned, clients can check the value of AVAssetWriter.status to determine whether the writing operation completed, failed, or was cancelled. If the status is AVAssetWriterStatusFailed, AVAsset.error will contain an instance of NSError that describes the failure.
- (BOOL)appendPixelBuffer:(CVPixelBufferRef)pixelBuffer withPresentationTime:(CMTime)presentationTime;
@end
-/*!
- @class AVAssetWriterInputTaggedPixelBufferGroupAdaptor
- @abstract
- Defines an interface for appending tagged buffer groups packaged as CMTaggedBufferGroupRef objects to a single AVAssetWriterInput object.
-
- @discussion
- Instances of AVAssetWriterInputTaggedPixelBufferGroupAdaptor provide a CVPixelBufferPool that can be used to allocate the pixel buffers of tagged buffer groups for writing to the output file. Using the provided pixel buffer pool for buffer allocation is typically more efficient than appending pixel buffers allocated using a separate pool.
- */
+/// Defines an interface for appending tagged buffer groups packaged as CMTaggedBufferGroupRef objects to a single AVAssetWriterInput object.
+///
+/// Instances of AVAssetWriterInputTaggedPixelBufferGroupAdaptor provide a CVPixelBufferPool that can be used to allocate the pixel buffers of tagged buffer groups for writing to the output file. Using the provided pixel buffer pool for buffer allocation is typically more efficient than appending pixel buffers allocated using a separate pool.
+#if __swift__
+API_DEPRECATED("Use AVAssetWriter.inputTaggedPixelBufferGroupReceiver(for:pixelBufferAttributes:) instead", macos(14.0, API_TO_BE_DEPRECATED), ios(17.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(tvos, watchos)
+#else
API_AVAILABLE(macos(14.0), ios(17.0), visionos(1.0)) API_UNAVAILABLE(tvos, watchos)
+#endif
@interface AVAssetWriterInputTaggedPixelBufferGroupAdaptor : NSObject
AV_INIT_UNAVAILABLE
-/*!
- @method assetWriterInputTaggedPixelBufferGroupAdaptorWithAssetWriterInput:sourcePixelBufferAttributes:
- @abstract
- Creates a new tagged buffer adaptor to receive tagged buffer groups for writing to the output file.
-
- @param input
- An instance of AVAssetWriterInput to which the receiver should append tagged buffer groups. Currently, only asset writer inputs that accept media data of type AVMediaTypeVideo can be used to initialize a tagged buffer adaptor.
- @param sourcePixelBufferAttributes
- Specifies the attributes of pixel buffers of tagged buffer groups that will be vended by the input's CVPixelBufferPool.
- @result
- An instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor.
-
- @discussion
- In order to take advantage of the improved efficiency of appending buffers created from the adaptor's pixel buffer pool, clients should specify pixel buffer attributes that most closely accommodate the source format of the video frames being appended.
-
- Pixel buffer attributes keys for the pixel buffer pool are defined in <CoreVideo/CVPixelBuffer.h>. To specify the pixel format type, the pixelBufferAttributes dictionary should contain a value for kCVPixelBufferPixelFormatTypeKey. For example, use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] for 8-bit-per-channel BGRA. See the discussion under appendPixelBuffer:withPresentationTime: for advice on choosing a pixel format.
-
- Clients that do not need a pixel buffer pool for allocating buffers should set sourcePixelBufferAttributes to nil.
-
- This method throws an exception if the input is already attached to another asset writer input tagged buffer group adaptor or if the input has already started writing (the asset writer has progressed beyond AVAssetWriterStatusUnknown).
- */
+/// Creates a new tagged buffer adaptor to receive tagged buffer groups for writing to the output file.
+///
+/// In order to take advantage of the improved efficiency of appending buffers created from the adaptor's pixel buffer pool, clients should specify pixel buffer attributes that most closely accommodate the source format of the video frames being appended.
+///
+/// Pixel buffer attributes keys for the pixel buffer pool are defined in <CoreVideo/CVPixelBuffer.h>. To specify the pixel format type, the pixelBufferAttributes dictionary should contain a value for kCVPixelBufferPixelFormatTypeKey. For example, use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] for 8-bit-per-channel BGRA. See the discussion under appendPixelBuffer:withPresentationTime: for advice on choosing a pixel format.
+///
+/// Clients that do not need a pixel buffer pool for allocating buffers should set sourcePixelBufferAttributes to nil.
+///
+/// This method throws an exception if the input is already attached to another asset writer input tagged buffer group adaptor or if the input has already started writing (the asset writer has progressed beyond AVAssetWriterStatusUnknown).
+///
+/// - Parameter input: An instance of AVAssetWriterInput to which the receiver should append tagged buffer groups. Currently, only asset writer inputs that accept media data of type AVMediaTypeVideo can be used to initialize a tagged buffer adaptor.
+/// - Parameter sourcePixelBufferAttributes: Specifies the attributes of pixel buffers of tagged buffer groups that will be vended by the input's CVPixelBufferPool.
+///
+/// - Returns: An instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor.
+ (instancetype)assetWriterInputTaggedPixelBufferGroupAdaptorWithAssetWriterInput:(AVAssetWriterInput *)input sourcePixelBufferAttributes:(nullable NSDictionary<NSString *, id> *)sourcePixelBufferAttributes;
-/*!
- @method initWithAssetWriterInput:sourcePixelBufferAttributes:
- @abstract
- Creates a new tagged buffer group adaptor to receive tagged buffer groups for writing to the output file.
-
- @param input
- An instance of AVAssetWriterInput to which the receiver should append tagged buffer groups. In addition to the pixel buffer adaptor, asset writer inputs with media data of type AVMediaTypeVideo can be used to initialize a tagged buffer group adaptor.
- @param sourcePixelBufferAttributes
- Specifies the attributes of pixel buffers of tagged buffer groups that will be vended by the input's CVPixelBufferPool.
- @result
- An instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor.
-
- @discussion
- In order to take advantage of the improved efficiency of appending buffers created from the adaptor's pixel buffer pool, clients should specify pixel buffer attributes that most closely accommodate the source format of the video frames of tagged buffer groups being appended.
-
- Pixel buffer attributes keys for the pixel buffer pool are defined in <CoreVideo/CVPixelBuffer.h>. To specify the pixel format type, the pixelBufferAttributes dictionary should contain a value for kCVPixelBufferPixelFormatTypeKey. For example, use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] for 8-bit-per-channel BGRA. See the discussion under appendPixelBuffer:withPresentationTime: in AVAssetWriterInputPixelBufferAdaptor for advice on choosing a pixel format.
-
- Clients that do not need a pixel buffer pool for allocating buffers should set sourcePixelBufferAttributes to nil.
-
- It is an error to initialize an instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor with an asset writer input that is already attached to another instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor. It is also an error to initialize an instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor with an asset writer input whose asset writer has progressed beyond AVAssetWriterStatusUnknown.
- */
+/// Creates a new tagged buffer group adaptor to receive tagged buffer groups for writing to the output file.
+///
+/// In order to take advantage of the improved efficiency of appending buffers created from the adaptor's pixel buffer pool, clients should specify pixel buffer attributes that most closely accommodate the source format of the video frames of tagged buffer groups being appended.
+///
+/// Pixel buffer attributes keys for the pixel buffer pool are defined in <CoreVideo/CVPixelBuffer.h>. To specify the pixel format type, the pixelBufferAttributes dictionary should contain a value for kCVPixelBufferPixelFormatTypeKey. For example, use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] for 8-bit-per-channel BGRA. See the discussion under appendPixelBuffer:withPresentationTime: in AVAssetWriterInputPixelBufferAdaptor for advice on choosing a pixel format.
+///
+/// Clients that do not need a pixel buffer pool for allocating buffers should set sourcePixelBufferAttributes to nil.
+///
+/// It is an error to initialize an instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor with an asset writer input that is already attached to another instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor. It is also an error to initialize an instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor with an asset writer input whose asset writer has progressed beyond AVAssetWriterStatusUnknown.
+///
+/// - Parameter input: An instance of AVAssetWriterInput to which the receiver should append tagged buffer groups. In addition to the pixel buffer adaptor, asset writer inputs with media data of type AVMediaTypeVideo can be used to initialize a tagged buffer group adaptor.
+/// - Parameter sourcePixelBufferAttributes: Specifies the attributes of pixel buffers of tagged buffer groups that will be vended by the input's CVPixelBufferPool.
+///
+/// - Returns: An instance of AVAssetWriterInputTaggedPixelBufferGroupAdaptor.
- (instancetype)initWithAssetWriterInput:(AVAssetWriterInput *)input sourcePixelBufferAttributes:(nullable NSDictionary<NSString *, id> *)sourcePixelBufferAttributes NS_DESIGNATED_INITIALIZER;
-/*!
- @property assetWriterInput
- @abstract
- The asset writer input to which the receiver should append tagged buffer groups.
- */
+/// The asset writer input to which the receiver should append tagged buffer groups.
@property (nonatomic, readonly) AVAssetWriterInput *assetWriterInput;
-/*!
- @property sourcePixelBufferAttributes
- @abstract
- The pixel buffer attributes of pixel buffers that will be vended by the receiver's CVPixelBufferPool.
+/// The pixel buffer attributes of pixel buffers that will be vended by the receiver's CVPixelBufferPool.
+///
+/// The value of this property is a dictionary containing pixel buffer attributes keys defined in <CoreVideo/CVPixelBuffer.h>.
+@property (nonatomic, readonly, nullable) NSDictionary<NSString *, id> * NS_SWIFT_SENDABLE sourcePixelBufferAttributes;
- @discussion
- The value of this property is a dictionary containing pixel buffer attributes keys defined in <CoreVideo/CVPixelBuffer.h>.
- */
-@property (nonatomic, readonly, nullable) NSDictionary<NSString *, id> *sourcePixelBufferAttributes;
-
-/*!
- @property pixelBufferPool
- @abstract
- A pixel buffer pool that will vend and efficiently recycle CVPixelBuffer objects of tagged buffer groups that can be appended to the receiver.
-
- @discussion
- For maximum efficiency, clients should create CVPixelBuffer objects of tagged buffer groups for appendTaggedPixelBufferGroup:withPresentationTime: by using this pool with the CVPixelBufferPoolCreatePixelBuffer() function.
-
- The value of this property will be NULL before -[AVAssetWriter startWriting] is called on the associated AVAssetWriter object. Clients should read this property after -[AVAssetWriter startWriting] calling to get a non-NULL value.
-
- This property is not key value observable.
- */
+/// A pixel buffer pool that will vend and efficiently recycle CVPixelBuffer objects of tagged buffer groups that can be appended to the receiver.
+///
+/// For maximum efficiency, clients should create CVPixelBuffer objects of tagged buffer groups for appendTaggedPixelBufferGroup:withPresentationTime: by using this pool with the CVPixelBufferPoolCreatePixelBuffer() function.
+///
+/// The value of this property will be NULL before -[AVAssetWriter startWriting] is called on the associated AVAssetWriter object. Clients should read this property after -[AVAssetWriter startWriting] calling to get a non-NULL value.
+///
+/// This property is not key value observable.
@property (nonatomic, readonly, nullable) CVPixelBufferPoolRef pixelBufferPool;
-/*!
- @method appendTaggedPixelBufferGroup:withPresentationTime:
- @abstract
- Appends a tagged buffer group to the receiver.
-
- @param taggedPixelBufferGroup
- The CMTaggedBufferGroup to be appended. All of the buffers in taggedPixelBufferGroup should be CVPixelBuffers, and they should correspond to tag collections that contain kCMTagCategory_VideoLayerID values matching the list set using kVTCompressionPropertyKey_MVHEVCVideoLayerIDs. The pixel buffers should be IOSurface-backed.
- @param presentationTime
- The presentation time for the tagged buffer group to be appended. This time will be considered relative to the time passed to -[AVAssetWriter startSessionAtSourceTime:] to determine the timing of the frame in the output file.
- @result
- A BOOL value indicating success of appending the tagged buffer group. If a result of NO is returned, clients can check the value of AVAssetWriter.status to determine whether the writing operation completed, failed, or was cancelled. If the status is AVAssetWriterStatusFailed, AVAssetWriter.error will contain an instance of NSError that describes the failure.
-
- @discussion
- The receiver will retain the CMTaggedBufferGroup until it is done with it, and then release it. Do not modify a CMTaggedBufferGroup or its contents after you have passed it to this method.
-
- Before calling this method, you must ensure that the input that underlies the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer. It is an error to invoke this method before starting a session (via -[AVAssetWriter startSessionAtSourceTime:]) or after ending a session (via -[AVAssetWriter endSessionAtSourceTime:]).
-
- In an AVAssetWriterInput instance creation with AVMediaTypeVideo, kVTCompressionPropertyKey_MVHEVCVideoLayerIDs key must be specified as part of the dictionary given for AVVideoCompressionPropertiesKey. It sets video layer IDs to a target multi-image video encoder. This method checks the values for kCMTagCategory_VideoLayerID tag in tag collections of taggedPixelBufferGroup over the array values for kVTCompressionPropertyKey_MVHEVCVideoLayerIDs key. An NSInvalidArgumentException will be raised if the video layer IDs mismatch between the value of kVTCompressionPropertyKey_MVHEVCVideoLayerIDs in the AVVideoCompressionPropertiesKey sub-dictionary of the input's outputSettings property and tag collections of taggedPixelBufferGroup.
-
- Below is a sample code sketch focusing on data flow that illustrates how you might append a taggedPixelBufferGroup instance.
- // Set up an AVAssetWriterInput and AVAssetWriterInputTaggedPixelBufferGroupAdaptor instance
- AVAssetWriterInput *assetWriterInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:@{
- ..,
- AVVideoCompressionPropertiesKey: @{ (NSString *)kVTCompressionPropertyKey_MVHEVCVideoLayerIDs : .. }}];
-
- AVAssetWriterInputTaggedPixelBufferGroupAdaptor *assetWriterInputAdaptor = [[AVAssetWriterInputTaggedPixelBufferGroupAdaptor alloc] initWithAssetWriterInput:assetWriterInput ..];
-
- Later, when the writer input is ready for more media data, create and append a tagged buffer group containing one or more pixel buffers and the exact tag values associated with kCMTagCategory_VideoLayerID being specified via kVTCompressionPropertyKey_MVHEVCVideoLayerIDs.
- // Set up tag collection buffers
- CMTag tags[] = CMTagMakeWithSInt64Value(kCMTagCategory_VideoLayerID, ..);
- CMTagCollectionCreate(.., tags, FigCountOf(tags), &tagCollection);
- CFArrayAppendValue(tagCollectionArray, tagCollection);
-
- // Set up pixel buffers
- CVPixelBufferPoolCreatePixelBuffer(.., &pixelBuffer);
- CFArrayAppendValue(pixelBufferArray, pixelBuffer);
-
- // Append a CMTaggedBufferGroupRef instance to asset writer input
- CMTaggedBufferGroupCreate(.., tagCollectionArray, pixelBufferArray, &taggedBufferGroup);
- [assetWriterInputAdaptor appendTaggedPixelBufferGroup:taggedBufferGroup ..];
-*/
+/// Appends a tagged buffer group to the receiver.
+///
+/// The receiver will retain the CMTaggedBufferGroup until it is done with it, and then release it. Do not modify a CMTaggedBufferGroup or its contents after you have passed it to this method.
+///
+/// Before calling this method, you must ensure that the input that underlies the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer. It is an error to invoke this method before starting a session (via -[AVAssetWriter startSessionAtSourceTime:]) or after ending a session (via -[AVAssetWriter endSessionAtSourceTime:]).
+///
+/// In an AVAssetWriterInput instance creation with AVMediaTypeVideo, kVTCompressionPropertyKey_MVHEVCVideoLayerIDs key must be specified as part of the dictionary given for AVVideoCompressionPropertiesKey. It sets video layer IDs to a target multi-image video encoder. This method checks the values for kCMTagCategory_VideoLayerID tag in tag collections of taggedPixelBufferGroup over the array values for kVTCompressionPropertyKey_MVHEVCVideoLayerIDs key. An NSInvalidArgumentException will be raised if the video layer IDs mismatch between the value of kVTCompressionPropertyKey_MVHEVCVideoLayerIDs in the AVVideoCompressionPropertiesKey sub-dictionary of the input's outputSettings property and tag collections of taggedPixelBufferGroup.
+///
+/// Below is a sample code sketch focusing on data flow that illustrates how you might append a taggedPixelBufferGroup instance.
+/// ```objc
+/// // Set up an AVAssetWriterInput and AVAssetWriterInputTaggedPixelBufferGroupAdaptor instance
+/// AVAssetWriterInput *assetWriterInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:@{
+/// ..,
+/// AVVideoCompressionPropertiesKey: @{ (NSString *)kVTCompressionPropertyKey_MVHEVCVideoLayerIDs : .. }}];
+///
+/// AVAssetWriterInputTaggedPixelBufferGroupAdaptor *assetWriterInputAdaptor = [[AVAssetWriterInputTaggedPixelBufferGroupAdaptor alloc] initWithAssetWriterInput:assetWriterInput ..];
+/// ```
+/// Later, when the writer input is ready for more media data, create and append a tagged buffer group containing one or more pixel buffers and the exact tag values associated with kCMTagCategory_VideoLayerID being specified via kVTCompressionPropertyKey_MVHEVCVideoLayerIDs.
+/// ```objc
+/// // Set up tag collection buffers
+/// CMTag tags[] = CMTagMakeWithSInt64Value(kCMTagCategory_VideoLayerID, ..);
+/// CMItemCount tagCount = sizeof(tags) / sizeof(tags[0]);
+/// CMTagCollectionCreate(.., tags, tagCount, &tagCollection);
+/// CFArrayAppendValue(tagCollectionArray, tagCollection);
+///
+/// // Set up pixel buffers
+/// CVPixelBufferPoolCreatePixelBuffer(.., &pixelBuffer);
+/// CFArrayAppendValue(pixelBufferArray, pixelBuffer);
+///
+/// // Append a CMTaggedBufferGroupRef instance to asset writer input
+/// CMTaggedBufferGroupCreate(.., tagCollectionArray, pixelBufferArray, &taggedBufferGroup);
+/// [assetWriterInputAdaptor appendTaggedPixelBufferGroup:taggedBufferGroup ..];
+/// ```
+/// - Parameter taggedPixelBufferGroup: The CMTaggedBufferGroup to be appended. All of the buffers in taggedPixelBufferGroup should be CVPixelBuffers, and they should correspond to tag collections that contain kCMTagCategory_VideoLayerID values matching the list set using kVTCompressionPropertyKey_MVHEVCVideoLayerIDs. The pixel buffers should be IOSurface-backed.
+/// - Parameter presentationTime: The presentation time for the tagged buffer group to be appended. This time will be considered relative to the time passed to -[AVAssetWriter startSessionAtSourceTime:] to determine the timing of the frame in the output file.
+///
+/// - Returns: A BOOL value indicating success of appending the tagged buffer group. If a result of NO is returned, clients can check the value of AVAssetWriter.status to determine whether the writing operation completed, failed, or was cancelled. If the status is AVAssetWriterStatusFailed, AVAssetWriter.error will contain an instance of NSError that describes the failure.
- (BOOL)appendTaggedPixelBufferGroup:(CMTaggedBufferGroupRef)taggedPixelBufferGroup withPresentationTime:(CMTime)presentationTime;
@end
@class AVTimedMetadataGroup;
@class AVAssetWriterInputMetadataAdaptorInternal;
-/*!
- @class AVAssetWriterInputMetadataAdaptor
- @abstract
- Defines an interface for writing metadata, packaged as instances of AVTimedMetadataGroup, to a single AVAssetWriterInput object.
- */
-
+/// Defines an interface for writing metadata, packaged as instances of AVTimedMetadataGroup, to a single AVAssetWriterInput object.
+#if __swift__
+API_DEPRECATED("Use AVAssetWriter.inputMetadataReceiver(for:) instead", macos(10.10, API_TO_BE_DEPRECATED), ios(8.0, API_TO_BE_DEPRECATED), tvos(9.0, API_TO_BE_DEPRECATED), visionos(1.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(watchos)
+#else
API_AVAILABLE(macos(10.10), ios(8.0), tvos(9.0), visionos(1.0)) API_UNAVAILABLE(watchos)
+#endif
@interface AVAssetWriterInputMetadataAdaptor : NSObject {
AVAssetWriterInputMetadataAdaptorInternal *_internal;
}
AV_INIT_UNAVAILABLE
-/*!
- @method assetWriterInputMetadataAdaptorWithAssetWriterInput:
- @abstract
- Creates a new timed metadata group adaptor to receive instances of AVTimedMetadataGroup for writing to the output file.
-
- @param input
- An instance of AVAssetWriterInput to which the receiver should append groups of timed metadata. Only asset writer inputs that accept media data of type AVMediaTypeMetadata can be used to initialize a timed metadata group adaptor.
- @result
- An instance of AVAssetWriterInputMetadataAdaptor.
-
- @discussion
- The instance of AVAssetWriterInput passed in to this method must have been created with a format hint indicating all possible combinations of identifier (or, alternatively, key and keySpace), dataType, and extendedLanguageTag that will be appended to the metadata adaptor. It is an error to append metadata items not represented in the input's format hint.
-
- This method throws an exception for any of the following reasons:
- - input is already attached to another instance of AVAssetWriterInputMetadataAdaptor
- - input's asset writer has already started writing (progressed beyond AVAssetWriterStatusUnknown)
- - input's asset writer does not carry a source format hint
- - input's source format hint media subtype is not kCMMetadataFormatType_Boxed
- */
+/// Creates a new timed metadata group adaptor to receive instances of AVTimedMetadataGroup for writing to the output file.
+///
+/// The instance of AVAssetWriterInput passed in to this method must have been created with a format hint indicating all possible combinations of identifier (or, alternatively, key and keySpace), dataType, and extendedLanguageTag that will be appended to the metadata adaptor. It is an error to append metadata items not represented in the input's format hint.
+///
+/// This method throws an exception for any of the following reasons:
+/// - input is already attached to another instance of AVAssetWriterInputMetadataAdaptor
+/// - input's asset writer has already started writing (progressed beyond AVAssetWriterStatusUnknown)
+/// - input's asset writer does not carry a source format hint
+/// - input's source format hint media subtype is not kCMMetadataFormatType_Boxed
+///
+/// - Parameter input: An instance of AVAssetWriterInput to which the receiver should append groups of timed metadata. Only asset writer inputs that accept media data of type AVMediaTypeMetadata can be used to initialize a timed metadata group adaptor.
+///
+/// - Returns: An instance of AVAssetWriterInputMetadataAdaptor.
+ (instancetype)assetWriterInputMetadataAdaptorWithAssetWriterInput:(AVAssetWriterInput *)input;
-/*!
- @method initWithAssetWriterInput:
- @abstract
- Creates a new timed metadator group adaptor to receive instances of AVTimedMetadataGroup for writing to the output file.
-
- @param input
- An instance of AVAssetWriterInput to which the receiver should append groups of timed metadata. Only asset writer inputs that accept media data of type AVMediaTypeMetadata can be used to initialize a timed metadata group adaptor.
- @result
- An instance of AVAssetWriterInputMetadataAdaptor.
-
- @discussion
- The instance of AVAssetWriterInput passed in to this method must have been created with a format hint indicating all possible combinations of identifier (or, alternatively, key and keySpace), dataType, and extendedLanguageTag that will be appended to the metadata adaptor. It is an error to append metadata items not represented in the input's format hint. For help creating a suitable format hint, see -[AVTimedMetadataGroup copyFormatDescription].
-
- This method throws an exception for any of the following reasons:
- - input is already attached to another instance of AVAssetWriterInputMetadataAdaptor
- - input's asset writer has already started writing (progressed beyond AVAssetWriterStatusUnknown)
- - input's asset writer does not carry a source format hint
- - input's source format hint media subtype is not kCMMetadataFormatType_Boxed
- */
+/// Creates a new timed metadator group adaptor to receive instances of AVTimedMetadataGroup for writing to the output file.
+///
+/// The instance of AVAssetWriterInput passed in to this method must have been created with a format hint indicating all possible combinations of identifier (or, alternatively, key and keySpace), dataType, and extendedLanguageTag that will be appended to the metadata adaptor. It is an error to append metadata items not represented in the input's format hint. For help creating a suitable format hint, see -[AVTimedMetadataGroup copyFormatDescription].
+///
+/// This method throws an exception for any of the following reasons:
+/// - input is already attached to another instance of AVAssetWriterInputMetadataAdaptor
+/// - input's asset writer has already started writing (progressed beyond AVAssetWriterStatusUnknown)
+/// - input's asset writer does not carry a source format hint
+/// - input's source format hint media subtype is not kCMMetadataFormatType_Boxed
+///
+/// - Parameter input: An instance of AVAssetWriterInput to which the receiver should append groups of timed metadata. Only asset writer inputs that accept media data of type AVMediaTypeMetadata can be used to initialize a timed metadata group adaptor.
+///
+/// - Returns: An instance of AVAssetWriterInputMetadataAdaptor.
- (instancetype)initWithAssetWriterInput:(AVAssetWriterInput *)input NS_DESIGNATED_INITIALIZER;
-/*!
- @property assetWriterInput
- @abstract
- The asset writer input to which the receiver should append timed metadata groups.
- */
+/// The asset writer input to which the receiver should append timed metadata groups.
@property (nonatomic, readonly) AVAssetWriterInput *assetWriterInput;
-/*!
- @method appendTimedMetadataGroup:
- @abstract
- Appends a timed metadata group to the receiver.
-
- @param timedMetadataGroup
- The AVTimedMetadataGroup to be appended.
- @result
- A BOOL value indicating success of appending the timed metadata group. If a result of NO is returned, AVAssetWriter.error will contain more information about why apending the timed metadata group failed.
-
- @discussion
- The receiver will retain the AVTimedMetadataGroup until it is done with it, and then release it.
-
- The timing of the metadata items in the output asset will correspond to the timeRange of the AVTimedMetadataGroup, regardless of the values of the time and duration properties of the individual items.
-
- Before calling this method, you must ensure that the input that underlies the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer. It is an error to invoke this method before starting a session (via -[AVAssetWriter startSessionAtSourceTime:]) or after ending a session (via -[AVAssetWriter endSessionAtSourceTime:]).
-
- This method throws an exception if the attached asset writer input has not been added to an asset writer or -startWriting has not been called on that asset writer.
- */
+/// Appends a timed metadata group to the receiver.
+///
+/// The receiver will retain the AVTimedMetadataGroup until it is done with it, and then release it.
+///
+/// The timing of the metadata items in the output asset will correspond to the timeRange of the AVTimedMetadataGroup, regardless of the values of the time and duration properties of the individual items.
+///
+/// Before calling this method, you must ensure that the input that underlies the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer. It is an error to invoke this method before starting a session (via -[AVAssetWriter startSessionAtSourceTime:]) or after ending a session (via -[AVAssetWriter endSessionAtSourceTime:]).
+///
+/// This method throws an exception if the attached asset writer input has not been added to an asset writer or -startWriting has not been called on that asset writer.
+///
+/// - Parameter timedMetadataGroup: The AVTimedMetadataGroup to be appended.
+///
+/// - Returns: A BOOL value indicating success of appending the timed metadata group. If a result of NO is returned, AVAssetWriter.error will contain more information about why apending the timed metadata group failed.
- (BOOL)appendTimedMetadataGroup:(AVTimedMetadataGroup *)timedMetadataGroup;
@end
-/*!
- @class AVAssetWriterInputCaptionAdaptor
- @abstract
- An adaptor class for appending instances of AVCaption to an asset writer input. -[AVAssetWriterInput -appendSampleBuffer:] will throw an exception if used when this adaptor is attached.
- */
+/// An adaptor class for appending instances of AVCaption to an asset writer input. -[AVAssetWriterInput -appendSampleBuffer:] will throw an exception if used when this adaptor is attached.
+#if __swift__
+API_DEPRECATED("Use AVAssetWriter.inputCaptionReceiver(for:) instead", macos(12.0, API_TO_BE_DEPRECATED), ios(18.0, API_TO_BE_DEPRECATED), macCatalyst(15.0, API_TO_BE_DEPRECATED))
+API_UNAVAILABLE(tvos, watchos, visionos)
+#else
API_AVAILABLE(macos(12.0), ios(18.0), macCatalyst(15.0)) API_UNAVAILABLE(tvos, watchos, visionos)
+#endif
@interface AVAssetWriterInputCaptionAdaptor : NSObject
{
@private
@@ -1064,64 +813,44 @@
}
AV_INIT_UNAVAILABLE
-/*!
- @method assetWriterInputCaptionAdaptorWithAssetWriterInput:
- @abstract
- Creates a new caption adaptor for writing to the specified asset writer input.
- */
+/// Creates a new caption adaptor for writing to the specified asset writer input.
+ (instancetype)assetWriterInputCaptionAdaptorWithAssetWriterInput:(AVAssetWriterInput *)input;
-/*!
- @method initWithAssetWriterInput:
- @abstract
- Creates a new caption adaptor for writing to the specified asset writer input.
-
- @discussion
- This method thows an exception for any of the following reasons:
- - input is nil
- - the input's media type is not supported (should use text or closed caption)
- - the input is already attached to an asset writer caption adaptor
- - the input has already started writing
- */
+/// Creates a new caption adaptor for writing to the specified asset writer input.
+///
+/// This method thows an exception for any of the following reasons:
+/// - input is nil
+/// - the input's media type is not supported (should use text or closed caption)
+/// - the input is already attached to an asset writer caption adaptor
+/// - the input has already started writing
- (instancetype)initWithAssetWriterInput:(AVAssetWriterInput *)input;
-/*!
- @property assetWriterInput
- @abstract
- The asset writer input that was used to initialize the receiver.
- */
+/// The asset writer input that was used to initialize the receiver.
@property (nonatomic, readonly) AVAssetWriterInput *assetWriterInput;
-/*!
- @method appendCaption:
- @abstract
- Append a single caption to be written.
- @param caption
- The caption to append.
- @result
- Returns YES if the operation succeeded, NO if it failed.
- @discussion
- If this method returns NO, check the value of AVAssetWriter.status on the attached asset writer to determine why appending failed.
-
- The start time of each caption's timeRange property must be numeric (see CMTIME_IS_NUMERIC) and must be at least as large as the start time of any previous caption (including any captions present in a group appended via -appendCaptionGroup:). In other words, the sequence of captions appended using this method must have monotonically increasing start times.
-
- The duration of each caption's timeRange property must be numeric.
- */
+/// Append a single caption to be written.
+///
+/// If this method returns NO, check the value of AVAssetWriter.status on the attached asset writer to determine why appending failed.
+///
+/// The start time of each caption's timeRange property must be numeric (see CMTIME_IS_NUMERIC) and must be at least as large as the start time of any previous caption (including any captions present in a group appended via -appendCaptionGroup:). In other words, the sequence of captions appended using this method must have monotonically increasing start times.
+///
+/// The duration of each caption's timeRange property must be numeric.
+///
+/// - Parameter caption: The caption to append.
+///
+/// - Returns: Returns YES if the operation succeeded, NO if it failed.
- (BOOL)appendCaption:(AVCaption *)caption;
-/*!
- @method appendCaptionGroup:
- @abstract
- Append a group of captions to be written.
- @param captionGroup
- @result
- Returns YES if the operation succeeded, NO if it failed.
- @discussion
- If this method returns NO, check the value of AVAssetWriter.status on the attached asset writer to determine why appending failed.
- When appending a sequence of captions groups, the start time of each group must be equal to or greater than the end time of any previous group. The easiest way to achieve this is to create the group using a caption whose duration is kCMTimeInvalid, in which case the duration will be determined by subtracting the start time of the group from the start time of the next appended group.
- When mixing calls to -appendCaptionGroup: and -appendCaption:, the start time of each group must be equal to or greater than the end time of any previous captions.
- To mark a time range containing no captions, append a group containing an empty caption array.
- */
+/// Append a group of captions to be written.
+///
+/// If this method returns NO, check the value of AVAssetWriter.status on the attached asset writer to determine why appending failed.
+/// When appending a sequence of captions groups, the start time of each group must be equal to or greater than the end time of any previous group. The easiest way to achieve this is to create the group using a caption whose duration is kCMTimeInvalid, in which case the duration will be determined by subtracting the start time of the group from the start time of the next appended group.
+/// When mixing calls to -appendCaptionGroup: and -appendCaption:, the start time of each group must be equal to or greater than the end time of any previous captions.
+/// To mark a time range containing no captions, append a group containing an empty caption array.
+///
+/// - Parameter captionGroup:
+///
+/// - Returns: Returns YES if the operation succeeded, NO if it failed.
- (BOOL)appendCaptionGroup:(AVCaptionGroup *)captionGroup;
@end
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVBase.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVBase.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVBase.h 2025-04-19 03:33:11
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVBase.h 2025-05-31 21:50:42
@@ -242,6 +242,23 @@
#endif
#endif // AVF_DEPLOYING_TO_2024_RELEASES_AND_LATER
+// Support for sendable AVURLAssetTracks
+#ifndef AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER
+ #if TARGET_OS_TV
+ #define AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER (__TV_OS_VERSION_MIN_REQUIRED >= 190000)
+ #elif TARGET_OS_WATCH
+ #define AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER (__WATCH_OS_VERSION_MIN_REQUIRED >= 120000)
+ #elif TARGET_OS_VISION
+ #define AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER (__VISION_OS_VERSION_MIN_REQUIRED >= 30000)
+ #elif TARGET_OS_IOS
+ #define AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER (__IPHONE_OS_VERSION_MIN_REQUIRED >= 190000)
+ #elif TARGET_OS_OSX
+ #define AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER (__MAC_OS_X_VERSION_MIN_REQUIRED >= 160000)
+ #else
+ #define AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER 0
+ #endif
+#endif // AVF_DEPLOYING_TO_2025_RELEASES_AND_LATER
+
#else
#import <AVFCore/AVBase.h>
#endif
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureAudioDataOutput.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureAudioDataOutput.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureAudioDataOutput.h 2025-04-19 03:47:15
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureAudioDataOutput.h 2025-05-28 07:58:25
@@ -87,6 +87,24 @@
@property(nonatomic, copy, null_resettable) NSDictionary<NSString *, id> *audioSettings API_UNAVAILABLE(ios, macCatalyst, watchos, tvos, visionos);
/*!
+ @property spatialAudioChannelLayoutTag
+ @abstract
+ Specifies the audio channel layout tag that describes the audio channel layout to be output by the AVCaptureAudioDataOutput.
+
+ @discussion
+ The value of this property is from the AudioChannelLayoutTag enumeration defined in CoreAudioBaseTypes.h. Currently, the only two supported values are kAudioChannelLayoutTag_Stereo or ( kAudioChannelLayoutTag_HOA_ACN_SN3D | 4 ) which will provide either a Stereo channel pair or four channels of First Order Ambisonic audio data output. The default value is kAudioChannelLayoutTag_Unknown which results in an AudioChannelLayout determined by the AVCaptureDeviceInput's configuration.
+
+ The rules for allowed values in a given AVCaptureSession are as follows:
+
+ When the associated AVCaptureDeviceInput's multichannelAudioMode property is set to AVCaptureMultichannelAudioModeFirstOrderAmbisonics, the AVCaptureSession can support up to two AVCaptureAudioDataOutput instances. If a single AVCaptureAudioDataOutput is present it can produce either four channels of First Order Ambisonic audio or two channels of Stereo audio. If two AVCaptureAudioDataOutputs are present, one of them must output four channels of First Order Ambisonic audio and the other must output two channels of Stereo audio.
+
+ When the associated AVCaptureDeviceInput's multichannelAudioMode property is set to anything other than AVCaptureMultichannelAudioModeFirstOrderAmbisonics, there must be only one AVCaptureAudioDataOutput present in the AVCaptureSession with its spatialAudioChannelLayoutTag property set to kAudioChannelLayoutTag_Unknown or left at the default value.
+
+ These rules are validated when a client calls -[AVCaptureSession startRunning:] or -[AVCaptureSession commitConfiguration:]. If the validation fails an exception will be thrown indicating the invalid setting and the session will not start running.
+ */
+@property(nonatomic) AudioChannelLayoutTag spatialAudioChannelLayoutTag API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
@method recommendedAudioSettingsForAssetWriterWithOutputFileType:
@abstract
Specifies the recommended settings for use with an AVAssetWriterInput.
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h 2025-04-19 03:21:27
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h 2025-05-30 07:11:06
@@ -372,6 +372,7 @@
*/
@property(nonatomic) CMTime activeVideoMaxFrameDuration API_AVAILABLE(macos(10.9), ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
+
/*!
@property autoVideoFrameRateEnabled
@abstract
@@ -462,7 +463,7 @@
<true/>
Otherwise, external cameras on Mac Catalyst report that their device type is AVCaptureDeviceTypeBuiltInWideAngleCamera.
- On visionOS, your app must have the `com.apple.developer.avfoundation.uvc-device-access` entitlement in order to discover and use devices of type `AVCaptureDeviceTypeExternal`.
+ Prior to visionOS 3.0, your app must have the `com.apple.developer.avfoundation.uvc-device-access` entitlement in order to discover and use devices of type `AVCaptureDeviceTypeExternal` on visionOS.
*/
AVF_EXPORT AVCaptureDeviceType const AVCaptureDeviceTypeExternal API_AVAILABLE(macos(14.0), ios(17.0), macCatalyst(17.0), tvos(17.0), visionos(2.1)) API_UNAVAILABLE(watchos);
@@ -817,7 +818,7 @@
The active constituent device restricted switching behavior.
@discussion
- For virtual devices with multiple constituent devices, this property returns the active restricted switching behavior conditions. This is equal to primaryConstituentDeviceRestrictedSwitchingBehaviorConditions except while recording using an AVCaptureMovieFileOutput configured with different retricted switching behavior conditions (see -[AVCaptureMovieFileOutput setPrimaryConstituentDeviceSwitchingBehaviorForRecording:restrictedSwitchingBehaviorConditions]). Devices that do not support constituent device switching return AVCapturePrimaryConstituentDeviceRestrictedSwitchingBehaviorConditionNone. This property is key-value observable.
+ For virtual devices with multiple constituent devices, this property returns the active restricted switching behavior conditions. This is equal to primaryConstituentDeviceRestrictedSwitchingBehaviorConditions except while recording using an AVCaptureMovieFileOutput configured with different restricted switching behavior conditions (see -[AVCaptureMovieFileOutput setPrimaryConstituentDeviceSwitchingBehaviorForRecording:restrictedSwitchingBehaviorConditions]). Devices that do not support constituent device switching return AVCapturePrimaryConstituentDeviceRestrictedSwitchingBehaviorConditionNone. This property is key-value observable.
*/
@property(nonatomic, readonly) AVCapturePrimaryConstituentDeviceRestrictedSwitchingBehaviorConditions activePrimaryConstituentDeviceRestrictedSwitchingBehaviorConditions API_AVAILABLE(macos(12.0), ios(15.0), macCatalyst(15.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
@@ -1140,6 +1141,49 @@
@property(nonatomic) CGPoint focusPointOfInterest;
/*!
+ @property focusRectOfInterestSupported
+ @abstract
+ Indicates whether the receiver supports focus rectangles of interest.
+
+ @discussion
+ The receiver's focusRectOfInterestSupported property can only be set if this property returns YES.
+ */
+@property(nonatomic, readonly, getter=isFocusRectOfInterestSupported) BOOL focusRectOfInterestSupported API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property minFocusRectOfInterestSize
+ @abstract
+ Returns the minimum size that can be used when specifying a rectangle of interest.
+
+ @discussion
+ The size returned is in normalized coordinates, and will depend on the current active format. If isFocusRectOfInterestSupported returns NO, this property will return { 0, 0 }.
+ */
+@property(nonatomic, readonly) CGSize minFocusRectOfInterestSize API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property focusRectOfInterest
+ @abstract
+ Indicates current focus rectangle of interest of the receiver, if it has one.
+
+ @discussion
+ The value of this property is a CGRect that determines the receiver's focus rectangle of interest, if it has one. It is used as an alternative to -setFocusPointOfInterest:, as it allows for both a location and size to be specified. A value of CGRectMake(0, 0, 1, 1) indicates that the receiver should use the entire field of view when determining the focus, while CGRectMake(0, 0, 0.25, 0.25) would indicate the top left sixteenth, and CGRectMake(0.75, 0.75, 0.25, 0.25) would indicate the bottom right sixteenth. -setFocusRectOfInterest: throws an NSInvalidArgumentException if isFocusRectOfInterestSupported returns NO. -setFocusRectOfInterest: throws an NSInvalidArgumentException if the size of the provided rectangle is smaller than that returned by minFocusRectOfInterestSize. -setFocusRectOfInterest: throws an NSGenericException if called without first obtaining exclusive access to the receiver using lockForConfiguration:. -setFocusRectOfInterest: will update the receiver's focusPointOfInterest to be the center of the rectangle of interest. If the client later sets the receiver's focusPointOfInterest, the focusRectOfInterest will reset to the default rectangle of interest for the new focus point of interest. If the client changes the activeFormat, the point of interest and rectangle of interest will revert to their default values. Clients can observe automatic changes to the receiver's focusRectOfInterest by key value observing this property. Note that setting focusRectOfInterest alone does not initiate a focus operation. After setting focusRectOfInterest, call -setFocusMode: to apply the new rectangle of interest.
+ */
+@property(nonatomic) CGRect focusRectOfInterest API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @method defaultRectForFocusPointOfInterest:
+ @abstract
+ Returns the default rectangle of interest that is used for a given focus point of interest.
+
+ @param pointOfInterest
+ Point of interest for which we are returning the default rectangle of interest.
+
+ @discussion
+ Pass (0.5, 0.5) to get the focus rectangle of interest used for the default focus point of interest at (0.5, 0.5); note that the particular default rectangle returned will depend on the current focus mode. This method returns CGRectNull if isFocusRectOfInterestSupported returns NO.
+ */
+- (CGRect)defaultRectForFocusPointOfInterest:(CGPoint)pointOfInterest API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
@property adjustingFocus
@abstract
Indicates whether the receiver is currently performing a focus scan to adjust focus.
@@ -1252,6 +1296,68 @@
*/
@property(nonatomic, readonly) NSInteger minimumFocusDistance API_AVAILABLE(macos(12.0), ios(15.0), macCatalyst(15.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+/*!
+ @enum AVCaptureCinematicVideoFocusMode
+ @abstract
+ Constants indicating the focus behavior when recording a Cinematic Video.
+
+ @constant AVCaptureCinematicVideoFocusModeNone
+ Indicates that no focus mode is specified, in which case weak focus is used as default.
+ @constant AVCaptureCinematicVideoFocusModeStrong
+ Indicates that the subject should remain in focus until it exits the scene.
+ @constant AVCaptureCinematicVideoFocusModeWeak
+ Indicates that the Cinematic Video algorithm should automatically adjust focus according to the prominence of the subjects in the scene.
+ */
+typedef NS_ENUM(NSInteger, AVCaptureCinematicVideoFocusMode) {
+ AVCaptureCinematicVideoFocusModeNone = 0,
+ AVCaptureCinematicVideoFocusModeStrong = 1,
+ AVCaptureCinematicVideoFocusModeWeak = 2,
+} NS_SWIFT_NAME(AVCaptureDevice.CinematicVideoFocusMode) API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @method setCinematicVideoTrackingFocusWithDetectedObjectID:focusMode:
+ @abstract
+ Focus on and start tracking a detected object.
+
+ @param detectedObjectID
+ ID of the detected object.
+
+ @param focusMode
+ Specify whether to focus strongly or weakly.
+
+ */
+- (void)setCinematicVideoTrackingFocusWithDetectedObjectID:(NSInteger)detectedObjectID focusMode:(AVCaptureCinematicVideoFocusMode)focusMode NS_SWIFT_NAME(setCinematicVideoTrackingFocus(detectedObjectID:focusMode:)) API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @method setCinematicVideoTrackingFocusAtPoint:focusMode:
+ @abstract
+ Focus on and start tracking an object if it can be detected at the region specified by the point.
+
+ @param point
+ A normalized point of interest (i.e., [0,1]) in the coordinate space of the device.
+
+ @param focusMode
+ Specify whether to focus strongly or weakly.
+
+ */
+- (void)setCinematicVideoTrackingFocusAtPoint:(CGPoint)point focusMode:(AVCaptureCinematicVideoFocusMode)focusMode NS_SWIFT_NAME(setCinematicVideoTrackingFocus(at:focusMode:)) API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @method setCinematicVideoFixedFocusAtPoint:focusMode:
+ @abstract
+ Fix focus at a distance.
+
+ @param point
+ A normalized point of interest (i.e., [0,1]) in the coordinate space of the device.
+
+ @param focusMode
+ Specify whether to focus strongly or weakly.
+
+ @discussion
+ The distance at which focus is set is determined internally using signals such as depth data.
+ */
+- (void)setCinematicVideoFixedFocusAtPoint:(CGPoint)point focusMode:(AVCaptureCinematicVideoFocusMode)focusMode NS_SWIFT_NAME(setCinematicVideoFixedFocus(at:focusMode:)) API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
@end
@@ -1326,6 +1432,49 @@
@property(nonatomic) CGPoint exposurePointOfInterest;
/*!
+ @property exposureRectOfInterestSupported
+ @abstract
+ Indicates whether the receiver supports exposure rectangles of interest.
+
+ @discussion
+ The receiver's exposureRectOfInterestSupported property can only be set if this property returns YES.
+ */
+@property(nonatomic, readonly, getter=isExposureRectOfInterestSupported) BOOL exposureRectOfInterestSupported API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property minExposureRectOfInterestSize
+ @abstract
+ Returns the minimum size that can be used when specifying a rectangle of interest.
+
+ @discussion
+ The size returned is in normalized coordinates, and will depend on the current active format. If isExposureRectOfInterestSupported returns NO, this property will return { 0, 0 }.
+ */
+@property(nonatomic, readonly) CGSize minExposureRectOfInterestSize API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property exposureRectOfInterest
+ @abstract
+ Indicates current exposure rectangle of interest of the receiver, if it has one.
+
+ @discussion
+ The value of this property is a CGRect that determines the receiver's exposure rectangle of interest, if it has one. It is used as an alternative to -setExposurePointOfInterest:, as it allows for both a location and size to be specified. A value of CGRectMake(0, 0, 1, 1) indicates that the receiver should use the entire field of view when determining the exposure, while CGRectMake(0, 0, 0.25, 0.25) would indicate the top left sixteenth, and CGRectMake(0.75, 0.75, 0.25, 0.25) would indicate the bottom right sixteenth. -setExposureRectOfInterest: throws an NSInvalidArgumentException if isExposureRectOfInterestSupported returns NO. -setExposureRectOfInterest: throws an NSInvalidArgumentException if the size of the provided rectangle is smaller than that returned by minExposureRectOfInterestSize. -setExposureRectOfInterest: throws an NSGenericException if called without first obtaining exclusive access to the receiver using lockForConfiguration:. -setExposureRectOfInterest: will update the receiver's exposurePointOfInterest to be the center of the rectangle of interest. If the client later sets the receiver's exposurePointOfInterest, the exposureRectOfInterest will reset to the default rectangle of interest for the new exposure point of interest. If the client changes the activeFormat, the point of interest and rectangle of interest will revert to their default values. Clients can observe automatic changes to the receiver's exposureRectOfInterest by key value observing this property. Note that setting exposureRectOfInterest alone does not initiate an exposure operation. After setting exposureRectOfInterest, call -setExposureMode: to apply the new rectangle of interest.
+ */
+@property(nonatomic) CGRect exposureRectOfInterest API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @method defaultRectForExposurePointOfInterest:
+ @abstract
+ Returns the default rectangle of interest that is used for a given exposure point of interest.
+
+ @param pointOfInterest
+ Point of interest for which we are returning the default rectangle of interest.
+
+ @discussion
+ Pass (0.5, 0.5) to get the exposure rectangle of interest used for the default exposure point of interest at (0.5, 0.5). This method returns CGRectNull if isExposureRectOfInterestSupported returns NO.
+ */
+- (CGRect)defaultRectForExposurePointOfInterest:(CGPoint)pointOfInterest API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
@property automaticallyAdjustsFaceDrivenAutoExposureEnabled
@abstract
Indicates whether the receiver should automatically adjust face-driven auto exposure.
@@ -1656,6 +1805,7 @@
*/
AVF_EXPORT const AVCaptureWhiteBalanceGains AVCaptureWhiteBalanceGainsCurrent API_AVAILABLE(ios(8.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+
/*!
@method setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler:
@abstract
@@ -1667,7 +1817,7 @@
A block to be called when white balance gains have been set to the values specified and whiteBalanceMode is set to AVCaptureWhiteBalanceModeLocked. If setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler: is called multiple times, the completion handlers will be called in FIFO order. The block receives a timestamp which matches that of the first buffer to which all settings have been applied. Note that the timestamp is synchronized to the device clock, and thus must be converted to the master clock prior to comparison with the timestamps of buffers delivered via an AVCaptureVideoDataOutput. This parameter may be nil if synchronization is not required.
@discussion
- For each channel in the whiteBalanceGains struct, only values between 1.0 and -maxWhiteBalanceGain are supported. Gain values are normalized to the minimum channel value to avoid brightness changes (e.g. R:2 G:2 B:4 will be normalized to R:1 G:1 B:2). This method throws an NSRangeException if any of the whiteBalanceGains are set to an unsupported level. This method throws an NSGenericException if called without first obtaining exclusive access to the receiver using lockForConfiguration:.
+ Gain values are normalized to the minimum channel value to avoid brightness changes (e.g. R:2 G:2 B:4 will be normalized to R:1 G:1 B:2). For each channel in the whiteBalanceGains struct, only values between 1.0 and maxWhiteBalanceGain after nomalization are supported. This method throws an NSRangeException if any of the whiteBalanceGains are set to an unsupported level. This method throws an NSGenericException if called without first obtaining exclusive access to the receiver using lockForConfiguration:.
*/
- (void)setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:(AVCaptureWhiteBalanceGains)whiteBalanceGains completionHandler:(nullable void (^)(CMTime syncTime))handler API_AVAILABLE(ios(8.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos);
@@ -2060,7 +2210,6 @@
AVCaptureColorSpace_AppleLog API_AVAILABLE(ios(17.0), macCatalyst(17.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) = 3,
} API_AVAILABLE(macos(10.15), ios(10.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
-
API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
@interface AVCaptureDevice (AVCaptureDeviceColorSpaceSupport)
@@ -2145,9 +2294,9 @@
@property geometricDistortionCorrectionSupported
@abstract
Indicates that geometric distortion correction is supported by the receiver.
-
+
@discussion
- Some AVCaptureDevices benefit from geometric distortion correction (GDC), such as devices with a very wide field of view. GDC lessens the fisheye effect at the outer edge of the frame at the cost of losing a small amount of vertical and horizontal field of view. When GDC is enabled on the AVCaptureDevice (see geometricDistortionEnabled), the corrected image is upscaled to the original image size when needed. With respect to the AVCaptureDevice.videoZoomFactor API, the full viewable field of view is always represented with a videoZoomFactor of 1.0. Thus, when GDC is enabled, the AVCaptureDevice.activeFormat's field of view at videoZoomFactor = 1.0 will be different than when GDC is disabled. The smaller field of view is reported through the activeFormat's geometricDistortionCorrectedVideoFieldOfView property. Beware though that RAW photo captures never have GDC applied, regardless of the value of AVCaptureDevice.geometricDistortionCorrectionEnabled.
+ Some AVCaptureDevices benefit from geometric distortion correction (GDC), such as devices with a very wide field of view. GDC lessens the fisheye effect at the outer edge of the frame at the cost of losing a small amount of vertical and horizontal field of view. When GDC is enabled on the AVCaptureDevice (see geometricDistortionCorrectionEnabled), the corrected image is upscaled to the original image size when needed. With respect to the AVCaptureDevice.videoZoomFactor API, the full viewable field of view is always represented with a videoZoomFactor of 1.0. Thus, when GDC is enabled, the AVCaptureDevice.activeFormat's field of view at videoZoomFactor = 1.0 will be different than when GDC is disabled. The smaller field of view is reported through the activeFormat's geometricDistortionCorrectedVideoFieldOfView property. Beware though that RAW photo captures never have GDC applied, regardless of the value of AVCaptureDevice.geometricDistortionCorrectionEnabled.
*/
@property(nonatomic, readonly, getter=isGeometricDistortionCorrectionSupported) BOOL geometricDistortionCorrectionSupported API_AVAILABLE(ios(13.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
@@ -2548,12 +2697,51 @@
Indicates whether or not the current environmental conditions are amenable to a spatial capture that is comfortable to view.
@discussion
- This property can be monitored in order to determine the presentation of U/I elements to inform the user that they should reframe their scene for a more pleasing spatial capture ("subject is too close", "scene is too dark").
+ This property can be monitored in order to determine the presentation of UI elements to inform the user that they should reframe their scene for a more pleasing spatial capture ("subject is too close", "scene is too dark").
*/
@property(nonatomic, readonly) NSSet<AVSpatialCaptureDiscomfortReason> *spatialCaptureDiscomfortReasons API_AVAILABLE(macos(15.0), ios(18.0), macCatalyst(18.0), tvos(18.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
@end
+/*!
+ @group AVCaptureSceneMonitoringStatus string constants
+ @discussion
+ Some features have certain requirements on the scene (lighting condition for Cinematic Video, for example) to produce optimal results; these AVCaptureSceneMonitoringStatus string constants are used to represent such scene statuses for a given feature.
+ */
+typedef NSString *AVCaptureSceneMonitoringStatus NS_TYPED_ENUM API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @constant AVCaptureSceneMonitoringStatusNotEnoughLight
+ The lighting of the current scene is not bright enough.
+ */
+AVF_EXPORT AVCaptureSceneMonitoringStatus const AVCaptureSceneMonitoringStatusNotEnoughLight API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+@interface AVCaptureDevice (AVCaptureDeviceCinematicVideoCapture)
+
+/*!
+ @property cinematicVideoCaptureSceneMonitoringStatuses
+ @abstract
+ Indicates the current scene monitoring statuses related to Cinematic Video capture.
+
+ @discussion
+ This property can be monitored in order to determine the presentation of UI elements to inform the user that they should reframe their scene for a better Cinematic Video experience ("scene is too dark").
+ */
+@property(nonatomic, readonly) NSSet<AVCaptureSceneMonitoringStatus> *cinematicVideoCaptureSceneMonitoringStatuses;
+
+@end
+
+@interface AVCaptureDevice (AVCaptureDeviceNominalFocalLengthIn35mmFilm)
+
+/// The nominal 35mm equivalent focal length of the capture device's lens.
+///
+/// This value represents a nominal measurement of the device's field of view, expressed as a 35mm equivalent focal length, measured diagonally. The value is similar to the `FocalLengthIn35mmFormat` EXIF entry (see <doc://com.apple.documentation/documentation/imageio/kcgimagepropertyexiffocallenin35mmfilm>) for a photo captured using the device's format where ``AVCaptureDeviceFormat/highestPhotoQualitySupported`` is `true` or when you've configured the session with the ``AVCaptureSessionPresetPhoto`` preset.
+///
+/// This property value is `0` for virtual devices and external cameras.
+@property(nonatomic, readonly) float nominalFocalLengthIn35mmFilm API_AVAILABLE(ios(26.0)) API_UNAVAILABLE(macos, macCatalyst, tvos, visionos) API_UNAVAILABLE(watchos);
+@end
+
+
#pragma mark - AVCaptureDeviceDiscoverySession
/*!
@@ -2686,7 +2874,6 @@
@end
-
#pragma mark - AVExposureBiasRange
/*!
@@ -2733,6 +2920,7 @@
@end
+
#pragma mark - AVFrameRateRange
@class AVFrameRateRangeInternal;
@@ -2843,7 +3031,6 @@
@end
-
/*!
@enum AVCaptureVideoStabilizationMode
@abstract
@@ -3326,7 +3513,7 @@
*/
@property(nonatomic, readonly, getter=isCenterStageSupported) BOOL centerStageSupported API_AVAILABLE(macos(12.3), ios(14.5), macCatalyst(14.5), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
-/*
+/*!
@property videoMinZoomFactorForCenterStage
@abstract
Indicates the minimum zoom factor available for the AVCaptureDevice's videoZoomFactor property when centerStageActive is YES.
@@ -3489,6 +3676,166 @@
@end
+API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+@interface AVCaptureDeviceFormat (AVCaptureDeviceFormatCinematicVideoSupport)
+
+/*!
+ @property cinematicVideoCaptureSupported
+ @abstract
+ Indicates whether the format supports Cinematic Video capture.
+
+ @discussion
+ This property returns YES if the format supports Cinematic Video that produces a controllable, simulated depth of field and adds beautiful focus transitions for a cinema-grade look.
+ */
+@property(nonatomic, readonly, getter=isCinematicVideoCaptureSupported) BOOL cinematicVideoCaptureSupported API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property defaultSimulatedAperture
+ @abstract
+ Default shallow depth of field simulated aperture.
+
+ @discussion
+ This will return a non-zero value on devices that support the shallow depth of field effect.
+ */
+@property(nonatomic, readonly) float defaultSimulatedAperture API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property minSimulatedAperture
+ @abstract
+ Minimum supported shallow depth of field simulated aperture.
+
+ @discussion
+ On devices that do not support changing the simulated aperture value, this will return a value of 0.
+ */
+@property(nonatomic, readonly) float minSimulatedAperture API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property maxSimulatedAperture
+ @abstract
+ Maximum supported shallow depth of field simulated aperture.
+
+ @discussion
+ On devices that do not support changing the simulated aperture value, this will return a value of 0.
+ */
+@property(nonatomic, readonly) float maxSimulatedAperture API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property videoMinZoomFactorForCinematicVideo
+ @abstract
+ Indicates the minimum zoom factor available for the AVCaptureDevice's videoZoomFactor property when Cinematic Video capture is enabled on the device input.
+
+ @discussion
+ Devices support a limited zoom range when Cinematic Video capture is active. If this device format does not support Cinematic Video capture, this property returns 1.0.
+ */
+@property(nonatomic, readonly) CGFloat videoMinZoomFactorForCinematicVideo API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property videoMaxZoomFactorForCinematicVideo
+ @abstract
+ Indicates the maximum zoom factor available for the AVCaptureDevice's videoZoomFactor property when Cinematic Video capture is enabled on the device input.
+
+ @discussion
+ Devices support a limited zoom range when Cinematic Video capture is active. If this device format does not support Cinematic Video capture, this property returns 1.0.
+ */
+@property(nonatomic, readonly) CGFloat videoMaxZoomFactorForCinematicVideo API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property videoFrameRateRangeForCinematicVideo
+ @abstract
+ Indicates the minimum / maximum frame rates available when Cinematic Video capture is enabled on the device input.
+
+ @discussion
+ Devices may support a limited frame rate range when Cinematic Video capture is active. If this device format does not support Cinematic Video capture, this property returns nil.
+ */
+@property(nonatomic, readonly, nullable) AVFrameRateRange *videoFrameRateRangeForCinematicVideo API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+@end
+
+API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+@interface AVCaptureDeviceFormat (CameraLensSmudgeDetection)
+
+/*!
+ @property cameraLensSmudgeDetectionSupported
+ @abstract
+ A BOOL value specifying whether camera lens smudge detection is supported.
+
+ @discussion
+ This property returns YES if the session's current configuration supports lens smudge detection. When switching cameras or formats this property may change. When this property changes from YES to NO, cameraLensSmudgeDetectionEnabled also reverts to NO. If you've previously opted in for lens smudge detection and then change configurations, you may need to set cameraLensSmudgeDetectionEnabled = YES again.
+ */
+@property(nonatomic, readonly, getter=isCameraLensSmudgeDetectionSupported) BOOL cameraLensSmudgeDetectionSupported API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+@end
+
+API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+@interface AVCaptureDevice (CameraLensSmudgeDetection)
+/*!
+ @method setCameraLensSmudgeDetectionEnabled:detectionInterval
+ @abstract
+ Specify whether to enable camera lens smudge detection, and the interval time between each run of detections.
+
+ @param cameraLensSmudgeDetectionEnabled
+ Specify whether camera lens smudge detection should be enabled.
+ @param detectionInterval
+ The detection running interval if detection is enabled.
+ @discussion
+ Each run of detection processes frames over a short period, and produces one detection result. Use `detectionInterval` to specify the interval time between each run of detections. For example, when `cameraLensSmudgeDetectionEnabled` is set to YES and `detectionInterval` is set to 1 minute, detection runs once per minute, and updates `AVCaptureCameraLensSmudgeDetectionStatus`. If `detectionInterval` is set to `kCMTimeInvalid`, detection will only run once after the session starts. If `detectionInterval` is set to `kCMTimeZero`, detection will run continuously.
+
+ AVCaptureDevice throws an NSInvalidArgumentException if `cameraLensSmudgeDetectionSupported` property on the current active format returns NO. From disabled (or stopped) to enabling requires a lengthy reconfiguration of the capture render pipeline, so if you intend to enable this feature, you should enable this detection before calling -[AVCaptureSession startRunning] or within -[AVCaptureSession beginConfiguration] and -[AVCaptureSession commitConfiguration] while running.
+ */
+- (void)setCameraLensSmudgeDetectionEnabled:(BOOL)cameraLensSmudgeDetectionEnabled detectionInterval:(CMTime)detectionInterval API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property cameraLensSmudgeDetectionEnabled
+ @abstract
+ The cameraLensSmudgeDetectionEnabled as set by -[AVCaptureDevice setCameraLensSmudgeDetectionEnabled:detectionInterval:].
+
+ @discussion
+ By default, this property is set to NO.
+ */
+@property(nonatomic, readonly, getter=isCameraLensSmudgeDetectionEnabled) BOOL cameraLensSmudgeDetectionEnabled API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property cameraLensSmudgeDetectionInterval
+ @abstract
+ The cameraLensSmudgeDetectionInterval as set by -[AVCaptureDevice setCameraLensSmudgeDetectionEnabled:detectionInterval:].
+
+ @discussion
+ By default, this property is set to kCMTimeInvalid.
+ */
+@property(nonatomic, readonly) CMTime cameraLensSmudgeDetectionInterval API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @enum AVCaptureCameraLensSmudgeDetectionStatus
+ @abstract
+ Constants indicating the current camera lens smudge detection status.
+
+ @constant AVCaptureCameraLensSmudgeDetectionStatusDisabled
+ Indicates that the detection is not enabled.
+ @constant AVCaptureCameraLensSmudgeDetectionStatusSmudgeNotDetected
+ Indicates that the most recent detection identifies smudge is not detected on camera lens.
+ @constant AVCaptureCameraLensSmudgeDetectionStatusSmudged
+ Indicates that the most recent detection identifies camera lens is smudged.
+ @constant AVCaptureCameraLensSmudgeDetectionStatusUnknown
+ Indicates that the detection result hasn't settled, commonly caused by excessive camera movement or the content of image.
+ */
+typedef NS_ENUM(NSInteger, AVCaptureCameraLensSmudgeDetectionStatus) {
+ AVCaptureCameraLensSmudgeDetectionStatusDisabled = 0,
+ AVCaptureCameraLensSmudgeDetectionStatusSmudgeNotDetected = 1,
+ AVCaptureCameraLensSmudgeDetectionStatusSmudged = 2,
+ AVCaptureCameraLensSmudgeDetectionStatusUnknown = 3,
+} API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property cameraLensSmudgeDetectionStatus
+ @abstract
+ A value specifying the status of camera lens smudge detection.
+
+ @discussion
+ During the initial detection execution, `cameraLensSmudgeDetectionStatus` is `AVCaptureCameraLensSmudgeDetectionStatusUnknown` before detection result is settled. Once a detection result is produced, `cameraLensSmudgeDetectionStatus` is the most recent detection result. This property can be key-value observed.
+ */
+@property(nonatomic, readonly) AVCaptureCameraLensSmudgeDetectionStatus cameraLensSmudgeDetectionStatus API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+@end
#pragma mark - AVCaptureDeviceInputSource
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureFileOutput.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureFileOutput.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureFileOutput.h 2025-04-19 03:47:14
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureFileOutput.h 2025-05-28 07:58:45
@@ -545,7 +545,7 @@
When primaryConstituentDeviceSwitchingBehaviorForRecordingEnabled is set to YES, this method controls the switching behavior and conditions, while a movie file is being recorded.
@discussion
- This controls the camera selection behavior used while recording a movie, when enabled through primaryConstituentDeviceSwitchingBehaviorForRecordingEnabled. Setting the switching behavior to anything other than AVCapturePrimaryConstituentDeviceSwitchingBehaviorUnsupported when connected to an AVCaptureDevice that does not suport constituent device selection throws an NSInvalidArgumentException. Setting restrictedSwitchingBehaviorConditions to something other than AVCapturePrimaryConstituentDeviceRestrictedSwitchingBehaviorConditionNone while setting switchingBehavior to something other than AVCapturePrimaryConstituentDeviceSwitchingBehaviorRestricted throws an NSInvalidArgumentException exception.
+ This controls the camera selection behavior used while recording a movie, when enabled through primaryConstituentDeviceSwitchingBehaviorForRecordingEnabled. Setting the switching behavior to anything other than AVCapturePrimaryConstituentDeviceSwitchingBehaviorUnsupported when connected to an AVCaptureDevice that does not support constituent device selection throws an NSInvalidArgumentException. Setting restrictedSwitchingBehaviorConditions to something other than AVCapturePrimaryConstituentDeviceRestrictedSwitchingBehaviorConditionNone while setting switchingBehavior to something other than AVCapturePrimaryConstituentDeviceSwitchingBehaviorRestricted throws an NSInvalidArgumentException exception.
*/
- (void)setPrimaryConstituentDeviceSwitchingBehaviorForRecording:(AVCapturePrimaryConstituentDeviceSwitchingBehavior)switchingBehavior restrictedSwitchingBehaviorConditions:(AVCapturePrimaryConstituentDeviceRestrictedSwitchingBehaviorConditions)restrictedSwitchingBehaviorConditions API_AVAILABLE(macos(12.0), ios(15.0), macCatalyst(15.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureInput.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureInput.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureInput.h 2025-04-19 03:21:26
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureInput.h 2025-05-28 07:58:24
@@ -281,6 +281,8 @@
*/
@property(nonatomic) CMTime videoMinFrameDurationOverride API_AVAILABLE(ios(13.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+
+
/*!
@enum AVCaptureMultichannelAudioMode
@abstract
@@ -347,6 +349,57 @@
Wind noise removal is available when the AVCaptureDeviceInput multichannelAudioMode property is set to any value other than AVCaptureMultichannelAudioModeNone.
*/
@property(nonatomic, getter=isWindNoiseRemovalEnabled) BOOL windNoiseRemovalEnabled API_AVAILABLE(macos(15.0), ios(18.0), macCatalyst(18.0), tvos(18.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property cinematicVideoCaptureSupported
+ @abstract
+ A BOOL value specifying whether Cinematic Video capture is supported.
+
+ @discussion
+ With Cinematic Video capture, you get a simulated depth-of-field effect that keeps your subjects—people, pets, and more—in sharp focus while applying a pleasing blur to the background (or foreground). Depending on the focus mode (see `AVCaptureCinematicVideoFocusMode` for detail), the camera either uses machine learning to automatically detect and focus on subjects in the scene, or it fixes focus on a subject until it exits the scene. Cinematic Videos can be played back and edited using the Cinematic framework.
+
+ The simulated aperture can be adjusted before the recording starts using the simulatedAperture property. Focus transitions can be dynamically controlled using the Cinematic Video specific focus methods on AVCaptureDevice.
+
+ The resulted movie file can be played back and edited with the Cinematic framework.
+
+ This property returns YES if the session's current configuration allows Cinematic Video capture. When switching cameras or formats this property may change. When this property changes from YES to NO, cinematicVideoCaptureEnabled also reverts to NO. If you've previously opted in for Cinematic Video capture and then change configurations, you may need to set cinematicVideoCaptureEnabled = YES again. This property is key-value observable.
+
+ AVCaptureDepthDataOutput is not supported when cinematicVideoCaptureEnabled is set to true. Running an AVCaptureSession with both of these features throws an NSInvalidArgumentException.
+ */
+@property(nonatomic, readonly, getter=isCinematicVideoCaptureSupported) BOOL cinematicVideoCaptureSupported API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property cinematicVideoCaptureEnabled
+ @abstract
+ A BOOL value specifying whether the Cinematic Video effect is applied to any AVCaptureMovieFileOutput, AVCaptureVideoDataOutput, AVCaptureMetadataOutput, and AVCaptureVideoPreviewLayer added to the same capture session.
+
+ @discussion
+ Default is NO. Set to YES to enable support for Cinematic Video capture.
+
+ When set to YES, the corresponding AVCaptureDevice's focusMode will be updated to AVCaptureFocusModeContinuousAutoFocus. While this is property is YES any attempt to change the focus mode will result in an exception.
+
+ This property may only be set to YES if cinematicVideoCaptureSupported is YES. Enabling Cinematic Video capture requires a lengthy reconfiguration of the capture render pipeline, so if you intend to capture Cinematic Video, you should set this property to YES before calling -[AVCaptureSession startRunning] or within -[AVCaptureSession beginConfiguration] and -[AVCaptureSession commitConfiguration] while running.
+
+ */
+@property(nonatomic, getter=isCinematicVideoCaptureEnabled) BOOL cinematicVideoCaptureEnabled API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/*!
+ @property simulatedAperture
+ @abstract
+ Shallow depth of field simulated aperture.
+
+ @discussion
+ When capturing a Cinematic Video, use this property to control the amount of blur in the simulated depth of field effect.
+
+ This property only takes effect when cinematicVideoCaptureEnabled is set to YES.
+
+ Setting this property throws an NSRangeException if simulatedAperture is set to a value less than the AVCaptureDevice's activeFormat.minSimulatedAperture or greater than the AVCaptureDevice's activeFormat.maxSimulatedAperture. This property may only be set if AVCaptureDevice's activeFormat.minSimulatedAperture returns a non-zero value, otherwise an NSInvalidArgumentException is thrown. This property can only be set before a Cinematic Video capture starts. An NSInvalidArgumentException is thrown if simulatedAperture is set when a Cinematic Video is being captured.
+
+ This property is initialized to the device.activeFormat's defaultSimulatedAperture.
+
+ This property is key-value observable.
+ */
+@property(nonatomic) float simulatedAperture API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
@end
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureMetadataOutput.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureMetadataOutput.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureMetadataOutput.h 2025-04-19 01:55:54
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureMetadataOutput.h 2025-05-28 07:58:25
@@ -92,6 +92,8 @@
@discussion
AVCaptureMetadataOutput may detect and emit multiple metadata object types. For apps linked before iOS 7.0, the receiver defaults to capturing face metadata objects if supported (see -availableMetadataObjectTypes). For apps linked on or after iOS 7.0, the receiver captures no metadata objects by default. -setMetadataObjectTypes: throws an NSInvalidArgumentException if any elements in the array are not present in the -availableMetadataObjectTypes array.
+
+ If you've set your AVCaptureMetadataOutput's connected input's `cinematicVideoCaptureEnabled` property to YES, you must set your `metadataObjectTypes` property to `requiredMetadataObjectTypesForCinematicVideoCapture` or an NSInvalidArgumentException is thrown.
*/
@property(nonatomic, copy, null_resettable) NSArray<AVMetadataObjectType> *metadataObjectTypes;
@@ -106,6 +108,16 @@
As of iOS 13, this property can be set without requiring a lengthy rebuild of the session in which video preview is disrupted.
*/
@property(nonatomic) CGRect rectOfInterest API_AVAILABLE(ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
+
+/*!
+ @property requiredMetadataObjectTypesForCinematicVideoCapture
+ @abstract
+ Indicates the required metadata object types when Cinematic Video capture is enabled.
+
+ @discussion
+ Since the Cinematic Video algorithm requires a particular set of metadata objects to function optimally, you must set your `metadataObjectTypes` property to this property's returned value if you've set `cinematicVideoCaptureEnabled` to YES on the connected device input.
+ */
+@property(nonatomic, readonly) NSArray<AVMetadataObjectType> *requiredMetadataObjectTypesForCinematicVideoCapture API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
@end
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureOutputBase.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureOutputBase.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureOutputBase.h 2025-04-19 04:24:32
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureOutputBase.h 2025-05-28 07:29:10
@@ -114,9 +114,26 @@
*/
- (CGRect)rectForMetadataOutputRectOfInterest:(CGRect)rectInMetadataOutputCoordinates API_AVAILABLE(macos(10.15), ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
+/// A Boolean value that indicates whether the output supports deferred start.
+///
+/// You can only set the ``deferredStartEnabled`` property value to `true` if the output supports deferred start.
+@property(nonatomic, readonly, getter=isDeferredStartSupported) BOOL deferredStartSupported API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
+/// A Boolean value that Indicates whether to defer starting this capture output.
+///
+/// When this value is `true`, the session doesn't prepare the output's resources until some time after ``startRunning`` returns. You can start the visual parts of your user interface (e.g. preview) prior to other parts (e.g. photo/movie capture, metadata output, etc..) to improve startup performance. Set this value to `false` for outputs that your app needs for startup, and `true` for the ones that it doesn't need to start immediately. For example, an ``AVCaptureVideoDataOutput`` that you intend to use for displaying preview should set this value to `false`, so that the frames are available as soon as possible.
+///
+/// By default, for apps that are linked on or after iOS 19, this property value is `true` for ``AVCapturePhotoOutput`` and ``AVCaptureFileOutput`` subclasses if supported, and `false` otherwise. When set to `true` for ``AVCapturePhotoOutput``, if you want to support multiple capture requests before running deferred start, set ``responsiveCaptureEnabled`` to `true` on that output.
+///
+/// If ``deferredStartSupported`` is `false`, setting this property value to `true` results in the system throwing an invalid argument exception.
+///
+/// - Note: Set this value before committing the configuration.
+@property(nonatomic, getter=isDeferredStartEnabled) BOOL deferredStartEnabled API_AVAILABLE(macos(26.0), ios(26.0), macCatalyst(26.0), tvos(26.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
@end
+
/*!
@enum AVCaptureOutputDataDroppedReason
@abstract
@@ -137,6 +154,7 @@
AVCaptureOutputDataDroppedReasonOutOfBuffers = 2,
AVCaptureOutputDataDroppedReasonDiscontinuity = 3,
} API_AVAILABLE(macos(10.15), ios(11.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
+
NS_ASSUME_NONNULL_END
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCapturePhotoOutput.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCapturePhotoOutput.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCapturePhotoOutput.h 2025-04-19 06:03:25
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCapturePhotoOutput.h 2025-05-28 07:29:40
@@ -68,7 +68,7 @@
An object conforming to the AVCapturePhotoCaptureDelegate protocol. This object's delegate methods are called back as the photo advances from capture to processing to finished delivery. May not be nil.
@discussion
- This method initiates a photo capture. The receiver copies your provided settings to prevent unintentional mutation. It is illegal to re-use settings. The receiver throws a NSInvalidArgumentException if your settings.uniqueID matches that of any previously used settings. This method is used to initiate all flavors of photo capture: single photo, RAW capture with or without a processed image (such as a JPEG), bracketed capture, and Live Photo.
+ This method initiates a photo capture. The receiver copies your provided settings to prevent unintentional mutation. It is illegal to re-use settings. The receiver throws an NSInvalidArgumentException if your settings.uniqueID matches that of any previously used settings. This method is used to initiate all flavors of photo capture: single photo, RAW capture with or without a processed image (such as a JPEG), bracketed capture, and Live Photo.
Clients need not wait for a capture photo request to complete before issuing another request. This is true for single photo captures as well as Live Photos, where movie complements of adjacent photo captures are allowed to overlap.
@@ -91,9 +91,9 @@
- portraitEffectsMatteDeliveryEnabled will automatically be disabled in AVCapturePhotoSettings
- enabledSemanticSegmentationMatteTypes will automatically be cleared in AVCapturePhotoSettings
Processed Format rules:
- - If format is non-nil, a kCVPixelBufferPixelFormatTypeKey or AVVideoCodecKey must be present, and both may not be present.
+ - If format is non-nil, a kCVPixelBufferPixelFormatTypeKey or AVVideoCodecKey must be present. You cannot specify both.
- If format has a kCVPixelBufferPixelFormatTypeKey, its value must be present in the receiver's -availablePhotoPixelFormatTypes array.
- - If format has a AVVideoCodecKey, its value must be present in the receiver's -availablePhotoCodecTypes array.
+ - If format has an AVVideoCodecKey, its value must be present in the receiver's -availablePhotoCodecTypes array.
- If format is non-nil, your delegate must respond to -captureOutput:didFinishProcessingPhotoSampleBuffer:previewPhotoSampleBuffer:resolvedSettings:bracketSettings:error:.
- If processedFileType is specified, it must be present in -availablePhotoFileTypes and must support the format's specified kCVPixelBufferPixelFormatTypeKey (using -supportedPhotoPixelFormatTypesForFileType:) or AVVideoCodecKey (using -supportedPhotoCodecTypesForFileType:).
- The photoQualityPrioritization you specify may not be a greater number than the photo output's maxPhotoQualityPrioritization. You must set your AVCapturePhotoOutput maxPhotoQualityPrioritization up front.
@@ -178,6 +178,7 @@
Not all codecs can be used for all rawPixelFormatType values and this call will show all of the possible codecs available. To check if a codec is available for a specific rawPixelFormatType and rawFileType, one should use supportedRawPhotoCodecTypesForRawPhotoPixelFormatType:fileType:.
*/
@property(nonatomic, readonly) NSArray<AVVideoCodecType> *availableRawPhotoCodecTypes API_AVAILABLE(ios(18.0), macCatalyst(18.0), tvos(18.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+
/*!
@property appleProRAWSupported
@abstract
@@ -298,6 +299,7 @@
If you wish to capture a raw photo for storage using a Bayer RAW or Apple ProRAW pixel format and to be stored in a file container, such as DNG, you must ensure that the codec type you request is valid for that file and pixel format type. If no RAW codec types are supported for a given file type and/or pixel format type, an empty array is returned. If you have not yet added your receiver to an AVCaptureSession with a video source, an empty array is returned.
*/
- (NSArray<AVVideoCodecType> *)supportedRawPhotoCodecTypesForRawPhotoPixelFormatType:(OSType)pixelFormatType fileType:(AVFileType)fileType API_AVAILABLE(ios(18.0), macCatalyst(18.0), tvos(18.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+
/*!
@method supportedRawPhotoPixelFormatTypesForFileType:
@abstract
@@ -313,6 +315,7 @@
*/
- (NSArray<NSNumber *> *)supportedRawPhotoPixelFormatTypesForFileType:(AVFileType)fileType API_AVAILABLE(ios(11.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos);
+
/*!
@enum AVCapturePhotoQualityPrioritization
@abstract
@@ -331,6 +334,7 @@
AVCapturePhotoQualityPrioritizationQuality = 3,
} API_AVAILABLE(macos(13.0), ios(13.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
/*!
@property maxPhotoQualityPrioritization
@abstract
@@ -716,6 +720,7 @@
*/
@property(nonatomic, getter=isResponsiveCaptureEnabled) BOOL responsiveCaptureEnabled API_AVAILABLE(ios(17.0), macos(14.0), macCatalyst(17.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
/*!
@enum AVCapturePhotoOutputCaptureReadiness
@abstract
@@ -740,6 +745,7 @@
AVCapturePhotoOutputCaptureReadinessNotReadyWaitingForProcessing = 4,
} NS_SWIFT_NAME(AVCapturePhotoOutput.CaptureReadiness) API_AVAILABLE(ios(17.0), macos(14.0), macCatalyst(17.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+
/*!
@property captureReadiness
@abstract
@@ -851,7 +857,7 @@
The AVCapturePhotoSettings.uniqueID of the settings passed to -startTrackingCaptureRequestUsingPhotoSettings:.
@discussion
- Tracking automatically stops when -[AVCapturePhotoOutput capturePhotoWithSettings:delegate] is called with a photo settings objects with the same or a newer uniqueID, but in cases where an error or other condition prevents calling -capturePhotoWithSettings:delegate tracking should be explictly stopped to ensure the captureReadiness value is up to date. When called on the main queue the delegate callback is invoked synchronously before returning to ensure shutter availability is updated immediately.
+ Tracking automatically stops when -[AVCapturePhotoOutput capturePhotoWithSettings:delegate] is called with a photo settings objects with the same or a newer uniqueID, but in cases where an error or other condition prevents calling -capturePhotoWithSettings:delegate tracking should be explicitly stopped to ensure the captureReadiness value is up to date. When called on the main queue the delegate callback is invoked synchronously before returning to ensure shutter availability is updated immediately.
*/
- (void)stopTrackingCaptureRequestUsingPhotoSettingsUniqueID:(int64_t)settingsUniqueID NS_SWIFT_NAME(stopTrackingCaptureRequest(using:));
@@ -1296,6 +1302,7 @@
One can specify desired format properties of the RAW file that will be created. Currently only the key AVVideoAppleProRAWBitDepthKey is allowed and the value to which it can be set should be from 8-16. The AVVideoCodecKey must be present in the receiver's -availableRawPhotoCodecTypes array as well as in -supportedRawPhotoCodecTypesForRawPhotoPixelFormatType:fileType:. AVVideoQualityKey (NSNumber in range [0.0,1.0]) can be optionally set and a value between [0.0,1.0] will use lossy compression with lower values being more lossy resulting in smaller file sizes but lower image quality, while a value of 1.0 will use lossless compression resulting in the largest file size but also the best quality.
*/
@property(nonatomic, copy, nullable) NSDictionary<NSString *, id> *rawFileFormat API_AVAILABLE(ios(18.0), macCatalyst(18.0), tvos(18.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+
/*!
@property processedFileType
@abstract
@@ -2201,6 +2208,7 @@
@end
+
/*!
@enum AVCaptureLensStabilizationStatus
@abstract
@@ -2224,6 +2232,7 @@
AVCaptureLensStabilizationStatusOutOfRange = 3,
AVCaptureLensStabilizationStatusUnavailable = 4,
} API_AVAILABLE(ios(11.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+
API_AVAILABLE(macos(10.15), ios(11.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
@interface AVCapturePhoto (AVCapturePhotoBracketedCapture)
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureReactions.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureReactions.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureReactions.h 2025-04-19 03:21:26
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureReactions.h 2025-05-28 07:58:24
@@ -35,7 +35,6 @@
*/
AVF_EXPORT AVCaptureReactionType AVCaptureReactionTypeThumbsDown API_AVAILABLE(macos(14.0), ios(17.0), macCatalyst(17.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
-
/*!
@constant AVCaptureReactionTypeBalloons
@abstract
@@ -91,7 +90,7 @@
Reports the state of a reaction performed on an AVCaptureDevice.
@discussion
- AVCaptureReactionEffectState may be obtained by calling -[AVCaptureDevice reactionEffectsInProgress]. When -[AVCaptureDevice canPerformReactionEffects] returns YES, new entries are added either by calling -[AVCaptureDevice performReactionEffect:], or by gesture detection in the capture stream when AVCaptureDevice.reactionEffectGesturesEnabled. The effect rendering is done before frames are given to the capture client, and these status objects let you know when these effects are performed.
+ AVCaptureReactionEffectState may be obtained by calling -[AVCaptureDevice reactionEffectsInProgress]. When -[AVCaptureDevice canPerformReactionEffects] returns YES, new entries are added either by calling -[AVCaptureDevice performEffectForReaction:], or by gesture detection in the capture stream when AVCaptureDevice.reactionEffectGesturesEnabled. The effect rendering is done before frames are given to the capture client, and these status objects let you know when these effects are performed.
*/
API_AVAILABLE(macos(14.0), ios(17.0), macCatalyst(17.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
@interface AVCaptureReactionEffectState : NSObject
@@ -124,7 +123,6 @@
@property(nonatomic, readonly) CMTime endTime API_AVAILABLE(macos(14.0), ios(17.0), macCatalyst(17.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
@end
-
NS_ASSUME_NONNULL_END
diff -ruN /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSession.h /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSession.h
--- /Applications/Xcode_16.4.0.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSession.h 2025-04-19 03:21:26
+++ /Applications/Xcode_26.0.0-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSession.h 2025-05-30 23:05:04
@@ -83,6 +83,8 @@
An interruption caused when the app is running in a multi-app layout, causing resource contention and degraded recording quality of service. Given your present AVCaptureSession configuration, the session may only be run if your app occupies the full screen.
@constant AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableDueToSystemPressure
An interruption caused by the video device temporarily being made unavailable due to system pressure, such as thermal duress. See AVCaptureDevice's AVCaptureSystemPressure category for more information.
+ @constant AVCaptureSessionInterruptionReasonSensitiveContentMitigationActivated
+ An interruption caused by a SCVideoStreamAnalyzer when it detects sensitive content on an associated AVCaptureDeviceInput. To resume your capture session, call your analyzer's `continueStream` method.
*/
typedef NS_ENUM(NSInteger, AVCaptureSessionInterruptionReason) {