Support HDR video playback, editing, and export in your app – Discover

Support HDR video playback, editing, and export in your app – Discover

  • Apple
  • February 12, 2021
  • No Comment
  • 92
  • 15 minutes read


Four film strip icons with a pencil, arrows, a play button and a star button, on a blue / green background.

You can help people create more vivid and real videos when you support high dynamic range (HDR) in your application. And when you support HDR with Dolby Vision, people with iPhone 12 or iPhone 12 Pro can go even further and record, edit, and play movie-quality videos directly from their device. Dolby Vision adjustment is offered dynamically in each frame, keeping the original look of the original shots.

Here’s how to provide the best HDR video playback, editing, and exporting experience.

Start with HDR video

Your application must be compatible with iOS 14.1 or later to take advantage of HDR video. To get started, we recommend reviewing a few WWDC sessions that provide a good overview of the process.

Export HDR media to your application with AVFoundation

Edit and play HDR videos with AVFoundation


Note: The iPhone 12 and iPhone 12 Pro record HDR video in Dolby Vision Profile 8.4, cross-compatibility identifier 4 (HLG) format, using a HEVC (10-bit) codec. This format is designed to be HLG compliant, allowing existing HEVC decoders to be decoded as HLG. The Camera application records the video as a QuickTime (QTFF) file format movie (.mov extension). Signaling for Dolby Vision in a QTFF movie is similar to signaling in Dolby Vision Streams within the ISO-based media file format.


Learn more about Dolby Vision profiles

Learn more about Dolby Vision levels

Learn more about Dolby Vision Streams

Supports HDR video playback in your application

Both iOS and macOS support HDR video playback on all eligible devices. Use eligibleForHDRPlayback activated AVPlayer to check for HDR playback support on your current device. In general, the classes AVPlayer, AVPlayerlayero AVSampleBufferDisplayLayer can be used to play Dolby Vision video. If your application uses AVPlayeryou do not need to add anything additional to your code: The AVFoundation framework automatically sets up an HDR playback channel to manage Dolby Vision Profile 8.4 if it detects an asset in Dolby Vision and the device supports HDR playback.

If your application usesAVSampleBufferDisplayLayer to render the video, make sure that the sample buffers passed to the display layer of the sample buffer have appropriate formats for HDR and that they carry metadata per frame from Dolby Vision Profile 8.4. These sample buffers must have a bit depth of 10 bits or more. A commonly used 10-bit format is the Y’CbCr 4: 2: 0 video range, represented by kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange. The associates OSType for this pixel format is ’x420’.

If your sample buffers are decoded using VTDecompressionSessionyou can bring Dolby Vision Profile 8.4 frame metadata to buffers using kVTDecompressionPropertyKey_PropagatePerFrameHDRDisplayMetadata. This value is true by default.

Asset inspection
AVMediaCharacteristic offers options for specifying media type characteristics, such as whether a video includes HDR metadata. You can use the Swift media feature containsHDRVideo to identify whether a segment of a track contains HDR so that your application can render it correctly. In Objective-C, you can use AVMediaCharacteristicContainsHDRVideodefined a AVMediaFormat.h.

After loading the track property using the Swift method loadValuesAsynchronously(forKeys:completionHandler:)you can get HDR tracks using tracks(withMediaCharacteristic:). Here’s how to get all the HDR tracks you want:

let hdrTracks = asset.tracks(withMediaCharacteristic: .containsHDRVideo)

Similarly, you can use the Objective-C method loadValuesAsynchronouslyForKeys:completionHandler: to load the track property and get the HDR tracks with the method tracksWithMediaCharacteristic:like this:

NSArray<AVAssetTrack *> *hdrTracks 
= [asset tracksWithMediaCharacteristic:AVMediaCharacteristicContainsHDRVideo];

He hasMediaCharacteristic(_:) The method can be used to track media features, such as HDR media type, format descriptions, or explicit tagging. For example:

if track.hasMediaCharacteristic(.containsHDRVideo){

}

In Objective-C, you can use the same hasMediaCharacteristic: method for explicit labeling, as shown here:

if([track hasMediaCharacteristic:AVMediaCharacteristicContainsHDRVideo]){
    
}

Supports HDR video editing and preview in your application

To add HDR content editing to your application, use AVVideoComposition. If you use the built-in composer, you can also use the Swift initializer init(asset:applyingCIFiltersWithHandler:) or the Objective-C initializer videoCompositionWithAsset:applyingCIFiltersWithHandler: with built-in CIFilters to easily incorporate an HDR editing channel into your application.

Custom composers can also support HDR content: you can use supportsHDRSourceFrames property to indicate HDR capability. For Objective-C, the supportsHDRSourceFrames the property is part of the AVVideoCompositing protocol defined a AVVideoCompositing.h.

If your custom composer is to work in 10-bit HDR pixel formats, you’ll need to select the pixel buffer attributes that your composer can accept as input using the sourcePixelBufferAttributes property. For Objective-C, this property is in AVVideoCompositing.h. The value of this property is a dictionary containing pixel buffer attribute keys defined in CoreVideo header file CVPixelBuffer.h.

In addition, to create new buffers to process, you will need the correct pixel buffer attributes required by the video composer. For this particular purpose, use the property requiredPixelBufferAttributesForRenderContext.

If your application provides video preview during editing, changing the pixel values ​​may invalidate existing dynamic video metadata and usage. Because Dolby Vision Profile 8.4 metadata is completely transparent, you can use it AVPlayerItem to delete invalid metadata during preview-only scenarios, as well as to update dynamic metadata during export to reflect changes in video content.

You can use the appliesPerFrameHDRDisplayMetadata owned by AVPlayerItem , which is true by default. In Objective-C, the default property is YES and can be found in AVPlayerItem.h.

By default, AVFoundation will try to use Dolby Vision metadata for a video, but you can tell your app to ignore it – just set the appliesPerFrameHDRDisplayMetadata owned by AVPlayerItem false in Swift, or NO in Objective-C. If your application is using VTDecompressionSession API of VideoToolboxyou can turn off Dolby Vision ringtone assignment with kVTDecompressionPropertyKey_PropagatePerFrameHDRDisplayMetadata. To use this property in C or Objective-C, be sure to include VideoToolbox in the frame header VTDecompressionProperties.h.

Supports HDR export to your application

You can support exporting HDR video to your application while working AVAssetWriter and HEVC presets.

Discover the default values ​​and AVAssetExportSession
All HEVC presets have been updated to support HDR. The output format will match the source format, so if the source file is Dolby Vision Profile 8.4, the exported movie will also be Dolby Vision Profile 8.4. If you need to change the output format, you can use it AVAssetWriter.


Note: H.264 defaults will convert HDR to Standard Dynamic Range (SDR).


In order to preserve Dolby Vision Profile 8.4 during export AVAssetWriteryou must choose an appropriate output format, Dolby Vision compatible color properties, and a 10-bit profile level.

To get started, consider this query supportedOutputSettingsKeys(for:) and Swift o supportedOutputSettingsKeysForConnection: Objective-C provides a list of output configuration keys that are compatible with your current device.

For Dolby Vision export, the key to the video output configuration dictionary AVVideoCompressionPropertiesKey allows you to control bit rate, B-frame delivery, I-frame interval, and codec quality. The value associated with this key is an instance of NSDictionary. For Objective-C, this key is in AVVideoSettings.h.

For example, a video output configuration dictionary for Dolby Vision on Swift would contain these key / value pairs:

let videoOutputSettings: [String: Any] = [
    AVVideoCodecKey: AVVideoCodecType.hevc,
    AVVideoProfileLevelKey: kVTProfileLevel_HEVC_Main10_AutoLevel,
    AVVideoColorPropertiesKey: [
        AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020,
        AVVideoTransferFunctionKey: AVVideoTransferFunction_ITU_R_2100_HLG,
        AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020
    ],
    AVVideoCompressionPropertiesKey: [
        kVTCompressionPropertyKey_HDRMetadataInsertionMode: kVTHDRMetadataInsertionMode_Auto
    ]
]

With Objective-C, your video output configuration dictionary would contain the same key / value pairs:

NSDictionary<NSString*, id>* videoOutputSettings = @{
    AVVideoCodecKey: AVVideoCodecTypeHEVC,
    AVVideoProfileLevelKey: (__bridge NSString*)kVTProfileLevel_HEVC_Main10_AutoLevel,
    AVVideoColorPropertiesKey: @{
        AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020,
        AVVideoTransferFunctionKey: AVVideoTransferFunction_ITU_R_2100_HLG,
        AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020
    },
    AVVideoCompressionPropertiesKey: @{
        (__bridge NSString*)kVTCompressionPropertyKey_HDRMetadataInsertionMode: (__bridge NSString*)kVTHDRMetadataInsertionMode_Auto
    }
};

In Objective-C, the key kVTCompressionPropertyKey_HDRMetadataInsertionMode and value kVTHDRMetadataInsertionMode_Auto are in VTDecompressionProperties.h.

In addition to defining key / value pairs, make sure that the pixel buffers are presented AVAssetWriterInput are a 10-bit Y’CbCr 4: 2: 0 video range represented by an OST “x420” type.

You can choose to use another AVAssetReader o AVAssetWriter export model. In this case, you can use the VideoToolbox property kVTCompressionPropertyKey_PreserveDynamicHDRMetadata and set it to kCFBooleanFalse or false for C / Objective-C or Swift respectively. When you configure the VideoToolbox property, AVAssetWriter will automatically recalculate Dolby Vision Profile 8.4 metadata to export the file. This should be done as your application modifies the output frames of the file AVAssetReader.

Resources

Learn more about AVFoundation

AV Foundation

Video toolbox

Learn more about Dolby Vision profiles



Source link

Related post

EDUCAUSE 2022: How Data Collection Can Improve Student and Faculty IT Support

EDUCAUSE 2022: How Data Collection Can Improve Student and…

At Indiana University, Gladdin said, to make life easier for students and faculty, they implemented a course template for the Canvas…
UGC, AICTE warn students against online PhD programmes offered by EdTech platforms | Latest News India

UGC, AICTE warn students against online PhD programmes offered…

The University Grants Commission (UGC) and the All India Council for Technical Education (AICTE) on Friday issued a joint advisory against…
UGC, AICTE warn students against online PhD programmes offered by EdTech platforms | Latest News India

UGC, AICTE warn students against online PhD programmes offered…

The University Grants Commission (UGC) and the All India Council for Technical Education (AICTE) on Friday issued a joint advisory against…

Leave a Reply

Your email address will not be published.