Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

Playback Issues for DRM content when sending CMCD
Since iOS and tvOS 18, CMCD can now be automatically sent by AVPlayer (https://developer.apple.com/streaming/Whats-new-HLS.pdf). However, after enabling CMCD, our streams occasionally fail with the following error: CoreMediaErrorDomain Error -17383 This issue appears to affect only DRM-protected (FairPlay) streams so far. We activate CMCD via the resource loader of an AVURLAsset, before assigning the item to an AVPlayer. Unfortunately, we haven’t found a reliable way to reproduce the issue, and we’ve been unable to gather any useful diagnostic information. Has anyone else observed this behavior when enabling CMCD on FairPlay streams?
3
0
441
Oct ’25
Support for encrypted Mp4 on Apple devices.
Hello, hope everybody is doing well. I have some reels (of aspect ratio 9X16) content, which I want to playback on iOS phones. My question is does AV player support out of box playback of encrypted Mp4. Please note, this is not HLS fMp4, rather unfragmented Mp4 content. If it is supported, what algorithm of encryption shall be used? Please let me know.
1
0
110
Mar ’25
AVAssetWriterInputTaggedPixelBufferGroupAdaptor Hanging With Tagged Buffers
We've successfully implemented an AVAssetWriter to produce HLS streams (all code is Objective-C++ for interop with existing codebase) but are struggling to extend the operations to use tagged buffers. We're starting to wonder if the tagged buffers required for an MV-HEVC signal are fully supported when producing HLS segments in a live-stream setting. We generate a live stream of data using something like: UTType *t = [UTType typeWithIdentifier:AVFileTypeMPEG4]; m_writter = [[AVAssetWriter alloc] initWithContentType:t]; // - videoHint describes HEVC and width/height // - m_videoConfig includes compression settings and, when using MV-HEVC, // the correct keys are added (i.e. kVTCompressionPropertyKey_MVHEVCVideoLayerIDs) // The app was throwing an exception without these which was // useful to know when we got the configuration right. m_video = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:m_videoConfig sourceFormatHint:videoHint]; For either path we're producing CVPixelBufferRefs that contain the raw pixel information (i.e. 32BGRA) so we use an adapter to make that as simple as possible. If we use a single view and a AVAssetWriterInputPixelBufferAdaptor things work out very well. We produce segments and the delegate is called. However, if we use the AVAssetWriterInputTaggedPixelBufferGroupAdaptor as exampled in the SideBySideToMVHEVC demo project, things go poorly. We create the tagged buffers with something like: CMTagCollectionRef collections[2]; CMTag leftTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)0), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_LeftEye) }; CMTagCollectionCreate( kCFAllocatorDefault, leftTags, 2, &(collections[0]) ); CMTag rightTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)1), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_RightEye) }; CMTagCollectionCreate( kCFAllocatorDefault, rightTags, 2, &(collections[1]) ); CFArrayRef tagCollections = CFArrayCreate( kCFAllocatorDefault, (const void **)collections, 2, &kCFTypeArrayCallBacks ); CVPixelBufferRef buffers[] = {*b, *alt}; CFArrayRef b = CFArrayCreate( kCFAllocatorDefault, (const void **)buffers, 2, &kCFTypeArrayCallBacks ); CMTaggedBufferGroupRef bufferGroup; OSStatus res = CMTaggedBufferGroupCreate( kCFAllocatorDefault, tagCollections, b, &bufferGroup ); Perhaps there's something about this OBJC code that I've buggered up? Hopefully! Anyways, when I submit this tagged bugger group to the adaptor: if (![mvVideoAdapter appendTaggedPixelBufferGroup:bufferGroup withPresentationTime:pts]) { // report error... } Appending does not raise any errors - eventually it just hangs on us and we never return from it... Real Issue: So either: The delegate assigned to the AVAssetWriter doesn't fire its assetWriter callback which should produce the segments The adapter hangs on the appendTaggedPixelBufferGroup before a segment is ready to be completed (but succeeds for a number of buffer groups before this happens). This is the same delegate class that's assigned to the non multi view code path if MV-HEVC is turned off which works perfectly.
1
0
110
Apr ’25
FPS Certificate Re-issuance and Validity of Existing Certificate
Keywords: FairPlay, FPS Certificate, DRM, FairPlay Streaming, license server Hi all, We are currently using FairPlay Streaming in production and already have an FPS certificate in place. However, the passphrase for the existing FPS certificate has unfortunately been lost. We are now considering reissuing a new FPS certificate, and I would like to confirm a few points before proceeding: 1️⃣ If we reissue a new FPS certificate, will the existing certificate be automatically revoked? Or will it remain valid until its original expiration date? 2️⃣ Is it possible to have both the newly issued and the existing certificates valid at the same time? In other words, can we serve DRM licenses using either certificate depending on the packaging or client? 3️⃣ Are there any caveats or best practices we should be aware of when reissuing an FPS certificate? For example, would existing packaged content become unplayable, or would CDN/packaging server configurations need to be updated carefully? Since this affects our production environment, we would like to minimize any service disruption or compatibility issues. Unfortunately, when we contacted Apple support directly, we were advised to post this question here in the Forums for additional guidance. Any advice or experiences would be greatly appreciated! Thank you in advance.
1
0
187
Jun ’25
Disconnect from AirPlay device programmatically
Hello there, I'm trying to implement feature which uses AirPlay with Apple TV. I want to disconnect from the device programmatically when something happens. Under something I mean a situation when a user wants to stop broadcasting (for example close the PiP window on his phone). I use this snippet: try audioSession.setCategory(.playAndRecord, options: .defaultToSpeaker) try audioSession.setActive(true, options: .notifyOthersOnDeactivation) It works fine sometimes but not always (it works on iOS 18 but it doesn't on iOS 17 or ). So I thought it's a bug and create a ticker to feedback assistant (FB21220013). The support told me write a post on the forum.
2
0
320
Dec ’25
Facing issue with fairplay Streaming server SDK 26.0.0
I am trying to Build server for testing on Linux(Alma linux 9 VM) NAME="AlmaLinux" VERSION="9.7 (Moss Jungle Cat)" ID="almalinux" ID_LIKE="rhel centos fedora" VERSION_ID="9.7" PLATFORM_ID="platform:el9" PRETTY_NAME="AlmaLinux 9.7 (Moss Jungle Cat)" ANSI_COLOR="0;34" [azuki@AlmaDevVM ~]$ uname -m x86_64 I have tried the following steps: Before starting, ensured that Swift 6 installed. Referred https://www.swift.org/install/ for instructions. Build the library In Terminal, uses the following commands to compile the Swift library: cd Development/Key_Server_Module/Swift swift build -Xbuild-tools-swiftc -DTEST_CREDENTIALS After building the library, ran test cases to ensure the library behaves as expected. ALL unit tests are passing with the development credentials. • Since I was using an x86_64 machine: export LD_LIBRARY_PATH=./Sources/prebuilt/x86_64-unknown-linux-gnu/ Run all tests: swift test -Xbuild-tools-swiftc -DTEST_CREDENTIALS --disable-swift-testing Build the server Build the server: Apache Before starting, ensured the following: a. Installed Apache HTTPD and the dev tools. Using the following command for installation: yum install httpd httpd-devel redhat-rpm-config b. After this, integrated it into the Apache server environment with swift library that was built above. Used the following command to build the server using apxs: • Since I was using an x86_64 machine: apxs -i -a -c -Wl,-L${PWD}/.build/x86_64-unknown-linux-gnu/debug/ -Wl,-lswift_fpssdk -Wl,-L${PWD}/Sources/prebuilt/x86_64-unknown-linux-gnu -lfpscrypto -Wl,-R${PWD}/.build/x86_64-unknown-linux-gnu/debug server_setup/mod_fps.c c. Next, copied the dependent libraries to the Apache modules folder using these commands: • If using an x86_64 machine: cp Sources/prebuilt/x86_64-unknown-linux-gnu/libfpscrypto.so /usr/lib64/httpd/modules/libfpscrypto.so cp .build/x86_64-unknown-linux-gnu/debug/libswift_fpssdk.so /usr/lib64/httpd/modules/libswift_fpssdk.so d. Configuring Apache HTTPD Configured Apache HTTPD by adding the module and handler to your Apache HTTPD configuration (/etc/httpd/conf/httpd.conf). Note that the apxs command may automatically add the LoadModule line in the previous step. Listen 8080 LoadFile /usr/lib64/httpd/modules/libfpscrypto.so LoadFile /usr/lib64/httpd/modules/libswift_fpssdk.so LoadModule fps_module /usr/lib64/httpd/modules/mod_fps.so <Location "/fps"> SetHandler fps_handler Copy the credentials to the Apache modules folder. cp -r ../credentials /usr/lib64/httpd/modules/ export FPS_CERT_PATH= /usr/lib64/httpd/modules/credentials/test_certificates.json e. Run your server You can run the Apache HTTPD server with the configured module by using the following command: httpd -D FOREGROUND No issues see till step. Get SDK version [azuki@AlmaDevVM Key_Server_Module]$ curl localhost:8080/fps/v 26.0.0 But when i try to generate license [azuki@AlmaDevVM Key_Server_Module]$ curl -d ../Test_Inputs/iOS/spc_ios_hd_lease_2048.json localhost:8080/fps {"fairplay-streaming-response":{"create-ckc":[{"id":1,"status":-42601}]}} Can you please suggest what i might be missing here?
8
0
941
2d
How to consume video from an RTSP service?
Hi,It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.Any ideas, tips, advice would be greatly appreciated.Thanks,Frank
9
1
16k
Oct ’25
macOS Sonoma 'Cannot Decode' HLS Video
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me: Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode} Thanks!
2
1
659
Mar ’25
Regarding FPS SDK Upgrade
Hi Apple Team, We have integrated FairPlay Streaming Server SDK v3 into our MDRM platform since 2017, the system works stable and stayed untouched. As you know, both Widevine and Playready have requirements to upgrade the Server SDK regularly. We want to know if Apple imposes similar requirements for upgrading the FPS SDK, or if we may continue using the old one without any updates. Thanks for your support!
0
1
379
Feb ’25
How to delete FPS Certificate from Apple developer account
Hello All, I am looking for assistance with our FairPlay Streaming (FPS) certificates. We are in the process of migrating to a new video streaming vendor and need to create a new FPS certificate using SDK 4. However, we have reached the limit of allowed FPS certificates in our account and cannot create a new one. Issue Details: • We currently have two FPS certificates active in our developer account. • One of these was created using SDK 5, but our new vendor (Mux) requires an FPS certificate based on SDK 4. • Since Apple does not allow deleting FPS certificates from the developer portal, we are unable to create a new SDK 4 certificate. • We kindly request Apple to revoke one of our existing FPS certificates to allow us to generate a new SDK 4 certificate. Request: We would greatly appreciate it if you could assist us on how to delete one of our existing FPS certificates so that we can proceed with creating a new SDK 4 certificate for our vendor integration. Thank you for your support.
1
1
640
Oct ’25
CoreMediaErrorDomain -42709 error
Hello, I am developing a video streaming service that uses FairPlay. Since around February 20th, we have started receiving reports of CoreMediaErrorDomain -42709 errors. Unfortunately, there is no documentation from Apple that explains what this error means, so we are not sure how to address or fix the issue. Most of the users who reported this error are using iOS 18.2.1 and iOS 18.3.1. Could you please advise on what we should check or how we might resolve this error?
1
1
562
Mar ’25
Assistance Needed: CoreMediaErrorDomain Error -12971
Hello Apple Developer Community, I am trying to play an HLS stream using the React Native Video player (underneath it's using AvPlayer). I am able to play the stream smoothly, but in some cases the player can not play the stream properly. Behaviour: react-native-video: I am getting the below error. Error details from react-native-video player: Error Code: -12971 Domain: CoreMediaErrorDomain Localised Description: The operation couldn’t be completed. (CoreMediaErrorDomain error -12971.) Target: 2457 The error does not provide a specific failure reason or recovery suggestion, which makes troubleshooting challenging. AvPlayer on native iOS project: Video playback stopped after playing a few seconds. AVPlayer configuration: player.currentItem?.preferredForwardBufferDuration = 1 player.automaticallyWaitsToMinimizeStalling = true N.B.: The same buffer duration is working perfectly for others. Stream properties: video resolution: 1280 x 720 I have attached an overview report generated from MediaStreamValidator. I would appreciate any insights or suggestions on how to address this error. Has anyone in the community experienced a similar issue or have any advice on potential solutions? Thank you for your help!
0
1
169
Apr ’25
Issues with "AVMetricEventStreamPublisher Discover Media Performance Metrics in AVFoundation" Example Code
Hi everyone! I’ve been working with AVFoundation and trying to use the AVMetricEventStreamPublisher to discover media performance metrics, as described in the Apple documentation. https://developer.apple.com/cn/videos/play/wwdc2024/10113/?time=508 However, when following the example code, I’m not getting the expected results. The performance metrics for both audio and video don’t seem to be captured properly. Has anyone successfully used this example code? If so, could you share your experience or any solutions you’ve found? Any tips or insights would be greatly appreciated. Thanks in advance! Ps. the example code: AVPlayerItem *item = ... AVMetricEventStream *eventStream = [AVMetricEventStream eventStream]; id subscriber = [[MyMetricSubscriber alloc] init]; [eventStream setSubscriber:subscriber queue:mySerialQueue] [eventStream subscribeToMetricEvent:[AVMetricPlayerItemLikelyToKeepUpEvent class]]; [eventStream subscribeToMetricEvent:[AVMetricPlayerItemPlaybackSummaryEvent class]]; [eventStream addPublisher:item];
2
0
176
Nov ’25
CoreMediaErrorDomain -12888: Bandwidth down-stepping when using 2sec segment duration
the problem is when using HLS live stream with AVPlayer on iOS/ tvOS the player chooses first highest bandwidth then slowly steps down to lowest (within 1-3min) and eventually steps up again then repeats to step down. the AVPlayer error log sends events: errorStatusCode: -12888, errorDomain: Optional("CoreMediaErrorDomain"), errorComment: Optional("The operation couldn't be completed. (CoreMediaErrorDomain error -12888 - Playlist File unchanged for longer than 1.5 * target duration we use standard segments in CMAF format, 2sec duration #EXTM3U #EXT-X-VERSION:6 #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:147065903 #EXT-X-MAP:URI="video_1_4660000_init.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2" #EXT-X-PROGRAM-DATE-TIME:2025-04-30T12:51:07 #EXTINF:2.000, video_1_4660000_t17460174670001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174690001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174710001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 when using 6sec segments the player stays stable at highest bandwidth. is there a way to avoid this error? in AVPlayer or HLS configuration?
2
0
322
May ’25
AVAssetResourceLoaderDelegate and CoreMediaErrorDomain -12881 When Playing HLS Audio
I am developing an app that plays HLS audio. When using AVPlayerItem with AVURLAsset, can AVAssetResourceLoaderDelegate correctly handle HLS segments? My goal is to use AVAssetResourceLoaderDelegate to add authentication HTTP headers when accessing HLS .m3u8 and .ts files. I can successfully download the files, but playback fails with errors. Specifically, I am observing the following cases: A. AVAssetResourceLoaderDelegate is canceled, and CoreMediaErrorDomain -12881 occurs In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest In didReceiveData, call dataRequest respondWithData resourceLoader didCancelLoadingRequest is called CoreMediaErrorDomain -12881 occurs B. CoreMediaErrorDomain -12881 occurs In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest In connection didReceiveData, buffer all received data until the end In connectionDidFinishLoading, pass the buffered data to respondWithData Call loadingRequest finishLoading CoreMediaErrorDomain -12881 occurs In both cases, dataRequest.requestsAllDataToEndOfResource is YES. For this use case, I am not using AVURLAssetHTTPHeaderFieldsKey because I need to apply the most up-to-date authentication data at the moment each file is accessed. I would appreciate any advice or suggestions you might have. Thank you in advance!
0
1
160
Aug ’25
New FairPlay Keys
Hello, My company has an in-store app with FPS SDK 4.x (1024) keys. We've handed those keys over to a trusted third-party and we do not have them. We've been in-store for several years. The person that created the keys in our organization mistakenly stored them encrypted to our third-party's PGP keys, so we cannot decrypt them, and the third party also has no mechanism to provide us with the keys even though it is in their runtime environment. They only have secure mechanisms for us to upload keys onto their servers. We are trying to migrate to a different third-party DRM provider, and would like to obtain new keys. Unfortunately, the developer portal won't let me create new keys, saying that we have exceeded the number of keys allowed, which I assume is one. Additionally, the new DRM provider can only support SDK 4.x keys, and it appears that we can only request SDK 5.x keys on the Apple Developer portal, as the SDK 4.0 option is grayed out. Regardless, it seems that we are not able to request any keys. We've submitted a request to the support e-mail address and received an automated e-mail that the response should take a few days, but may take longer on occasion. It's now been a month. The e-mail says that the reply address is not monitored. Is there any way we can accelerate this? Thank you, Carlos
0
1
277
Aug ’25
-46250 error when calling `makeSecureTokenForExpirationDateOfPersistableContentKey`
Hi there, We're working on offline playback of DRM tracks. The persistent keys (also known as track licenses) for offline playback are stored locally on the device and are served from cache when a user initiates playback of a downloaded track. Our persistent keys have a limited validity time and need to be refreshed when they expire. To prevent a situation where a persistent key expires while the user is offline, we've decided to eagerly refresh these keys one week before their expiration date. To make that happen we need to be able to obtain the expiration date of the given track license. We've been attempting to use the makeSecureTokenForExpirationDateOfPersistableContentKey API to facilitate this process. The documentation states that this API returns a secret token representing the persistent key, which we can then exchange with our license server for the expiration date: https://developer.apple.com/documentation/avfoundation/avcontentkeysession/makesecuretokenforexpirationdate(ofpersistablecontentkey:completionhandler:)?language=objc However, every time we call makeSecureTokenForExpirationDateOfPersistableContentKey, we receive an error with code -46250. We haven't been able to find any public references or documentation for this specific error code, which is preventing us from troubleshooting the issue. We are conducting our tests on a physical device, as the simulator does not support FairPlay playback. We don't use dual expiry approach. Is our understanding of how to obtain the expiration timestamp correct? Are we using the makeSecureTokenForExpirationDateOfPersistableContentKey API as it was intended? What does the -46250 error code mean, and what steps should we take to fix our FairPlay implementation to make this work? Thanks in advance for your assistance.
2
1
301
1w
AVPlayer HLS High Bitrate Problem on Apple TV HD (A1625) tvOS 26
Hello, We have Video Stream app. It has HLS VOD Content. We supply 1080p, 4K Contents to users. Users were watching 1080p content before tvOS 26. Users can not watch 1080p content anymore when they update to tvOS 26. We have not changed anything at HLS playlist side and application version. This problem only occurs on Apple TV 4th Gen (A1625) tvOS 26 version. There is no problem with newer Apple TV devices. Would you help to resolve problem? Thanks in advance
2
1
505
Oct ’25