Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Created

Inquiry regarding CoreMediaErrorDomain Code=-12880 in LL-HLS playback
Hello, I am developing a custom player SDK based on AVPlayer. While testing LL-HLS streams, I intermittently encounter the following error: Error Domain=CoreMediaErrorDomain Code=-12880 Since I cannot find documentation for this specific code, could you please clarify its meaning? Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored. Any insights would be appreciated. Thank you.
1
0
164
Feb ’26
Clarification on SPC Version 3 Availability and Requirements (SDK 26 Certificate Bundle)
Hello, I’m using a valid certificate bundle generated with SDK 26 (combined RSA‑1024 + RSA‑2048). However, all my devices currently still generate SPC v2 during playback, including my iPhone 16 under iOS 26.2. Apple staff mentioned that future iOS versions will send SPC v3 when using an SDK 26 certificate bundle. Could you please clarify: Which iOS/macOS versions will first support SPC v3? Are there any additional client‑side requirements (Safari version, playback APIs, headers, etc.) to trigger SPC v3? Is there any way to test SPC v3 today, e.g., using beta builds? Thank you!
1
1
443
Feb ’26
A mistake in FairPlay Streaming SDK 26 sample code on comparing ProtocolVersionUsed value?
Hello, I am reviewing the sample codes of FairPlay Streaming SDK 26 and there was a place where I think is a mistake. The codes are for the server, for both Swift and Rust codes. There is an if statement which compares "ProtocolVersionUsed"(spcData.versionUsed) and SPCVersion1 constant, though "ProtocolVersionUsed" and SPC Version is a different thing, so shouldn't it be using a different constant value? [createContentKeyPayload.swift] // Fallback to version 1 if content can have encrypted slice headers, which need to be decrypted separately. Slice headers are not encrypted when using CBCS. if serverCtx.spcContainer.spcData.versionUsed == base_constants.SPCVersion.v1.rawValue && [createContentKeyPayload.rs] // Fallback to version 1 if content can have encrypted slice headers, which need to be decrypted separately. Slice headers are not encrypted when using CBCS. if (serverCtx.spcContainer.spcData.versionUsed == SPCVersion::v1 as u32) && Thank you.
0
0
90
Feb ’26
The audio of FairPlay protected content can be captured - Safari on iOS
Hi, Has anyone been able to protect the audio part of FairPlay protected content from being captured as part of screen recording on Safari/iOS (PWA and/or online web app)? We have tried many things but could not prevent the audio from being recorded. Same app and content on Safari/Mac does not allow audio to be recorded. Any tips?
0
0
139
4w
Inquiry regarding CoreMediaErrorDomain Code=-15517 during LL-HLS Live Playback
Hello, I am currently developing a live streaming application using AVPlayer to play LL-HLS (Low-Latency HLS) content. During our testing phase, we consistently encountered the following error in the logs: CoreMediaErrorDomain Code=-15517 The challenge we are facing is that the error description is quite vague. It only provides cryptic messages such as "Key not found" or "No value information," which makes it extremely difficult to identify the root cause or perform a deep-dive analysis. I have searched through the official Apple Developer documentation and technical notes, but I couldn’t find any specific reference to what Code -15517 signifies in the context of LL-HLS or CoreMedia. Regarding this issue, I have the following questions: What is the specific meaning of this error code (-15517)? Does it relate to missing tags in the HLS manifest, or is it an internal state issue within the AVPlayer stack? Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored. Is there any additional logging or debugging tool you would recommend to further investigate "Key not found" issues in LL-HLS? Any insights or guidance from the community or Apple engineers would be greatly appreciated. Thank you in advance for your help.
1
0
174
3w
Swift Array Out of Bounds Crash in VTFrameProcessor when using VTLowLatencyFrameInterpolationParameters
Hi everyone, Our team is encountering a reproducible crash when using VTLowLatencyFrameInterpolation on iOS 26.3 while processing a live LL-HLS input stream. 🤖 Environment Device: iPhone 16 OS: iOS 26.3 Xcode: Xcode 26.3 Framework: VideoToolbox 💥 Crash Details The application crashes with the following fatal error: Fatal error: Swift/ContiguousArrayBuffer.swift:184: Array index out of range The stack trace highlights the following: VTLowLatencyFrameInterpolationImplementation processWithParameters:frameOutputHandler: Called from VTFrameProcessor.process(parameters:) Here is the simplified implementation block where the crash occurs. (Note: PrismSampleBuffer and PrismLLFIError are our internal custom wrapper types). // Create `VTFrameProcessorFrame` for the source (previous) frame. let sourcePTS = sourceSampleBuffer.presentationTimeStamp var sourceFrame: VTFrameProcessorFrame? if let pixelBuffer = sourceSampleBuffer.imageBuffer { sourceFrame = VTFrameProcessorFrame(buffer: pixelBuffer, presentationTimeStamp: sourcePTS) } // Validate the source VTFrameProcessorFrame. guard let sourceFrame else { throw PrismLLFIError.missingImageBuffer } // Create `VTFrameProcessorFrame` for the next frame. let nextPTS = nextSampleBuffer.presentationTimeStamp var nextFrame: VTFrameProcessorFrame? if let pixelBuffer = nextSampleBuffer.imageBuffer { nextFrame = VTFrameProcessorFrame(buffer: pixelBuffer, presentationTimeStamp: nextPTS) } // Validate the next VTFrameProcessorFrame. guard let nextFrame else { throw PrismLLFIError.missingImageBuffer } // Calculate interpolation intervals and allocate destination frame buffers. let intervals = interpolationIntervals() let destinationFrames = try framesBetween(firstPTS: sourcePTS, lastPTS: nextPTS, interpolationIntervals: intervals) let interpolationPhase: [Float] = intervals.map { Float($0) } // Create VTLowLatencyFrameInterpolationParameters. // This sets up the configuration required for temporal frame interpolation between the previous and current source frames. guard let parameters = VTLowLatencyFrameInterpolationParameters( sourceFrame: nextFrame, previousFrame: sourceFrame, interpolationPhase: interpolationPhase, destinationFrames: destinationFrames ) else { throw PrismLLFIError.failedToCreateParameters } try await send(sourceSampleBuffer) // Process the frames. // Using progressive callback here to get the next processed frame as soon as it's ready, // preventing the system from waiting for the entire batch to finish. for try await readOnlyFrame in self.frameProcessor.process(parameters: parameters) { // Create an interpolated sample buffer based on the output frame. let newSampleBuffer: PrismSampleBuffer = try readOnlyFrame.frame.withUnsafeBuffer { pixelBuffer in try PrismLowLatencyFrameInterpolation.createSampleBuffer(from: pixelBuffer, readOnlyFrame.timeStamp) } // Pass the newly generated frame to the output stream. try await send(newSampleBuffer) } 🙋 Questions Are there any known limitations or bugs regarding VTLowLatencyFrameInterpolation when handling live 60fps streams? Are there any undocumented constraints we should be aware of regarding source/previous frame timing, pixel buffer attributes, or how destinationFrames and interpolationPhase arrays must be allocated? Is a "warm-up" sequence recommended after startSession() before making the first process(parameters:) call?
1
0
444
2w
Unity iOS (Metal) → WebRTC (Unity WebRTC) video stream to remote Unity client: PeerConnection connects but receiver renders black frames
Hello! My name is Mason Prather. I'm a graduate student at Kennesaw State University and a Research Engineer working in XR environments through my Graduate Research Assistant role. I’m currently building a research prototype that connects a mobile companion application to a VR headset. The mobile application is built in Unity and deployed on iOS, and it streams video frames to a remote Unity client using WebRTC. Environment Device: iPhone 15 OS: iOS 26.3 (tested on physical device, not Simulator) Engine: Unity 2022.3.57f1 Graphics API: Metal Streaming Technology: WebRTC (Unity WebRTC package) Architecture: Mobile Unity app streaming video frames to a remote Unity client Receiver Device: Meta Quest Pro headset (Unity application) Networking: LAN (UDP discovery + TCP signaling) Video Source: Unity RenderTexture Goal The goal of the system is to allow a VR user to view media stored on their phone inside a VR environment. The iOS app: renders or captures media content converts frames into a WebRTC video track streams the video to the headset Current Status Connection setup works correctly. Observed behavior: Signaling connection successful ICE candidate exchange successful PeerConnection state becomes Connected Video track created successfully However, the receiving application displays black frames. iOS App Details The video source originates from a Unity RenderTexture. Inside the phone application: RenderTexture displays correctly Frames appear correct locally But the receiving peer does not display the frames. Relevant Components Unity WebRTC package iOS Metal rendering pipeline Custom TCP signaling LAN discovery via UDP Expected Behavior Rendered frames should transmit via WebRTC and appear on the remote device. Actual Behavior The remote video track is active, but the rendered frames appear black on the receiving client. Questions Are there known issues involving Unity WebRTC + iOS Metal texture capture? Are there specific pixel format requirements when streaming textures from Unity on iOS? Could the issue relate to texture readback limitations or GPU synchronization? I am more than happy to provide screenshots and console logs upon request. If anyone has experience streaming Unity video frames via WebRTC on iOS, I would greatly appreciate any guidance.
0
0
203
2w
On iOS 26, HLS alternate audio track selection behaves inconsistently
Summary On iOS 26, HLS alternate audio track selection behaves inconsistently on both VOD and live streams: the French track falls back to the DEFAULT=YES (English) track after manual selection, and in some cases switching to a non-default track appears to work but it is then impossible to switch back to English. Environment iOS version: 26 Players affected: native Safari on iOS 26 and THEOplayer (issue also reproducible on THEOplayer's own demo page) Stream type: HLS/CMAF with demuxed alternate audio renditions (CMFC container) Affected stream types: both VOD and live streaming Issue NOT present on iOS 17/18 Manifest #EXTM3U #EXT-X-VERSION:4 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-STREAM-INF:BANDWIDTH=8987973,AVERAGE-BANDWIDTH=8987973,VIDEO-RANGE=SDR,CODECS="avc1.640028",RESOLUTION=1920x1080,FRAME-RATE=29.970,AUDIO="program_audio" video_1080p.m3u8 #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="program_audio",LANGUAGE="en",ASSOC-LANGUAGE="en",NAME="English",AUTOSELECT=YES,DEFAULT=YES,CHANNELS="2",URI="audio_ENG.m3u8" #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="program_audio",LANGUAGE="de",ASSOC-LANGUAGE="de",NAME="Deutsch",AUTOSELECT=YES,DEFAULT=NO,CHANNELS="2",URI="audio_DEU.m3u8" #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="program_audio",LANGUAGE="fr",ASSOC-LANGUAGE="fr",NAME="Francais",AUTOSELECT=YES,DEFAULT=NO,CHANNELS="2",URI="audio_FRA.m3u8" Steps to Reproduce Load the HLS manifest (VOD or live) in Safari on iOS 26, or in any AVFoundation-backed player Start playback — English plays correctly as DEFAULT Manually select "Francais" from the audio track selector Observe that English audio continues playing (French does not play) In a separate scenario: manually select "Deutsch" — German plays correctly Attempt to switch back to English — English does not resume; audio remains on the previously selected track Expected behavior Selecting any track should immediately switch to that language Switching back to English (DEFAULT=YES) should work at any time Behavior should be consistent across VOD and live streams Actual behavior Two distinct anomalies observed, reproducible on both VOD and live streams, in both native Safari and THEOplayer: French-specific fallback: selecting the French track causes playback to fall back to English. This does not happen with German. Cannot return to English: in cases where a non-default track plays correctly, attempting to switch back to the DEFAULT=YES track (English) fails — the previous non-default track continues playing. The fact that the issue reproduces in native Safari confirms this is an AVFoundation/WebKit-level regression, not a third-party player bug. What we have already verified and ruled out LANGUAGE codes are BCP-47 compliant (en, fr, de) ✓ EXT-X-VERSION:4 is present ✓ Audio codec removed from STREAM-INF CODECS (video-only) ✓ ASSOC-LANGUAGE attribute added matching LANGUAGE value ✓ Container metadata verified via ffprobe: mdhd box correctly contains language tags (e.g. "fra") ✓ Audio segment content verified via ffplay: correct audio in each language file ✓ French audio source file contains correct French audio content ✓ Issue reproduces in native Safari on iOS 26, confirming it is not a THEOplayer-specific bug Issue does NOT reproduce on iOS 17/18 with the same manifest and segments Additional notes The VOD stream is packaged with AWS MediaConvert, CMAF output group, SEGMENTED_FILES, AAC-LC codec (mp4a.40.2), 128kbps, 48kHz stereo. English uses AudioTrackType ALTERNATE_AUDIO_AUTO_SELECT_DEFAULT; French and German use ALTERNATE_AUDIO_AUTO_SELECT. The live stream uses AWS MediaPackage with a similar CMAF/HLS output configuration. Please advise whether this is a known regression in AVFoundation on iOS 26 and whether a fix is planned.
1
0
90
1w
Generating a new FPS certificate (SDK 26) alongside an existing SDK 4 certificate
Hi, Our client currently has an FPS deployment certificate generated with SDK version 4 that is still actively used in production. They would like to generate an additional certificate using SDK version 26. Before doing so, they just want to confirm: Will the existing SDK 4 certificate remain unaffected and still visible in the Apple Developer portal? Any considerations they should keep in mind? Thanks! :)
0
0
98
4d
Generating a new FPS certificate (SDK 26) alongside an existing SDK 4 certificate
Hi, Our client currently has an FPS deployment certificate generated with SDK version 4 that is still actively used in production. They would like to generate an additional certificate using SDK version 26. Before doing so, they just want to confirm: Will the existing SDK 4 certificate remain unaffected and still visible in the Apple Developer portal? Any considerations they should keep in mind? Thanks!
0
0
117
4d
Inquiry regarding CoreMediaErrorDomain Code=-12880 in LL-HLS playback
Hello, I am developing a custom player SDK based on AVPlayer. While testing LL-HLS streams, I intermittently encounter the following error: Error Domain=CoreMediaErrorDomain Code=-12880 Since I cannot find documentation for this specific code, could you please clarify its meaning? Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored. Any insights would be appreciated. Thank you.
Replies
1
Boosts
0
Views
164
Activity
Feb ’26
Clarification on SPC Version 3 Availability and Requirements (SDK 26 Certificate Bundle)
Hello, I’m using a valid certificate bundle generated with SDK 26 (combined RSA‑1024 + RSA‑2048). However, all my devices currently still generate SPC v2 during playback, including my iPhone 16 under iOS 26.2. Apple staff mentioned that future iOS versions will send SPC v3 when using an SDK 26 certificate bundle. Could you please clarify: Which iOS/macOS versions will first support SPC v3? Are there any additional client‑side requirements (Safari version, playback APIs, headers, etc.) to trigger SPC v3? Is there any way to test SPC v3 today, e.g., using beta builds? Thank you!
Replies
1
Boosts
1
Views
443
Activity
Feb ’26
A mistake in FairPlay Streaming SDK 26 sample code on comparing ProtocolVersionUsed value?
Hello, I am reviewing the sample codes of FairPlay Streaming SDK 26 and there was a place where I think is a mistake. The codes are for the server, for both Swift and Rust codes. There is an if statement which compares "ProtocolVersionUsed"(spcData.versionUsed) and SPCVersion1 constant, though "ProtocolVersionUsed" and SPC Version is a different thing, so shouldn't it be using a different constant value? [createContentKeyPayload.swift] // Fallback to version 1 if content can have encrypted slice headers, which need to be decrypted separately. Slice headers are not encrypted when using CBCS. if serverCtx.spcContainer.spcData.versionUsed == base_constants.SPCVersion.v1.rawValue && [createContentKeyPayload.rs] // Fallback to version 1 if content can have encrypted slice headers, which need to be decrypted separately. Slice headers are not encrypted when using CBCS. if (serverCtx.spcContainer.spcData.versionUsed == SPCVersion::v1 as u32) && Thank you.
Replies
0
Boosts
0
Views
90
Activity
Feb ’26
The audio of FairPlay protected content can be captured - Safari on iOS
Hi, Has anyone been able to protect the audio part of FairPlay protected content from being captured as part of screen recording on Safari/iOS (PWA and/or online web app)? We have tried many things but could not prevent the audio from being recorded. Same app and content on Safari/Mac does not allow audio to be recorded. Any tips?
Replies
0
Boosts
0
Views
139
Activity
4w
Inquiry regarding CoreMediaErrorDomain Code=-15517 during LL-HLS Live Playback
Hello, I am currently developing a live streaming application using AVPlayer to play LL-HLS (Low-Latency HLS) content. During our testing phase, we consistently encountered the following error in the logs: CoreMediaErrorDomain Code=-15517 The challenge we are facing is that the error description is quite vague. It only provides cryptic messages such as "Key not found" or "No value information," which makes it extremely difficult to identify the root cause or perform a deep-dive analysis. I have searched through the official Apple Developer documentation and technical notes, but I couldn’t find any specific reference to what Code -15517 signifies in the context of LL-HLS or CoreMedia. Regarding this issue, I have the following questions: What is the specific meaning of this error code (-15517)? Does it relate to missing tags in the HLS manifest, or is it an internal state issue within the AVPlayer stack? Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored. Is there any additional logging or debugging tool you would recommend to further investigate "Key not found" issues in LL-HLS? Any insights or guidance from the community or Apple engineers would be greatly appreciated. Thank you in advance for your help.
Replies
1
Boosts
0
Views
174
Activity
3w
Swift Array Out of Bounds Crash in VTFrameProcessor when using VTLowLatencyFrameInterpolationParameters
Hi everyone, Our team is encountering a reproducible crash when using VTLowLatencyFrameInterpolation on iOS 26.3 while processing a live LL-HLS input stream. 🤖 Environment Device: iPhone 16 OS: iOS 26.3 Xcode: Xcode 26.3 Framework: VideoToolbox 💥 Crash Details The application crashes with the following fatal error: Fatal error: Swift/ContiguousArrayBuffer.swift:184: Array index out of range The stack trace highlights the following: VTLowLatencyFrameInterpolationImplementation processWithParameters:frameOutputHandler: Called from VTFrameProcessor.process(parameters:) Here is the simplified implementation block where the crash occurs. (Note: PrismSampleBuffer and PrismLLFIError are our internal custom wrapper types). // Create `VTFrameProcessorFrame` for the source (previous) frame. let sourcePTS = sourceSampleBuffer.presentationTimeStamp var sourceFrame: VTFrameProcessorFrame? if let pixelBuffer = sourceSampleBuffer.imageBuffer { sourceFrame = VTFrameProcessorFrame(buffer: pixelBuffer, presentationTimeStamp: sourcePTS) } // Validate the source VTFrameProcessorFrame. guard let sourceFrame else { throw PrismLLFIError.missingImageBuffer } // Create `VTFrameProcessorFrame` for the next frame. let nextPTS = nextSampleBuffer.presentationTimeStamp var nextFrame: VTFrameProcessorFrame? if let pixelBuffer = nextSampleBuffer.imageBuffer { nextFrame = VTFrameProcessorFrame(buffer: pixelBuffer, presentationTimeStamp: nextPTS) } // Validate the next VTFrameProcessorFrame. guard let nextFrame else { throw PrismLLFIError.missingImageBuffer } // Calculate interpolation intervals and allocate destination frame buffers. let intervals = interpolationIntervals() let destinationFrames = try framesBetween(firstPTS: sourcePTS, lastPTS: nextPTS, interpolationIntervals: intervals) let interpolationPhase: [Float] = intervals.map { Float($0) } // Create VTLowLatencyFrameInterpolationParameters. // This sets up the configuration required for temporal frame interpolation between the previous and current source frames. guard let parameters = VTLowLatencyFrameInterpolationParameters( sourceFrame: nextFrame, previousFrame: sourceFrame, interpolationPhase: interpolationPhase, destinationFrames: destinationFrames ) else { throw PrismLLFIError.failedToCreateParameters } try await send(sourceSampleBuffer) // Process the frames. // Using progressive callback here to get the next processed frame as soon as it's ready, // preventing the system from waiting for the entire batch to finish. for try await readOnlyFrame in self.frameProcessor.process(parameters: parameters) { // Create an interpolated sample buffer based on the output frame. let newSampleBuffer: PrismSampleBuffer = try readOnlyFrame.frame.withUnsafeBuffer { pixelBuffer in try PrismLowLatencyFrameInterpolation.createSampleBuffer(from: pixelBuffer, readOnlyFrame.timeStamp) } // Pass the newly generated frame to the output stream. try await send(newSampleBuffer) } 🙋 Questions Are there any known limitations or bugs regarding VTLowLatencyFrameInterpolation when handling live 60fps streams? Are there any undocumented constraints we should be aware of regarding source/previous frame timing, pixel buffer attributes, or how destinationFrames and interpolationPhase arrays must be allocated? Is a "warm-up" sequence recommended after startSession() before making the first process(parameters:) call?
Replies
1
Boosts
0
Views
444
Activity
2w
Unity iOS (Metal) → WebRTC (Unity WebRTC) video stream to remote Unity client: PeerConnection connects but receiver renders black frames
Hello! My name is Mason Prather. I'm a graduate student at Kennesaw State University and a Research Engineer working in XR environments through my Graduate Research Assistant role. I’m currently building a research prototype that connects a mobile companion application to a VR headset. The mobile application is built in Unity and deployed on iOS, and it streams video frames to a remote Unity client using WebRTC. Environment Device: iPhone 15 OS: iOS 26.3 (tested on physical device, not Simulator) Engine: Unity 2022.3.57f1 Graphics API: Metal Streaming Technology: WebRTC (Unity WebRTC package) Architecture: Mobile Unity app streaming video frames to a remote Unity client Receiver Device: Meta Quest Pro headset (Unity application) Networking: LAN (UDP discovery + TCP signaling) Video Source: Unity RenderTexture Goal The goal of the system is to allow a VR user to view media stored on their phone inside a VR environment. The iOS app: renders or captures media content converts frames into a WebRTC video track streams the video to the headset Current Status Connection setup works correctly. Observed behavior: Signaling connection successful ICE candidate exchange successful PeerConnection state becomes Connected Video track created successfully However, the receiving application displays black frames. iOS App Details The video source originates from a Unity RenderTexture. Inside the phone application: RenderTexture displays correctly Frames appear correct locally But the receiving peer does not display the frames. Relevant Components Unity WebRTC package iOS Metal rendering pipeline Custom TCP signaling LAN discovery via UDP Expected Behavior Rendered frames should transmit via WebRTC and appear on the remote device. Actual Behavior The remote video track is active, but the rendered frames appear black on the receiving client. Questions Are there known issues involving Unity WebRTC + iOS Metal texture capture? Are there specific pixel format requirements when streaming textures from Unity on iOS? Could the issue relate to texture readback limitations or GPU synchronization? I am more than happy to provide screenshots and console logs upon request. If anyone has experience streaming Unity video frames via WebRTC on iOS, I would greatly appreciate any guidance.
Replies
0
Boosts
0
Views
203
Activity
2w
On iOS 26, HLS alternate audio track selection behaves inconsistently
Summary On iOS 26, HLS alternate audio track selection behaves inconsistently on both VOD and live streams: the French track falls back to the DEFAULT=YES (English) track after manual selection, and in some cases switching to a non-default track appears to work but it is then impossible to switch back to English. Environment iOS version: 26 Players affected: native Safari on iOS 26 and THEOplayer (issue also reproducible on THEOplayer's own demo page) Stream type: HLS/CMAF with demuxed alternate audio renditions (CMFC container) Affected stream types: both VOD and live streaming Issue NOT present on iOS 17/18 Manifest #EXTM3U #EXT-X-VERSION:4 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-STREAM-INF:BANDWIDTH=8987973,AVERAGE-BANDWIDTH=8987973,VIDEO-RANGE=SDR,CODECS="avc1.640028",RESOLUTION=1920x1080,FRAME-RATE=29.970,AUDIO="program_audio" video_1080p.m3u8 #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="program_audio",LANGUAGE="en",ASSOC-LANGUAGE="en",NAME="English",AUTOSELECT=YES,DEFAULT=YES,CHANNELS="2",URI="audio_ENG.m3u8" #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="program_audio",LANGUAGE="de",ASSOC-LANGUAGE="de",NAME="Deutsch",AUTOSELECT=YES,DEFAULT=NO,CHANNELS="2",URI="audio_DEU.m3u8" #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="program_audio",LANGUAGE="fr",ASSOC-LANGUAGE="fr",NAME="Francais",AUTOSELECT=YES,DEFAULT=NO,CHANNELS="2",URI="audio_FRA.m3u8" Steps to Reproduce Load the HLS manifest (VOD or live) in Safari on iOS 26, or in any AVFoundation-backed player Start playback — English plays correctly as DEFAULT Manually select "Francais" from the audio track selector Observe that English audio continues playing (French does not play) In a separate scenario: manually select "Deutsch" — German plays correctly Attempt to switch back to English — English does not resume; audio remains on the previously selected track Expected behavior Selecting any track should immediately switch to that language Switching back to English (DEFAULT=YES) should work at any time Behavior should be consistent across VOD and live streams Actual behavior Two distinct anomalies observed, reproducible on both VOD and live streams, in both native Safari and THEOplayer: French-specific fallback: selecting the French track causes playback to fall back to English. This does not happen with German. Cannot return to English: in cases where a non-default track plays correctly, attempting to switch back to the DEFAULT=YES track (English) fails — the previous non-default track continues playing. The fact that the issue reproduces in native Safari confirms this is an AVFoundation/WebKit-level regression, not a third-party player bug. What we have already verified and ruled out LANGUAGE codes are BCP-47 compliant (en, fr, de) ✓ EXT-X-VERSION:4 is present ✓ Audio codec removed from STREAM-INF CODECS (video-only) ✓ ASSOC-LANGUAGE attribute added matching LANGUAGE value ✓ Container metadata verified via ffprobe: mdhd box correctly contains language tags (e.g. "fra") ✓ Audio segment content verified via ffplay: correct audio in each language file ✓ French audio source file contains correct French audio content ✓ Issue reproduces in native Safari on iOS 26, confirming it is not a THEOplayer-specific bug Issue does NOT reproduce on iOS 17/18 with the same manifest and segments Additional notes The VOD stream is packaged with AWS MediaConvert, CMAF output group, SEGMENTED_FILES, AAC-LC codec (mp4a.40.2), 128kbps, 48kHz stereo. English uses AudioTrackType ALTERNATE_AUDIO_AUTO_SELECT_DEFAULT; French and German use ALTERNATE_AUDIO_AUTO_SELECT. The live stream uses AWS MediaPackage with a similar CMAF/HLS output configuration. Please advise whether this is a known regression in AVFoundation on iOS 26 and whether a fix is planned.
Replies
1
Boosts
0
Views
90
Activity
1w
offline with FairPlay
I am working on offline license support for FairPlay. I am trying to identify offline vs. live streaming requests. How do I do that? Is there any way to identify the request type (offline or streaming) from SPC?
Replies
0
Boosts
1
Views
59
Activity
1w
Generating a new FPS certificate (SDK 26) alongside an existing SDK 4 certificate
Hi, Our client currently has an FPS deployment certificate generated with SDK version 4 that is still actively used in production. They would like to generate an additional certificate using SDK version 26. Before doing so, they just want to confirm: Will the existing SDK 4 certificate remain unaffected and still visible in the Apple Developer portal? Any considerations they should keep in mind? Thanks! :)
Replies
0
Boosts
0
Views
98
Activity
4d
Generating a new FPS certificate (SDK 26) alongside an existing SDK 4 certificate
Hi, Our client currently has an FPS deployment certificate generated with SDK version 4 that is still actively used in production. They would like to generate an additional certificate using SDK version 26. Before doing so, they just want to confirm: Will the existing SDK 4 certificate remain unaffected and still visible in the Apple Developer portal? Any considerations they should keep in mind? Thanks!
Replies
0
Boosts
0
Views
117
Activity
4d