Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Created

AVAudioEngine startAndReturnError is now failing
I have a keyboard in my iOS Morse Code app that has always been able to play audio via AVAudioEngine. Recently it has been failing to produce audio. I see that startAndReturnError: is now failing with this error: Error Domain=com.apple.coreaudio.avfaudio Code=268435459 "(null)" UserInfo={failed call=err = PerformCommand(*outputNode, kAUInitialize, NULL, 0)} What's going on? Have keyboards lost the ability to play audio? Here's how I set things up: _engine = [AVAudioEngine new]; _prefs = [[NSUserDefaults alloc] initWithSuiteName:kSharedAppGroupID]; AVAudioMixerNode* mainMixerNode = _engine.mainMixerNode; AVAudioOutputNode* outputNode = _engine.outputNode; AVAudioFormat* format = [outputNode inputFormatForBus:0]; AVAudioFormat* inputFormat = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatFloat32 sampleRate:44100 channels:1 interleaved:NO]; self.srcNode = [[AVAudioSourceNode alloc] initWithRenderBlock:^OSStatus(BOOL* _Nonnull isSilence, const AudioTimeStamp* _Nonnull timestamp, AVAudioFrameCount frameCount, AudioBufferList* _Nonnull outputData) { // This block builds the data, but is never called, so it is not the culprit. }]; [_engine attachNode:self.srcNode]; [_engine connect:self.srcNode to:mainMixerNode format:inputFormat]; [_engine connect:mainMixerNode to:_engine.outputNode format:nil]; [_engine prepare];
0
0
136
2w
macOS 26 – NSSound/CoreAudio causes SIGILL crash in caulk allocator
Hi everyone, We are the engineering team behind an enterprise communications application for macOS. We are experiencing a critical crash on macOS 26 that did not occur on any previous macOS version. We are seeking clarification from Apple engineers or anyone who may have insight into this behaviour. Environment Architecturex86_64macOS26.4.1 (25E253)HardwareMac15,13 (MacBook Pro)ExceptionSIGILL / ILL_ILLOPCCrashed ThreadThread 0 (Main Thread)TriggerPlaying a notification sound via NSSound during an incoming call Crash Stack 0 caulk consolidating_free_map::maybe_create_free_node + 119 ← SIGILL 1 caulk tiered_allocator + 1469 2 caulk exported_resource::do_allocate + 15 3 AudioToolboxCore EABLImpl::create + 204 4 CoreAudio AUNotQuiteSoSimpleTimeFactory + 33267 8 AudioToolboxCore AudioUnitInitialize + 189 9 AudioToolbox XAudioUnit::Initialize + 19 10 AudioToolbox MESubmixGraph::initialize + 125 11 AudioToolbox MESubmixGraph::connectInputChannel + 1172 12 AudioToolbox MEDeviceStreamClient::AddRunningClient + 509 15 AudioToolbox AudioQueueObject::StartRunning + 194 16 AudioToolbox AudioQueueObject::Start + 1447 22 AudioToolbox AQ::API::V2Impl::AudioQueueStartWithFlags + 805 23 AVFAudio AVAudioPlayerCpp::playQueue + 354 24 AVFAudio AVAudioPlayerCpp::DoAction + 134 25 AVFAudio -[AVAudioPlayer play] + 26 26 AppKit -[NSSound play] + 100 27 Our App -[AudioHelper tryToStartSound:ofType:] + 569 28 Our App block_invoke + 59 Behaviour Difference Between macOS Versions The exact same code path that triggers this crash on macOS 26 works without any issue on macOS 14 and macOS 15 — no crash, no warning, no log output of any kind. The crash occurs inside Apple's private caulk memory allocator during CoreAudio audio engine initialisation, triggered by a call to [NSSound play]. The SIGILL / ILL_ILLOPC at maybe_create_free_node + 119 suggests a hard ud2 trap — an intentional abort guard inserted at compile time. This strongly suggests that something changed in macOS 26 within NSSound / CoreAudio / caulk that causes this code path to fail in a way it previously did not. Questions We have the following specific questions: Was there a deliberate threading policy change in NSSound / CoreAudio in macOS 26? Is the SIGILL in caulk::consolidating_free_map::maybe_create_free_node an intentional thread-affinity assertion introduced in macOS 26? Are there any other NSSound / AVAudioPlayer / AudioQueue APIs that have similarly tightened their requirements in macOS 26 that we should be aware of? Is there a migration guide, release note, or WWDC session that covers CoreAudio changes in macOS 26 that we may have missed? Has anyone else in the developer community encountered a similar SIGILL crash in caulk on macOS 26 during audio playback?
7
0
1.1k
2w
PHPhotosErrorDomain Code: 3302 started affecting my users recently.
Recently I received multiple user email supports because the app cannot save video to photos, they are all on iOS 26.x The code in my app around recording video and saving to Photos hasn't changed in years. I'm not able to reproduce it locally, I tried on all my available devices. In recently published build I added additional logs and it appears that all of cases that fail with 3302 have the photos access set to "Limited Access". It never happens to users with "Full Access". In that build I also added a fallback, when saving to photos fails, the app saves to Documents and it seems it works (two of my users affected users confirmed it), but it's very unfortunate. I think it kind of proves that videos aren't broken given that users are able to play them just fine. On of users says that for him saving to Photos works for 2-3 times after he reinstalls the app and then it stops working. Did anything recently changed in how we should save videos to photos? I'm using the following code. I can see in git blame that I haven't changed in since 2020 and never encountered those errors in development or heard about those issues until around 1 month ago. Thank you. PHPhotoLibrary.shared().performChanges { PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: fileURL) } completionHandler: { success, error in
2
0
165
2w
FairPlay SPC v3 documentation mismatch: payload length field size vs sample code
Hi, I’ve identified a discrepancy between the FairPlay Streaming SPC v3 documentation and the provided Swift reference implementation regarding the SPC payload length field. Documentation states: The SPC V3 structure defines: SPC payload length: 4 bytes However, in Apple’s Swift sample implementation: // Move local offset by 12 to adjust for padding localOffset += 12 spcContainer.spcDataSize = Int(try readBigEndianU32(spc, localOffset)) This indicates: A 16-byte field (12 bytes padding + 4-byte length) This behavior also matches the SPC sample provided in the FairPlay Streaming SDK (sample_spc_v3.b64 ). 00000003 // spc version 00000000 // reserved .... 00000000000000000000000000000f40 // spc payload length (16 bytes) Could you please confirm the correct implementation? Thanks
1
0
130
2w
FairPlay SPC with an invalid device type
Hi, I received an SPC without a device Identity TLLV and with an invalid (i.e a value that is not specified in the FairPlay programming guide) value of device type in the Device info TLLV. The info I got is the following - Apple Device Type: Type:0x555ea482e2ef0a7c, OS version:189.121.178 Does anyone know what device type it is, and why it does not conform to the Apple spec? Also, should I accept such an SPC or is it not valid? Thanks.
1
0
235
2w
AirPods Gestures
Hello together, is there an API or a way to react to AirPods Gestures for an Recording that got started from an Intent or even when the App is open? Scenario: I am walking, riding the bike or do some other mainly hands free activities or can't reach my phone but have my AirPods in my ears. Goal: Via Siri, I am able to start an AudioRecordingIntent and it runs smoothly. I'd like Pause / Resume the recording by Single Tapping the AirPods or to end the Recording by simply double-tapping. Pretty much like if I would mute/unmute or hang up on a call. MPRemoteCommandCenter doesn't seem to be the solution for this. Not sure if this is because the Recording is started through an AudioRecordingIntent.
0
0
322
2w
Entitlement "com.apple.developer.carplay-driving-task" not allowing audio playback for voice controlled interaction
According to https://developer.apple.com/download/files/CarPlay-Developer-Guide.pdf , apps with entitlement com.apple.developer.carplay-driving-task are allowed to use voice control. In my current implementation the voice recording working fine but the voice response (AVPlayer with category "playback set") does not output any audio. I suspect that it is a entitlement limitation because if I quickly tap to play a music while the voice assistant AVPlayer is "playing", then I can hear the response, but without this trick it stays playing but mute. In parallel I have now requested com.apple.developer.carplay-voice-based-conversation entitlement , but I don't even know if when approved I will be able to use 2 entitlement for the same CarPlay app. Long story short: 1 - Should an app be able to play audio responses when it's CarPlay entitlement is com.apple.developer.carplay-driving-task? 2 - If not, can I combine entitlements com.apple.developer.carplay-driving-task and com.apple.developer.carplay-voice-based-conversation?
1
0
474
2w
Manual FairPlay License Renewal: AVContentKeySessionDelegate not triggering via addContentKeyRecipient
Hi everyone, I am working on an app that supports offline playback with FairPlay Streaming (FPS). I have successfully implemented the logic to download and persist the content keys (TLLV), and offline playback is working correctly using the stored persistent keys. However, I am now trying to implement a manual renewal process for these licenses, and I’ve run into an issue where the delegate methods are not being fired as expected. The Issue: I am calling contentKeySession.addContentKeyRecipient(asset) to force a renewal or re-fetch of the content key for a specific asset. Even though the asset is correctly initialized and the session is active, the AVContentKeySessionDelegate methods (specifically contentKeySession(_:didProvide:)) are not being triggered at all. My Questions: Why is the delegate not firing when adding the recipient? Is there a specific state or property the AVURLAsset needs to have (or a specific way it should be initialized) to trigger a new key request via addContentKeyRecipient? Is it possible to perform a manual license renewal triggered by a UI action (e.g., a button tap) without actually initiating playback of the asset? The goal is to allow users to refresh their licenses manually while online, ensuring the content remains playable offline before the previous license expires, all without forcing the user to start the video. Any insights or best practices for this manual renewal flow would be greatly appreciated.
3
0
692
3w
MusicKit developer token returns 401 on all catalog endpoints
My MusicKit developer token returns 401 (empty body) on every Apple Music API catalog endpoint. I've tried two different keys — both fail identically. Setup: Team ID: K79RSBVM9G Key ID: URNQV5UDGB (MusicKit enabled, associated with Media ID media.audio.explore.musickit) Apple Developer Program License Agreement accepted April 14, 2026 Token format (matches docs exactly): Header: {"alg":"ES256","kid":"URNQV5UDGB"} Payload: {"iss":"K79RSBVM9G","iat":,"exp":<now+15777000>} What works: /v1/storefronts/us returns 200 What fails: Every catalog endpoint returns 401 with empty body: /v1/catalog/us/search?types=artists&term=test /v1/catalog/us/artists/5920832 /v1/catalog/us/genres /v1/test The token self-verifies (signature is valid). I've tried with and without typ:"JWT", with the origin claim, and with a manually signed JWT bypassing the jsonwebtoken library. Same 401 every time. What am I missing?
0
1
195
3w
Is Push to Talk appropriate for a voice-based interactive assistant (not a walkie-talkie app)?
Hello, Looking for guidance from Apple engineers or developers who have used Push to Talk in production I am developing an iOS application called Companion AI / Theo Voice, designed for elderly users. The goal of the app is to provide a simple, voice-first interactive assistant that enables: natural voice interaction (no typing required) daily assistance (reminders, well-being, conversation) bidirectional voice communication (the user can immediately respond by voice) ⸻ How it works The app operates in two main modes: Conversation mode the user opens the app the assistant speaks the user replies naturally by voice Proactive mode in specific useful situations (e.g. medication reminders, check-ins) the app initiates a voice interaction the user can respond immediately ⸻ Important constraints there is no continuous listening the microphone is only active during interactions users can disable proactive interactions frequency is limited and user-controlled ⸻ Question We are considering using the Push to Talk framework in order to: allow the app to be awakened in the background initiate a voice interaction enable immediate voice response from the user Would this usage be considered aligned with the intended use of Push to Talk? Are there any specific recommendations to ensure compliance with App Store Review Guidelines? Thank you very much for your guidance.
0
0
175
3w
Trying to load image & identifier from photo library with PhotosPicker
I'm updating an older Mac app written in Objective C and OpenGL to be a mutliplatform app in SwiftUI and Metal. The app loads images and creates kaleidoscope animations from them. It is a document-based application, and saves info about the kaleidoscope into the document. On macOS, it creates a security-scoped bookmark to remember the user's chosen image. On iOS, I use a PhotosPicker to have the user choose an image from their photo library to use. I would like to get the itemIdentifier from the image they choose and save that into my document so I can use it to fetch the image when the user reloads the kaleidoscope document in the future. However, the call to loadTransferable is returning nil for the itemIdentifier. Here is my iOS/iPadOS code: #if os(macOS) // Mac code #else PhotosPicker("Choose image", selection: $selectedItem, matching: .images) .onChange(of: selectedItem) { Task { if let newValue = selectedItem { scopeState.isHEIC = newValue.supportedContentTypes.contains(UTType.heic) let data = try? await newValue.loadTransferable(type: Data.self) print("newValue = \(newValue)") print("newValue.supportedContentTypes = \(newValue.supportedContentTypes)") scopeState.selectedImageID = newValue.itemIdentifier scopeState.selectedImageData = data } } } #endif The debug print statements show: newValue = PhotosPickerItem(_itemIdentifier: "9386762B-C241-4EE2-9942-BC04017E35C1/L0/001", _shouldExposeItemIdentifier: false, _supportedContentTypes: [<_UTCoreType 0x20098cd40> public.png (not dynamic, declared), <UTType 0x11e4ec060> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x11e4ec150> com.apple.private.photos.thumbnail.low (not dynamic, declared)], _content: _PhotosUI_SwiftUI.PhotosPickerItem.(unknown context at $1e75ee3bc).Content.result(PhotosUI.PHPickerResult(itemProvider: <PUPhotosFileProviderItemProvider: 0x11d2bd680> {types = ( "public.png", "com.apple.private.photos.thumbnail.standard", "com.apple.private.photos.thumbnail.low" )}, _objcResult: <PHPickerResult: 0x11b18cff0>))) newValue.supportedContentTypes = [<_UTCoreType 0x20098cd40> public.png (not dynamic, declared), <UTType 0x11e4ec060> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x11e4ec150> com.apple.private.photos.thumbnail.low (not dynamic, declared)] And the returned item has a nil itemIdentifier. (note the _shouldExposeItemIdentifier=false in the log of the selected item). How do I get the itemIdentifier for the user's chosen image? And is that valid to then fetch the asset when the user reloads their document? Is it like a security-scoped bookmark on macOS, where the itemIdentifier is like a key that gives me permission to reload the image? If not, what do I need to do in order to reload the image the next time the user opens a saved kaleidoscope document?
1
0
478
3w
Bug: Channels erroneously populated when sending audio from an iPhone to a linux gadget audio device.
I have a device which is using linux gadget audio to receive audio input via USB, exposing 24 capture channels. This device works well with Mac, Windows, and Android phones. However, when sending audio from an iPhone (both USB-C iPhones and lightning iPhones using an official Apple lightning -> usb adaptor) I am seeing strange behaviour. Audio which is sent from the iPhone to any one of inputs 12, 19, 20, 21, or 22 appears in all of those channels, rather than only the channel to which audio is routed. I have confirmed on my linux device that these channels are not being erroneously populated by the software running on that device; the issue is visible in audio recorded directly from the gadget using arecord, meaning it is present in the audio being sent from the iPhone. I have confirmed that the gadget channel mask is correct for 24 channel audio (0xFFFFFF). As said above, audio routed to this device from any non-iPhone device (Mac, Windows, Android) works fine. The only sensible conclusion seems to be that the iPhone is populating the additional channels erroneously due to some bug in CoreAudio's handling of gadget audio devices. I would appreciate any insight on this from Apple developers, or from anyone else who has come across this issue and found a workaround.
0
0
271
3w
iTunes Search API returning 404 for /search endpoint - April 16, 2026
Is anyone else seeing a sudden outage with the iTunes Search API (https://itunes.apple.com/search) today? As of this morning (April 16), all my requests to the /search endpoint are returning HTTP 404 Not Found. I've tested across multiple countries (us, gb, fr) and entities (software, iPadSoftware), but they all fail with the same error. Interestingly, the /lookup endpoint (e.g., https://itunes.apple.com/lookup?id=[APP_ID]) is still working perfectly fine. What I've checked so far: Apple System Status page is "All Green" (as usual). Tried different IP addresses/regions to rule out local blocking. Tested simple queries like term=car to rule out specific keyword issues. Questions: Are you guys seeing 404s as well, or is it just me? Has anyone heard of a sudden migration or deprecation notice for this legacy endpoint?
0
0
305
3w
AVMetricMediaResourceRequestEvent returns error but no URLSession metrics for failed HLS playlist/segment requests
Hello, I am using AVMetrics to monitor HLS playback requests from AVPlayer, specifically AVMetricHLSPlaylistRequestEvent and AVMetricHLSMediaSegmentRequestEvent. These events provide an AVMetricMediaResourceRequestEvent. For successful requests, I can read URLSession metrics. However, when a request fails, the event contains an error but no URLSession metrics. I reproduced this by intercepting HLS playlist and segment requests with Charles Proxy and forcing failures on both the simulator and a physical device. Is this expected behavior? If so, is there any supported way to get timing details for failed HLS requests? I am using code like this: for try await event in playerItem.metrics(forType: AVMetricHLSPlaylistRequestEvent.self) { // ... } for try await event in playerItem.metrics(forType: AVMetricHLSMediaSegmentRequestEvent.self) { // ... } Also, the example shown in the WWDC session does not compile for me (XCode 26.2). I get the following error: Pack expansion requires that '' and 'AVMetricEvent' have the same shape let playerItem: AVPlayerItem = ... let ltkuMetrics = item.metrics(forType: AVMetricPlayerItemLikelyToKeepUpEvent.self) let summaryMetrics = item.metrics(forType: AVMetricPlayerItemPlaybackSummaryEvent.self) for await (metricEvent, publisher) in ltkuMetrics.chronologicalMerge(with: summaryMetrics) { // send metricEvent to server }
2
1
249
4w
`LockedCameraCaptureManager` practically unusable since iOS 26
Somewhere since iOS 26, the LockedCameraCapture framework gets in an unpredictable state after opening the main app from the LockedCamera extension using LockedCameraCaptureSession.openApplication(for userActivity:). (Feedback with sample code to reproduce: FB21966835) Opening the extension from the lock screen again doesn’t open the extension but puts the lock screen in a state as if it has. Content updated from LockedCameraCaptureManager.shared.sessionContentUpdates comes in inconsistently, usually needs the app to be opened again or the extension to be opened. This makes using this extension impossible for me as I use it to record video files that manually need to be imported when the app is launched (so not through PhotoKit). Does anybody have a suggestion to circumvent this issue or how to get this fixed?
0
0
270
4w
Setting up video and image capture pipeline creates internal errors in AVFoundation.
I have created code for iOS that allows me to start and stop video acquisition from a proprietary USB camera using AVFoundation's AVCaptureSession and AVCaptureDevice APIs. There is a start and stop method. The start method takes an argument to specify one of two formats that I use for my custom camera application. I can start the session and switch between formats all day without any errors. However, if I start and then stop the camera three times in a row, on the third invocation of start, I get errors in the console output and the CMSampleBuffers stop flowing to my callback. Additionally, once I get AVFoundation into this state, stoping the camera doesn't help. I have to kill the app and start over. Here are the errors. And below these, the code. I'm hoping someone who has experience with these errors or an engineer from Apple who knows the AVFoundation image capture pipeline code, can respond and tell me what I'm doing wrong. Thanks. <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:558) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:253) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:269) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:511) - (err=-16453) Capture session error: The operation could not be completed Capture session error: The operation could not be completed func start(for deviceFormat: String) async throws -> AnyPublisher<CMSampleBuffer, Swift.Error> { func configureCaptureDevice(with deviceFormat: String) throws { guard let format = formatDict[deviceFormat] else { throw Error.captureFormatNotFound } captureSession.beginConfiguration() defer { captureSession.commitConfiguration() } try captureDevice.lockForConfiguration() captureDeviceFormat = deviceFormat captureDevice.activeFormat = format captureDevice.unlockForConfiguration() } return try await withCheckedThrowingContinuation { continuation in sessionQueue.async { [unowned self] in logger.debug("Start capture session for \(deviceFormat): \(String(describing: captureSession))") // If we were already steaming camera images from a different mode, terminate that stream. bufferPublisher?.send(completion: .finished) bufferPublisher = nil captureDeviceFormat = "" do { // Re-configure with the new format; should be harmless if called with the currently configured format. try configureCaptureDevice(with: deviceFormat) // Return a new stream publisher for this invocation. bufferPublisher = PassthroughSubject<CMSampleBuffer, Swift.Error>() // If we are not currently running, start the image capture pipeline. if captureSession.isRunning == false { captureSession.startRunning() } continuation.resume(returning: bufferPublisher!.eraseToAnyPublisher()) } catch { logger.fault("Failed to start camera: \(error.localizedDescription)") continuation.resume(throwing: error) } } } } func stop() async throws { try await withCheckedThrowingContinuation { continuation in sessionQueue.async { [unowned self] in logger.debug("Stop capture session: \(String(describing: captureSession))") // The following invocation is synchronous and takes time to execute; // looks like a stall but you can ignore it as the MainActor is not blocked. captureSession.stopRunning() // Terminate the stream and reset our state. bufferPublisher?.send(completion: .finished) bufferPublisher = nil captureDeviceFormat = "" // Signal the caller that we are done here. continuation.resume() } } }
0
0
236
Apr ’26
How to Validate Now Playing Events on Apple Devices (iOS/tvOS)?
Hi Support Team, I need some guidance regarding Now Playing metadata integration on Apple platforms (iOS/tvOS). We are currently implementing Now Playing events in our application and would like to understand: How can we enable or configure logging for Now Playing metadata updates? Is there any recommended way or tool to verify that Now Playing events are correctly sent and received by the system (e.g., Control Center / external devices)? Are there any debugging techniques or best practices to validate metadata updates during development? Our app is currently in the development phase, and we are working towards meeting Video Partner Program (VPP) requirements. Any documentation, tools, or suggestions would be greatly appreciated. Thanks in advance for your support.
1
0
179
Apr ’26
iOS 26.4 regression: The `.pauses` audiovisual background playback policy does not pause video playback anymore when backgrounding the app
Starting with iOS 26.4 and the iOS 26.4 SDK, the .pauses audiovisual background playback policy is not correctly applied anymore to an AVPlayer having an attached video layer displayed on screen. This means that, when backgrounding a video-playing app (without Picture in Picture support) or locking the device, playback is not paused automatically by the system anymore. This issue affects the Apple TV application as well. We have filed FB22488151 with more information.
0
0
234
Apr ’26
iPad Pro M4 giving wrong value for layerPointConverted for ultra wide angle
I am using iPad Pro M4 device to apply exposure point to the camera. While converting layerPointConverted from 0 -1 range to device size point it is giving wrong value. But if same code is used for other iPad like Gen2, it gives proper value. In both cases video gravity used is resizeAspectFill. I tried using true depth camera for M4 device but it does not work.
0
0
204
Apr ’26
tvOS: Background audio + local caching works on Simulator but stops on real Apple TV device
Description: I’m developing a tvOS app using SwiftUI where we play background audio (music) in the Welcome screen, with support for offline playback via local caching. Feature Overview: App fetches audio metadata from API Starts streaming audio (HLS .m3u8) immediately In parallel, downloads the raw audio file (.mp3) Once download completes: Switches playback from streaming → local file On next launch (offline mode), app plays audio from local storage Issue: This flow works perfectly on the Simulator, but on a real Apple TV device: Audio plays for a few seconds (2–5 sec) and then stops Especially after switching from streaming → local file No explicit AVPlayer error is logged Playback sometimes stops after UI updates or periodic API refresh Implementation Details: Using AVPlayer with AVPlayerItem Background audio controlled via a shared manager (singleton) Files stored locally using FileManager (currently using .cachesDirectory) Switching playback using: player.replaceCurrentItem(with: AVPlayerItem(url: localURL)) player.play() Observations: Works reliably on Simulator On device: -- Playback stops silently -- Seems related to lifecycle, buffering, or file access No issues when continuously streaming (without switching to local) Questions: Is there any limitation or known issue with AVPlayer when switching from streaming (HLS) to local file playback on tvOS? Are there specific requirements for playing locally cached media files on tvOS (e.g., file location, permissions, or sandbox behavior)? What is the recommended storage location and size limit for cached media files on tvOS? We understand tvOS has limited persistent storage Is .cachesDirectory the correct approach for this use case? Are there known differences in AVPlayer behavior between Simulator and real Apple TV devices (especially regarding buffering or lifecycle)? What is the recommended approach for implementing offline background audio on tvOS apps? Goal: We want to implement a reliable system where: Audio streams initially Seamlessly switches to local file after download Continues playing without interruption Supports offline playback on subsequent launches Any guidance or best practices would be greatly appreciated. Thank you!
0
0
175
Apr ’26
AVAudioEngine startAndReturnError is now failing
I have a keyboard in my iOS Morse Code app that has always been able to play audio via AVAudioEngine. Recently it has been failing to produce audio. I see that startAndReturnError: is now failing with this error: Error Domain=com.apple.coreaudio.avfaudio Code=268435459 "(null)" UserInfo={failed call=err = PerformCommand(*outputNode, kAUInitialize, NULL, 0)} What's going on? Have keyboards lost the ability to play audio? Here's how I set things up: _engine = [AVAudioEngine new]; _prefs = [[NSUserDefaults alloc] initWithSuiteName:kSharedAppGroupID]; AVAudioMixerNode* mainMixerNode = _engine.mainMixerNode; AVAudioOutputNode* outputNode = _engine.outputNode; AVAudioFormat* format = [outputNode inputFormatForBus:0]; AVAudioFormat* inputFormat = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatFloat32 sampleRate:44100 channels:1 interleaved:NO]; self.srcNode = [[AVAudioSourceNode alloc] initWithRenderBlock:^OSStatus(BOOL* _Nonnull isSilence, const AudioTimeStamp* _Nonnull timestamp, AVAudioFrameCount frameCount, AudioBufferList* _Nonnull outputData) { // This block builds the data, but is never called, so it is not the culprit. }]; [_engine attachNode:self.srcNode]; [_engine connect:self.srcNode to:mainMixerNode format:inputFormat]; [_engine connect:mainMixerNode to:_engine.outputNode format:nil]; [_engine prepare];
Replies
0
Boosts
0
Views
136
Activity
2w
macOS 26 – NSSound/CoreAudio causes SIGILL crash in caulk allocator
Hi everyone, We are the engineering team behind an enterprise communications application for macOS. We are experiencing a critical crash on macOS 26 that did not occur on any previous macOS version. We are seeking clarification from Apple engineers or anyone who may have insight into this behaviour. Environment Architecturex86_64macOS26.4.1 (25E253)HardwareMac15,13 (MacBook Pro)ExceptionSIGILL / ILL_ILLOPCCrashed ThreadThread 0 (Main Thread)TriggerPlaying a notification sound via NSSound during an incoming call Crash Stack 0 caulk consolidating_free_map::maybe_create_free_node + 119 ← SIGILL 1 caulk tiered_allocator + 1469 2 caulk exported_resource::do_allocate + 15 3 AudioToolboxCore EABLImpl::create + 204 4 CoreAudio AUNotQuiteSoSimpleTimeFactory + 33267 8 AudioToolboxCore AudioUnitInitialize + 189 9 AudioToolbox XAudioUnit::Initialize + 19 10 AudioToolbox MESubmixGraph::initialize + 125 11 AudioToolbox MESubmixGraph::connectInputChannel + 1172 12 AudioToolbox MEDeviceStreamClient::AddRunningClient + 509 15 AudioToolbox AudioQueueObject::StartRunning + 194 16 AudioToolbox AudioQueueObject::Start + 1447 22 AudioToolbox AQ::API::V2Impl::AudioQueueStartWithFlags + 805 23 AVFAudio AVAudioPlayerCpp::playQueue + 354 24 AVFAudio AVAudioPlayerCpp::DoAction + 134 25 AVFAudio -[AVAudioPlayer play] + 26 26 AppKit -[NSSound play] + 100 27 Our App -[AudioHelper tryToStartSound:ofType:] + 569 28 Our App block_invoke + 59 Behaviour Difference Between macOS Versions The exact same code path that triggers this crash on macOS 26 works without any issue on macOS 14 and macOS 15 — no crash, no warning, no log output of any kind. The crash occurs inside Apple's private caulk memory allocator during CoreAudio audio engine initialisation, triggered by a call to [NSSound play]. The SIGILL / ILL_ILLOPC at maybe_create_free_node + 119 suggests a hard ud2 trap — an intentional abort guard inserted at compile time. This strongly suggests that something changed in macOS 26 within NSSound / CoreAudio / caulk that causes this code path to fail in a way it previously did not. Questions We have the following specific questions: Was there a deliberate threading policy change in NSSound / CoreAudio in macOS 26? Is the SIGILL in caulk::consolidating_free_map::maybe_create_free_node an intentional thread-affinity assertion introduced in macOS 26? Are there any other NSSound / AVAudioPlayer / AudioQueue APIs that have similarly tightened their requirements in macOS 26 that we should be aware of? Is there a migration guide, release note, or WWDC session that covers CoreAudio changes in macOS 26 that we may have missed? Has anyone else in the developer community encountered a similar SIGILL crash in caulk on macOS 26 during audio playback?
Replies
7
Boosts
0
Views
1.1k
Activity
2w
PHPhotosErrorDomain Code: 3302 started affecting my users recently.
Recently I received multiple user email supports because the app cannot save video to photos, they are all on iOS 26.x The code in my app around recording video and saving to Photos hasn't changed in years. I'm not able to reproduce it locally, I tried on all my available devices. In recently published build I added additional logs and it appears that all of cases that fail with 3302 have the photos access set to "Limited Access". It never happens to users with "Full Access". In that build I also added a fallback, when saving to photos fails, the app saves to Documents and it seems it works (two of my users affected users confirmed it), but it's very unfortunate. I think it kind of proves that videos aren't broken given that users are able to play them just fine. On of users says that for him saving to Photos works for 2-3 times after he reinstalls the app and then it stops working. Did anything recently changed in how we should save videos to photos? I'm using the following code. I can see in git blame that I haven't changed in since 2020 and never encountered those errors in development or heard about those issues until around 1 month ago. Thank you. PHPhotoLibrary.shared().performChanges { PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: fileURL) } completionHandler: { success, error in
Replies
2
Boosts
0
Views
165
Activity
2w
FairPlay SPC v3 documentation mismatch: payload length field size vs sample code
Hi, I’ve identified a discrepancy between the FairPlay Streaming SPC v3 documentation and the provided Swift reference implementation regarding the SPC payload length field. Documentation states: The SPC V3 structure defines: SPC payload length: 4 bytes However, in Apple’s Swift sample implementation: // Move local offset by 12 to adjust for padding localOffset += 12 spcContainer.spcDataSize = Int(try readBigEndianU32(spc, localOffset)) This indicates: A 16-byte field (12 bytes padding + 4-byte length) This behavior also matches the SPC sample provided in the FairPlay Streaming SDK (sample_spc_v3.b64 ). 00000003 // spc version 00000000 // reserved .... 00000000000000000000000000000f40 // spc payload length (16 bytes) Could you please confirm the correct implementation? Thanks
Replies
1
Boosts
0
Views
130
Activity
2w
FairPlay SPC with an invalid device type
Hi, I received an SPC without a device Identity TLLV and with an invalid (i.e a value that is not specified in the FairPlay programming guide) value of device type in the Device info TLLV. The info I got is the following - Apple Device Type: Type:0x555ea482e2ef0a7c, OS version:189.121.178 Does anyone know what device type it is, and why it does not conform to the Apple spec? Also, should I accept such an SPC or is it not valid? Thanks.
Replies
1
Boosts
0
Views
235
Activity
2w
AirPods Gestures
Hello together, is there an API or a way to react to AirPods Gestures for an Recording that got started from an Intent or even when the App is open? Scenario: I am walking, riding the bike or do some other mainly hands free activities or can't reach my phone but have my AirPods in my ears. Goal: Via Siri, I am able to start an AudioRecordingIntent and it runs smoothly. I'd like Pause / Resume the recording by Single Tapping the AirPods or to end the Recording by simply double-tapping. Pretty much like if I would mute/unmute or hang up on a call. MPRemoteCommandCenter doesn't seem to be the solution for this. Not sure if this is because the Recording is started through an AudioRecordingIntent.
Replies
0
Boosts
0
Views
322
Activity
2w
Entitlement "com.apple.developer.carplay-driving-task" not allowing audio playback for voice controlled interaction
According to https://developer.apple.com/download/files/CarPlay-Developer-Guide.pdf , apps with entitlement com.apple.developer.carplay-driving-task are allowed to use voice control. In my current implementation the voice recording working fine but the voice response (AVPlayer with category "playback set") does not output any audio. I suspect that it is a entitlement limitation because if I quickly tap to play a music while the voice assistant AVPlayer is "playing", then I can hear the response, but without this trick it stays playing but mute. In parallel I have now requested com.apple.developer.carplay-voice-based-conversation entitlement , but I don't even know if when approved I will be able to use 2 entitlement for the same CarPlay app. Long story short: 1 - Should an app be able to play audio responses when it's CarPlay entitlement is com.apple.developer.carplay-driving-task? 2 - If not, can I combine entitlements com.apple.developer.carplay-driving-task and com.apple.developer.carplay-voice-based-conversation?
Replies
1
Boosts
0
Views
474
Activity
2w
Manual FairPlay License Renewal: AVContentKeySessionDelegate not triggering via addContentKeyRecipient
Hi everyone, I am working on an app that supports offline playback with FairPlay Streaming (FPS). I have successfully implemented the logic to download and persist the content keys (TLLV), and offline playback is working correctly using the stored persistent keys. However, I am now trying to implement a manual renewal process for these licenses, and I’ve run into an issue where the delegate methods are not being fired as expected. The Issue: I am calling contentKeySession.addContentKeyRecipient(asset) to force a renewal or re-fetch of the content key for a specific asset. Even though the asset is correctly initialized and the session is active, the AVContentKeySessionDelegate methods (specifically contentKeySession(_:didProvide:)) are not being triggered at all. My Questions: Why is the delegate not firing when adding the recipient? Is there a specific state or property the AVURLAsset needs to have (or a specific way it should be initialized) to trigger a new key request via addContentKeyRecipient? Is it possible to perform a manual license renewal triggered by a UI action (e.g., a button tap) without actually initiating playback of the asset? The goal is to allow users to refresh their licenses manually while online, ensuring the content remains playable offline before the previous license expires, all without forcing the user to start the video. Any insights or best practices for this manual renewal flow would be greatly appreciated.
Replies
3
Boosts
0
Views
692
Activity
3w
MusicKit developer token returns 401 on all catalog endpoints
My MusicKit developer token returns 401 (empty body) on every Apple Music API catalog endpoint. I've tried two different keys — both fail identically. Setup: Team ID: K79RSBVM9G Key ID: URNQV5UDGB (MusicKit enabled, associated with Media ID media.audio.explore.musickit) Apple Developer Program License Agreement accepted April 14, 2026 Token format (matches docs exactly): Header: {"alg":"ES256","kid":"URNQV5UDGB"} Payload: {"iss":"K79RSBVM9G","iat":,"exp":<now+15777000>} What works: /v1/storefronts/us returns 200 What fails: Every catalog endpoint returns 401 with empty body: /v1/catalog/us/search?types=artists&term=test /v1/catalog/us/artists/5920832 /v1/catalog/us/genres /v1/test The token self-verifies (signature is valid). I've tried with and without typ:"JWT", with the origin claim, and with a manually signed JWT bypassing the jsonwebtoken library. Same 401 every time. What am I missing?
Replies
0
Boosts
1
Views
195
Activity
3w
Is Push to Talk appropriate for a voice-based interactive assistant (not a walkie-talkie app)?
Hello, Looking for guidance from Apple engineers or developers who have used Push to Talk in production I am developing an iOS application called Companion AI / Theo Voice, designed for elderly users. The goal of the app is to provide a simple, voice-first interactive assistant that enables: natural voice interaction (no typing required) daily assistance (reminders, well-being, conversation) bidirectional voice communication (the user can immediately respond by voice) ⸻ How it works The app operates in two main modes: Conversation mode the user opens the app the assistant speaks the user replies naturally by voice Proactive mode in specific useful situations (e.g. medication reminders, check-ins) the app initiates a voice interaction the user can respond immediately ⸻ Important constraints there is no continuous listening the microphone is only active during interactions users can disable proactive interactions frequency is limited and user-controlled ⸻ Question We are considering using the Push to Talk framework in order to: allow the app to be awakened in the background initiate a voice interaction enable immediate voice response from the user Would this usage be considered aligned with the intended use of Push to Talk? Are there any specific recommendations to ensure compliance with App Store Review Guidelines? Thank you very much for your guidance.
Replies
0
Boosts
0
Views
175
Activity
3w
Trying to load image & identifier from photo library with PhotosPicker
I'm updating an older Mac app written in Objective C and OpenGL to be a mutliplatform app in SwiftUI and Metal. The app loads images and creates kaleidoscope animations from them. It is a document-based application, and saves info about the kaleidoscope into the document. On macOS, it creates a security-scoped bookmark to remember the user's chosen image. On iOS, I use a PhotosPicker to have the user choose an image from their photo library to use. I would like to get the itemIdentifier from the image they choose and save that into my document so I can use it to fetch the image when the user reloads the kaleidoscope document in the future. However, the call to loadTransferable is returning nil for the itemIdentifier. Here is my iOS/iPadOS code: #if os(macOS) // Mac code #else PhotosPicker("Choose image", selection: $selectedItem, matching: .images) .onChange(of: selectedItem) { Task { if let newValue = selectedItem { scopeState.isHEIC = newValue.supportedContentTypes.contains(UTType.heic) let data = try? await newValue.loadTransferable(type: Data.self) print("newValue = \(newValue)") print("newValue.supportedContentTypes = \(newValue.supportedContentTypes)") scopeState.selectedImageID = newValue.itemIdentifier scopeState.selectedImageData = data } } } #endif The debug print statements show: newValue = PhotosPickerItem(_itemIdentifier: "9386762B-C241-4EE2-9942-BC04017E35C1/L0/001", _shouldExposeItemIdentifier: false, _supportedContentTypes: [<_UTCoreType 0x20098cd40> public.png (not dynamic, declared), <UTType 0x11e4ec060> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x11e4ec150> com.apple.private.photos.thumbnail.low (not dynamic, declared)], _content: _PhotosUI_SwiftUI.PhotosPickerItem.(unknown context at $1e75ee3bc).Content.result(PhotosUI.PHPickerResult(itemProvider: <PUPhotosFileProviderItemProvider: 0x11d2bd680> {types = ( "public.png", "com.apple.private.photos.thumbnail.standard", "com.apple.private.photos.thumbnail.low" )}, _objcResult: <PHPickerResult: 0x11b18cff0>))) newValue.supportedContentTypes = [<_UTCoreType 0x20098cd40> public.png (not dynamic, declared), <UTType 0x11e4ec060> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x11e4ec150> com.apple.private.photos.thumbnail.low (not dynamic, declared)] And the returned item has a nil itemIdentifier. (note the _shouldExposeItemIdentifier=false in the log of the selected item). How do I get the itemIdentifier for the user's chosen image? And is that valid to then fetch the asset when the user reloads their document? Is it like a security-scoped bookmark on macOS, where the itemIdentifier is like a key that gives me permission to reload the image? If not, what do I need to do in order to reload the image the next time the user opens a saved kaleidoscope document?
Replies
1
Boosts
0
Views
478
Activity
3w
Bug: Channels erroneously populated when sending audio from an iPhone to a linux gadget audio device.
I have a device which is using linux gadget audio to receive audio input via USB, exposing 24 capture channels. This device works well with Mac, Windows, and Android phones. However, when sending audio from an iPhone (both USB-C iPhones and lightning iPhones using an official Apple lightning -> usb adaptor) I am seeing strange behaviour. Audio which is sent from the iPhone to any one of inputs 12, 19, 20, 21, or 22 appears in all of those channels, rather than only the channel to which audio is routed. I have confirmed on my linux device that these channels are not being erroneously populated by the software running on that device; the issue is visible in audio recorded directly from the gadget using arecord, meaning it is present in the audio being sent from the iPhone. I have confirmed that the gadget channel mask is correct for 24 channel audio (0xFFFFFF). As said above, audio routed to this device from any non-iPhone device (Mac, Windows, Android) works fine. The only sensible conclusion seems to be that the iPhone is populating the additional channels erroneously due to some bug in CoreAudio's handling of gadget audio devices. I would appreciate any insight on this from Apple developers, or from anyone else who has come across this issue and found a workaround.
Replies
0
Boosts
0
Views
271
Activity
3w
iTunes Search API returning 404 for /search endpoint - April 16, 2026
Is anyone else seeing a sudden outage with the iTunes Search API (https://itunes.apple.com/search) today? As of this morning (April 16), all my requests to the /search endpoint are returning HTTP 404 Not Found. I've tested across multiple countries (us, gb, fr) and entities (software, iPadSoftware), but they all fail with the same error. Interestingly, the /lookup endpoint (e.g., https://itunes.apple.com/lookup?id=[APP_ID]) is still working perfectly fine. What I've checked so far: Apple System Status page is "All Green" (as usual). Tried different IP addresses/regions to rule out local blocking. Tested simple queries like term=car to rule out specific keyword issues. Questions: Are you guys seeing 404s as well, or is it just me? Has anyone heard of a sudden migration or deprecation notice for this legacy endpoint?
Replies
0
Boosts
0
Views
305
Activity
3w
AVMetricMediaResourceRequestEvent returns error but no URLSession metrics for failed HLS playlist/segment requests
Hello, I am using AVMetrics to monitor HLS playback requests from AVPlayer, specifically AVMetricHLSPlaylistRequestEvent and AVMetricHLSMediaSegmentRequestEvent. These events provide an AVMetricMediaResourceRequestEvent. For successful requests, I can read URLSession metrics. However, when a request fails, the event contains an error but no URLSession metrics. I reproduced this by intercepting HLS playlist and segment requests with Charles Proxy and forcing failures on both the simulator and a physical device. Is this expected behavior? If so, is there any supported way to get timing details for failed HLS requests? I am using code like this: for try await event in playerItem.metrics(forType: AVMetricHLSPlaylistRequestEvent.self) { // ... } for try await event in playerItem.metrics(forType: AVMetricHLSMediaSegmentRequestEvent.self) { // ... } Also, the example shown in the WWDC session does not compile for me (XCode 26.2). I get the following error: Pack expansion requires that '' and 'AVMetricEvent' have the same shape let playerItem: AVPlayerItem = ... let ltkuMetrics = item.metrics(forType: AVMetricPlayerItemLikelyToKeepUpEvent.self) let summaryMetrics = item.metrics(forType: AVMetricPlayerItemPlaybackSummaryEvent.self) for await (metricEvent, publisher) in ltkuMetrics.chronologicalMerge(with: summaryMetrics) { // send metricEvent to server }
Replies
2
Boosts
1
Views
249
Activity
4w
`LockedCameraCaptureManager` practically unusable since iOS 26
Somewhere since iOS 26, the LockedCameraCapture framework gets in an unpredictable state after opening the main app from the LockedCamera extension using LockedCameraCaptureSession.openApplication(for userActivity:). (Feedback with sample code to reproduce: FB21966835) Opening the extension from the lock screen again doesn’t open the extension but puts the lock screen in a state as if it has. Content updated from LockedCameraCaptureManager.shared.sessionContentUpdates comes in inconsistently, usually needs the app to be opened again or the extension to be opened. This makes using this extension impossible for me as I use it to record video files that manually need to be imported when the app is launched (so not through PhotoKit). Does anybody have a suggestion to circumvent this issue or how to get this fixed?
Replies
0
Boosts
0
Views
270
Activity
4w
Setting up video and image capture pipeline creates internal errors in AVFoundation.
I have created code for iOS that allows me to start and stop video acquisition from a proprietary USB camera using AVFoundation's AVCaptureSession and AVCaptureDevice APIs. There is a start and stop method. The start method takes an argument to specify one of two formats that I use for my custom camera application. I can start the session and switch between formats all day without any errors. However, if I start and then stop the camera three times in a row, on the third invocation of start, I get errors in the console output and the CMSampleBuffers stop flowing to my callback. Additionally, once I get AVFoundation into this state, stoping the camera doesn't help. I have to kill the app and start over. Here are the errors. And below these, the code. I'm hoping someone who has experience with these errors or an engineer from Apple who knows the AVFoundation image capture pipeline code, can respond and tell me what I'm doing wrong. Thanks. <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:558) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:253) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:269) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:511) - (err=-16453) Capture session error: The operation could not be completed Capture session error: The operation could not be completed func start(for deviceFormat: String) async throws -> AnyPublisher<CMSampleBuffer, Swift.Error> { func configureCaptureDevice(with deviceFormat: String) throws { guard let format = formatDict[deviceFormat] else { throw Error.captureFormatNotFound } captureSession.beginConfiguration() defer { captureSession.commitConfiguration() } try captureDevice.lockForConfiguration() captureDeviceFormat = deviceFormat captureDevice.activeFormat = format captureDevice.unlockForConfiguration() } return try await withCheckedThrowingContinuation { continuation in sessionQueue.async { [unowned self] in logger.debug("Start capture session for \(deviceFormat): \(String(describing: captureSession))") // If we were already steaming camera images from a different mode, terminate that stream. bufferPublisher?.send(completion: .finished) bufferPublisher = nil captureDeviceFormat = "" do { // Re-configure with the new format; should be harmless if called with the currently configured format. try configureCaptureDevice(with: deviceFormat) // Return a new stream publisher for this invocation. bufferPublisher = PassthroughSubject<CMSampleBuffer, Swift.Error>() // If we are not currently running, start the image capture pipeline. if captureSession.isRunning == false { captureSession.startRunning() } continuation.resume(returning: bufferPublisher!.eraseToAnyPublisher()) } catch { logger.fault("Failed to start camera: \(error.localizedDescription)") continuation.resume(throwing: error) } } } } func stop() async throws { try await withCheckedThrowingContinuation { continuation in sessionQueue.async { [unowned self] in logger.debug("Stop capture session: \(String(describing: captureSession))") // The following invocation is synchronous and takes time to execute; // looks like a stall but you can ignore it as the MainActor is not blocked. captureSession.stopRunning() // Terminate the stream and reset our state. bufferPublisher?.send(completion: .finished) bufferPublisher = nil captureDeviceFormat = "" // Signal the caller that we are done here. continuation.resume() } } }
Replies
0
Boosts
0
Views
236
Activity
Apr ’26
How to Validate Now Playing Events on Apple Devices (iOS/tvOS)?
Hi Support Team, I need some guidance regarding Now Playing metadata integration on Apple platforms (iOS/tvOS). We are currently implementing Now Playing events in our application and would like to understand: How can we enable or configure logging for Now Playing metadata updates? Is there any recommended way or tool to verify that Now Playing events are correctly sent and received by the system (e.g., Control Center / external devices)? Are there any debugging techniques or best practices to validate metadata updates during development? Our app is currently in the development phase, and we are working towards meeting Video Partner Program (VPP) requirements. Any documentation, tools, or suggestions would be greatly appreciated. Thanks in advance for your support.
Replies
1
Boosts
0
Views
179
Activity
Apr ’26
iOS 26.4 regression: The `.pauses` audiovisual background playback policy does not pause video playback anymore when backgrounding the app
Starting with iOS 26.4 and the iOS 26.4 SDK, the .pauses audiovisual background playback policy is not correctly applied anymore to an AVPlayer having an attached video layer displayed on screen. This means that, when backgrounding a video-playing app (without Picture in Picture support) or locking the device, playback is not paused automatically by the system anymore. This issue affects the Apple TV application as well. We have filed FB22488151 with more information.
Replies
0
Boosts
0
Views
234
Activity
Apr ’26
iPad Pro M4 giving wrong value for layerPointConverted for ultra wide angle
I am using iPad Pro M4 device to apply exposure point to the camera. While converting layerPointConverted from 0 -1 range to device size point it is giving wrong value. But if same code is used for other iPad like Gen2, it gives proper value. In both cases video gravity used is resizeAspectFill. I tried using true depth camera for M4 device but it does not work.
Replies
0
Boosts
0
Views
204
Activity
Apr ’26
tvOS: Background audio + local caching works on Simulator but stops on real Apple TV device
Description: I’m developing a tvOS app using SwiftUI where we play background audio (music) in the Welcome screen, with support for offline playback via local caching. Feature Overview: App fetches audio metadata from API Starts streaming audio (HLS .m3u8) immediately In parallel, downloads the raw audio file (.mp3) Once download completes: Switches playback from streaming → local file On next launch (offline mode), app plays audio from local storage Issue: This flow works perfectly on the Simulator, but on a real Apple TV device: Audio plays for a few seconds (2–5 sec) and then stops Especially after switching from streaming → local file No explicit AVPlayer error is logged Playback sometimes stops after UI updates or periodic API refresh Implementation Details: Using AVPlayer with AVPlayerItem Background audio controlled via a shared manager (singleton) Files stored locally using FileManager (currently using .cachesDirectory) Switching playback using: player.replaceCurrentItem(with: AVPlayerItem(url: localURL)) player.play() Observations: Works reliably on Simulator On device: -- Playback stops silently -- Seems related to lifecycle, buffering, or file access No issues when continuously streaming (without switching to local) Questions: Is there any limitation or known issue with AVPlayer when switching from streaming (HLS) to local file playback on tvOS? Are there specific requirements for playing locally cached media files on tvOS (e.g., file location, permissions, or sandbox behavior)? What is the recommended storage location and size limit for cached media files on tvOS? We understand tvOS has limited persistent storage Is .cachesDirectory the correct approach for this use case? Are there known differences in AVPlayer behavior between Simulator and real Apple TV devices (especially regarding buffering or lifecycle)? What is the recommended approach for implementing offline background audio on tvOS apps? Goal: We want to implement a reliable system where: Audio streams initially Seamlessly switches to local file after download Continues playing without interruption Supports offline playback on subsequent launches Any guidance or best practices would be greatly appreciated. Thank you!
Replies
0
Boosts
0
Views
175
Activity
Apr ’26