Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

macOS 26 – NSSound/CoreAudio causes SIGILL crash in caulk allocator
Hi everyone, We are the engineering team behind an enterprise communications application for macOS. We are experiencing a critical crash on macOS 26 that did not occur on any previous macOS version. We are seeking clarification from Apple engineers or anyone who may have insight into this behaviour. Environment Architecturex86_64macOS26.4.1 (25E253)HardwareMac15,13 (MacBook Pro)ExceptionSIGILL / ILL_ILLOPCCrashed ThreadThread 0 (Main Thread)TriggerPlaying a notification sound via NSSound during an incoming call Crash Stack 0 caulk consolidating_free_map::maybe_create_free_node + 119 ← SIGILL 1 caulk tiered_allocator + 1469 2 caulk exported_resource::do_allocate + 15 3 AudioToolboxCore EABLImpl::create + 204 4 CoreAudio AUNotQuiteSoSimpleTimeFactory + 33267 8 AudioToolboxCore AudioUnitInitialize + 189 9 AudioToolbox XAudioUnit::Initialize + 19 10 AudioToolbox MESubmixGraph::initialize + 125 11 AudioToolbox MESubmixGraph::connectInputChannel + 1172 12 AudioToolbox MEDeviceStreamClient::AddRunningClient + 509 15 AudioToolbox AudioQueueObject::StartRunning + 194 16 AudioToolbox AudioQueueObject::Start + 1447 22 AudioToolbox AQ::API::V2Impl::AudioQueueStartWithFlags + 805 23 AVFAudio AVAudioPlayerCpp::playQueue + 354 24 AVFAudio AVAudioPlayerCpp::DoAction + 134 25 AVFAudio -[AVAudioPlayer play] + 26 26 AppKit -[NSSound play] + 100 27 Our App -[AudioHelper tryToStartSound:ofType:] + 569 28 Our App block_invoke + 59 Behaviour Difference Between macOS Versions The exact same code path that triggers this crash on macOS 26 works without any issue on macOS 14 and macOS 15 — no crash, no warning, no log output of any kind. The crash occurs inside Apple's private caulk memory allocator during CoreAudio audio engine initialisation, triggered by a call to [NSSound play]. The SIGILL / ILL_ILLOPC at maybe_create_free_node + 119 suggests a hard ud2 trap — an intentional abort guard inserted at compile time. This strongly suggests that something changed in macOS 26 within NSSound / CoreAudio / caulk that causes this code path to fail in a way it previously did not. Questions We have the following specific questions: Was there a deliberate threading policy change in NSSound / CoreAudio in macOS 26? Is the SIGILL in caulk::consolidating_free_map::maybe_create_free_node an intentional thread-affinity assertion introduced in macOS 26? Are there any other NSSound / AVAudioPlayer / AudioQueue APIs that have similarly tightened their requirements in macOS 26 that we should be aware of? Is there a migration guide, release note, or WWDC session that covers CoreAudio changes in macOS 26 that we may have missed? Has anyone else in the developer community encountered a similar SIGILL crash in caulk on macOS 26 during audio playback?
7
0
1k
1h
Camera launched via Camera Control is terminated with “AVCaptureEventInteraction not installed” when viewing/editing photos
I’m seeing a reproducible system-level Camera crash/termination on iPhone Air running iOS 26.4.2. Steps to reproduce: Press Camera Control to launch the Camera app. Tap the lower-left thumbnail to enter the recent photo view. Browse photos, or tap Edit and start cropping a photo. The Camera/Photos flow unexpectedly exits and returns to the Home Screen or widget view. Additional detail: The issue can happen whether or not a new photo is taken after launching Camera with Camera Control. In other words, using Camera Control as a shortcut into Camera, then tapping the lower-left thumbnail to browse photos, can trigger the issue. Sometimes it happens while only browsing photos, without entering Edit. Expected result: The photo viewer/editor should stay open and allow normal browsing or cropping. Actual result: The flow exits unexpectedly. Mac Console evidence: Around 2026-05-12 21:53:59-21:54:00, Console showed SpringBoard/RunningBoard terminating com.apple.camera. Relevant log excerpt: Capture Application Requirements Unmet: "AVCaptureEventInteraction not installed" reportType: CrashLog ReportCrash Parsing corpse data for pid 94087 com.apple.camera: Foreground: false Storage is sufficient. Restart/reset-style support steps have already been tried and did not resolve the issue. This appears specific to the Camera Control launch path, not normal Photos app browsing. Has anyone else seen this on iOS 26.x, or is this a known Camera Control / AVCaptureEventInteraction regression? Already Filed as FB22766094.
0
0
52
8h
AVPlayerView. Internal constraints conflicts
I’m getting Auto Layout constraint conflict warnings related to AVPlayerView in my project. I’ve reproduced the issue on macOS Tahoe 26.2. The conflict appears to originate inside AVPlayerView itself, between its internal subviews, rather than in my own layout code. This issue can be easily reproduced in an empty project by simply adding an AVPlayerView as a subview using the code below. class ViewController: NSViewController { override func viewDidLoad() { super.viewDidLoad() let playerView = AVPlayerView() view.addSubview(playerView) } } After presenting that view controller, the following Auto Layout constraint conflict warnings appear in the console: Conflicting constraints detected: <decode: bad range for [%@] got [offs:346 len:1057 within:0]>. Will attempt to recover by breaking <decode: bad range for [%@] got [offs:1403 len:81 within:0]>. Unable to simultaneously satisfy constraints: ( "<NSLayoutConstraint:0xb33c29950 H:|-(0)-[AVDesktopPlayerViewContentView:0x10164dce0](LTR) (active, names: '|':AVPlayerView:0xb32ecc000 )>", "<NSLayoutConstraint:0xb33c299a0 AVDesktopPlayerViewContentView:0x10164dce0.right == AVPlayerView:0xb32ecc000.right (active)>", "<NSAutoresizingMaskLayoutConstraint:0xb33c62850 h=--& v=--& AVPlayerView:0xb32ecc000.width == 0 (active)>", "<NSLayoutConstraint:0xb33d46df0 H:|-(0)-[AVEventPassthroughView:0xb33cfb480] (active, names: '|':AVDesktopPlayerViewContentView:0x10164dce0 )>", "<NSLayoutConstraint:0xb33d46e40 AVEventPassthroughView:0xb33cfb480.trailing == AVDesktopPlayerViewContentView:0x10164dce0.trailing (active)>", "<NSLayoutConstraint:0xb33ef8320 NSGlassView:0xb33ed8c00.trailing == AVEventPassthroughView:0xb33cfb480.trailing - 6 (active)>", "<NSLayoutConstraint:0xb33ef8460 NSGlassView:0xb33ed8c00.width == 180 (active)>", "<NSLayoutConstraint:0xb33ef84b0 NSGlassView:0xb33ed8c00.leading >= AVEventPassthroughView:0xb33cfb480.leading + 6 (active)>" ) Will attempt to recover by breaking constraint <NSLayoutConstraint:0xb33ef8460 NSGlassView:0xb33ed8c00.width == 180 (active)> Set the NSUserDefault NSConstraintBasedLayoutVisualizeMutuallyExclusiveConstraints to YES to have -[NSWindow visualizeConstraints:] automatically called when this happens. And/or, set a symbolic breakpoint on LAYOUT_CONSTRAINTS_NOT_SATISFIABLE to catch this in the debugger. Is it system bug or maybe someone knows how to fix that? Thank you.
4
0
961
13h
AVCaptureSession runtime error -11800 / 'what' on startRunning() with audio input — what's holding the HAL?
AVCaptureSession.startRunning() triggers AVCaptureSessionRuntimeErrorNotification with AVError.unknown (-11800), underlying OSStatus 2003329396 → fourCC 'what', every cold launch, but only when an audio AVCaptureDeviceInput is attached. Removing only the audio input makes the error disappear. Same code in a fresh project records audio fine — bug only appears in this app's binary. AVAudioApplication.shared.recordPermission == .granted. Info.plist has NSMicrophoneUsageDescription. No interruption notifications fire. Test device: iPhone 16 Pro, iOS 26.4.2. iOS deployment target 17.1. Minimal reproducer import AVFoundation let session = AVCaptureSession() session.beginConfiguration() let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)! session.addInput(try AVCaptureDeviceInput(device: camera)) // Removing ONLY this line makes the error disappear: let mic = AVCaptureDevice.default(for: .audio)! session.addInput(try AVCaptureDeviceInput(device: mic)) session.addOutput(AVCaptureMovieFileOutput()) session.addOutput(AVCapturePhotoOutput()) session.commitConfiguration() NotificationCenter.default.addObserver( forName: .AVCaptureSessionRuntimeError, object: session, queue: nil ) { print($0.userInfo ?? [:]) } session.startRunning() // -11800 / 'what' fires within ~2 sec Observed state at error time AVError.unknown (-11800) underlyingError = NSError(NSOSStatusErrorDomain, 2003329396) userInfo[AVErrorFourCharCode] = 'what' captureSession.isRunning = false ← never came up captureSession.isInterrupted = false captureSession.preset = .high captureSession.inputs = [Back Triple Camera, iPhone Microphone] AVAudioSession.sharedInstance(): category = .playAndRecord mode = .videoRecording sampleRate = 48000.0 isInputAvailable = true isOtherAudioPlaying = false availableInputs = [MicrophoneBuiltIn] (no BT/Continuity/AirPods) currentRoute.inputs = [] ← EMPTY currentRoute.outputs = [Speaker|Speaker] 2003329396 = 0x77686174 = 'what'. From a few SO threads this maps to AURemoteIO::StartIO returning a HAL-bring-up failure. The smoking gun: currentRoute.inputs is empty even though availableInputs contains the built-in mic, isInputAvailable is true, the category is .playAndRecord, and isOtherAudioPlaying is false. The HAL never routes the mic into the session, then 'what' follows. Nothing observable from AVAudioSession indicates a competing client. Environment / SDKs linked Firebase (SPM: Crashlytics, Performance, Messaging, Analytics, AppCheck, RemoteConfig, DynamicLinks), FBSDK, Kingfisher, MetalPetal. Multiple Google ad mediation pods present, but their audio session takeover is already disabled (audioVideoManager.isAudioSessionApplicationManaged = true, IMSdk.shouldAutoManageAVAudioSession(false)). What I've ruled out (all still produce 'what') Audio session config: .playAndRecord/.videoRecording, .playAndRecord/.default, .record/.measurement, .record/.default. With/without .defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP, .mixWithOthers. setActive(true) before vs. after attaching audio input. setPreferredInput(builtInMic) (verified accepted). 200ms Thread.sleep between setActive(true) and startRunning(). Setting usesApplicationAudioSession = false swaps the fourCC to '!rec' but produces the same outcome. Topology: sessionPreset = .high / .hd1920x1080 / .hd1280x720 / .medium. Camera = .builtInTripleCamera / .builtInDualWideCamera / .builtInWideAngleCamera. AVCam-style always-attached graph. Setting sessionPreset before vs. after adding inputs. Threading: All session mutations on a single dedicated DispatchQueue (vs. Swift actor). 1× and 2× full stopRunning()+startRunning() recovery cycles ("do it twice" pattern) — both re-fail with 'what'. SDK takeover prevention: GoogleMobileAdsMediation pods (Vungle, Mintegral, Pangle, Unity, InMobi), Google-Mobile-Ads-SDK, MediaPipeTasksVision removed via full pod uninstall + clean build — 'what' persists. Notifications during the failure window: 3 × AVAudioSession.routeChangeNotification reason categoryChange before the error fires, even though category stays .playAndRecord/.videoRecording. Disabling automaticallyConfiguresApplicationAudioSession drops this to 1, but the runtime error still fires. No AVAudioSession.interruptionNotification. No AVCaptureSessionWasInterruptedNotification. Symbol audit otool -L and nm of the bundle confirm none of the linked frameworks reference AVAudioRecorder, AudioComponentInstanceNew, AURemoteIO, or AudioUnitInitialize in their symbol tables. Only the app's own files reference any audio API. Yet adding AVCaptureDeviceInput(.audio) reproduces 100% in this binary and 0% in a fresh project. My questions Who is most likely holding the audio HAL in a process where no linked framework references the AudioUnit / HAL APIs directly? Are there framework load-time audio initializations that don't show up in symbol tables (e.g., dynamic dlopen, CFBundleLoadExecutable) that could grab the HAL? Is there an os_log subsystem / category that surfaces the underlying AURemoteIO::StartIO failure reason at runtime? com.apple.coreaudio shows 'what' but not the originating cause. currentRoute.inputs is empty at error time even though availableInputs = [MicrophoneBuiltIn], isInputAvailable = true, and the category is .playAndRecord. What does an empty input route under those conditions imply, and what other system-level holders could be preventing the HAL from routing the mic in? Has anyone seen 'what' resolve with a device reboot, an iOS update, or by removing a specific framework? Happy to share a sysdiagnose. Thanks!
1
0
256
22h
PHPickerConfiguration.preselectedAssetIdentifiers not work
let authStatus = PHPhotoLibrary.authorizationStatus(for: .readWrite) let fetchResult = PHAsset.fetchAssets(withLocalIdentifiers: selectedAssetIDs, options: nil) print("[AlbumCreation] authStatus=\(authStatus.rawValue) IDs=\(selectedAssetIDs.count) PHAsset匹配=\(fetchResult.count)") // result is: [AlbumCreation] authStatus=3 IDs=3 PHAsset匹配=3 var config = PHPickerConfiguration(photoLibrary: .shared()) config.preselectedAssetIdentifiers = selectedAssetIDs config.selectionLimit = 0 let picker = PHPickerViewController(configuration: config) picker.delegate = self present(picker, animated: true)
1
0
168
1d
RotationCoordinator returns angles 90 degrees lower on iPhone 17 Pro front camera — clarification on contract with AVSampleBufferDisplayLayer
Hi AVFoundation team, I'm seeing a uniform 90° offset in AVCaptureDevice.RotationCoordinator's reported angles between iPhone 17 Pro and iPhone 14 Pro using the front-facing .builtInWideAngleCamera (Center Stage on 17 Pro), and I'd like to confirm whether this is by design and what the recommended consumption pattern is when the rendering surface is an AVSampleBufferDisplayLayer rather than an AVCaptureVideoPreviewLayer. Here is the github repo of sample project. Setup Devices: iPhone 14 Pro (iOS 26.5) and iPhone 17 Pro (iOS 26.4.2) Camera: front, AVCaptureDeviceTypeBuiltInWideAngleCamera Active format: 1920×1080 Three RotationCoordinator instances are created on the same AVCaptureDevice, varying only the previewLayer: argument: - previewLayer: nil - previewLayer: AVSampleBufferDisplayLayer (the surface receiving frames from AVCaptureVideoDataOutput) - previewLayer: AVCaptureVideoPreviewLayer (with .session = captureSession, not displayed) Each instance is KVO-observed for videoRotationAngleForHorizonLevelPreview and videoRotationAngleForHorizonLevelCapture. Observed angles Device / Orientation: 14 Pro · Portrait (interface=1) RC[nil] prev / cap: 0° / 90° RC[AVSampleBufferLayer] prev / cap: 90° / 90° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 90° ──────────────────────────────────────── Device / Orientation: 14 Pro · LandscapeRight (interface=3) RC[nil] prev / cap: 0° / 180° RC[AVSampleBufferLayer] prev / cap: 180° / 180° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 180° ──────────────────────────────────────── Device / Orientation: 17 Pro · Portrait (interface=1) RC[nil] prev / cap: 0° / 0° RC[AVSampleBufferLayer] prev / cap: 0° / 0° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 0° ──────────────────────────────────────── Device / Orientation: 17 Pro · LandscapeRight (interface=3) RC[nil] prev / cap: 0° / 90° RC[AVSampleBufferLayer] prev / cap: 90° / 90° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 90° The −90° offset on 17 Pro is uniform: it appears in every RC variant, in both the preview-angle and the capture-angle properties, at every orientation tested. It is not specific to the previewLayer: argument.
1
0
243
1d
Mac (Designed for iPad) cannot access microphone
I have an application that is a VOIP application of sorts that needs access to the microphone. I am using the Mac (Designed for iPad) support to not have to do huge amounts of conditional building and support for all the many iOS specific things my app includes. I never get prompted to allow microphone permissions and I never see my app name appear in Privacy & Security -> Microphone permissions setup. So is it that Mac is just a dead end for any form of an application that needs a microphone and is running under Mac (Designed for iPad) compatibility mode? Why doesn't TCC have some mechanism to notice and grant access to mic use?
3
0
430
2d
Apple Music playlist create/delete works but DELETE returns 401 — and MusicKit write APIs are macOS‑unavailable. How to build a playlist editor on macOS?
I’m trying to build a playlist editor on macOS. I can create playlists via the Apple Music HTTP API, but DELETE always returns 401 even immediately after creation with the same tokens. Minimal repro: #!/usr/bin/env bash set -euo pipefail BASE_URL="https://api.music.apple.com/v1" PLAYLIST_NAME="${PLAYLIST_NAME:-blah}" : "${APPLE_MUSIC_DEV_TOKEN:?}" : "${APPLE_MUSIC_USER_TOKEN:?}" create_body="$(mktemp)" delete_body="$(mktemp)" trap 'rm -f "$create_body" "$delete_body"' EXIT curl -sS --compressed -o "$create_body" -w "Create status: %{http_code}\n" \ -X POST "${BASE_URL}/me/library/playlists" \ -H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \ -H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \ -H "Content-Type: application/json" \ -d "{\"attributes\":{\"name\":\"${PLAYLIST_NAME}\"}}" playlist_id="$(python3 - "$create_body" <<'PY' import json, sys with open(sys.argv[1], "r", encoding="utf-8") as f: data = json.load(f) print(data["data"][0]["id"]) PY )" curl -sS --compressed -o "$delete_body" -w "Delete status: %{http_code}\n" \ -X DELETE "${BASE_URL}/me/library/playlists/${playlist_id}" \ -H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \ -H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \ -H "Content-Type: application/json" I capture the response bodies like this: cat "$create_body" cat "$delete_body" Result: Create: 201 Delete: 401 I also checked the latest macOS SDK’s MusicKit interfaces and MusicLibrary.createPlaylist/edit/add(to:) are marked @available(macOS, unavailable), so I can’t create/ delete via MusicKit on macOS either. Question: How can I implement a playlist editor on macOS (create/delete/modify) if: MusicKit write APIs are unavailable on macOS, and The HTTP API can create but DELETE returns 401? Any guidance or official workaround would be hugely appreciated.
1
2
525
2d
Electron app + Apple Music playback: queue works, playback does not start. Looking for guidance.
Hi everyone. I’m building a macOS-first desktop app where music drives the app's behavior loop. The app is currently an Electron prototype. The blocker: we’re testing Apple Music inside an Electron app. MusicKit JS authorization works, catalog search works, and setting the queue works, but playback does not actually start in Electron. What we tried: Created Apple Developer / MusicKit credentials. Generated Apple Music developer tokens successfully. Retrieved a Music User Token through MusicKit JS. Confirmed Apple Music API calls work. Confirmed /v1/test and /me/storefront return 200 OK. Built a local HTTP auth/playback window inside Electron instead of using file://. Tested music.setQueue() with both: { song: songId } { url: catalogUrl } In Electron, the queue loads correctly: queueEmpty=false queueLength=1 volume=1 playbackRate=1 But after music.play(), playbackTime stays at 0 and no audio plays. Then we ran the same MusicKit playback test in normal Chrome using the same token, same local origin, same catalog track, and same queue descriptor. Chrome played successfully and playbackTime advanced. We also checked Electron directly and found navigator.requestMediaKeySystemAccess is missing, so our current theory is that stock Electron lacks the protected media / EME support Apple Music web playback needs. Important: we are not trying to bypass DRM or extract audio. We just want a legitimate way for a user-authorized macOS app to control Apple Music playback or observe playback state. What we’re considering next: Use the native macOS Music app as the playback engine and control it from our app. Test AppleScript / Automation permissions for play, pause, next, current track, player state, etc. Later, possibly build a native Swift helper using Apple Music / MediaPlayer APIs and communicate with Electron over IPC. Avoid relying on Electron MusicKit JS playback if this is a known dead end. Questions: Has anyone successfully made Apple Music / MusicKit JS playback work inside Electron? Is the missing EME/protected-media layer the expected blocker here? Is controlling the native macOS Music app the more realistic path? Any gotchas with AppleScript, MusicKit native APIs, or Electron + native helper architecture for this use case? Any pointers from people who have dealt with Electron + Apple Music / protected media would be appreciated.
0
0
37
2d
Radiometric interpretation of Apple ProRAW and Bayer RAW access via AVFoundation
I am working on a computational photography research project involving multi-exposure HDR reconstruction using Bayer RAW and Apple ProRAW captures. I would like to clarify the radiometric interpretation of Apple ProRAW and the availability of Bayer RAW capture through AVFoundation. My questions are: 1.On current iPhone Pro devices, is it possible for third-party apps to capture and export true Bayer-pattern RAW DNG files through AVFoundation, rather than Apple ProRAW linear DNG files? If so, which availableRawPhotoPixelFormatTypes correspond to Bayer RAW, and what device or format restrictions apply? 2.Apple ProRAW appears to be demosaiced and computationally processed, and may include multi-frame fusion. Is the decoded ProRAW image intended to be radiometrically linear and scene-referred? 3.For a bracketed ProRAW sequence captured with fixed ISO, white balance, lens, and focus, but different exposure times, can one assume that the decoded linear pixel values Y_i(p) satisfy an exposure-proportional model in non-saturated regions, such as Y_i(p) ≈ t_i R(p), across brackets? This question is about radiometric consistency for algorithmic use, not about visual editing or tone mapping. Thank you for your help.
0
0
138
3d
Radiometric interpretation of Apple ProRAW and Bayer RAW access via AVFoundation
I am working on a computational photography research project involving multi-exposure HDR reconstruction using Bayer RAW and Apple ProRAW captures. I would like to clarify the radiometric interpretation of Apple ProRAW and the availability of Bayer RAW capture through AVFoundation. My questions are: On current iPhone Pro devices, is it possible for third-party apps to capture and export true Bayer-pattern RAW DNG files through AVFoundation, rather than Apple ProRAW linear DNG files? If so, which availableRawPhotoPixelFormatTypes correspond to Bayer RAW, and what device or format restrictions apply? Apple ProRAW appears to be demosaiced and computationally processed, and may include multi-frame fusion. Is the decoded ProRAW image intended to be radiometrically linear and scene-referred? For a bracketed ProRAW sequence captured with fixed ISO, white balance, lens, and focus, but different exposure times, can one assume that the decoded linear pixel values Y_i(p) satisfy an exposure-proportional model in non-saturated regions, such as Y_i(p) ≈ t_i R(p), across brackets? This question is about radiometric consistency for algorithmic use, not about visual editing or tone mapping. Thank you for your help.
0
0
144
3d
How to Monitor Any USB Audio or Video Device on macOS
USB cameras, microphones, HDMI capture cards, and audio interfaces are supposed to "just work" on macOS. In reality, it's often difficult to quickly access or monitor them without opening large and complicated software. Sometimes you simply want to see whether a USB camera is active. Sometimes you want to check an HDMI source connected through a capture card. And in other cases, you may want to use a Mac mini without a dedicated monitor by viewing its HDMI output through a USB capture device directly on another Mac. macOS supports many modern USB AV devices out of the box, but it surprisingly lacks a simple built-in utility for live monitoring and recording. Most users end up using oversized streaming or editing applications just to preview a video signal or monitor audio input. That becomes especially noticeable with: USB webcams HDMI capture adapters USB microphones audio interfaces secondary computers headless Mac mini setups A lightweight monitor utility is often much more practical when you only need real-time access to a device, want to record a stream, or quickly switch between multiple AV inputs. That's one of the reasons I built AV Monitor Pro  -  a native macOS app designed for monitoring and recording connected audio/video devices in real time. It can preview USB cameras, capture cards, microphones, and HDMI sources with minimal setup, and it's especially useful for workflows like running a Mac mini without a monitor, monitoring external devices, or recording live AV input directly on macOS.
0
0
164
4d
AudioHardwareCreateProcessTap delivers all-zero buffers while system audio is audible
Summary Using AudioHardwareCreateProcessTap + AudioHardwareCreateAggregateDevice for system audio capture. During long sessions, the AudioDeviceIOProc callback continues firing normally but every PCM sample is exactly 0.0f — while the system is producing audible output. Environment Field Value macOS 26.5 Beta Hardware MacBook Air (M2) API AudioHardwareCreateProcessTap + AudioHardwareCreateAggregateDevice Tap CATapDescription, processes = [], .unmuted, private Format 48,000 Hz, Float32, interleaved stereo Aggregate anchor kAudioAggregateDeviceMainSubDeviceKey = current default output UID Observed behavior After running normally for several minutes, the stream transitions into an all-zero state: AudioDeviceIOProc continues to fire at expected cadence Frame count, timestamps (mHostTime, mSampleTime), and mDataByteSize all look normal AudioBufferList pointers are valid Every sample in every buffer is exactly 0.0f Other apps are still producing audible output through the same output device The condition may self-recover or persist until the session is stopped Confirmed via RMS logging both inside the IOProc and after the ring buffer consumer — data is zero on delivery, not introduced downstream. Example: 51-minute session on MacBook Air M2 Segment 1 (~7 min): Three all-zero periods: 60 s, 53 s, 141 s. Real PCM briefly returned between them. Segment 2 (~44 min): Two all-zero periods: 16 min 3 s, 3 min 8 s. IOProc cadence, timestamp deltas, default output UID, and kAudioDevicePropertyDeviceIsRunningSomewhere all remained normal throughout. What I have ruled out Actual silence: User was in an active video call and could hear participants through the output device. Default output device change: Monitored kAudioHardwarePropertyDefaultOutputDevice — no change during affected periods. IOProc stall: Heartbeat and kAudioDevicePropertyDeviceIsRunningSomewhere remained normal. Aggregate device destroyed: AudioObjectGetPropertyData on the aggregate UID continued returning the expected device. Tap descriptor misconfiguration: The same tap produces valid PCM earlier in the same session, so this is not a startup-time issue. Why detection is hard All-zero buffers from a broken tap are indistinguishable from legitimate silence (muted participant, waiting room, paused media). kAudioProcessPropertyIsRunningOutput reports whether a process has active output IO, not whether it is contributing non-zero samples — a muted Zoom call still reports true. Possible correlations Sample-rate renegotiation on the output device (44.1 kHz ↔ 48 kHz) when another app changes output Bluetooth device state changes (AirPods sleep/wake) where UID stays the same MacBook Air more frequently affected than MacBook Pro Always occurs after extended uptime — first few minutes are consistently clean Current workaround Full teardown and rebuild restores real PCM. Restarting the IOProc alone or recreating only the aggregate device is not reliable — both the Process Tap and Aggregate Device must be destroyed and recreated. 1. AudioDeviceStop 2. AudioDeviceDestroyIOProcID 3. AudioHardwareDestroyAggregateDevice 4. AudioHardwareDestroyProcessTap 5. AudioHardwareCreateProcessTap 6. AudioHardwareCreateAggregateDevice 7. Create + start new IOProc Applying this automatically is risky because it cannot be reliably distinguished from legitimate silence. Questions Expected failure mode? Can a Process Tap continue delivering zero-filled buffers while the system output is audible? Is this expected under certain device or routing conditions? Detection signal? Is there any HAL property, notification, or diagnostic counter that distinguishes "sources are genuinely silent" from "the tap data path has stopped receiving the real mix"? Targeted recovery? Is there a supported way to re-anchor or reset the tap data path without destroying and recreating both objects? Full rebuild as intended workaround? If so, it would help to confirm this so developers can converge on a consistent approach. Mixer activity signal? kAudioProcessPropertyIsRunningOutput reflects IO registration, not sample contribution. Is there any AudioProcess property that indicates a process is currently delivering non-zero audio to the system mixer?
0
0
222
5d
MusicKit playback completely broken after Apple Music “What’s New?” update screen until native app is opened
I’m developing a third-party Apple Music streaming app using MusicKit (ApplicationMusicPlayer + catalog requests). Issue: Whenever Apple releases an Apple Music update that shows the “What’s New?” onboarding/modal screen in the native Apple Music app, MusicKit in our app completely breaks for all users. Attempts to play anything (queue, prepareToPlay, etc.) fail silently or with service-related errors. Playback and most MusicKit operations remain broken until the user opens the native Apple Music app, dismisses the “What’s New?” screen, and returns to our app. After that single native interaction (we deliberately stopped users from going any further within Apple Music to verify this), everything works perfectly again. Reproduction Steps: Apple Music receives an update with “What’s New?” screen. User launches our third-party app and attempts playback. MusicKit fails. User opens Apple Music → dismisses modal → returns to our app. MusicKit works again. Expected Behavior: Third-party MusicKit apps should not become non-functional because the native Apple Music app has a pending onboarding screen. Shared backend services (account readiness, tokens, subscription state, etc.) should initialize independently. Environment: iOS 26.4.2 Devices verified to be affected: iPhone 13 Pro iPhone XR iPhone 15 Workarounds attempted: Re-requesting MusicAuthorization Recreating ApplicationMusicPlayer Stopping/re-queuing Background/foreground app None resolve it without the native Apple Music interaction. This appears to be a recurring integration fragility with shared Apple Music services. Has anyone else seen this? Any recommended recovery path or API to force service initialization? Thanks!
1
2
212
5d
CarPlay HID transport buttons remap to call-control during continuous mic capture (no opt-out API)
Hello, I am developing Uniq Intercom, a voice-only group communication app for motorcyclists (always-on intercom over WebRTC, used continuously for multi-hour rides). I am seeking guidance on an iOS audio session and CarPlay HID interaction I have not been able to resolve through documented APIs. Problem: As soon as my app activates the microphone (yellow recording indicator visible), iOS appears to classify the app as a real-time communication participant and CarPlay re-routes the steering-wheel / handlebar HID transport buttons (left / right / ok) from the media-control role to the call-control role (answer/decline). Because I do not register a CallKit / LiveCommunicationKit call (the session is a continuous group voice channel, not a discrete telephony call), there is no call object for those buttons to act upon — they effectively become inert. Why this matters: Motorcyclists rely on the intercom for 4–6 hour rides. CarPlay is now built into a growing number of modern motorcycles and with aftermarket display units virtually any bike, and any rider who uses any voice communication platform alongside it — Uniq Intercom, WhatsApp Call currently runs into this same handlebar button remap. With the buttons inert, the rider's only remaining option is to reach for the motorcycle's touchscreen to skip a track or change navigation — this is unsafe. The exact same remap behavior occurs during a real WhatsApp or Phone call — but for those the remap is correct (answer/decline maps to a real call). For continuous voice apps without a CallKit-style discrete call, no equivalent path exists today. As both an iOS developer and a motorcyclist, I would very much like to see this resolved — solving it would meaningfully improve safety for every rider using an iPhone with CarPlay. Configurations I have tested (all reproduce the symptom on iOS 18+ / 26 with wireless CarPlay): AVAudioSession.Category.playAndRecord + .voiceChat mode + various option combinations (duckOthers, mixWithOthers, allowBluetoothHFP, allowBluetoothA2DP, defaultToSpeaker) Same category with .videoChat mode (which @livekit/react-native defaults to) Same category with .default mode (re-applied after setAudioModeAsync to defeat any framework override) — confirmed Mode = Default for ~2 s window in audiomxd log before WebRTC's setActive cycle returned mode to .voiceChat. Buttons remained remapped during the .default window. Disabling MPRemoteCommandCenter and clearing MPNowPlayingInfoCenter.default().nowPlayingInfo JS-side override of WebRTC's global RTCAudioSessionConfiguration via @livekit/react-native's AudioSession.setAppleAudioConfiguration({audioMode: 'default'}) bridge, applied both before connect and after setAudioModeAsync to defeat library overrides In every case the audiomxd system log confirms our session goes active (Mode = VoiceChat or Default, Recording = YES), and CarPlay HID buttons are immediately remapped to call-control. The middle "OK" button remains functional because it is not part of the call-control mapping — confirming the buttons are not blocked, only re-purposed. The remap occurs the instant the iOS recording indicator appears, regardless of audio session mode. This led me to conclude the trigger is not audio session mode but the combination of microphone permission + active session + (likely) the AUVoiceIO unit instantiated by WebRTC. I cannot find a public API path to suppress this classification while maintaining the always-on continuous voice channel. My questions: Is there an entitlement or API that allows an app with active microphone capture to declare itself as a non-call media participant, keeping CarPlay HID transport buttons in the media role? Is AVAudioSession.setPrefersEchoCancelledInput(_:) (iOS 18+) the intended path for retaining platform AEC under .default mode without the focus-engine "communication priority" marking? Documentation is sparse on its CarPlay arbitration implications. Does the PushToTalk framework affect HID arbitration differently from playAndRecord + voiceChat? Our continuous-channel UX does not fit the PTT transmit-on-press model, but understanding the contrast would help. If no current API exists, is this something the iOS Audio team would consider for future SDKs? Solving this would meaningfully improve safety for motorcycle and adventure-sport users on iOS. Thank you for your time and any guidance you can offer. — Emre Erkaya / Uniq Intercom
1
0
123
5d
Issues with monitoring and changing WebRTC audio output device in WKWebView
I am developing a VoIP app that uses WebRTC inside a WKWebView. Question 1: How can I monitor which audio output device WebRTC is currently using? I want to display this information in the UI for the user . Question 2: How can I change the current audio output device for WebRTC? I am using a JS Bridge to Objective-C code, attempting to change the audio device with the following code: void set_speaker(int n) { session = [AVAudioSession sharedInstance]; NSError *err = nil; if (n == 1) { [session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&err]; } else { [session overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&err]; } } However, this approach does not work. I am testing on an iPhone with iOS 16.7. Is a higher iOS version required?
2
0
355
6d
AVMutableComposition audio silently drops on iOS 26 when streaming over HTTP/2 (FB22696516)
We've discovered a regression in iOS 26 where AVMutableComposition silently drops audio when the source asset is streamed over HTTP/2. The same file served over HTTP/1.1 plays audio correctly through the same composition code. Direct AVPlayer playback (without composition) works fine on HTTP/2. This did not occur on iOS 18.x. It happens on physical devices only. It does not reproduce on a simulator or on macOS. Tested conditions (same MP4 file, different CDNs): CloudFront (HTTP/2) + Composition → ❌ Audio silent Cloudflare (HTTP/2) + Composition → ❌ Audio silent Akamai (HTTP/1.1) + Composition → ✅ Audio works Apple TS (HTTP/1.1) + Composition → ✅ Audio works Downloaded locally, then composed → ✅ Audio works Direct playback, no composition (HTTP/2) → ✅ Audio works The CloudFront and Akamai URLs serve the identical file — same S3 object, different CDN edge. CDN vendor doesn't matter; any HTTP/2 source triggers it. Minimal reproduction: let asset = AVURLAsset(url: http2URL) let videoTrack = try await asset.loadTracks(withMediaType: .video).first! let audioTrack = try await asset.loadTracks(withMediaType: .audio).first! let duration = try await asset.load(.duration) let composition = AVMutableComposition() let fullRange = CMTimeRange(start: .zero, end: duration) let compVideo = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)! try compVideo.insertTimeRange(fullRange, of: videoTrack, at: .zero) let compAudio = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)! try compAudio.insertTimeRange(fullRange, of: audioTrack, at: .zero) let item = AVPlayerItem(asset: composition.copy() as! AVComposition) player.replaceCurrentItem(with: item) player.play() // Video plays, audio goes silent after a while Playing the same asset directly works fine: player.replaceCurrentItem(with: AVPlayerItem(asset: asset)) player.play() // Both video and audio work Filed as FB22696516 Sample project: https://github.com/karlingen/AVCompositionBug
1
9
212
1w
AVAudioEngineConfigurationChangeNotification received while engine is running
The documentation for AVAudioEngineConfigurationChangeNotification states When the audio engine’s I/O unit observes a change to the audio input or output hardware’s channel count or sample rate, the audio engine stops, uninitializes itself, and issues this notification. A user of my framework has reported a crash during notification processing on iOS 26.4 when the main mixer node is disconnected from the output node in order to reestablish the connection with a different format. The failing precondition is com.apple.coreaudio.avfaudio: required condition is false: !IsRunning(). The report was observed on iPhone 16 / iOS 26.4.2, ARM64, TestFlight build. The backtrace contains: [Last Exception Backtrace] 3 AVFAudio AVAudioEngineGraph::_DisconnectInput AVAudioEngineGraph.mm:2728 4 AVFAudio -[AVAudioEngine disconnectNodeInput:bus:] AVAudioEngine.mm:155 5 SFB sfb::AudioPlayer::handleAudioEngineConfigurationChange AudioPlayer.mm:2247 [Thread 18 Crashed] 9 SFB sfb::AudioPlayer::handleAudioEngineConfigurationChange AudioPlayer.mm:2212 … 14 AVFAudio IOUnitConfigurationChanged Has the behavior for AVAudioEngineConfigurationChangeNotification changed in iOS 26.4? It's simple enough to call [engine_ stop] in the notification handler but the documentation states this shouldn't be necessary. I've not observed a similar crash on previous iOS versions.
0
1
143
1w
macOS 26 – NSSound/CoreAudio causes SIGILL crash in caulk allocator
Hi everyone, We are the engineering team behind an enterprise communications application for macOS. We are experiencing a critical crash on macOS 26 that did not occur on any previous macOS version. We are seeking clarification from Apple engineers or anyone who may have insight into this behaviour. Environment Architecturex86_64macOS26.4.1 (25E253)HardwareMac15,13 (MacBook Pro)ExceptionSIGILL / ILL_ILLOPCCrashed ThreadThread 0 (Main Thread)TriggerPlaying a notification sound via NSSound during an incoming call Crash Stack 0 caulk consolidating_free_map::maybe_create_free_node + 119 ← SIGILL 1 caulk tiered_allocator + 1469 2 caulk exported_resource::do_allocate + 15 3 AudioToolboxCore EABLImpl::create + 204 4 CoreAudio AUNotQuiteSoSimpleTimeFactory + 33267 8 AudioToolboxCore AudioUnitInitialize + 189 9 AudioToolbox XAudioUnit::Initialize + 19 10 AudioToolbox MESubmixGraph::initialize + 125 11 AudioToolbox MESubmixGraph::connectInputChannel + 1172 12 AudioToolbox MEDeviceStreamClient::AddRunningClient + 509 15 AudioToolbox AudioQueueObject::StartRunning + 194 16 AudioToolbox AudioQueueObject::Start + 1447 22 AudioToolbox AQ::API::V2Impl::AudioQueueStartWithFlags + 805 23 AVFAudio AVAudioPlayerCpp::playQueue + 354 24 AVFAudio AVAudioPlayerCpp::DoAction + 134 25 AVFAudio -[AVAudioPlayer play] + 26 26 AppKit -[NSSound play] + 100 27 Our App -[AudioHelper tryToStartSound:ofType:] + 569 28 Our App block_invoke + 59 Behaviour Difference Between macOS Versions The exact same code path that triggers this crash on macOS 26 works without any issue on macOS 14 and macOS 15 — no crash, no warning, no log output of any kind. The crash occurs inside Apple's private caulk memory allocator during CoreAudio audio engine initialisation, triggered by a call to [NSSound play]. The SIGILL / ILL_ILLOPC at maybe_create_free_node + 119 suggests a hard ud2 trap — an intentional abort guard inserted at compile time. This strongly suggests that something changed in macOS 26 within NSSound / CoreAudio / caulk that causes this code path to fail in a way it previously did not. Questions We have the following specific questions: Was there a deliberate threading policy change in NSSound / CoreAudio in macOS 26? Is the SIGILL in caulk::consolidating_free_map::maybe_create_free_node an intentional thread-affinity assertion introduced in macOS 26? Are there any other NSSound / AVAudioPlayer / AudioQueue APIs that have similarly tightened their requirements in macOS 26 that we should be aware of? Is there a migration guide, release note, or WWDC session that covers CoreAudio changes in macOS 26 that we may have missed? Has anyone else in the developer community encountered a similar SIGILL crash in caulk on macOS 26 during audio playback?
Replies
7
Boosts
0
Views
1k
Activity
1h
Camera launched via Camera Control is terminated with “AVCaptureEventInteraction not installed” when viewing/editing photos
I’m seeing a reproducible system-level Camera crash/termination on iPhone Air running iOS 26.4.2. Steps to reproduce: Press Camera Control to launch the Camera app. Tap the lower-left thumbnail to enter the recent photo view. Browse photos, or tap Edit and start cropping a photo. The Camera/Photos flow unexpectedly exits and returns to the Home Screen or widget view. Additional detail: The issue can happen whether or not a new photo is taken after launching Camera with Camera Control. In other words, using Camera Control as a shortcut into Camera, then tapping the lower-left thumbnail to browse photos, can trigger the issue. Sometimes it happens while only browsing photos, without entering Edit. Expected result: The photo viewer/editor should stay open and allow normal browsing or cropping. Actual result: The flow exits unexpectedly. Mac Console evidence: Around 2026-05-12 21:53:59-21:54:00, Console showed SpringBoard/RunningBoard terminating com.apple.camera. Relevant log excerpt: Capture Application Requirements Unmet: "AVCaptureEventInteraction not installed" reportType: CrashLog ReportCrash Parsing corpse data for pid 94087 com.apple.camera: Foreground: false Storage is sufficient. Restart/reset-style support steps have already been tried and did not resolve the issue. This appears specific to the Camera Control launch path, not normal Photos app browsing. Has anyone else seen this on iOS 26.x, or is this a known Camera Control / AVCaptureEventInteraction regression? Already Filed as FB22766094.
Replies
0
Boosts
0
Views
52
Activity
8h
AVPlayerView. Internal constraints conflicts
I’m getting Auto Layout constraint conflict warnings related to AVPlayerView in my project. I’ve reproduced the issue on macOS Tahoe 26.2. The conflict appears to originate inside AVPlayerView itself, between its internal subviews, rather than in my own layout code. This issue can be easily reproduced in an empty project by simply adding an AVPlayerView as a subview using the code below. class ViewController: NSViewController { override func viewDidLoad() { super.viewDidLoad() let playerView = AVPlayerView() view.addSubview(playerView) } } After presenting that view controller, the following Auto Layout constraint conflict warnings appear in the console: Conflicting constraints detected: <decode: bad range for [%@] got [offs:346 len:1057 within:0]>. Will attempt to recover by breaking <decode: bad range for [%@] got [offs:1403 len:81 within:0]>. Unable to simultaneously satisfy constraints: ( "<NSLayoutConstraint:0xb33c29950 H:|-(0)-[AVDesktopPlayerViewContentView:0x10164dce0](LTR) (active, names: '|':AVPlayerView:0xb32ecc000 )>", "<NSLayoutConstraint:0xb33c299a0 AVDesktopPlayerViewContentView:0x10164dce0.right == AVPlayerView:0xb32ecc000.right (active)>", "<NSAutoresizingMaskLayoutConstraint:0xb33c62850 h=--& v=--& AVPlayerView:0xb32ecc000.width == 0 (active)>", "<NSLayoutConstraint:0xb33d46df0 H:|-(0)-[AVEventPassthroughView:0xb33cfb480] (active, names: '|':AVDesktopPlayerViewContentView:0x10164dce0 )>", "<NSLayoutConstraint:0xb33d46e40 AVEventPassthroughView:0xb33cfb480.trailing == AVDesktopPlayerViewContentView:0x10164dce0.trailing (active)>", "<NSLayoutConstraint:0xb33ef8320 NSGlassView:0xb33ed8c00.trailing == AVEventPassthroughView:0xb33cfb480.trailing - 6 (active)>", "<NSLayoutConstraint:0xb33ef8460 NSGlassView:0xb33ed8c00.width == 180 (active)>", "<NSLayoutConstraint:0xb33ef84b0 NSGlassView:0xb33ed8c00.leading >= AVEventPassthroughView:0xb33cfb480.leading + 6 (active)>" ) Will attempt to recover by breaking constraint <NSLayoutConstraint:0xb33ef8460 NSGlassView:0xb33ed8c00.width == 180 (active)> Set the NSUserDefault NSConstraintBasedLayoutVisualizeMutuallyExclusiveConstraints to YES to have -[NSWindow visualizeConstraints:] automatically called when this happens. And/or, set a symbolic breakpoint on LAYOUT_CONSTRAINTS_NOT_SATISFIABLE to catch this in the debugger. Is it system bug or maybe someone knows how to fix that? Thank you.
Replies
4
Boosts
0
Views
961
Activity
13h
AVCaptureSession runtime error -11800 / 'what' on startRunning() with audio input — what's holding the HAL?
AVCaptureSession.startRunning() triggers AVCaptureSessionRuntimeErrorNotification with AVError.unknown (-11800), underlying OSStatus 2003329396 → fourCC 'what', every cold launch, but only when an audio AVCaptureDeviceInput is attached. Removing only the audio input makes the error disappear. Same code in a fresh project records audio fine — bug only appears in this app's binary. AVAudioApplication.shared.recordPermission == .granted. Info.plist has NSMicrophoneUsageDescription. No interruption notifications fire. Test device: iPhone 16 Pro, iOS 26.4.2. iOS deployment target 17.1. Minimal reproducer import AVFoundation let session = AVCaptureSession() session.beginConfiguration() let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)! session.addInput(try AVCaptureDeviceInput(device: camera)) // Removing ONLY this line makes the error disappear: let mic = AVCaptureDevice.default(for: .audio)! session.addInput(try AVCaptureDeviceInput(device: mic)) session.addOutput(AVCaptureMovieFileOutput()) session.addOutput(AVCapturePhotoOutput()) session.commitConfiguration() NotificationCenter.default.addObserver( forName: .AVCaptureSessionRuntimeError, object: session, queue: nil ) { print($0.userInfo ?? [:]) } session.startRunning() // -11800 / 'what' fires within ~2 sec Observed state at error time AVError.unknown (-11800) underlyingError = NSError(NSOSStatusErrorDomain, 2003329396) userInfo[AVErrorFourCharCode] = 'what' captureSession.isRunning = false ← never came up captureSession.isInterrupted = false captureSession.preset = .high captureSession.inputs = [Back Triple Camera, iPhone Microphone] AVAudioSession.sharedInstance(): category = .playAndRecord mode = .videoRecording sampleRate = 48000.0 isInputAvailable = true isOtherAudioPlaying = false availableInputs = [MicrophoneBuiltIn] (no BT/Continuity/AirPods) currentRoute.inputs = [] ← EMPTY currentRoute.outputs = [Speaker|Speaker] 2003329396 = 0x77686174 = 'what'. From a few SO threads this maps to AURemoteIO::StartIO returning a HAL-bring-up failure. The smoking gun: currentRoute.inputs is empty even though availableInputs contains the built-in mic, isInputAvailable is true, the category is .playAndRecord, and isOtherAudioPlaying is false. The HAL never routes the mic into the session, then 'what' follows. Nothing observable from AVAudioSession indicates a competing client. Environment / SDKs linked Firebase (SPM: Crashlytics, Performance, Messaging, Analytics, AppCheck, RemoteConfig, DynamicLinks), FBSDK, Kingfisher, MetalPetal. Multiple Google ad mediation pods present, but their audio session takeover is already disabled (audioVideoManager.isAudioSessionApplicationManaged = true, IMSdk.shouldAutoManageAVAudioSession(false)). What I've ruled out (all still produce 'what') Audio session config: .playAndRecord/.videoRecording, .playAndRecord/.default, .record/.measurement, .record/.default. With/without .defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP, .mixWithOthers. setActive(true) before vs. after attaching audio input. setPreferredInput(builtInMic) (verified accepted). 200ms Thread.sleep between setActive(true) and startRunning(). Setting usesApplicationAudioSession = false swaps the fourCC to '!rec' but produces the same outcome. Topology: sessionPreset = .high / .hd1920x1080 / .hd1280x720 / .medium. Camera = .builtInTripleCamera / .builtInDualWideCamera / .builtInWideAngleCamera. AVCam-style always-attached graph. Setting sessionPreset before vs. after adding inputs. Threading: All session mutations on a single dedicated DispatchQueue (vs. Swift actor). 1× and 2× full stopRunning()+startRunning() recovery cycles ("do it twice" pattern) — both re-fail with 'what'. SDK takeover prevention: GoogleMobileAdsMediation pods (Vungle, Mintegral, Pangle, Unity, InMobi), Google-Mobile-Ads-SDK, MediaPipeTasksVision removed via full pod uninstall + clean build — 'what' persists. Notifications during the failure window: 3 × AVAudioSession.routeChangeNotification reason categoryChange before the error fires, even though category stays .playAndRecord/.videoRecording. Disabling automaticallyConfiguresApplicationAudioSession drops this to 1, but the runtime error still fires. No AVAudioSession.interruptionNotification. No AVCaptureSessionWasInterruptedNotification. Symbol audit otool -L and nm of the bundle confirm none of the linked frameworks reference AVAudioRecorder, AudioComponentInstanceNew, AURemoteIO, or AudioUnitInitialize in their symbol tables. Only the app's own files reference any audio API. Yet adding AVCaptureDeviceInput(.audio) reproduces 100% in this binary and 0% in a fresh project. My questions Who is most likely holding the audio HAL in a process where no linked framework references the AudioUnit / HAL APIs directly? Are there framework load-time audio initializations that don't show up in symbol tables (e.g., dynamic dlopen, CFBundleLoadExecutable) that could grab the HAL? Is there an os_log subsystem / category that surfaces the underlying AURemoteIO::StartIO failure reason at runtime? com.apple.coreaudio shows 'what' but not the originating cause. currentRoute.inputs is empty at error time even though availableInputs = [MicrophoneBuiltIn], isInputAvailable = true, and the category is .playAndRecord. What does an empty input route under those conditions imply, and what other system-level holders could be preventing the HAL from routing the mic in? Has anyone seen 'what' resolve with a device reboot, an iOS update, or by removing a specific framework? Happy to share a sysdiagnose. Thanks!
Replies
1
Boosts
0
Views
256
Activity
22h
PHPickerConfiguration.preselectedAssetIdentifiers not work
let authStatus = PHPhotoLibrary.authorizationStatus(for: .readWrite) let fetchResult = PHAsset.fetchAssets(withLocalIdentifiers: selectedAssetIDs, options: nil) print("[AlbumCreation] authStatus=\(authStatus.rawValue) IDs=\(selectedAssetIDs.count) PHAsset匹配=\(fetchResult.count)") // result is: [AlbumCreation] authStatus=3 IDs=3 PHAsset匹配=3 var config = PHPickerConfiguration(photoLibrary: .shared()) config.preselectedAssetIdentifiers = selectedAssetIDs config.selectionLimit = 0 let picker = PHPickerViewController(configuration: config) picker.delegate = self present(picker, animated: true)
Replies
1
Boosts
0
Views
168
Activity
1d
RotationCoordinator returns angles 90 degrees lower on iPhone 17 Pro front camera — clarification on contract with AVSampleBufferDisplayLayer
Hi AVFoundation team, I'm seeing a uniform 90° offset in AVCaptureDevice.RotationCoordinator's reported angles between iPhone 17 Pro and iPhone 14 Pro using the front-facing .builtInWideAngleCamera (Center Stage on 17 Pro), and I'd like to confirm whether this is by design and what the recommended consumption pattern is when the rendering surface is an AVSampleBufferDisplayLayer rather than an AVCaptureVideoPreviewLayer. Here is the github repo of sample project. Setup Devices: iPhone 14 Pro (iOS 26.5) and iPhone 17 Pro (iOS 26.4.2) Camera: front, AVCaptureDeviceTypeBuiltInWideAngleCamera Active format: 1920×1080 Three RotationCoordinator instances are created on the same AVCaptureDevice, varying only the previewLayer: argument: - previewLayer: nil - previewLayer: AVSampleBufferDisplayLayer (the surface receiving frames from AVCaptureVideoDataOutput) - previewLayer: AVCaptureVideoPreviewLayer (with .session = captureSession, not displayed) Each instance is KVO-observed for videoRotationAngleForHorizonLevelPreview and videoRotationAngleForHorizonLevelCapture. Observed angles Device / Orientation: 14 Pro · Portrait (interface=1) RC[nil] prev / cap: 0° / 90° RC[AVSampleBufferLayer] prev / cap: 90° / 90° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 90° ──────────────────────────────────────── Device / Orientation: 14 Pro · LandscapeRight (interface=3) RC[nil] prev / cap: 0° / 180° RC[AVSampleBufferLayer] prev / cap: 180° / 180° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 180° ──────────────────────────────────────── Device / Orientation: 17 Pro · Portrait (interface=1) RC[nil] prev / cap: 0° / 0° RC[AVSampleBufferLayer] prev / cap: 0° / 0° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 0° ──────────────────────────────────────── Device / Orientation: 17 Pro · LandscapeRight (interface=3) RC[nil] prev / cap: 0° / 90° RC[AVSampleBufferLayer] prev / cap: 90° / 90° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 90° The −90° offset on 17 Pro is uniform: it appears in every RC variant, in both the preview-angle and the capture-angle properties, at every orientation tested. It is not specific to the previewLayer: argument.
Replies
1
Boosts
0
Views
243
Activity
1d
Mac (Designed for iPad) cannot access microphone
I have an application that is a VOIP application of sorts that needs access to the microphone. I am using the Mac (Designed for iPad) support to not have to do huge amounts of conditional building and support for all the many iOS specific things my app includes. I never get prompted to allow microphone permissions and I never see my app name appear in Privacy & Security -> Microphone permissions setup. So is it that Mac is just a dead end for any form of an application that needs a microphone and is running under Mac (Designed for iPad) compatibility mode? Why doesn't TCC have some mechanism to notice and grant access to mic use?
Replies
3
Boosts
0
Views
430
Activity
2d
Apple Music playlist create/delete works but DELETE returns 401 — and MusicKit write APIs are macOS‑unavailable. How to build a playlist editor on macOS?
I’m trying to build a playlist editor on macOS. I can create playlists via the Apple Music HTTP API, but DELETE always returns 401 even immediately after creation with the same tokens. Minimal repro: #!/usr/bin/env bash set -euo pipefail BASE_URL="https://api.music.apple.com/v1" PLAYLIST_NAME="${PLAYLIST_NAME:-blah}" : "${APPLE_MUSIC_DEV_TOKEN:?}" : "${APPLE_MUSIC_USER_TOKEN:?}" create_body="$(mktemp)" delete_body="$(mktemp)" trap 'rm -f "$create_body" "$delete_body"' EXIT curl -sS --compressed -o "$create_body" -w "Create status: %{http_code}\n" \ -X POST "${BASE_URL}/me/library/playlists" \ -H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \ -H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \ -H "Content-Type: application/json" \ -d "{\"attributes\":{\"name\":\"${PLAYLIST_NAME}\"}}" playlist_id="$(python3 - "$create_body" <<'PY' import json, sys with open(sys.argv[1], "r", encoding="utf-8") as f: data = json.load(f) print(data["data"][0]["id"]) PY )" curl -sS --compressed -o "$delete_body" -w "Delete status: %{http_code}\n" \ -X DELETE "${BASE_URL}/me/library/playlists/${playlist_id}" \ -H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \ -H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \ -H "Content-Type: application/json" I capture the response bodies like this: cat "$create_body" cat "$delete_body" Result: Create: 201 Delete: 401 I also checked the latest macOS SDK’s MusicKit interfaces and MusicLibrary.createPlaylist/edit/add(to:) are marked @available(macOS, unavailable), so I can’t create/ delete via MusicKit on macOS either. Question: How can I implement a playlist editor on macOS (create/delete/modify) if: MusicKit write APIs are unavailable on macOS, and The HTTP API can create but DELETE returns 401? Any guidance or official workaround would be hugely appreciated.
Replies
1
Boosts
2
Views
525
Activity
2d
Electron app + Apple Music playback: queue works, playback does not start. Looking for guidance.
Hi everyone. I’m building a macOS-first desktop app where music drives the app's behavior loop. The app is currently an Electron prototype. The blocker: we’re testing Apple Music inside an Electron app. MusicKit JS authorization works, catalog search works, and setting the queue works, but playback does not actually start in Electron. What we tried: Created Apple Developer / MusicKit credentials. Generated Apple Music developer tokens successfully. Retrieved a Music User Token through MusicKit JS. Confirmed Apple Music API calls work. Confirmed /v1/test and /me/storefront return 200 OK. Built a local HTTP auth/playback window inside Electron instead of using file://. Tested music.setQueue() with both: { song: songId } { url: catalogUrl } In Electron, the queue loads correctly: queueEmpty=false queueLength=1 volume=1 playbackRate=1 But after music.play(), playbackTime stays at 0 and no audio plays. Then we ran the same MusicKit playback test in normal Chrome using the same token, same local origin, same catalog track, and same queue descriptor. Chrome played successfully and playbackTime advanced. We also checked Electron directly and found navigator.requestMediaKeySystemAccess is missing, so our current theory is that stock Electron lacks the protected media / EME support Apple Music web playback needs. Important: we are not trying to bypass DRM or extract audio. We just want a legitimate way for a user-authorized macOS app to control Apple Music playback or observe playback state. What we’re considering next: Use the native macOS Music app as the playback engine and control it from our app. Test AppleScript / Automation permissions for play, pause, next, current track, player state, etc. Later, possibly build a native Swift helper using Apple Music / MediaPlayer APIs and communicate with Electron over IPC. Avoid relying on Electron MusicKit JS playback if this is a known dead end. Questions: Has anyone successfully made Apple Music / MusicKit JS playback work inside Electron? Is the missing EME/protected-media layer the expected blocker here? Is controlling the native macOS Music app the more realistic path? Any gotchas with AppleScript, MusicKit native APIs, or Electron + native helper architecture for this use case? Any pointers from people who have dealt with Electron + Apple Music / protected media would be appreciated.
Replies
0
Boosts
0
Views
37
Activity
2d
Radiometric interpretation of Apple ProRAW and Bayer RAW access via AVFoundation
I am working on a computational photography research project involving multi-exposure HDR reconstruction using Bayer RAW and Apple ProRAW captures. I would like to clarify the radiometric interpretation of Apple ProRAW and the availability of Bayer RAW capture through AVFoundation. My questions are: 1.On current iPhone Pro devices, is it possible for third-party apps to capture and export true Bayer-pattern RAW DNG files through AVFoundation, rather than Apple ProRAW linear DNG files? If so, which availableRawPhotoPixelFormatTypes correspond to Bayer RAW, and what device or format restrictions apply? 2.Apple ProRAW appears to be demosaiced and computationally processed, and may include multi-frame fusion. Is the decoded ProRAW image intended to be radiometrically linear and scene-referred? 3.For a bracketed ProRAW sequence captured with fixed ISO, white balance, lens, and focus, but different exposure times, can one assume that the decoded linear pixel values Y_i(p) satisfy an exposure-proportional model in non-saturated regions, such as Y_i(p) ≈ t_i R(p), across brackets? This question is about radiometric consistency for algorithmic use, not about visual editing or tone mapping. Thank you for your help.
Replies
0
Boosts
0
Views
138
Activity
3d
Radiometric interpretation of Apple ProRAW and Bayer RAW access via AVFoundation
I am working on a computational photography research project involving multi-exposure HDR reconstruction using Bayer RAW and Apple ProRAW captures. I would like to clarify the radiometric interpretation of Apple ProRAW and the availability of Bayer RAW capture through AVFoundation. My questions are: On current iPhone Pro devices, is it possible for third-party apps to capture and export true Bayer-pattern RAW DNG files through AVFoundation, rather than Apple ProRAW linear DNG files? If so, which availableRawPhotoPixelFormatTypes correspond to Bayer RAW, and what device or format restrictions apply? Apple ProRAW appears to be demosaiced and computationally processed, and may include multi-frame fusion. Is the decoded ProRAW image intended to be radiometrically linear and scene-referred? For a bracketed ProRAW sequence captured with fixed ISO, white balance, lens, and focus, but different exposure times, can one assume that the decoded linear pixel values Y_i(p) satisfy an exposure-proportional model in non-saturated regions, such as Y_i(p) ≈ t_i R(p), across brackets? This question is about radiometric consistency for algorithmic use, not about visual editing or tone mapping. Thank you for your help.
Replies
0
Boosts
0
Views
144
Activity
3d
How to Monitor Any USB Audio or Video Device on macOS
USB cameras, microphones, HDMI capture cards, and audio interfaces are supposed to "just work" on macOS. In reality, it's often difficult to quickly access or monitor them without opening large and complicated software. Sometimes you simply want to see whether a USB camera is active. Sometimes you want to check an HDMI source connected through a capture card. And in other cases, you may want to use a Mac mini without a dedicated monitor by viewing its HDMI output through a USB capture device directly on another Mac. macOS supports many modern USB AV devices out of the box, but it surprisingly lacks a simple built-in utility for live monitoring and recording. Most users end up using oversized streaming or editing applications just to preview a video signal or monitor audio input. That becomes especially noticeable with: USB webcams HDMI capture adapters USB microphones audio interfaces secondary computers headless Mac mini setups A lightweight monitor utility is often much more practical when you only need real-time access to a device, want to record a stream, or quickly switch between multiple AV inputs. That's one of the reasons I built AV Monitor Pro  -  a native macOS app designed for monitoring and recording connected audio/video devices in real time. It can preview USB cameras, capture cards, microphones, and HDMI sources with minimal setup, and it's especially useful for workflows like running a Mac mini without a monitor, monitoring external devices, or recording live AV input directly on macOS.
Replies
0
Boosts
0
Views
164
Activity
4d
AudioHardwareCreateProcessTap delivers all-zero buffers while system audio is audible
Summary Using AudioHardwareCreateProcessTap + AudioHardwareCreateAggregateDevice for system audio capture. During long sessions, the AudioDeviceIOProc callback continues firing normally but every PCM sample is exactly 0.0f — while the system is producing audible output. Environment Field Value macOS 26.5 Beta Hardware MacBook Air (M2) API AudioHardwareCreateProcessTap + AudioHardwareCreateAggregateDevice Tap CATapDescription, processes = [], .unmuted, private Format 48,000 Hz, Float32, interleaved stereo Aggregate anchor kAudioAggregateDeviceMainSubDeviceKey = current default output UID Observed behavior After running normally for several minutes, the stream transitions into an all-zero state: AudioDeviceIOProc continues to fire at expected cadence Frame count, timestamps (mHostTime, mSampleTime), and mDataByteSize all look normal AudioBufferList pointers are valid Every sample in every buffer is exactly 0.0f Other apps are still producing audible output through the same output device The condition may self-recover or persist until the session is stopped Confirmed via RMS logging both inside the IOProc and after the ring buffer consumer — data is zero on delivery, not introduced downstream. Example: 51-minute session on MacBook Air M2 Segment 1 (~7 min): Three all-zero periods: 60 s, 53 s, 141 s. Real PCM briefly returned between them. Segment 2 (~44 min): Two all-zero periods: 16 min 3 s, 3 min 8 s. IOProc cadence, timestamp deltas, default output UID, and kAudioDevicePropertyDeviceIsRunningSomewhere all remained normal throughout. What I have ruled out Actual silence: User was in an active video call and could hear participants through the output device. Default output device change: Monitored kAudioHardwarePropertyDefaultOutputDevice — no change during affected periods. IOProc stall: Heartbeat and kAudioDevicePropertyDeviceIsRunningSomewhere remained normal. Aggregate device destroyed: AudioObjectGetPropertyData on the aggregate UID continued returning the expected device. Tap descriptor misconfiguration: The same tap produces valid PCM earlier in the same session, so this is not a startup-time issue. Why detection is hard All-zero buffers from a broken tap are indistinguishable from legitimate silence (muted participant, waiting room, paused media). kAudioProcessPropertyIsRunningOutput reports whether a process has active output IO, not whether it is contributing non-zero samples — a muted Zoom call still reports true. Possible correlations Sample-rate renegotiation on the output device (44.1 kHz ↔ 48 kHz) when another app changes output Bluetooth device state changes (AirPods sleep/wake) where UID stays the same MacBook Air more frequently affected than MacBook Pro Always occurs after extended uptime — first few minutes are consistently clean Current workaround Full teardown and rebuild restores real PCM. Restarting the IOProc alone or recreating only the aggregate device is not reliable — both the Process Tap and Aggregate Device must be destroyed and recreated. 1. AudioDeviceStop 2. AudioDeviceDestroyIOProcID 3. AudioHardwareDestroyAggregateDevice 4. AudioHardwareDestroyProcessTap 5. AudioHardwareCreateProcessTap 6. AudioHardwareCreateAggregateDevice 7. Create + start new IOProc Applying this automatically is risky because it cannot be reliably distinguished from legitimate silence. Questions Expected failure mode? Can a Process Tap continue delivering zero-filled buffers while the system output is audible? Is this expected under certain device or routing conditions? Detection signal? Is there any HAL property, notification, or diagnostic counter that distinguishes "sources are genuinely silent" from "the tap data path has stopped receiving the real mix"? Targeted recovery? Is there a supported way to re-anchor or reset the tap data path without destroying and recreating both objects? Full rebuild as intended workaround? If so, it would help to confirm this so developers can converge on a consistent approach. Mixer activity signal? kAudioProcessPropertyIsRunningOutput reflects IO registration, not sample contribution. Is there any AudioProcess property that indicates a process is currently delivering non-zero audio to the system mixer?
Replies
0
Boosts
0
Views
222
Activity
5d
MusicKit playback completely broken after Apple Music “What’s New?” update screen until native app is opened
I’m developing a third-party Apple Music streaming app using MusicKit (ApplicationMusicPlayer + catalog requests). Issue: Whenever Apple releases an Apple Music update that shows the “What’s New?” onboarding/modal screen in the native Apple Music app, MusicKit in our app completely breaks for all users. Attempts to play anything (queue, prepareToPlay, etc.) fail silently or with service-related errors. Playback and most MusicKit operations remain broken until the user opens the native Apple Music app, dismisses the “What’s New?” screen, and returns to our app. After that single native interaction (we deliberately stopped users from going any further within Apple Music to verify this), everything works perfectly again. Reproduction Steps: Apple Music receives an update with “What’s New?” screen. User launches our third-party app and attempts playback. MusicKit fails. User opens Apple Music → dismisses modal → returns to our app. MusicKit works again. Expected Behavior: Third-party MusicKit apps should not become non-functional because the native Apple Music app has a pending onboarding screen. Shared backend services (account readiness, tokens, subscription state, etc.) should initialize independently. Environment: iOS 26.4.2 Devices verified to be affected: iPhone 13 Pro iPhone XR iPhone 15 Workarounds attempted: Re-requesting MusicAuthorization Recreating ApplicationMusicPlayer Stopping/re-queuing Background/foreground app None resolve it without the native Apple Music interaction. This appears to be a recurring integration fragility with shared Apple Music services. Has anyone else seen this? Any recommended recovery path or API to force service initialization? Thanks!
Replies
1
Boosts
2
Views
212
Activity
5d
CarPlay HID transport buttons remap to call-control during continuous mic capture (no opt-out API)
Hello, I am developing Uniq Intercom, a voice-only group communication app for motorcyclists (always-on intercom over WebRTC, used continuously for multi-hour rides). I am seeking guidance on an iOS audio session and CarPlay HID interaction I have not been able to resolve through documented APIs. Problem: As soon as my app activates the microphone (yellow recording indicator visible), iOS appears to classify the app as a real-time communication participant and CarPlay re-routes the steering-wheel / handlebar HID transport buttons (left / right / ok) from the media-control role to the call-control role (answer/decline). Because I do not register a CallKit / LiveCommunicationKit call (the session is a continuous group voice channel, not a discrete telephony call), there is no call object for those buttons to act upon — they effectively become inert. Why this matters: Motorcyclists rely on the intercom for 4–6 hour rides. CarPlay is now built into a growing number of modern motorcycles and with aftermarket display units virtually any bike, and any rider who uses any voice communication platform alongside it — Uniq Intercom, WhatsApp Call currently runs into this same handlebar button remap. With the buttons inert, the rider's only remaining option is to reach for the motorcycle's touchscreen to skip a track or change navigation — this is unsafe. The exact same remap behavior occurs during a real WhatsApp or Phone call — but for those the remap is correct (answer/decline maps to a real call). For continuous voice apps without a CallKit-style discrete call, no equivalent path exists today. As both an iOS developer and a motorcyclist, I would very much like to see this resolved — solving it would meaningfully improve safety for every rider using an iPhone with CarPlay. Configurations I have tested (all reproduce the symptom on iOS 18+ / 26 with wireless CarPlay): AVAudioSession.Category.playAndRecord + .voiceChat mode + various option combinations (duckOthers, mixWithOthers, allowBluetoothHFP, allowBluetoothA2DP, defaultToSpeaker) Same category with .videoChat mode (which @livekit/react-native defaults to) Same category with .default mode (re-applied after setAudioModeAsync to defeat any framework override) — confirmed Mode = Default for ~2 s window in audiomxd log before WebRTC's setActive cycle returned mode to .voiceChat. Buttons remained remapped during the .default window. Disabling MPRemoteCommandCenter and clearing MPNowPlayingInfoCenter.default().nowPlayingInfo JS-side override of WebRTC's global RTCAudioSessionConfiguration via @livekit/react-native's AudioSession.setAppleAudioConfiguration({audioMode: 'default'}) bridge, applied both before connect and after setAudioModeAsync to defeat library overrides In every case the audiomxd system log confirms our session goes active (Mode = VoiceChat or Default, Recording = YES), and CarPlay HID buttons are immediately remapped to call-control. The middle "OK" button remains functional because it is not part of the call-control mapping — confirming the buttons are not blocked, only re-purposed. The remap occurs the instant the iOS recording indicator appears, regardless of audio session mode. This led me to conclude the trigger is not audio session mode but the combination of microphone permission + active session + (likely) the AUVoiceIO unit instantiated by WebRTC. I cannot find a public API path to suppress this classification while maintaining the always-on continuous voice channel. My questions: Is there an entitlement or API that allows an app with active microphone capture to declare itself as a non-call media participant, keeping CarPlay HID transport buttons in the media role? Is AVAudioSession.setPrefersEchoCancelledInput(_:) (iOS 18+) the intended path for retaining platform AEC under .default mode without the focus-engine "communication priority" marking? Documentation is sparse on its CarPlay arbitration implications. Does the PushToTalk framework affect HID arbitration differently from playAndRecord + voiceChat? Our continuous-channel UX does not fit the PTT transmit-on-press model, but understanding the contrast would help. If no current API exists, is this something the iOS Audio team would consider for future SDKs? Solving this would meaningfully improve safety for motorcycle and adventure-sport users on iOS. Thank you for your time and any guidance you can offer. — Emre Erkaya / Uniq Intercom
Replies
1
Boosts
0
Views
123
Activity
5d
Apple Vision Pro streaming spatial video transmission
I want develop an app for real-time streaming spatial video transmission from an Apple Vision Pro to another Apple Vision Pro and play, like MV-HEVC, does it's possible? If it's possible how to make it?
Replies
1
Boosts
0
Views
210
Activity
5d
I have the same, iOS 26.3.0
open FB22712056
Replies
1
Boosts
0
Views
139
Activity
6d
Issues with monitoring and changing WebRTC audio output device in WKWebView
I am developing a VoIP app that uses WebRTC inside a WKWebView. Question 1: How can I monitor which audio output device WebRTC is currently using? I want to display this information in the UI for the user . Question 2: How can I change the current audio output device for WebRTC? I am using a JS Bridge to Objective-C code, attempting to change the audio device with the following code: void set_speaker(int n) { session = [AVAudioSession sharedInstance]; NSError *err = nil; if (n == 1) { [session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&err]; } else { [session overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&err]; } } However, this approach does not work. I am testing on an iPhone with iOS 16.7. Is a higher iOS version required?
Replies
2
Boosts
0
Views
355
Activity
6d
AVMutableComposition audio silently drops on iOS 26 when streaming over HTTP/2 (FB22696516)
We've discovered a regression in iOS 26 where AVMutableComposition silently drops audio when the source asset is streamed over HTTP/2. The same file served over HTTP/1.1 plays audio correctly through the same composition code. Direct AVPlayer playback (without composition) works fine on HTTP/2. This did not occur on iOS 18.x. It happens on physical devices only. It does not reproduce on a simulator or on macOS. Tested conditions (same MP4 file, different CDNs): CloudFront (HTTP/2) + Composition → ❌ Audio silent Cloudflare (HTTP/2) + Composition → ❌ Audio silent Akamai (HTTP/1.1) + Composition → ✅ Audio works Apple TS (HTTP/1.1) + Composition → ✅ Audio works Downloaded locally, then composed → ✅ Audio works Direct playback, no composition (HTTP/2) → ✅ Audio works The CloudFront and Akamai URLs serve the identical file — same S3 object, different CDN edge. CDN vendor doesn't matter; any HTTP/2 source triggers it. Minimal reproduction: let asset = AVURLAsset(url: http2URL) let videoTrack = try await asset.loadTracks(withMediaType: .video).first! let audioTrack = try await asset.loadTracks(withMediaType: .audio).first! let duration = try await asset.load(.duration) let composition = AVMutableComposition() let fullRange = CMTimeRange(start: .zero, end: duration) let compVideo = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)! try compVideo.insertTimeRange(fullRange, of: videoTrack, at: .zero) let compAudio = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)! try compAudio.insertTimeRange(fullRange, of: audioTrack, at: .zero) let item = AVPlayerItem(asset: composition.copy() as! AVComposition) player.replaceCurrentItem(with: item) player.play() // Video plays, audio goes silent after a while Playing the same asset directly works fine: player.replaceCurrentItem(with: AVPlayerItem(asset: asset)) player.play() // Both video and audio work Filed as FB22696516 Sample project: https://github.com/karlingen/AVCompositionBug
Replies
1
Boosts
9
Views
212
Activity
1w
AVAudioEngineConfigurationChangeNotification received while engine is running
The documentation for AVAudioEngineConfigurationChangeNotification states When the audio engine’s I/O unit observes a change to the audio input or output hardware’s channel count or sample rate, the audio engine stops, uninitializes itself, and issues this notification. A user of my framework has reported a crash during notification processing on iOS 26.4 when the main mixer node is disconnected from the output node in order to reestablish the connection with a different format. The failing precondition is com.apple.coreaudio.avfaudio: required condition is false: !IsRunning(). The report was observed on iPhone 16 / iOS 26.4.2, ARM64, TestFlight build. The backtrace contains: [Last Exception Backtrace] 3 AVFAudio AVAudioEngineGraph::_DisconnectInput AVAudioEngineGraph.mm:2728 4 AVFAudio -[AVAudioEngine disconnectNodeInput:bus:] AVAudioEngine.mm:155 5 SFB sfb::AudioPlayer::handleAudioEngineConfigurationChange AudioPlayer.mm:2247 [Thread 18 Crashed] 9 SFB sfb::AudioPlayer::handleAudioEngineConfigurationChange AudioPlayer.mm:2212 … 14 AVFAudio IOUnitConfigurationChanged Has the behavior for AVAudioEngineConfigurationChangeNotification changed in iOS 26.4? It's simple enough to call [engine_ stop] in the notification handler but the documentation states this shouldn't be necessary. I've not observed a similar crash on previous iOS versions.
Replies
0
Boosts
1
Views
143
Activity
1w