Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

SCStreamDelegetate stream:didStopWithError receiving null pointer for error
I have an SCStreamDelegate for capturing frames from applications. On recent point releases of macOS Sonoma, I've noticed that the stream is being cancelled with no user action being taken. I started trying to debug it and when my on error method is called, the error parameter being passed is null: func stream(_ stream: SCStream, didStopWithError error: Error) { /*debugger shows this and segfaults if I try to print "\(error)" error (Error) > error = (Builtin.RawPointer) 0x0 */ From what I can tell, error should be a valid NSError so I can check the error code, based on similar code I've seen in, for example OBS (https://github.com/obsproject/obs-studio/blob/265239d4174f8d291b0de437088c5b78f8e27687/plugins/mac-capture/mac-sck-common.m#L29) Usually when this happens, the menubar icon for screen sharing (where I would click to change sharing window, etc) stays there even after my app has closed an no apps are doing sharing stuff. Has anyone come across this before? Am I misinterpreting what the debugger is saying about the error parameter? I'm running macos 14.7.3, but I just updated from 14.7.2 earlier and had basically the same issue on both macos versions
3
0
569
Mar ’25
Why does CADisplayLink of an external UIScreen drift in time?
I am using Apple's original Lightning Digital AV-adapter (Lightning-to-HDMI dongle) to connect my iPhone to an external display via a HDMI cable. I need to synchronize rendering with the external display's refresh rate, so I create a new CADisplayLink tied to the external display's UIScreen: UIScreen.screens[externalDisplayIdx].displayLink(withTarget:, selector:). The callback is being called regularly, but with increasing delay relative to the CADisplayLink.timestamp, so the next time the callback is called, I have less and less time to draw the next frame (see the snippet below). Assuming 60 FPS, the value of secondsTillDeadline starts at an arbitrary value in the range of approx -0.0001 to 0.0166667, and then it slowly decreases towards zero (and for a brief period it goes into small negative numbers). Once it reaches zero, it flips back to 0.0166667 and continues to decrease again. This cycle repeats indefinitely. Changing the external display's resolution (UIScreen's mode) or the CADisplayLink's preferredFrameRateRange to a lower FPS does not seem to have any effect on the temporal drifting (even the rate of change seem to be the same). When I create a new CADisplayLink for the iPhone's main screen, the value of secondsTillDeadline is stable, it does not drift and it is very close to 0.0166667, as expected. Is this drift caused by the external monitor or by Apple's Lightning-to-HDMI dongle ...or is the problem somewhere else? Can the drifting be stopped? func onDisplayLinkUpdate(displayLink: CADisplayLink) { // Gradually decreases from 0.01667 to -0.0001, then flips back to 0.01667 and continues to decrease let secondsTillDeadline = displayLink.targetTimestamp - CACurrentMediaTime() }
3
0
337
Aug ’25
Configuring CaptureVideoDelegate to avoid gamma/transfer function
I'm working on an application that uses the iPhone camera for scientific purposes - and, as a result would like to receive video in as unprocessed format as possible. In particular, I'm interested in getting pixel buffers that contain pretty much the bayer data as the sensor sees it - with the minimum processing of color possible. Currently we configure the AVCaptureDevice to fix the focus and exposure, use a low ISO with no gain and set the white balance gains to 1. AVCaptureVideoDataOutput is using 32BGRA. What I'd like to do is remove any additional color and brightness processing such that the data is effectively processed with a linear transfer function (i.e. gamma function is 1). I thought that this might be down to using the AVCaptureDevice activeColorSpace - we currently use P3_D65 for this. But there only seems to be a few choices (e.g. sRGB, HLG_BT2020) all of which I think affect the gamma. So: is it possible to control or specify the gamma / transfer function when using CaptureVideoDelegate? if not, does one of the color space settings have a defined gamma function that I can effectively reverse it from the pixel data without losing too much information? or is there a better way to capture video-ish speed images (15-30fps) from the camera sensor that skips processing like this? Many thanks for any suggestions.
3
0
133
Mar ’25
Audio Unit v3 host v2 third party plugins
Hi, I have just implemented an Audio Unit v3 host. AgsAudioUnitPlugin *audio_unit_plugin; AVAudioUnitComponentManager *audio_unit_component_manager; NSArray<AVAudioUnitComponent *> *av_component_arr; AudioComponentDescription description; guint i, i_stop; if(!AGS_AUDIO_UNIT_MANAGER(audio_unit_manager)){ return; } audio_unit_component_manager = [AVAudioUnitComponentManager sharedAudioUnitComponentManager]; /* effects */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_Effect; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i < i_stop; i++){ ags_audio_unit_manager_load_component(audio_unit_manager, (gpointer) av_component_arr[i]); } /* instruments */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_MusicDevice; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i < i_stop; i++){ ags_audio_unit_manager_load_component(audio_unit_manager, (gpointer) av_component_arr[i]); } But this doesn't show me Audio Unit v2 plugins, why? regards, Joël
3
0
507
Aug ’25
AVAudioUnit host - PCM buffer output silent
Hi, I just started to develop audio unit hosting support in my application. Offline rendering seems to work except that I hear no output, but why? I suspect with the player goes something wrong. I connect to CoreAudio in a different location in the code. Here are some error messages I faced so far: 2025-08-14 19:42:04.132930+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:42:04.151171+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:43:08.344530+0200 com.gsequencer.GSequencer[34358:18614927] AUAudioUnit.mm:1417 Cannot set maximumFramesToRender while render resources allocated. 2025-08-14 19:43:08.346583+0200 com.gsequencer.GSequencer[34358:18614927] [avae] AVAEInternal.h:104 [AVAudioSequencer.mm:121:-[AVAudioSequencer(AVAudioSequencer_Player) startAndReturnError:]: (impl->Start()): error -10852 ** (<unknown>:34358): WARNING **: 19:43:08.346: error during audio sequencer start - -10852 I have implemented an AVAudioEngine based AudioUnit host. Here I instantiate player and effect: /* audio engine */ audio_engine = [[AVAudioEngine alloc] init]; fx_audio_unit_audio->audio_engine = (gpointer) audio_engine; av_format = (AVAudioFormat *) fx_audio_unit_audio->av_format; /* av audio player node */ av_audio_player_node = [[AVAudioPlayerNode alloc] init]; /* av audio unit */ av_audio_unit_effect = [[AVAudioUnitEffect alloc] initWithAudioComponentDescription:[((AVAudioUnitComponent *) AGS_AUDIO_UNIT_PLUGIN(base_plugin)->component) audioComponentDescription]]; av_audio_unit = (AVAudioUnit *) av_audio_unit_effect; fx_audio_unit_audio->av_audio_unit = av_audio_unit; /* audio sequencer */ av_audio_sequencer = [[AVAudioSequencer alloc] initWithAudioEngine:audio_engine]; fx_audio_unit_audio->av_audio_sequencer = (gpointer) av_audio_sequencer; /* output node */ [[AVAudioOutputNode alloc] init]; /* audio player and audio unit */ [audio_engine attachNode:av_audio_player_node]; [audio_engine attachNode:av_audio_unit]; [audio_engine connect:av_audio_player_node to:av_audio_unit format:av_format]; [audio_engine connect:av_audio_unit to:[audio_engine outputNode] format:av_format]; ns_error = NULL; [audio_engine enableManualRenderingMode:AVAudioEngineManualRenderingModeOffline format:av_format maximumFrameCount:buffer_size error:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("enable manual rendering mode error - %d", [ns_error code]); } ns_error = NULL; [[av_audio_unit AUAudioUnit] allocateRenderResourcesAndReturnError:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("Audio Unit allocate render resources returned error - ErrorCode %d", [ns_error code]); } Then I render in a dedicated thread. ns_error = NULL; [audio_engine startAndReturnError:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("error during audio engine start - %d", [ns_error code]); } [av_audio_sequencer prepareToPlay]; ns_error = NULL; [av_audio_sequencer startAndReturnError:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("error during audio sequencer start - %d", [ns_error code]); } [av_audio_player_node play]; while(is_running){ /* pre sync */ /* IO buffers */ av_output_buffer = (AVAudioPCMBuffer *) scope_data->av_output_buffer; av_input_buffer = (AVAudioPCMBuffer *) scope_data->av_input_buffer; /* fill input buffer */ /* schedule av input buffer */ frame_position = 0; // (gint64) ((note_offset * absolute_delay) + delay_counter) * buffer_size; av_audio_player_node = (AVAudioPlayerNode *) fx_audio_unit_audio->av_audio_player_node; AVAudioTime *av_audio_time = [[AVAudioTime alloc] initWithHostTime:frame_position sampleTime:frame_position atRate:((double) samplerate)]; [av_audio_player_node scheduleBuffer:av_input_buffer atTime:av_audio_time options:0 completionHandler:nil]; /* render */ ns_error = NULL; status = [audio_engine renderOffline:AGS_FX_AUDIO_UNIT_AUDIO_FIXED_BUFFER_SIZE toBuffer:av_output_buffer error:&ns_error]; if(ns_error != NULL && [ns_error code] != noErr){ g_warning("render offline error - %d", [ns_error code]); } } regards, Joël
3
0
465
Aug ’25
Looking for solutions to build a video chat app (Omegle/Chatroulette style) on Vision Pro
Hi everyone, We are working on a prototype app for Apple Vision Pro that is similar in functionality to Omegle or Chatroulette, but exclusively for Vision Pro owners. The core idea is: – a matching system where one user connects to another through a virtual persona; – real-time video and audio transmission; – time limits for sessions with the ability to extend them; – users can skip a match and move on to the next one. We have explored WebRTC and Twilio, but unfortunately, they don’t fit our use case. Question: What alternative services or SDKs are available for implementing real-time video/audio communication on Vision Pro that would work with this scenario? Has anyone encountered a similar challenge and can recommend which technologies or tools to use? Thanks in advance!
2
0
269
Aug ’25
AVPlayerViewController `customInfoViewControllers` crash/workaround on tvOS 26
One thing I've noticed on tvOS 26 is that if you try to set the AVPlayerViewController customInfoViewControllers property while the Content Tabs are on screen, your app will crash. *** Terminating app due to uncaught exception 'UIViewControllerHierarchyInconsistency', reason: 'trying to add child view controller that is already presented: <AVInfoPanelViewController: 0x1030cdc00>' *** First throw call stack: (0x18a7167bc 0x189a77510 0x18a7166a8 0x1ab425658 0x1b2ee9d54 0x1b2efcd60 0x1b2eaf3f0 0x1080f744c 0x107e021a8 0x107e01b3c 0x18de41c14 0x18de41ba8 0x18de48d28 0x18ad9e358 0x101fac5f0 0x101fc6228 0x101fe7278 0x101fbc6fc 0x101fbc63c 0x18a67a2e0 0x18a679418 0x18a673b34 0x1937e4d5c 0x1abb36588 0x1abb3ae80 0x1aae9dec4 0x108610174 0x1086100e4 0x108615140 0x189abd4d0) I've logged a feedback (FB19554461) but it's getting awfully late in the dev cycle. So I've been trying to think of a workaround. The problem is that customInfoViewControllers is pretty declarative in nature. There are no properties or delegate methods I am aware of that let me know when they are displaying or not. One trick I came up with was seeing if my custom info view controller's view was "visible" or not - I put that in quotes because it turns out it can be visible even when I think it's not, as when the transport bar is scrolled to the top my custom VC still has its top pixels showing, so it gets a viewDidAppear call. So, I tried to see if my view controllers view is completely visible, ie based on the results of the GGRect contains method. And that works! But the problem is it only accounts for my own custom info view controllers, and not the standard one that Apple provides. I can't think of a way at all to know whether that is showing. Any ideas?
1
0
163
Aug ’25
Play Audio for a Metronome
Hi, I am looking for a good way to play sounds at a high frequency. At the moment I am using the AVAudioEngine, and create a couple AVAudioPlayerNode and for each sound I need to play I create a AVAudioPCMBuffer. When the app needs to play a sound, I get the correct AVAudioPCMBuffer for the sound and use the first available AVAudioPlayerNode and feed it to the buffer. The timing for a metronome app has to be very precise because if it's of by about 16ms the user can hear that it is not playing had the right interval. For low speeds this is working without any problems, but at high speeds it is getting worse. Maybe anyone has an idea on how I can improve my method. Its a Plugin for Flutter. import AVFoundation class FastSoundPlayer { private var audioPlayers: [SoundPlayer?] = [] private var sounds: [String: Sound] = [:] private var engine = AVAudioEngine() let session = AVAudioSession.sharedInstance() init() { do { try session.setCategory(AVAudioSession.Category.playback, mode: AVAudioSession.Mode.default, options: [AVAudioSession.CategoryOptions.mixWithOthers]) try session.setActive(true) createSoundPlayers(count: 20) try engine.start() } catch { print("Error starting audio engine: \(error.localizedDescription)") } } // Selector method to handle applicationDidBecomeActiveNotification func applicationDidBecomeActive() { // Reinitialize AVAudioEngine and reattach all nodes do { engine.reset() objc_sync_enter(audioPlayers) audioPlayers.removeAll() createSoundPlayers(count: 20) objc_sync_exit(audioPlayers) try engine.start() } catch { print("Error starting audio engine: \(error.localizedDescription)") } } func createSoundPlayers(count: Int) { for _ in 0..<count { let player = SoundPlayer() engine.attach(player.player) engine.connect(player.player, to: engine.mainMixerNode, format: nil) audioPlayers.append(player) } } func load(sound: Data, name: String) { let sound = Sound(soundData: sound) sounds[name] = sound } func play(name: String) { if !engine.isRunning { applicationDidBecomeActive() } guard let sound = sounds[name] else { print("Sound not found") return } if let player = getAvailablePlayer() { player.play(sound: sound) } } func getAvailablePlayer() -> SoundPlayer? { for player in audioPlayers { if !player!.isPlaying { return player } } return nil } } class SoundPlayer { let player = AVAudioPlayerNode() var isPlaying = false init() { player.volume = 1.0 } func play(sound: Sound) { player.scheduleBuffer(sound.sound!, at: nil, options: .interrupts, completionCallbackType: .dataPlayedBack) { _ in self.complete() } if (player.engine != nil && player.engine!.isRunning) { player.play() isPlaying = true } } func complete() { isPlaying = false } } class Sound { var sound: AVAudioPCMBuffer? init(soundData: Data) { do { let temporaryURL = FileManager.default.temporaryDirectory.appendingPathComponent("tempSound.wav") try soundData.write(to: temporaryURL) // Create AVAudioFile from the temporary file URL let audioFile = try AVAudioFile(forReading: temporaryURL) // Define the format for the PCM buffer (44100Hz, stereo) let format = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44100, channels: 2, interleaved: false) // Create AVAudioPCMBuffer guard let pcmBuffer = AVAudioPCMBuffer(pcmFormat: format!, frameCapacity: AVAudioFrameCount(audioFile.length)) else { // Failed to create PCM buffer self.sound = nil return } // Read audio file into PCM buffer try audioFile.read(into: pcmBuffer) // Assign the created AVAudioPCMBuffer to the sound property self.sound = pcmBuffer } catch { print("Error loading sound file: \(error.localizedDescription)") self.sound = nil } } } Thanks!
1
0
183
Mar ’25
Detecting if a phone call is being recorded by another app on iOS
Hello, I’m new here. I'm developing an iOS app and I’d like to know whether it is possible to detect if a phone call is being recorded by another app running in the background. I’ve already reviewed the documentation for CallKit and AVAudioSession, but I couldn’t find anything related. My expectation was that iOS might provide some callback or API to indicate if a call is being recorded (third-party apps), but so far I haven’t found a way. My questions are: Does iOS expose any API to detect if a call is being recorded? If not, is there any indirect, Apple's policy compliant method (e.g., microphone usage events) that can be relied upon? Or is this something that iOS explicitly prevents for privacyreasons? Expecting solutions that align with Apple’s policies and would be accepted under the App Store Review Guidelines. Thanks in advance for any guidance.
1
0
255
Aug ’25
AVAudioSessionCategoryOptionAllowBluetooth incorrectly marked as deprecated in iOS 8 in iOS 26 beta 5
AVAudioSessionCategoryOptionAllowBluetooth is marked as deprecated in iOS 8 in iOS 26 beta 5 when this option was not deprecated in iOS 18.6. I think this is a mistake and the deprecation is in iOS 26. Am I right? It seems that the substitute for this option is "AVAudioSessionCategoryOptionAllowBluetoothHFP". The documentation does not make clear if the behaviour is exactly the same or if any difference should be expected... Has anyone used this option in iOS 26? Should I expect any difference with the current behaviour of "AVAudioSessionCategoryOptionAllowBluetooth"? Thank you.
2
0
288
Aug ’25
Is AVPlayerViewController not supported on macCatalyst?
When building an application that can be built on iOS using macCatalyst, a link error like the one below will occur. Undefined symbol: OBJC_CLASS$_AVPlayerViewController The AVPlayerViewController documentation seems to support macCatalyst, but what is the reality? [AVPlayerViewController](https://developer.apple.com/documentation/avkit/avplayerviewcontroller? language=objc) Each version of the environment is as follows. Xcode 16.2 macOS deployment target: macOS 10.15 iOS deployment target: iOS 13.0 Thank you for your support.
4
0
433
Mar ’25
FairPlay-Protected HLS Files Not Transferred via Quick Start
FairPlay-Protected HLS Files Not Transferred via Quick Start I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths. Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures. Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
2
0
207
Mar ’25
[MusicKit] Check for availability of songs
Songs can be unavailable (greyed out) in Apple Music. How can I check if a song is unavailable via the MusicKit framework? Obviously the playback will fail with MPMusicPlayerControllerErrorDomain Code=6 "Failed to prepare to play" but how can I know that in advance? I need to check the availability of hundreds of albums and therefore initiating a playback for each of them is not an option. Things I have tried: Checking if the release date property is set to a future date. This filters out all future releases but doesn't solve the problem for already released songs. Checking if the duration is 0. This does not work since the duration of unavailable songs does not have to be 0. Initiating a playback and checking for the "Failed to prepare to play" error. This is not suitable for a huge amount of Albums. I couldn't find a solution yet but somehow other third-party-apps are able ignore/don't shows these albums. I believe the Apple Music app is only displaying albums where at least one song is available. I am using this function to fetch all albums of an artist. private func fetchAlbumsFor(_ artist: Artist) async throws -> [Album] { let artistWithAlbums = try await artist.with(.albums) var allAlbums = [Album]() guard var currentBadge = artistWithAlbums.albums else { return [] } allAlbums.append(contentsOf: currentBadge) while currentBadge.hasNextBatch { if let nextBatch = try await currentBadge.nextBatch() { currentBadge = nextBatch allAlbums.append(contentsOf: nextBatch) } else { break } } return allAlbums } Here is an example album where I am unable to detect its unavailability (at least in Germany): https://music.apple.com/de/album/die-haferhorde-immer-den-n%C3%BCstern-nach-h%C3%B6rspiel-zu-band-3/1755774804 Furthermore I was unable to navigate to this album via the Apple Music app directly. Thanks for any help Edit: Apparently this album is not included in an apple music subscription but can be bought seperately. The question remains: How can I check that?
1
0
596
Feb ’25
MusicKit - Authorization failed: AUTHORIZATION_ERROR: Unauthorized
Hello, I am having difficulties with configuring MusicKit correctly for my web app that I am building, seeking assistance with the issues I am having. Would greatly appreciate any help! After allowing access to the following, "Access Request media.mydomain.com would like to access Apple Music, media library, and listening activity for myemail'@icloud.com.", I get a popup error that states, "Authorization failed. Please try again.". Following is the information that is given in developer console: [Error] Failed to load resource: the server responded with a status of 403 () (webPlayerLogout, line 0) [Error] Authorization failed: AUTHORIZATION_ERROR: Unauthorized (anonymous function) (media.mydomain.com:398)
8
0
971
Mar ’25
What changes were made to the VideoToolbox HEVC encoder in iOS 26?
Because I want to control the grid size and number of HEIC images myself, I decided to perform HEVC encoding manually and then generate the HEIC image. Previously, I used VTCompressionSession to accomplish this task, and the results were satisfactory. It worked perfectly on iOS 16 through iOS 18 — in other words, it was able to generate correct HEVC encoding, and its CMFormatDescription should also have been correct, since I relied on it to generate the decoderConfig; otherwise, the final image would have decoding issues. However, it can no longer generate a valid HEIC image on a physical device running iOS 26. Interestingly, it still works fine on the iOS 26 simulator — it only fails on real hardware. The abnormal result is that the image becomes completely black, although the image dimensions are still correct. After my troubleshooting, I suspect that the encoding behavior of VTCompressionSession has been modified on iOS 26, which causes the final hvc1 encoding I pass in to be incorrect. I created a VTCompressionSession using the following configuration. var newSession: VTCompressionSession! var status = VTCompressionSessionCreate( allocator: kCFAllocatorDefault, width: Int32(frameSize.width), height: Int32(frameSize.height), codecType: kCMVideoCodecType_HEVC, encoderSpecification: nil, imageBufferAttributes: nil, compressedDataAllocator: nil, outputCallback: nil, refcon: nil, compressionSessionOut: &newSession ) try check(status, VideoToolboxErrorDomain) let properties: [CFString: Any] = [ kVTCompressionPropertyKey_AllowFrameReordering: false, kVTCompressionPropertyKey_AllowTemporalCompression: false, kVTCompressionPropertyKey_RealTime: false, kVTCompressionPropertyKey_MaximizePowerEfficiency: false, kVTCompressionPropertyKey_ProfileLevel: profileLevel, kVTCompressionPropertyKey_Quality: quality.rawValue, ] status = VTSessionSetProperties(newSession, propertyDictionary: properties as CFDictionary) try check(status, VideoToolboxErrorDomain) { VTCompressionSessionInvalidate(newSession) } Then use the following code to encode each Grid of the image. let status = VTCompressionSessionEncodeFrame( session, imageBuffer: buffer, presentationTimeStamp: presentationTimeStamp, duration: frameDuration, frameProperties: nil, infoFlagsOut: nil) { [weak self] status, _, sampleBuffer in try check(status, VideoToolboxErrorDomain) if let sampleBuffer { let encodedImage = try self.encodedImage(from: sampleBuffer) // handle encodedImage } } try check(status, VideoToolboxErrorDomain) If I try to display this abnormal image in the App, my console outputs the following error, so it can be inferred that the issue probably occurred during decoding. createImageBlock:3029: *** ERROR: CGImageBlockCreate {0, 0, 2316, 6176} - data is NULL callDecodeImage:2411: *** ERROR: decodeImageImp failed - NULL _blockArray createImageBlock:3029: *** ERROR: CGImageBlockCreate {0, 0, 2316, 6176} - data is NULL callDecodeImage:2411: *** ERROR: decodeImageImp failed - NULL _blockArray createImageBlock:3029: *** ERROR: CGImageBlockCreate {0, 0, 2316, 6176} - data is NULL callDecodeImage:2411: *** ERROR: decodeImageImp failed - NULL _blockArray It needs to be emphasized again that this code used to work fine in the past, and the issue only occurs on an iOS 26 physical device. I noticed that iOS 26 has introduced many new properties, but I’m not sure whether some of these new properties must be set in the new system, and there’s no information about this in the official documentation.
3
0
525
Sep ’25
AVCaptureDevice rotationCoordinator modifying CALayer on switching devices
I am trying to use AVCaptureDevice.rotationCoordinator API to observe angles for preview and capture and it seems there is an issue with the API when used with arbitrary CALayer (which is not a AVCaptureVideoPreviewLayer) and switching cameras. Here is my setup. The below function is defined in an actor class called CameraManager that performs setup of rotationCoordinator. func updateRotationCoordinator(_ callback:@escaping @MainActor (CGFloat) -> Void) { guard let device = sessionConfiguration.activeVideoInput?.device, let displayLayer = displayLayer else { return } cancellables.removeAll() rotationCoordinator = AVCaptureDevice.RotationCoordinator(device: device, previewLayer: displayLayer) guard let coordinator = rotationCoordinator else { return } coordinator.publisher(for: \.videoRotationAngleForHorizonLevelPreview) .receive(on: DispatchQueue.main) .sink { degrees in let radians = degrees * .pi / 180 MainActor.assumeIsolated { callback(radians) } } .store(in: &cancellables) } This works the very first time but when I switch cameras and call this function again, it throws a runtime error that view's layer is modified from a non-main thread. This happens at the very line where rotation coordinator is been recreated. It's not clear why initialising rotation coordinator should modify CALayer properties right in it's init method. Modifying properties of a view's layer off the main thread is not allowed: view <MyApp.DisplayLayerView: 0x102ffaf40> with nearest ancestor view controller <_TtGC7SwiftUI19UIHostingControllerGVS_15ModifiedContentVS_7AnyViewVS_12RootModifier__: 0x101f7fb80>; backtrace: ( 0 UIKitCore 0x0000000194a977b4 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 22509492 1 UIKitCore 0x000000019358594c 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 416076 2 QuartzCore 0x00000001927f5bd8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43992 3 QuartzCore 0x00000001927f5a4c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43596 4 QuartzCore 0x000000019283a41c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 324636 5 QuartzCore 0x000000019283a0a8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 323752 6 AVFCapture 0x00000001af072a18 09192166-E0B6-346C-B1C2-7C95C3EFF7F7 + 420376 7 MyApp.debug.dylib 0x0000000105fa3914 $s10MyApp15CapturePipelineC25updateRotationCoordinatoryyy12CoreGraphics7CGFloatVScMYccF + 972 8 MyApp.debug.dylib 0x00000001063ade40 $s10MyApp11CameraModelC18switchVideoDevicesyyYaFTY3_ + 72 9 MyApp.debug.dylib 0x0000000105fe3cbd $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TQ1_ + 1 10 MyApp.debug.dylib 0x0000000105ff06d9 $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TATQ0_ + 1 11 MyApp.debug.dylib 0x0000000105f9c595 $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTQ0_ + 1 12 MyApp.debug.dylib 0x0000000105f9fb3d $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTATQ0_ + 1 13 libswift_Concurrency.dylib 0x000000019c49fe39 E15CC6EE-9354-3CE5-AF91-F641CA8283E0 + 433721 )
2
0
597
Feb ’25
AVSpeechSynthesizer read Mandarin as Cantonese(iOS 26 beta 3))
In iOS 26, AVSpeechSynthesizer read Mandarin into Cantonese pronunciation. No matter how you set the language, and change the settings of my phone system, it doesn't work. let utterance = AVSpeechUtterance(string: "你好啊") //let voice = AVSpeechSynthesisVoice(language: "zh-CN") // not work let voice = AVSpeechSynthesisVoice(language: "zh-Hans") // not work too utterance.voice = voice et synth = AVSpeechSynthesizer() synth.speak(utterance)
1
0
302
Aug ’25