Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Inquiry about Low-Latency Frame Interpolation using VTFrameProcessor
Hello, I have implemented Low-Latency Frame Interpolation using the VTFrameProcessor framework, based on the sample code from https://developer.apple.com/kr/videos/play/wwdc2025/300. It is currently working well for both LIVE and VOD streams. However, I have a few questions regarding the lifecycle management and synchronization of this feature: Dynamic Toggling: Do you recommend enabling/disabling Low-Latency Frame Interpolation dynamically during playback? Or is it better practice to configure this only during the initial setup/preparation phase? Synchronization Method: I am currently using CADisplayLink to fetch frames and perform interpolation. However, I am concerned that CADisplayLink might not be suitable if I need to toggle the feature on or off during active playback. If CADisplayLink is not ideal for this dynamic scenario, could you recommend an alternative approach or best practice for synchronization? Supported Quality/Resolution: What are the minimum and maximum video resolutions (or quality levels) supported for frame interpolation? Additionally, is there a recommended resolution for optimal performance? The documentation on this specific topic seems limited, so I would appreciate any insights or advice. Thank you.
0
0
16
17m
_MPRemoteCommandEventDispatch crashes on iOS 26.x devices.
I'm seeing crashes in _MPRemoteCommandEventDispatch on iOS 26.x devices in 3 apps. According to Bugsnag logs they are: NSInternalInconsistencyException: event dispatch <_MPRemoteCommandEventDispatch: <MPRemoteCommandEvent: 0x11c049500 commandID=THV0 command=<MPRemoteCommand: 0x109ad1ea0 type=Play (0) enabled=YES handlers=[0x109b6a310]> sourceID=(null) ([HostedRoutingSessionDataSource] handleControlSendingCommand<2W5E>)> state:201> deallocated without calling continuation I attached a log from Xcode organizer matching Bugsnag crash. mpr_remote_command_event.crash When I set the brakpoint on the -[_MPRemoteCommandEventDispatch dealloc] I can see it it's hit every time I tap play or pause on locked screen play button. Thread 0 Crashed: 0 libsystem_kernel.dylib 0x00000002370420cc __pthread_kill + 8 (:-1) 1 libsystem_pthread.dylib 0x00000001e975c810 pthread_kill + 268 (pthread.c:1721) 2 libsystem_c.dylib 0x0000000198f8ff64 abort + 124 (abort.c:122) 3 libc++abi.dylib 0x000000018a7cf808 __abort_message + 132 (abort_message.cpp:66) 4 libc++abi.dylib 0x000000018a7be484 demangling_terminate_handler() + 304 (cxa_default_handlers.cpp:76) 5 libobjc.A.dylib 0x000000018a6cff78 _objc_terminate() + 156 (objc-exception.mm:496) 6 xxxxxxxxxxxxxx 0x00000001003a7db8 CPPExceptionTerminate() + 416 (BSG_KSCrashSentry_CPPException.mm:156) 7 libc++abi.dylib 0x000000018a7cebdc std::__terminate(void (*)()) + 16 (cxa_handlers.cpp:59) 8 libc++abi.dylib 0x000000018a7ceb80 std::terminate() + 108 (cxa_handlers.cpp:88) 9 CoreFoundation 0x000000018d7341c4 __CFRunLoopPerCalloutARPEnd + 256 (CFRunLoop.c:769) 10 CoreFoundation 0x000000018d70bb5c __CFRunLoopRun + 1976 (CFRunLoop.c:3179) 11 CoreFoundation 0x000000018d70aa6c _CFRunLoopRunSpecificWithOptions + 532 (CFRunLoop.c:3462) 12 GraphicsServices 0x000000022e31c498 GSEventRunModal + 120 (GSEvent.c:2049) 13 UIKitCore 0x00000001930ceba4 -[UIApplication _run] + 792 (UIApplication.m:3902) 14 UIKitCore 0x0000000193077a78 UIApplicationMain + 336 (UIApplication.m:5577) 15 xxxxxxxxxxxxxx 0x00000001000c0134 main + 308 (main.swift:15) 16 dyld 0x000000018a722e28 start + 7116 (dyldMain.cpp:1477) Is the crash happening when the app is being terminated? Thank you!
2
2
406
5h
FairPlay: SDK4 vs SDK26 credentials/certificate for iOS/tvOS client apps
Hi We’re updating our KSM to support SPC v2/v3 and currently operate with both legacy SDK4 credentials (ASK + 1024 cert) and SDK26 credentials (certificate bundle + provisioning data + 1024/2048 keys). Our client apps run across a wide range of iOS/tvOS versions, so we want to follow Apple’s recommended client strategy for certificate selection. The docs describe SHA‑1 vs SHA‑256 in the SPC header, but do not specify which OS versions should use SDK4 vs SDK26 credentials. Could you clarify: Is there an official minimum iOS/tvOS version where you recommend SDK26 credentials for client apps? For older OS versions (e.g. iOS 15), is SDK4 still the recommended choice for client apps? Are there any official migration guidelines for client apps moving from SDK4 to SDK26 credentials? Thanks in advance.
1
0
110
8h
Background Upload Extension
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application. 1 - We implemented the code to upload still images. We would like to do the same for Live Photos but we are not sure how to proceed since a Live Photo is composed of 2 resources that we would like to upload in the same job so that we keep the association between the 2 resources. Steps: A Live Photo is captured on the device. Our application is notified for new content in the Photo Library and the asset is queued for upload by our application. The system calls the background upload extension and the Live Photo is prepared for upload but we can schedule a job only for one resource (still photo or paired video) so we can not upload both resources in a single job. 2 - Is there a way to synchronise the application and the extension so that the application would not process the same data or execute the same requests than the extension. For example: our application is working with tokens and we would like to prevent those tokens to be consumed by the application and the extension at the same time. 3 - is there a command in xcode or terminal to start or stop the extension, something similar to what exists for processing tasks (https://developer.apple.com/documentation/backgroundtasks/starting-and-terminating-tasks-during-development).
3
1
533
9h
Repeated NUIdentifier crash
Some of our app's users are repeatedly running into a crash on NeutrinoCore -[NUIdentifier initWithNamespace:name:version:] + 2352. It looks from the stack trace like multiple threads are performing PHFetchRequests, but that shouldn't be causing a crash. It's isolated to a small number of users, which makes me think that it's something related to their specific Photos databases (e.g., data corruption.) Do you have any suggestions how I might be able to resolve this? 2 libsystem_c.dylib abort + 124 3 NeutrinoCore -[NUAssertionPolicyCrashReport notifyAssertion:] + 66 4 NeutrinoCore -[NUAssertionPolicyComposite notifyAssertion:] + 160 5 NeutrinoCore -[NUAssertionPolicyUnique notifyAssertion:] + 176 6 NeutrinoCore -[NUAssertionHandler handleFailureInFunction:file:lineNumber:currentlyExecutingJobName:description:arguments:] + 156 7 NeutrinoCore _NUAssertFailHandler + 176 8 NeutrinoCore -[NUIdentifier initWithNamespace:name:version:] + 2352 9 NeutrinoCore -[NUIdentifier initWithName:version:] + 84 10 NeutrinoCore -[NUIdentifier initWithName:] + 68 11 PhotoImaging +[PISchema identifier] + 36 12 PhotoImaging +[PISchema registeredPhotosSchemaIdentifier] + 32 13 PhotoImaging +[PIPhotoEditHelper newComposition] + 28 14 PhotoImaging +[PICompositionSerializer deserializeCompositionFromAdjustments:metadata:formatIdentifier:formatVersion:sidecarData:error:] + 160 15 PhotoImaging +[PICompositionSerializer deserializeCompositionFromData:formatIdentifier:formatVersion:sidecarData:error:] + 224 16 PhotoLibraryServices -[PLPhotoEditPersistenceManager loadCompositionFrom:formatIdentifier:formatVersion:sidecarData:error:] + 1848 17 PhotoLibraryServices +[PLPhotoEditPersistenceManager validateAdjustmentData:formatIdentifier:formatVersion:error:] + 108 18 Photos __167+[PHContentEditingInputRequestContext contentEditingInputRequestContextForAsset:requestID:managerID:networkAccessAllowed:downloadIntent:progressHandler:resultHandler:]_block_invoke + 260 19 Photos -[PHAdjustmentData(ContentEditingInput) _contentEditing_readableByClientWithVerificationBlock:] + 136 20 Photos -[PHAdjustmentData(ContentEditingInput) _contentEditing_requiredBaseVersionReadableByClient:verificationBlock:] + 88 21 Photos -[PHContentEditingInputRequestContext _adjustmentBaseVersionFromResult:request:canHandleAdjustmentData:] + 356 22 Photos -[PHContentEditingInputRequestContext produceChildRequestsForRequest:reportingIsLocallyAvailable:isDegraded:result:] + 624 23 Photos -[PHMediaRequestContext _produceChildRequestsForRequest:withResult:] + 88 24 Photos -[PHMediaRequestContext mediaRequest:didFinishWithResult:] + 92 25 Photos -[PHAdjustmentDataRequest _finishFromAsynchronousCallback] + 124 26 Photos __39-[PHAdjustmentDataRequest startRequest]_block_invoke + 584 27 PhotoLibraryServicesCore __106-[PLAssetsdResourceClient adjustmentDataForAsset:networkAccessAllowed:trackCPLDownload:completionHandler:]block_invoke.84 + 880 28 CoreFoundation invoking + 148 29 CoreFoundation -[NSInvocation invoke] + 424 30 Foundation <deduplicated_symbol> + 16 31 Foundation -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 528 32 Foundation __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188 33 libxpc.dylib _xpc_connection_reply_callout + 124 42 libsystem_pthread.dylib start_wqthread + 8
0
0
44
10h
USDZ model files when opened in Preview turns black
On macOS 26.2 (Tahoe), the Preview app fails to render many USDZ models correctly. The issue appears inconsistent: • Some USDZ files open normally in Preview • Some USDZ files open but the viewport turns completely black (no geometry, no material, no lighting) • All the same files open correctly when opened using “Open With → Xcode” This was not the behavior on macOS 15.7.2, where the exact same USDZ files rendered consistently in Preview without failures.
1
0
93
14h
Cannot generate 2048-bit FairPlay Streaming certificate
Hello, I have a problem generating a 2048-bit FairPlay Streaming certificate. I tried generating SDK v26.x certificate in two ways. (1) Use existing certificate (2) Create new certificate Though, in both ways, Apple gives me a certificate bundle of 1024-bit certificate. (fps_certificate.bin) I've uploaded 2048-bit CSR on creating a certificate. Just to note, I have created a SDK v4.x certificate few years ago. Have anyone bumped into a same issue? Or am I missing something?
2
0
233
20h
AudioOutputUnitStart takes ~500 ms when using Push-to-Talk framework after beginTransmission
I’m working with the Push-to-Talk (PTT) framework and observing a consistent delay when starting audio capture. Scenario: A PTT call is already active The AVAudioSession is fully configured I request beginTransmission on the PTT channel I start my Audio Unit for recording (AudioOutputUnitStart) Observed behavior: AudioOutputUnitStart takes ~500 ms This happens whether I start the Audio Unit: after didBeginTransmission, or after AVAudioSession didActivate Comparison: Using the same Audio Unit, same format, and same configuration Without the PTT framework, AudioOutputUnitStart takes ~200 ms Additional notes: I am not modifying or reconfiguring AVAudioSession when requesting beginTransmission The audio session is already set up when the PTT call starts There are no interruptions or route changes at the time of starting the Audio Unit Impact: This extra latency is significant for Push-to-Talk use cases where fast transmit start is critical.
1
0
197
1d
Facing issue with fairplay Streaming server SDK 26.0.0
I am trying to Build server for testing on Linux(Alma linux 9 VM) NAME="AlmaLinux" VERSION="9.7 (Moss Jungle Cat)" ID="almalinux" ID_LIKE="rhel centos fedora" VERSION_ID="9.7" PLATFORM_ID="platform:el9" PRETTY_NAME="AlmaLinux 9.7 (Moss Jungle Cat)" ANSI_COLOR="0;34" [azuki@AlmaDevVM ~]$ uname -m x86_64 I have tried the following steps: Before starting, ensured that Swift 6 installed. Referred https://www.swift.org/install/ for instructions. Build the library In Terminal, uses the following commands to compile the Swift library: cd Development/Key_Server_Module/Swift swift build -Xbuild-tools-swiftc -DTEST_CREDENTIALS After building the library, ran test cases to ensure the library behaves as expected. ALL unit tests are passing with the development credentials. • Since I was using an x86_64 machine: export LD_LIBRARY_PATH=./Sources/prebuilt/x86_64-unknown-linux-gnu/ Run all tests: swift test -Xbuild-tools-swiftc -DTEST_CREDENTIALS --disable-swift-testing Build the server Build the server: Apache Before starting, ensured the following: a. Installed Apache HTTPD and the dev tools. Using the following command for installation: yum install httpd httpd-devel redhat-rpm-config b. After this, integrated it into the Apache server environment with swift library that was built above. Used the following command to build the server using apxs: • Since I was using an x86_64 machine: apxs -i -a -c -Wl,-L${PWD}/.build/x86_64-unknown-linux-gnu/debug/ -Wl,-lswift_fpssdk -Wl,-L${PWD}/Sources/prebuilt/x86_64-unknown-linux-gnu -lfpscrypto -Wl,-R${PWD}/.build/x86_64-unknown-linux-gnu/debug server_setup/mod_fps.c c. Next, copied the dependent libraries to the Apache modules folder using these commands: • If using an x86_64 machine: cp Sources/prebuilt/x86_64-unknown-linux-gnu/libfpscrypto.so /usr/lib64/httpd/modules/libfpscrypto.so cp .build/x86_64-unknown-linux-gnu/debug/libswift_fpssdk.so /usr/lib64/httpd/modules/libswift_fpssdk.so d. Configuring Apache HTTPD Configured Apache HTTPD by adding the module and handler to your Apache HTTPD configuration (/etc/httpd/conf/httpd.conf). Note that the apxs command may automatically add the LoadModule line in the previous step. Listen 8080 LoadFile /usr/lib64/httpd/modules/libfpscrypto.so LoadFile /usr/lib64/httpd/modules/libswift_fpssdk.so LoadModule fps_module /usr/lib64/httpd/modules/mod_fps.so <Location "/fps"> SetHandler fps_handler Copy the credentials to the Apache modules folder. cp -r ../credentials /usr/lib64/httpd/modules/ export FPS_CERT_PATH= /usr/lib64/httpd/modules/credentials/test_certificates.json e. Run your server You can run the Apache HTTPD server with the configured module by using the following command: httpd -D FOREGROUND No issues see till step. Get SDK version [azuki@AlmaDevVM Key_Server_Module]$ curl localhost:8080/fps/v 26.0.0 But when i try to generate license [azuki@AlmaDevVM Key_Server_Module]$ curl -d ../Test_Inputs/iOS/spc_ios_hd_lease_2048.json localhost:8080/fps {"fairplay-streaming-response":{"create-ckc":[{"id":1,"status":-42601}]}} Can you please suggest what i might be missing here?
8
0
932
1d
AVAudioEngine fails to start during FaceTime call (error 2003329396)
Is it possible to perform speech-to-text using AVAudioEngine to capture microphone input while being on a FaceTime call at the same time? I tried implementing this, but whenever I attempt to start the  AVAudioEngine  while a FaceTime call is active, I get the following error: “The operation couldn’t be completed. (OSStatus error 2003329396)” I assume this might be due to microphone resource restrictions during FaceTime, but I’d like to confirm whether this limitation is at the system level or if there’s any possible workaround or entitlement that allows concurrent microphone access. Has anyone encountered this issue or found a solution?
1
1
427
3d
Audio DSP Processing Issue / Metallic Ringing Artifacts when recording acoustic instruments on iPhone 17 Pro Max
Description: I have identified a specific issue when recording acoustic guitar and other instruments on the iPhone 17 Pro Max using native applications (Voice Memos, Camera). The recordings contain an unnatural metallic resonance (ringing artifacts) that should not be present. Testing and Methodology: Hardware Verification: Initially, I suspected a hardware defect in the audio chip or microphone. However, extensive testing with third-party software suggests this is likely a software-level issue. AudioShare Test: I conducted a test using the AudioShare app in "Measurement Mode" (which bypasses standard iOS system-wide audio processing). In this mode, the audio remains perfectly clean, and the metallic ringing disappears entirely. Conclusion: The issue is rooted in the DSP (Digital Signal Processing) algorithms that iOS applies for noise suppression or voice enhancement. These algorithms appear to misinterpret the high-frequency overtones of acoustic instruments as background noise and attempt to "filter" them, resulting in audible digital artifacts. Comparison Results: This issue has not been observed on devices from other brands or on older iPhone models (preliminary tests suggest older versions handle this better). Notably, the problem persists even in GarageBand, as the app still utilizes certain system-level processing layers. Proposed Solution: I suggest adding a "Raw Audio" or "Instrument Mode" toggle within the Microphone/Audio settings for native apps. This mode should disable aggressive DSP processing, similar to how the AVAudioSession.Mode.measurement works in specialized apps. Attachments: I am attaching 4 archives, including a final "Measurement Mode" folder with comparative samples (Measurement Mode vs. Standard Mode). The artifacts are most prominent when monitored through headphones.
0
0
54
4d
USB microphone input : Mac "Designed for iPad"
My app - natively iOS but built with the "Designed for iPad" option to run on Mac - does not recognise an attached USB microphone when running on a Mac. This line int32_t items = (int32_t) [[[AVAudioSession sharedInstance] availableInputs] count ]; returns 1, which is the Mac internal mic. On iPad and iPhone it sees both the internal mic and the USB mic. Is this an inherent "Designed for iPad" restriction, and is there some trick I can pull to get the USB microphone to be recognised by the system?
1
0
209
4d
AVCam Sample Code - Undesired "Jump" in Video Recording Image
On iPhone 16 Pro Max (not tested other devices) there's a noticeable jump in the framing of the preview video when you record in the iOS AVCam Sample App. The same jump in camera framing can be observed by switching to the front facing camera and then back to the rear one. It looks roughly consistent with switching between the 0.5x and 1x camera (but not quite a match for the same viewable area in the Camera app) - and it's only when it's initially loaded, once recording is started it retains the 'closer' image no matter how many times it's stopped/started thereafter. I'm relatively new to Swift and haven't done anything with the camera before, so odd 'buggy' behaviour in the sample code isn't helping me understand it! :-) Is there any way to fix this?
0
0
144
5d
SpeechAnalyzer > AnalysisContext lack of documentation
I'm using the new SpeechAnalyzer framework to detect certain commands and want to improve accuracy by giving context. Seems like AnalysisContext is the solution for this, but couldn't find any usage example. So I want to make sure that I'm doing it right or not. let context = AnalysisContext() context.contextualStrings = [ AnalysisContext.ContextualStringsTag("commands"): [ "set speed level", "set jump level", "increase speed", "decrease speed", ... ], AnalysisContext.ContextualStringsTag("vocabulary"): [ "speed", "jump", ... ] ] try await analyzer.setContext(context) With this implementation, it still gives outputs like "Set some speed level", "It's speed level", etc. Also, is it possible to make it expect number after those commands, in order to eliminate results like "set some speed level to" (instead of two).
1
0
396
5d
ApplicationMusicPlayer fails play in macCatalyst 26.3 due to RemotePlayerService crash
I've filed this as FB21446798 but figured I'd post here too. In the first build of macOS 26.3, playback via ApplicationMusicPlayer is completely broken. When starting playback of anything at all, the console shows the following error: applicationController: xpc service connection interrupted Failed to obtain remoteObject: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service created from an endpoint was invalidated from this process." UserInfo={NSDebugDescription=The connection to service created from an endpoint was invalidated from this process.} Failed to prepareToPlay with error: Error Domain=MPMusicPlayerControllerErrorDomain Code=10 "(null)" UserInfo={NSUnderlyingError=0xc92910ff0 {Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service created from an endpoint was invalidated from this process." UserInfo={NSDebugDescription=The connection to service created from an endpoint was invalidated from this process.}}} In addition, several crash logs for RemotePlayerService are generated, showing my app as the parent process. This issue is 100% repeatable. No matter how I load the queue, whether it’s catalog or library content, any variation I can think of all fails like this. I really hope this can be fixed before 26.3 comes out, otherwise my app will be totally unusable. 😅
2
0
529
5d
Are White Balance gains applied before or after ADC?
At which point in the image processing pipeline does iOS apply the white balance gains which can be set via AVCaptureDevice.setWhiteBalanceModeLocked(with:completionHandler:)? Are those gains applied in the analog part of the camera pipeline, before the pixel voltage gets converted via the ADC to digital values? Or does the camera first convert the pixel voltages to digital values and then the gains are applied to the digital values? Is this consistent across devices or can the behavior vary from device to device?
1
0
262
5d
How to mark Audio Unit as dirty (needing to be saved)
I'm working on a v2 Audio Unit that has some complicated internal state (audio, midi, other settings). When the internal state changes, I want to inform the host (f.i. Logic Pro) that my plugin state has changed, and that the main window should show the 'project changed' status through the window close button. This was easy to achieve for the VST version of the plugin, but I can't figure out a way to do it for the Audio Unit. I've tried: Notifying change of the kAudioUnitProperty_ClassInfo property that stores the plugin state: unit->PropertyChanged(kAudioUnitProperty_ClassInfo, kAudioUnitScope_Global, 0); Setting the kAudioUnitProperty_ClassInfo property value each time the plugin state changes. Adding a new parameter called 'dirtystate' and toggling it and notifying the change each time the plugin state changes. But nothing really make Logic take notice. This should be an easy task, but I can't put my finger on it. How do I flag may AUv2 as needing its status saved (i.e. the host project needs saving)?
0
0
120
6d
AppleAVBAudio assertion information
Hi, I'm currently developping an AVB hardware device, and I'm currently stuck because because the apple AVB stack is throwing me errors without much informations. Is there any way to have more information about these assertions and why they are happening ? Furtermore is there any documentation on theAppleAVBAudio module ? It would be very handy Here are the logs shown in the console: Filtering the log data using "process == "coreaudiod"" Timestamp Thread Type Activity PID TTL 2025-12-05 15:44:27.087043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.087545+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.088043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.088546+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.089043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.089545+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.090043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.090545+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.091043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.091545+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.092044+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.092544+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.093044+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.093552+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.094050+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.094543+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533
3
0
218
6d
is the output frame rate of a CMIOExtension rounded or capped?
I made a CMIOExtension (a virtual camera) which generates its own output, for use in our in-house software testing. I wanted to make a video source with 29.97, 30, 59.94 and 60fps output. To this end, I created a CMIOExtensionDeviceSource which creates a CMIOExtensionDevice with one CMIOExtensionStreamSource with various stream formats contained in [CMIOExtensionStreamFormat], including one with both maxFrameDuration and minFrameDuration = CMTimeMake(value: 1000, timescale: 30000) and another with both maxFrameDuration and minFrameDuration = CMTimeMake(value: 1001, timescale: 30000) I've held off on the creation of the 59.94/60fps source for now until this problem is resolved. my virtual camera works, it produces a signal, but when I examine its associated AVCaptureDevice in the debugger, I find (lldb) po self.captureDevice?.formats[0].videoSupportedFrameRateRanges[0].maxFrameDuration ▿ Optional<CMTime> ▿ some : CMTime - value : 1000000 - timescale : 30000000 ▿ flags : CMTimeFlags - rawValue : 1 - epoch : 0 I get the same value, 1000000/30000000, or exactly 30fps, for all the formats of my AVCaptureDevice. Is there something I'm doing wrong, or do CMIOExtensionDevices always round the frame rates? I can't force CoreMediaIO to produce frames at exactly my desired frame interval, but I'd like to ensure that the average frame rate is my desired rate. How can I do that? Frame emission is governed by a repeating DispatchSourceTimer with a repeat time specified in nanoseconds with the TimerFlags set to 'strict'.
2
0
543
1w