Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Can we say that HLS uses UDP?
Hello, Apple video engineers. According to the official documentation, HLS is built on HTTP and traditionally ran on top of TCP. However, with the introduction of HTTP/3, which uses QUIC (runs on top of UDP), I would like to clarify the following: Has the official HLS specification changed in a way that allows it to be considered UDP-based when using HTTP/3? And is it fair to say that HLS supports UDP since the transport can go over HTTP/3 and QUIC? Would it be more accurate to say that HLS remains HTTP-dependent, and the transport protocol (TCP or QUIC) only determines how HTTP requests are delivered? My thoughts: Since HTTP/3 uses QUIC running over UDP, we still can't say that HLS supports UDP in a classical way, as it is introduced in RTP, RTSP, SRT.
0
0
434
Feb ’25
AVMIDIPlayer.play() function crashes on iOS 18
It's only occurs on iOS 18+. Backtrace attached below. Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: NoteKeys [24384] Triggered by Thread: 0 Last Exception Backtrace: 0 CoreFoundation 0x1a2d4c7cc __exceptionPreprocess + 164 (NSException.m:249) 1 libobjc.A.dylib 0x1a001f2e4 objc_exception_throw + 88 (objc-exception.mm:356) 2 CoreFoundation 0x1a2e47748 +[NSException raise:format:] + 128 (NSException.m:0) 3 AVFAudio 0x1bd41f4c8 -[AVMIDIPlayer play:] + 300 (AVMIDIPlayer.mm:145) 4 NoteKeys 0x1023c0670 SoundGenerator.playData() + 20 (SoundGenerator.swift:170) 5 NoteKeys 0x1023c0670 EditViewController.playBtnTapped(startIndex:) + 940 (EditViewController.swift:2034) 6 NoteKeys 0x1024497fc specialized Keyboard.playBtnTapped(sender:) + 1904 (Keyboard.swift:1249) 7 NoteKeys 0x10244631c Keyboard.playBtnTapped(sender:) + 4 (<compiler-generated>:0) 8 NoteKeys 0x10244631c @objc Keyboard.playBtnTapped(sender:) + 48 9 UIKitCore 0x1a58739cc -[UIApplication sendAction:to:from:forEvent:] + 100 (UIApplication.m:5816) 10 UIKitCore 0x1a58738a4 -[UIControl sendAction:to:forEvent:] + 112 (UIControl.m:942) 11 UIKitCore 0x1a58736f4 -[UIControl _sendActionsForEvents:withEvent:] + 324 (UIControl.m:1013) 12 UIKitCore 0x1a5fe8d8c -[UIButton _sendActionsForEvents:withEvent:] + 124 (UIButton.m:4198) 13 UIKitCore 0x1a5fea5a0 -[UIControl touchesEnded:withEvent:] + 400 (UIControl.m:692) 14 UIKitCore 0x1a57bb9ac -[UIWindow _sendTouchesForEvent:] + 852 (UIWindow.m:3318) 15 UIKitCore 0x1a57bb3d8 -[UIWindow sendEvent:] + 2964 (UIWindow.m:3641) 16 UIKitCore 0x1a564fb70 -[UIApplication sendEvent:] + 376 (UIApplication.m:12972) 17 UIKitCore 0x1a565009c __dispatchPreprocessedEventFromEventQueue + 1048 (UIEventDispatcher.m:2686) 18 UIKitCore 0x1a5659f3c __processEventQueue + 5696 (UIEventDispatcher.m:3044) 19 UIKitCore 0x1a5552c60 updateCycleEntry + 160 (UIEventDispatcher.m:133) 20 UIKitCore 0x1a55509d8 _UIUpdateSequenceRun + 84 (_UIUpdateSequence.mm:136) 21 UIKitCore 0x1a5550628 schedulerStepScheduledMainSection + 172 (_UIUpdateScheduler.m:1171) 22 UIKitCore 0x1a555159c runloopSourceCallback + 92 (_UIUpdateScheduler.m:1334) 23 CoreFoundation 0x1a2d20328 __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 (CFRunLoop.c:1970) 24 CoreFoundation 0x1a2d202bc __CFRunLoopDoSource0 + 176 (CFRunLoop.c:2014) 25 CoreFoundation 0x1a2d1ddc0 __CFRunLoopDoSources0 + 244 (CFRunLoop.c:2051) 26 CoreFoundation 0x1a2d1cfbc __CFRunLoopRun + 840 (CFRunLoop.c:2969) 27 CoreFoundation 0x1a2d1c830 CFRunLoopRunSpecific + 588 (CFRunLoop.c:3434) 28 GraphicsServices 0x1eecfc1c4 GSEventRunModal + 164 (GSEvent.c:2196) 29 UIKitCore 0x1a5882eb0 -[UIApplication _run] + 816 (UIApplication.m:3844) 30 UIKitCore 0x1a59315b4 UIApplicationMain + 340 (UIApplication.m:5496) 31 NoteKeys 0x10254bc10 main + 68 (AppDelegate.swift:15) 32 dyld 0x1c870aec8 start + 2724 (dyldMain.cpp:1334) Thanks very much for any help: )
1
0
306
Feb ’25
AVPlayer error: Too many open files
For some users in production, there's a high probability that after launching the App, using AVPlayer to play any local audio resources results in the following error. Restarting the App doesn't help. issue: [error: Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSLocalizedFailureReason=发生未知错误(24), NSLocalizedDescription=这项操作无法完成, NSUnderlyingError=0x30311f270 {Error Domain=NSPOSIXErrorDomain Code=24 "Too many open files"}} I've checked the code, and there aren't actually multiple AVPlayers playing simultaneously. What could be causing this?
0
0
434
Feb ’25
Best way to stream audio from file system
I am trying to stream audio from local filesystem. For that, I am trying to use an AVAssetResourceLoaderDelegate for an AVURLAsset. However, Content-Length is not known at the start. To overcome this, I tried several methods: Set content length as nil, in the AVAssetResourceLoadingContentInformationRequest Set content length to -1, in the ContentInformationRequest Both of these cause the AVPlayerItem to fail with an error. I also tried setting Content-Length as INT_MAX, and setting a renewalDate = Date(timeIntervalSinceNow: 5). However, that seems to be buggy. Even after updating the Content-Length to the correct value (e.g. X bytes) and finishing that loading request, the resource loader keeps getting requests with requestedOffset = X with dataRequest.requestsAllDataToEndOfResource = true. These requests keep coming indefinitely, and as a result it seems that the next item in the queue does not get played. Also, .AVPlayerItemDidPlayToEndTime notification does not get called. I wanted to check if this is an expected behavior or is there a bug in this implementation. Also, what is the recommended way to stream audio of unknown initial length from local file system? Thanks!
1
0
172
Mar ’25
MusicKit Web Playback States
In MusicKit Web the playback states are provided as numbers. For example the playbackStateDidChange event listener will return: {oldState: 2, state: 3, item:...} When the state changes from playing (2) to paused (3). Those are pretty easy to guess, but I'm having a hard time with some of the others: completed, ended, loading, none, paused, playing, seeking, stalled, stopped, waiting. I cannot find a mapping of states to numbers documented anywhere. I got the above states from an enum in a d.ts file that is often incorrect/incomplete. Can someone help out pointing to the docs or provide a mapping? Thanks.
2
0
440
Feb ’25
H.265 Decoding with VideoToolBox
I am creating an app that decodes H.265 elementary streams on iOS. I use VideoToolBox to decode from H.265 to NV12. The decoded data is enqueued in the CMSampleBufferDisplayLayer as a CMSampleBuffer. However, nothing is displayed in the VideoPlayerView. It remains black. The decoding in VideoToolBox is successful. I confirmed this by saving the NV12 data in the CMSampleBuffer to a file and displaying it using a tool. Why is nothing displayed in the VideoPlayerView? I can provide other source code as well. // // VideoPlayerView.swift // H265Decoder // // Created by Kohshin Tokunaga on 2025/02/15. // import SwiftUI import AVFoundation struct VideoPlayerView: UIViewRepresentable { // Return an H265Player as the coordinator, and start playback there. func makeCoordinator() -> H265Player { H265Player() } func makeUIView(context: Context) -> UIView { let uiView = UIView(frame: .zero) // Base layer for attaching sublayers uiView.backgroundColor = .black // Screen background color (for iOS) // Create the display layer and add it to uiView.layer let displayLayer = context.coordinator.displayLayer displayLayer.frame = uiView.bounds displayLayer.backgroundColor = UIColor.clear.cgColor uiView.layer.addSublayer(displayLayer) // Start playback context.coordinator.startPlayback() return uiView } func updateUIView(_ uiView: UIView, context: Context) { // Reset the frame of the AVSampleBufferDisplayLayer when the view's size changes. let displayLayer = context.coordinator.displayLayer displayLayer.frame = uiView.layer.bounds // Optionally update the layer's background color, etc. uiView.backgroundColor = .black displayLayer.backgroundColor = UIColor.clear.cgColor // Flush transactions if necessary CATransaction.flush() } } // // H265Player.swift // H265Decoder // // Created by Kohshin Tokunaga on 2025/02/15. // import Foundation import AVFoundation import CoreMedia class H265Player: NSObject, VideoDecoderDelegate { let displayLayer = AVSampleBufferDisplayLayer() private var decoder: H265Decoder? override init() { super.init() // Initial configuration for the display layer displayLayer.videoGravity = .resizeAspect // Initialize the decoder (delegate = self) decoder = H265Decoder(delegate: self) // For simple playback, set isBaseline to true decoder?.isBaseline = true } func startPlayback() { // Load the file "cars_320x240.h265" guard let url = Bundle.main.url(forResource: "temp2", withExtension: "h265") else { print("File not found") return } do { let data = try Data(contentsOf: url) // Set FPS and video size as needed let packet = VideoPacket(data: data, type: .h265, fps: 30, videoSize: CGSize(width: 1080, height: 1920)) // Decode as a single packet decoder?.decodeOnePacket(packet) } catch { print("Failed to load file: \(error)") } } // MARK: - VideoDecoderDelegate func decodeOutput(video: CMSampleBuffer) { // When decoding is complete, send the output to AVSampleBufferDisplayLayer displayLayer.enqueue(video) } func decodeOutput(error: DecodeError) { print("Decoding error: \(error)") } }
0
0
384
Feb ’25
Unable to match music with shazamkit for Android
Hello, i can successfully match music using shazamkit on Apple using SwiftUI, a simple app that let user to load an audio file and exctracts the relative match, while i am unable to match music using shamzamkit on Android. I am trying to make the same simple app but i cannot match music as i get MATCH_ATTEMPT_FAILED every time i try to. I don't know what i am doing wrong but the shazam part in the kotlin Android code is in this method : suspend fun processAudioFileInBackground( filePath: String, developerTokenProvider: DeveloperTokenProvider ) = withContext(Dispatchers.IO) { val bufferSize = 1024 * 1024 val audioFile = FileInputStream(filePath) val byteBuffer = ByteBuffer.allocate(bufferSize) byteBuffer.order(ByteOrder.LITTLE_ENDIAN) var bytesRead: Int while (audioFile.read(byteBuffer.array()).also { bytesRead = it } != -1) { val signatureGenerator = (ShazamKit.createSignatureGenerator(AudioSampleRateInHz.SAMPLE_RATE_44100) as ShazamKitResult.Success).data signatureGenerator.append(byteBuffer.array(), bytesRead, System.currentTimeMillis()) val signature = signatureGenerator.generateSignature() println("Signature: ${signature.durationInMs}") val catalog = ShazamKit.createShazamCatalog(developerTokenProvider, Locale.ENGLISH) val session = (ShazamKit.createSession(catalog) as ShazamKitResult.Success).data val matchResult = session.match(signature) println("MatchResult : $matchResult") setMatchResult(matchResult) byteBuffer.clear() } audioFile.close() } I noticed that changing Locale in catalog creation results in different result as i get NoMatch without exception. Can you please help me with this? Do i need to create a custom catalog?
0
0
117
May ’25
Making sense of AVAudioSession interruption notifications
I have an app under development - demo here - https://youtu.be/VbAfUk_eYl0?si=s6EDBx-4G6P_QbZO - which is sort of an audio player for airdropped files - something useful to musicians who dump work in progress to their phone, make notes, revise and update. I've been testing my handling of audio session interruption notifications, but seems to be a lot of inconsistency in how, when and why iOS delivers them, and I'm wondering if there is some rhyme or reason to it that I'm just not detecting. For example, I am playing a song in my app. Switch to Apple Music and start playing a song there. My app gets an interruption began notification - this is consistent. Switch back to my app, and about half the time, I will get an interruption ended notification (coupled often with a blast of the tail of whatever audio buffer was partially played when the interruption started, even though the engine was stopped - and followed by call to my AVAudioPlayerNodeCompletionCallback - is there some way to avoid this?). Half the time I don't get an interruption ended notification; my app can (as expected) end the interruption by activating the AVAudioSession and playing something. I have not been able to determine any pattern to this behavior, other than that if my app started playing using AVAudioPlayerNode.scheduleSegment rather than scheduleFile I think the notification will be consistently delivered on app activation rather than when I activate the session programmatically. I would like my app to behave deterministically, and would appreciate any help in deciphering what causes the inconsistent behavior in notifications from iOS.
2
0
390
Mar ’25
Does PHAsset localIdentifier change after device transfer?
Hi, I’m working on a photo backup app. I track the PHAsset localIdentifier to determine which photos have been backed up and which haven’t. Recently, I’ve noticed that two users seem to have experienced the localIdentifier changes after transferring data to a new iPhone using Quick Start. Additionally, others on StackOverflow have mentioned that the localIdentifier sometimes changes after updating the iOS version. https://stackoverflow.com/questions/40094728/phobject-localidentifier-reliability I’d like to confirm the reliability of the localIdentifier after an iOS version upgrade or device transfer. Can I continue using these locally stored localIdentifiers? Or is there another recommended approach, such as using PHCloudIdentifier?
2
0
398
Feb ’25
Feature Request: Long-Lived Access to Personal Apple Music Data
Feature Request: Long-Lived Access to Personal Apple Music Data Use Case Summary I'm developing a personal portfolio website (using Nuxt) and want to display information from my own Apple Music library - showcasing personal playlists, recently played tracks, or a read-only "now playing" widget. This is purely for personal use on my website and doesn't require other users to log in. With Spotify's API, implementing this was straightforward thanks to automatic token refresh. I want a similarly seamless integration with Apple Music. Challenge with MusicKit and Music User Tokens Apple Music API requirements Apple's Music API requires a valid Music User Token (MUT) for requests involving personal library data. Beyond the Apple Developer Token, you must obtain a user-specific token via MusicKit authentication to access your own library playlists, play history, or current playback status. Token expiration and manual renewal Music User Tokens expire after approximately 6 months without any mechanism to automatically refresh or renew them - unlike typical OAuth flows that provide refresh tokens. Apple's guidance suggests the device (e.g., iPhone) is responsible for obtaining new user tokens when old ones expire. This works for interactive apps on Apple devices but fails in server-side or long-lived web contexts like a personal website widget. Impact on personal projects Displaying Apple Music data on a public-facing site becomes difficult. I would need to periodically re-authenticate through the MusicKit JS flow every few months just to keep a widget alive. Embedding credentials in a public site is insecure, and manual token refreshing is cumbersome and easy to forget. Comparison to Spotify's Token Model Spotify's API offers a developer-friendly authentication model. Their OAuth flow provides a Refresh Token that applications can use to obtain new access tokens automatically without requiring user re-authorization. This means a personal app can maintain continuous access to a user's Spotify data for extended periods until access is revoked. When building a similar feature with Spotify, this automatic token renewal was crucial. I could safely store the refresh token on my server and have my app periodically update the access token. Many developers have created public-facing widgets showing currently playing tracks on blogs or GitHub profiles using this model. Unfortunately, Apple Music's API lacks an equivalent capability, putting it at a disadvantage for personal projects. Proposed Solutions I request Apple's consideration for one of these enhancements: Provide a mechanism to refresh or extend a Music User Token programmatically for server-side applications. This could be an OAuth-style refresh token issued alongside the MUT, or a dedicated endpoint to exchange an expired MUT for a new one. This would enable renewal without a full user re-auth/login each time. Allow developers to access their own Apple Music library data with just the long-lived Developer Token. Apple could permit GET requests to personal library endpoints using the Developer Token alone, or a special token tied to the developer's Apple ID. This access would be read-only - no ability to modify the library, purely for retrieving data. It could be an opt-in feature in the Apple Developer account settings. Either solution would significantly improve the developer experience for Apple Music API in personal projects. Security and Privacy Considerations This request is not about accessing others' data or creating privacy loopholes - it's about empowering an Apple Music subscriber to access their own information more conveniently. The proposed options respect privacy principles: The data accessed is only what the user already has access to - their own playlists, library items, or playback status. An automatic token refresh can be designed securely (revocable tokens bound to a single account with no increase in permissions). Read-only developer token access could be restricted to non-sensitive data and require explicit opt-in. Conclusion I request an improvement to Apple Music's developer experience through either (1) an automatic Music User Token refresh mechanism, or (2) a provision for read-only personal library access using a Developer Token. This would bring Apple Music integration capabilities closer to parity with services like Spotify for personal projects. I ask Apple's Developer Relations and the Apple Music API team to consider this feature request. If there are existing best practices or workarounds with current APIs, I would appreciate guidance. I invite feedback from Apple or other developers. Are there known patterns for maintaining an Apple Music user token for server-side applications, or any plans to support non-interactive use cases? Any advice is welcome. Thank you for your consideration. I look forward to integrating Apple Music into my personal site as smoothly as with other services, and believe many developers would benefit from this added flexibility. Sources: User Authentication for MusicKit - Requirements for Music User Tokens StackOverflow: Do Apple Music User Tokens expire? - Confirmation of 6-month expiration MetaBrainz GSoC Blog - Documentation of MusicKit authentication limitations Apple Developer Forums - Information on token renewal behavior Spotify for Developers - Documentation on refresh token mechanism
1
0
256
Mar ’25
Seeking Expert Clarification on MusicKit Usage and Compliance for a App in Saudi Arabia
Hello Apple Developer Community, We are developing a music management platform for restaurants and cafes in Saudi Arabia. Our app enables businesses to schedule playlists and allows visitors to request songs via barcodes. Music playback is powered by Apple Music, and users must have their own Apple Music subscriptions to access the music. Our service charges a monthly subscription fee for these management features, not for music access itself. Project Overview and MusicKit Role Our app integrates MusicKit to leverage Apple Music’s catalog and playback capabilities. Users log in with their Apple Music accounts, ensuring they have an active subscription for music playback. Our platform’s value lies in its tools—playlist scheduling and song requests—which are built on top of MusicKit’s APIs. We offer these features exclusively in Saudi Arabia. Legal Context in Saudi Arabia In Saudi Arabia, to our understanding, no special licenses are required for playing music in commercial venues like restaurants and cafes. This means our clients can use Apple Music subscriptions for playback without additional performance rights licenses. While this aligns with local laws, we recognize that Apple’s global policies may impose stricter requirements, prompting our need for clarification. Subscription Model and Monetization Concerns We charge a monthly subscription fee for access to our app’s features (e.g., scheduling playlists and managing song requests). This fee is separate from the Apple Music subscription, which users must maintain for playback. However, Apple’s MusicKit terms state: "You agree not to require payment for or indirectly monetize access to the Apple Music service." We’re concerned whether our subscription model might be interpreted as indirectly monetizing Apple Music access, given its reliance on MusicKit for functionality. Scheduling Feature and Synchronization Rights Our app allows businesses to schedule playlists for general time slots (e.g., “play this playlist from 6 PM to 8 PM”). It does not support precise scheduling, such as playing a specific song at an exact moment (e.g., “play this song at 7:30 PM”). Apple’s guidelines mention that “deeper or more complex music integration” may require additional licenses, like synchronization rights. We’re unsure if our general scheduling feature crosses this threshold or remains within MusicKit’s standard usage. Questions for Clarification We’d greatly appreciate expert input on the following: Monetization: Does our subscription fee for management features (scheduling and song requests) violate Apple’s policy against indirectly monetizing Apple Music access? Local Context: Given that Saudi Arabia requires no additional licenses for commercial music playback, does this impact our compliance with Apple’s global terms? Scheduling: Does our playlist scheduling for general time slots (not exact moments) fall within MusicKit’s permitted scope, or does it require further licensing? Thank you in advance for any insights or guidance to ensure our app aligns with Apple’s policies!
0
0
333
Mar ’25
Coverting CVPixelBuffer 2VUY to a Metal Texture
I am working on a project for macOS where I am taking an AVCaptureSession's CVPixelBuffer and I need to convert it into a MTLTexture for rendering. On macOS the pixel format is 2vuy, there does not seem to be a clear format conversion while converting to a metal texture. I have been able to convert it to a texture but the color space seems to be off as it is rendering distorted colors with a double image. I believe 2vuy is a single pane color space and I have tried to account for that, but I am unaware of what is off. I have attached The CVPixelBuffer and The distorted MTLTexture along with a laundry list of errors. On iOS my conversions are fine, it is only the macOS 2vuy pixel format that seems to have issues. My code for the conversion is also attached. If there are any suggestions or guidance on how to properly convert a 2vuy CVPixelBuffer to a MTLTexture I would greatly appreciate it. Many Thanks Conversion_Logs.txt ConversionCode.swift
0
0
110
Mar ’25
Crash in Photos framework
Hi This is one of our top crashes. It does not contain any of our code in the stacktrace and we can't reproduce it. Those points make this crash very hard to understand and fix. We know that most of the crashes are happening on iPhone 13 with iOS 18.x.x. Also we see that a lot of cases happen when app goes into background (stacktrace contains -[UIApplication _applicationDidEnterBackground]). 2025-03-04_16-06-00.3670_-0500-6a273c7d5da97f098b5cc24898bb9761dc45208e.crash 2025-03-04_20-21-08.6609_-0500-2c08f640900f8a62c4f8a4f6f2a61feb052e66dd.crash 2025-03-04_20-46-27.7138_+0000-4d7ea89b1b564eda22ca63e708f7ad3909c7b768.crash
2
0
483
Mar ’25
Above Xcode16 operation project, in the project use AVPictureInPictureController opportunities (PIP) function open system blackout
I found that when the development tool above Xcode16 ran my app, I opened the suspended inscription function, and then opened the system camera, the content in the suspended window would not be displayed, and the suspended window would have a black screen. However, this phenomenon does not appear on Xcode15.4 development tools, it is the same code, I do not know why
6
0
689
Apr ’25
Missing Depth Frames When Recording with AVCaptureVideoDataOutputSampleBufferDelegate/AVCaptureDataOutputSynchronizerDelegate and AVAssetWriter
I’ve tried both AVCaptureVideoDataOutputSampleBufferDelegate (captureOutput) and AVCaptureDataOutputSynchronizerDelegate (dataOutputSynchronizer), but the number of depth frames and saved timestamps is significantly lower than the number of frames in the .mp4 file written by AVAssetWriter. In my code, I save: Timestamps for each frame to a metadata file Depth frames to a binary file Video to an .mp4 file If I record a 4-second video at 30fps, the .mp4 file correctly plays for 4 seconds, but the number of stored timestamps and depth frames is much lower—around 70 frames instead of the expected 120. Does anyone know why this mismatch happens? func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) { // Read all outputs guard let syncedDepthData: AVCaptureSynchronizedDepthData = synchronizedDataCollection.synchronizedData(for: depthDataOutput) as? AVCaptureSynchronizedDepthData, let syncedVideoData: AVCaptureSynchronizedSampleBufferData = synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else { // only work on synced pairs return } if syncedDepthData.depthDataWasDropped || syncedVideoData.sampleBufferWasDropped { return } let depthData = syncedDepthData.depthData let depthPixelBuffer = depthData.depthDataMap let sampleBuffer = syncedVideoData.sampleBuffer guard let videoPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer), let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer) else { return } addToPreviewStream?(CIImage(cvPixelBuffer: videoPixelBuffer)) if !canWrite() { return } // Extract the presentation timestamp (PTS) from the sample buffer let timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) //sessionAtSourceTime is the first buffer we will write to the file if self.sessionAtSourceTime == nil { //Make sure we don't start recording until the buffer reaches the correct time (buffer is always behind, this will fix the difference in time) guard sampleBuffer.presentationTimeStamp >= self.recordFromTime! else { return } self.sessionAtSourceTime = sampleBuffer.presentationTimeStamp self.videoWriter!.startSession(atSourceTime: sampleBuffer.presentationTimeStamp) } if self.videoWriterInput!.isReadyForMoreMediaData { self.videoWriterInput!.append(sampleBuffer) self.videoTimestamps.append( Timestamp( frame: videoTimestamps.count, value: timestamp.value, timescale: timestamp.timescale ) ) let ddm = depthData.depthDataMap depthCapture.addDepthData(pixelBuffer: ddm, timestamp: timestamp) } }
3
0
475
Feb ’25
Set the capture device color space to apple log not works.
I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log? Full runnable demo here: https://github.com/SpaceGrey/ColorSpaceDemo session.sessionPreset = .inputPriority // get the back camera let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back) backCamera = deviceDiscoverySession.devices.first! try! backCamera.lockForConfiguration() backCamera.automaticallyAdjustsVideoHDREnabled = false backCamera.isVideoHDREnabled = false let formats = backCamera.formats let appleLogFormat = formats.first { format in format.supportedColorSpaces.contains(.appleLog) } print(appleLogFormat!.supportedColorSpaces.contains(.appleLog)) backCamera.activeFormat = appleLogFormat! backCamera.activeColorSpace = .appleLog print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)") backCamera.unlockForConfiguration() do { let input = try AVCaptureDeviceInput(device: backCamera) session.addInput(input) } catch { print(error.localizedDescription) } // add output output = AVCaptureMovieFileOutput() session.addOutput(output) let connection = output.connection(with: .video)! print( output.outputSettings(for: connection) ) /* ["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled. "AVVideoCompressionPropertiesKey": { AverageBitRate = 220029696; ExpectedFrameRate = 30; PrepareEncodedSampleBuffersForPaddedWrites = 1; PrioritizeEncodingSpeedOverQuality = 0; RealTime = 1; }] */ previewSource = DefaultPreviewSource(session: session) queue.async { self.session.startRunning() } }
1
0
454
Mar ’25
Build a vision capture system for apple vision pro
Hi, I am a newbie here. We have been given a task to build a robotic vision system to capture an immersive video in a hazed environment, which will later be played on Apple Vision Pro. I am thinking of starting with 2 or 4 basic CMOS camera sensors, such as IMX378, AR0144, or VD66GY, and designing an FPGA-based circuit to synchronously capture and store raw frame-by-frame data. Some frame initial processing such as demosaicing and filtering can also be done by the FPGA. Then, I would use software for post-processing to convert the data into a compatible video format for Apple Vision Pro. Will this idea work? I can handle the raw data capture, but I’m unsure if this approach is feasible and what post-processing software I should use. Thanks a lot for your suggestions! Charlie
0
0
334
Feb ’25