Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

How to play Vorbis/OGG files with swift?
Does anyone have a working example on how to play OGG files with swift? I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C\++ to fill the PCM because this method seems to only be available in C\++ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes. I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac) I also tried using Cricket Audio framework below. https://github.com/sjmerel/ck It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue https://github.com/sjmerel/ck/issues/3 Right now I believe every player that can play OGGs on mac is written in Objective-C or C++. Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
2
0
4.4k
Jul ’25
Metal IR reference
Hello! I'm developing a GPU (shader) language, where I aim to target multiple backends with a common frontend. I wanted to avoid having to round trip through Metal, and go straight to IR just like I have with SPIRV, in order to have a fast and efficient compilation process. I've been looking for a reference page where I can read about Metals IR, and as far as I'm aware, it exists, but I can't seem to find it anywhere. Furthermore, if such a reference is available, is there also a toolkit where I can run validation on the output IR, and perhaps even run optimizations, much like spv-tools for SPIRV? Any help would be appreciated! Thanks, Gustav
2
0
324
Jul ’25
Xcode Metal Trace
Code is download from apple official metal4 sample [https://developer.apple.com/documentation/metal/drawing-a-triangle-with-metal-4?language=objc] enable metal gpu trace in macOS schema and trace a frame in Xcode. Xcode may show segment fault on App from some 'GTTrace' function when click trace button. When replay a .gputrace file, Xcode may crash , throw an internal error or a XPC error. The example code using old metal-renderer can trace without any problem and everything works fine. Test Environment: Xcode Version 26.2 (17C52) macOS 26.2 (25C56) M1 Pro 16GB A2442
2
0
507
Jan ’26
Warning code: 00000006 will affect background survival
Our APP has integrated 3D function, in order to reduce the memory occupation of the APP in the background, we will uninstall the 3D after the APP enters the background. However, the uninstall also causes problems. When the uninstall process is executed in the background, the app will briefly trigger the background GPU rendering error warning with the error warning code: OGPUMetalError: Insufficient Permission (to submit GPU work from background) (00000006:kIOGPUCommandBufferCallbackErrorBackgroundExecutionNotPermitted) Execution of the command buffer was aborted due to an error during execution. Insufficient Permission (to submit GPU Work from background) (00000006: kIOGPUCommandBufferCallbackErrorBackgroundExecutionNotPermitted) excuse me this warning system will tighten APP permissions background?​ For example, limit or shorten the background survival time of the APP. In addition, will the background refresh function fail, resulting in the failure of Bluetooth Ibeacon activation?
2
0
276
Aug ’25
Float64 (Double Precision) Support on MPS with PyTorch on Apple Silicon?
Hi everyone, This project uses PyTorch on an Apple Silicon Mac (M1/M2/etc.), and the goal is to use the MPS backend for GPU acceleration, notes Apple Developer. However, the workflow depends on Float64 (double-precision) floating-point numbers for certain computations, notes PyTorch Forums. The error "Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead" has been encountered, notes GitHub. It seems that the MPS backend doesn't currently support Float64 for direct GPU computation. Questions for the community: Are there any known workarounds or best practices for handling Float64-dependent operations when using the MPS backend with PyTorch? For those working with high-precision tasks on Apple Silicon, what strategies are being used to balance performance with the need for Float64? Offloading to the CPU is an option, and it's of interest to know if there are any specific techniques or libraries within the Apple ecosystem that could streamline this process while aiming for optimal performance. Any insights, tips, or experiences would be appreciated. Thanks in advance, Jonaid MacBook Pro M3 Max
2
1
478
Oct ’25
iPhone limited to 60hz frame rate
Just wondering if anyone knows what it will take to hit greater than 60hz when targeting iPhone. If I set the preferredFramesPerSecond of an MTKView to 120, it works on the iPad, but on iPhone it never goes over 60hz, even with a simple hello triangle sample app... is this a limitation of targeting iPhone?
2
0
220
Sep ’25
Metal HUD Display Value Range
Can't seem to get the Metal HUD to display value range's (pre 26 Tahoe). The documented environment variable MTL_HUD_SHOW_VALUE_RANGE doesn't seem to work. https://developer.apple.com/documentation/xcode/monitoring-your-metal-apps-graphics-performance#Display-the-value-range-of-metrics Anyone having any luck?
2
0
348
Sep ’25
Pink screen on MTLCommandBuffer.presentDrawable.
I rewrote my graphics pipeline to use Load/Store better for clearing and don't care cases. All my tests pass, and in the Metal debugger, all the draw calls succeed. But when I present drawables (before [commandBuffer commit]) I only get a pink screen. I've tried everything I can think of: making sure the pixel formats are the same for the back buffer as my render targets, etc. But it's still pink. Could you point me in the right direction so I can fix this, or help describe why it's pink. That would be really helpful. Thank you, Brian Hapgood
2
0
411
Sep ’25
receivedTurnEventForMatch giving stale data
In my turn-based game, I receive GKListener event receivedTurnEventForMatch and decode the match.matchData. On occasion, the matchData is clearly stale and is from the previous turn. If I call the MatchMaker ViewController up and select that same match, the data is not stale, so it's not a matter of not calling endTurn. I have tried both loadMatchWithID and loadMatchesWithCompletionHandler after receiving the receivedTurnEventForMatch, but the data is still stale. Advice?
2
0
747
Jan ’26
GameKit Leaderboards not working in iOS 26.2
Leaderboards working fine in iOS 26.1 but seem to be broken in 26.2 and also in the 26.3 developer beta. Players cannot submit scores and neither can they view scores on Apple's default leaderboards. Custom leaderboards that rely on pulling information using GameKit APIs also fail. Is there a workaround or patch for this?
2
0
492
Jan ’26
Error this SDK is not supported by the compiler - Xcode version 16.2 (16C5032a) to 16.3(16E140)
In Xcode version 16.2 (16C5032a) I created: One [ScenekitApp] [File/New/Project/iOS/Game/Game Technology: SceneKit]. Three custom Frameworks [File/New/Project/iOS/Framework] [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework]. In the four projects the [Minimum Deployments=iOS 15.0], [Swift version=6.0] and [BUILD_LIBRARY_FOR_DISTRIBUTION=Yes]. I directly installed the [Zip.framework] in [ASCSource.framework] without [pod init/pod install] and in the [General tab] [Frameworks, Libraries and Embeded Content] [Zip.framework Embed&Sign]. I installed the three frameworks [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework] in [ScenekitApp] and everything works perfectly in Xcode version 16.2(16C5032a). I updated Xcode version 16.2(16C5032a) to Xcode version 16.3(16E140) and some errors in the SDKs were indicated. [Failed to build module 'ASCSource'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK]. [Failed to build module 'Zip'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK]. If I recompile all frameworks in Xcode version 16.3 (16E140) it will work, but I don't think it will be a good solution. I found some discussions in links like the following without success. https://github.com/firebase/firebase-ios-sdk/issues/13727 https://stackoverflow.com/questions/70556401/swift-version-conflict-this-sdk-is-not-supported-by-the-compiler-please-select Any help is welcome. Thanks everyone!!!
2
0
514
Apr ’25
RealityKit .Kinematic + collisions (visionOs)
Hi everyone, I'm new to visionOS development. I'm trying to create a physics-based scene (with gravity) where users can pick up and move objects on a workbench. I am struggling with physics interactions during the drag gesture: Kinematic Mode: If I switch to .kinematic during the drag, the object moves smoothly but clips through other objects (no collisions). Dynamic Mode: I tried keeping it .dynamic and applying linear velocity toward the hand position, but the movement feels laggy and unresponsive. Hybrid Approach: I tried switching to .kinematic during DragGesture.onChange and back to .dynamic on collision, but this causes the entity to jitter/shake violently when touching other objects. Has anyone found a clean way to drag objects while maintaining solid collisions. Thanks for your help!
2
0
767
Feb ’26
TapGesture stops responding on ViewAttachmentComponent after disabling or removing and re-adding the Entity (visionOS 26)
Issue When an Entity with a ViewAttachmentComponent is: disabled using isEnabled = false removed using removeFromParent() and then enabled or added back again, the attached SwiftUI view is rendered correctly, but tap interactions stop working. Specifically: Button actions inside the attached view do not fire TapGesture closures on child views do not respond Expected Behavior Tap interactions inside the attached view should continue to work after the Entity is re-enabled or re-added. Actual Behavior After being disabled or removed once, all tap interactions stop responding. Comparison When displaying the same SwiftUI view using RealityViewAttachments, this issue does not occur. Removing and re-displaying the attachment still allows taps to work correctly. Reproduction Attached sample code reproduces the issue: A RealityView with an Entity that has a ViewAttachmentComponent The attached SwiftUI view contains a Toggle The toggle updates isEnabled on the Entity After toggling off and on, tap interactions stop responding Environment Xcode 26 visionOS 26 Question Is this expected behavior of ViewAttachmentComponent, or a bug? Is there a recommended way to temporarily hide or disable an Entity with ViewAttachmentComponent without breaking tap interactions? import SwiftUI import RealityKit struct GestureTestView: View { @State var sampleEnabled = true @State var sampleEntity: Entity? var body: some View { RealityView { contents, attachments in // After deleting and re-displaying it, taps no longer respond. let sample = Entity(components: ViewAttachmentComponent(rootView: SampleView())) // Executed successfully //let sample = attachments.entity(for: "SampleView")! contents.add(sample) sample.position = [0, 1.2, -1] sampleEntity = sample let toggleButton = Entity(components: ViewAttachmentComponent(rootView: ToggleButtonView(isOn: $sampleEnabled))) contents.add(toggleButton) toggleButton.position = [0, 1, -1] } update: { _, _ in // run update closure print(sampleEnabled) // update sample entity enable sampleEntity?.isEnabled = sampleEnabled } attachments: { Attachment(id: "SampleView") { SampleView() } } } } struct ToggleButtonView: View { @Binding var isOn: Bool var body: some View { VStack { Toggle(isOn: $isOn) { Text("Toggle") } } .padding() .glassBackgroundEffect() } } struct SampleView: View { var body: some View { VStack { Button { print("Hello, World!") } label: { Text("Hello, World!") .padding() } } .padding() .glassBackgroundEffect() } } #Preview(immersionStyle: .mixed) { GestureTestView() }
2
0
526
Jan ’26
xCode26.x Metal4 classes do not compile
Hi, I am using xCode26.x. But my Metal4 classes are not compiling. I downloaded the sample code from Apple's website - https://developer.apple.com/documentation/Metal/processing-a-texture-in-a-compute-function. For example, I am getting errors like "Cannot find protocol declaration for 'MTL4CommandQueue'; I have hit a deadline. Any recommendations are very welcome. I have downloaded the Metal Tool chain. When I run the following commands on the terminal - xcodebuild -showComponent metalToolchain ; xcrun -f metal ; xcrun metal --version I get the following response - Asset Path: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/86fbaf7b114a899754307896c0bfd52ffbf4fded.asset/AssetData Build Version: 17A321 Status: installed Toolchain Identifier: com.apple.dt.toolchain.Metal.32023 Toolchain Search Path: /Users/private/Library/Developer/DVTDownloads/MetalToolchain/mounts/86fbaf7b114a899754307896c0bfd52ffbf4fded /Users/private/Library/Developer/DVTDownloads/MetalToolchain/mounts/86fbaf7b114a899754307896c0bfd52ffbf4fded/Metal.xctoolchain/usr/bin/metal Apple metal version 32023.830 (metalfe-32023.830.2) Target: air64-apple-darwin24.6.0 Thread model: posix InstalledDir: /Users/private/Library/Developer/DVTDownloads/MetalToolchain/mounts/86fbaf7b114a899754307896c0bfd52ffbf4fded/Metal.xctoolchain/usr/metal/current/bin
2
0
1.2k
Jan ’26
RealityKit, Attachments - not working
The simplest realityView (content, attachments in ... causes Contextual closure expects 1 argument but 2 were used in closure body. I have checked every example and i cannot understand why i get this error regardless of any content. Note: i have added Attachment(id: "test") to the attachment closure and get Attachment not is scope. imported both realityKit and SwiftUI.
2
0
361
Dec ’25
RealityView postProcess effect depth texture
Hello, Question re: iOS RealityView postProcess. I've got a working postProcess kernel and I'd like to add some depth-based effects to it. Theoretically I should be able to just do: encoder.setTexture(context.sourceDepthTexture, index: 1) and then in the kernel: texture2d<float, access::read> depthIn [[texture(1)]] ... outTexture.write(depthIn.read(gid), gid); And I consistently see all black rendered to the view. The postProcess shader works, so that's not the issue. It just seems to not be receiving actual depth information. (If I set a breakpoint at the encoder setTexture step, I can see preview the color texture of the scene, but the context's depthTexture looks like all NaN / blank.) I've looked at all the WWDC samples, but they include ARView for all the depth sample code, which has a different set of configuration options than RealityView. So far I haven't seen anywhere to explicitly tell RealityView "include the depth information". So I'm not sure if I'm missing something there. It appears that there is indeed a depth texture being passed, but it looks blank. Is there a working example somewhere that we can reference?
2
0
649
Nov ’25
Metal 4: When is it ok to dealloc a MTLBuffer's memory
I have something like this drawing in an MTKView (see at bottom). I am finding it difficult to figure out when can the Swift-land resources used in making the MTLBuffer(s) be released? Below, for example, is it ok if args goes out of scope (or is otherwise deallocated) at point 1, 2, or 3? Or perhaps even earlier, as soon as argsBuffer has been created? I have been reading through various articles such as Setting resource storage modes Choosing a resource storage mode for Apple GPUs Copying data to a private resource but it's a lot to absorb and I haven't been really able to find an authoritative description of the required lifetime of the resources in CPU land. I should mention that this is Metal 4 code. In previous versions of Metal, the MTLCommandBuffer had the ability to add a completion handler to be called by the GPU after it has finished running the commands in the buffer but in Metal 4 there is no such thing (it it were even needed for the purpose I am interested in). Any advice and/or pointers to the definitive literature will be appreciated. guard let argsBuffer = device.makeBuffer(bytes: &args,... argumentTable.setAddress(argsBuffer.gpuAddress, ... encoder.setArgumentTable(argumentTable, stages: .vertex) // encode drawing renderEncoder.draw... ... encoder.endEncoding() // 1 commandBuffer.endCommandBuffer() // 2 commandQueue.waitForDrawable(drawable) commandQueue.commit([commandBuffer]) // 3 commandQueue.signalDrawable(drawable) drawable.present()
2
0
238
Jan ’26
Game Center Access Point does not appear on iOS 26 (Simulator)
Attempting to bring up the access point yields the following error log: [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy [GameCenterOverlayService] Could not create endpoint for service name: com.apple.GameOverlayUI.dashboard-service [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy [GameCenterOverlayService] Could not create endpoint for service name: com.apple.GameOverlayUI.dashboard-service [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy The same code (which is a single line setting 'active' to true) works on physical devices and on the simulator in iOS 18.6 I haven't been able to find any mention of this issue online. Any suggestions or help greatly appreciated.
2
1
808
Feb ’26
How to play Vorbis/OGG files with swift?
Does anyone have a working example on how to play OGG files with swift? I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C&#92;&#43;+ to fill the PCM because this method seems to only be available in C&#92;&#43;+ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes. I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac) I also tried using Cricket Audio framework below. https://github.com/sjmerel/ck It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue https://github.com/sjmerel/ck/issues/3 Right now I believe every player that can play OGGs on mac is written in Objective-C or C++. Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
Replies
2
Boosts
0
Views
4.4k
Activity
Jul ’25
Metal IR reference
Hello! I'm developing a GPU (shader) language, where I aim to target multiple backends with a common frontend. I wanted to avoid having to round trip through Metal, and go straight to IR just like I have with SPIRV, in order to have a fast and efficient compilation process. I've been looking for a reference page where I can read about Metals IR, and as far as I'm aware, it exists, but I can't seem to find it anywhere. Furthermore, if such a reference is available, is there also a toolkit where I can run validation on the output IR, and perhaps even run optimizations, much like spv-tools for SPIRV? Any help would be appreciated! Thanks, Gustav
Replies
2
Boosts
0
Views
324
Activity
Jul ’25
Xcode Metal Trace
Code is download from apple official metal4 sample [https://developer.apple.com/documentation/metal/drawing-a-triangle-with-metal-4?language=objc] enable metal gpu trace in macOS schema and trace a frame in Xcode. Xcode may show segment fault on App from some 'GTTrace' function when click trace button. When replay a .gputrace file, Xcode may crash , throw an internal error or a XPC error. The example code using old metal-renderer can trace without any problem and everything works fine. Test Environment: Xcode Version 26.2 (17C52) macOS 26.2 (25C56) M1 Pro 16GB A2442
Replies
2
Boosts
0
Views
507
Activity
Jan ’26
Warning code: 00000006 will affect background survival
Our APP has integrated 3D function, in order to reduce the memory occupation of the APP in the background, we will uninstall the 3D after the APP enters the background. However, the uninstall also causes problems. When the uninstall process is executed in the background, the app will briefly trigger the background GPU rendering error warning with the error warning code: OGPUMetalError: Insufficient Permission (to submit GPU work from background) (00000006:kIOGPUCommandBufferCallbackErrorBackgroundExecutionNotPermitted) Execution of the command buffer was aborted due to an error during execution. Insufficient Permission (to submit GPU Work from background) (00000006: kIOGPUCommandBufferCallbackErrorBackgroundExecutionNotPermitted) excuse me this warning system will tighten APP permissions background?​ For example, limit or shorten the background survival time of the APP. In addition, will the background refresh function fail, resulting in the failure of Bluetooth Ibeacon activation?
Replies
2
Boosts
0
Views
276
Activity
Aug ’25
Float64 (Double Precision) Support on MPS with PyTorch on Apple Silicon?
Hi everyone, This project uses PyTorch on an Apple Silicon Mac (M1/M2/etc.), and the goal is to use the MPS backend for GPU acceleration, notes Apple Developer. However, the workflow depends on Float64 (double-precision) floating-point numbers for certain computations, notes PyTorch Forums. The error "Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead" has been encountered, notes GitHub. It seems that the MPS backend doesn't currently support Float64 for direct GPU computation. Questions for the community: Are there any known workarounds or best practices for handling Float64-dependent operations when using the MPS backend with PyTorch? For those working with high-precision tasks on Apple Silicon, what strategies are being used to balance performance with the need for Float64? Offloading to the CPU is an option, and it's of interest to know if there are any specific techniques or libraries within the Apple ecosystem that could streamline this process while aiming for optimal performance. Any insights, tips, or experiences would be appreciated. Thanks in advance, Jonaid MacBook Pro M3 Max
Replies
2
Boosts
1
Views
478
Activity
Oct ’25
iPhone limited to 60hz frame rate
Just wondering if anyone knows what it will take to hit greater than 60hz when targeting iPhone. If I set the preferredFramesPerSecond of an MTKView to 120, it works on the iPad, but on iPhone it never goes over 60hz, even with a simple hello triangle sample app... is this a limitation of targeting iPhone?
Replies
2
Boosts
0
Views
220
Activity
Sep ’25
Metal HUD Display Value Range
Can't seem to get the Metal HUD to display value range's (pre 26 Tahoe). The documented environment variable MTL_HUD_SHOW_VALUE_RANGE doesn't seem to work. https://developer.apple.com/documentation/xcode/monitoring-your-metal-apps-graphics-performance#Display-the-value-range-of-metrics Anyone having any luck?
Replies
2
Boosts
0
Views
348
Activity
Sep ’25
Pink screen on MTLCommandBuffer.presentDrawable.
I rewrote my graphics pipeline to use Load/Store better for clearing and don't care cases. All my tests pass, and in the Metal debugger, all the draw calls succeed. But when I present drawables (before [commandBuffer commit]) I only get a pink screen. I've tried everything I can think of: making sure the pixel formats are the same for the back buffer as my render targets, etc. But it's still pink. Could you point me in the right direction so I can fix this, or help describe why it's pink. That would be really helpful. Thank you, Brian Hapgood
Replies
2
Boosts
0
Views
411
Activity
Sep ’25
findNavigator
How can I paste a string to the findNavigator of a TextEditor ?
Replies
2
Boosts
0
Views
790
Activity
Dec ’25
receivedTurnEventForMatch giving stale data
In my turn-based game, I receive GKListener event receivedTurnEventForMatch and decode the match.matchData. On occasion, the matchData is clearly stale and is from the previous turn. If I call the MatchMaker ViewController up and select that same match, the data is not stale, so it's not a matter of not calling endTurn. I have tried both loadMatchWithID and loadMatchesWithCompletionHandler after receiving the receivedTurnEventForMatch, but the data is still stale. Advice?
Replies
2
Boosts
0
Views
747
Activity
Jan ’26
GameKit Leaderboards not working in iOS 26.2
Leaderboards working fine in iOS 26.1 but seem to be broken in 26.2 and also in the 26.3 developer beta. Players cannot submit scores and neither can they view scores on Apple's default leaderboards. Custom leaderboards that rely on pulling information using GameKit APIs also fail. Is there a workaround or patch for this?
Replies
2
Boosts
0
Views
492
Activity
Jan ’26
Error this SDK is not supported by the compiler - Xcode version 16.2 (16C5032a) to 16.3(16E140)
In Xcode version 16.2 (16C5032a) I created: One [ScenekitApp] [File/New/Project/iOS/Game/Game Technology: SceneKit]. Three custom Frameworks [File/New/Project/iOS/Framework] [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework]. In the four projects the [Minimum Deployments=iOS 15.0], [Swift version=6.0] and [BUILD_LIBRARY_FOR_DISTRIBUTION=Yes]. I directly installed the [Zip.framework] in [ASCSource.framework] without [pod init/pod install] and in the [General tab] [Frameworks, Libraries and Embeded Content] [Zip.framework Embed&Sign]. I installed the three frameworks [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework] in [ScenekitApp] and everything works perfectly in Xcode version 16.2(16C5032a). I updated Xcode version 16.2(16C5032a) to Xcode version 16.3(16E140) and some errors in the SDKs were indicated. [Failed to build module 'ASCSource'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK]. [Failed to build module 'Zip'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK]. If I recompile all frameworks in Xcode version 16.3 (16E140) it will work, but I don't think it will be a good solution. I found some discussions in links like the following without success. https://github.com/firebase/firebase-ios-sdk/issues/13727 https://stackoverflow.com/questions/70556401/swift-version-conflict-this-sdk-is-not-supported-by-the-compiler-please-select Any help is welcome. Thanks everyone!!!
Replies
2
Boosts
0
Views
514
Activity
Apr ’25
RealityKit .Kinematic + collisions (visionOs)
Hi everyone, I'm new to visionOS development. I'm trying to create a physics-based scene (with gravity) where users can pick up and move objects on a workbench. I am struggling with physics interactions during the drag gesture: Kinematic Mode: If I switch to .kinematic during the drag, the object moves smoothly but clips through other objects (no collisions). Dynamic Mode: I tried keeping it .dynamic and applying linear velocity toward the hand position, but the movement feels laggy and unresponsive. Hybrid Approach: I tried switching to .kinematic during DragGesture.onChange and back to .dynamic on collision, but this causes the entity to jitter/shake violently when touching other objects. Has anyone found a clean way to drag objects while maintaining solid collisions. Thanks for your help!
Replies
2
Boosts
0
Views
767
Activity
Feb ’26
TapGesture stops responding on ViewAttachmentComponent after disabling or removing and re-adding the Entity (visionOS 26)
Issue When an Entity with a ViewAttachmentComponent is: disabled using isEnabled = false removed using removeFromParent() and then enabled or added back again, the attached SwiftUI view is rendered correctly, but tap interactions stop working. Specifically: Button actions inside the attached view do not fire TapGesture closures on child views do not respond Expected Behavior Tap interactions inside the attached view should continue to work after the Entity is re-enabled or re-added. Actual Behavior After being disabled or removed once, all tap interactions stop responding. Comparison When displaying the same SwiftUI view using RealityViewAttachments, this issue does not occur. Removing and re-displaying the attachment still allows taps to work correctly. Reproduction Attached sample code reproduces the issue: A RealityView with an Entity that has a ViewAttachmentComponent The attached SwiftUI view contains a Toggle The toggle updates isEnabled on the Entity After toggling off and on, tap interactions stop responding Environment Xcode 26 visionOS 26 Question Is this expected behavior of ViewAttachmentComponent, or a bug? Is there a recommended way to temporarily hide or disable an Entity with ViewAttachmentComponent without breaking tap interactions? import SwiftUI import RealityKit struct GestureTestView: View { @State var sampleEnabled = true @State var sampleEntity: Entity? var body: some View { RealityView { contents, attachments in // After deleting and re-displaying it, taps no longer respond. let sample = Entity(components: ViewAttachmentComponent(rootView: SampleView())) // Executed successfully //let sample = attachments.entity(for: "SampleView")! contents.add(sample) sample.position = [0, 1.2, -1] sampleEntity = sample let toggleButton = Entity(components: ViewAttachmentComponent(rootView: ToggleButtonView(isOn: $sampleEnabled))) contents.add(toggleButton) toggleButton.position = [0, 1, -1] } update: { _, _ in // run update closure print(sampleEnabled) // update sample entity enable sampleEntity?.isEnabled = sampleEnabled } attachments: { Attachment(id: "SampleView") { SampleView() } } } } struct ToggleButtonView: View { @Binding var isOn: Bool var body: some View { VStack { Toggle(isOn: $isOn) { Text("Toggle") } } .padding() .glassBackgroundEffect() } } struct SampleView: View { var body: some View { VStack { Button { print("Hello, World!") } label: { Text("Hello, World!") .padding() } } .padding() .glassBackgroundEffect() } } #Preview(immersionStyle: .mixed) { GestureTestView() }
Replies
2
Boosts
0
Views
526
Activity
Jan ’26
xCode26.x Metal4 classes do not compile
Hi, I am using xCode26.x. But my Metal4 classes are not compiling. I downloaded the sample code from Apple's website - https://developer.apple.com/documentation/Metal/processing-a-texture-in-a-compute-function. For example, I am getting errors like "Cannot find protocol declaration for 'MTL4CommandQueue'; I have hit a deadline. Any recommendations are very welcome. I have downloaded the Metal Tool chain. When I run the following commands on the terminal - xcodebuild -showComponent metalToolchain ; xcrun -f metal ; xcrun metal --version I get the following response - Asset Path: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/86fbaf7b114a899754307896c0bfd52ffbf4fded.asset/AssetData Build Version: 17A321 Status: installed Toolchain Identifier: com.apple.dt.toolchain.Metal.32023 Toolchain Search Path: /Users/private/Library/Developer/DVTDownloads/MetalToolchain/mounts/86fbaf7b114a899754307896c0bfd52ffbf4fded /Users/private/Library/Developer/DVTDownloads/MetalToolchain/mounts/86fbaf7b114a899754307896c0bfd52ffbf4fded/Metal.xctoolchain/usr/bin/metal Apple metal version 32023.830 (metalfe-32023.830.2) Target: air64-apple-darwin24.6.0 Thread model: posix InstalledDir: /Users/private/Library/Developer/DVTDownloads/MetalToolchain/mounts/86fbaf7b114a899754307896c0bfd52ffbf4fded/Metal.xctoolchain/usr/metal/current/bin
Replies
2
Boosts
0
Views
1.2k
Activity
Jan ’26
Raytracing on the Vision Pro M5
Is there any support pr plans for support for for raytraced reflections in RealityKit on the Vision Pro M5? I cannot find any documentation regarding this topic.
Replies
2
Boosts
0
Views
496
Activity
Nov ’25
RealityKit, Attachments - not working
The simplest realityView (content, attachments in ... causes Contextual closure expects 1 argument but 2 were used in closure body. I have checked every example and i cannot understand why i get this error regardless of any content. Note: i have added Attachment(id: "test") to the attachment closure and get Attachment not is scope. imported both realityKit and SwiftUI.
Replies
2
Boosts
0
Views
361
Activity
Dec ’25
RealityView postProcess effect depth texture
Hello, Question re: iOS RealityView postProcess. I've got a working postProcess kernel and I'd like to add some depth-based effects to it. Theoretically I should be able to just do: encoder.setTexture(context.sourceDepthTexture, index: 1) and then in the kernel: texture2d<float, access::read> depthIn [[texture(1)]] ... outTexture.write(depthIn.read(gid), gid); And I consistently see all black rendered to the view. The postProcess shader works, so that's not the issue. It just seems to not be receiving actual depth information. (If I set a breakpoint at the encoder setTexture step, I can see preview the color texture of the scene, but the context's depthTexture looks like all NaN / blank.) I've looked at all the WWDC samples, but they include ARView for all the depth sample code, which has a different set of configuration options than RealityView. So far I haven't seen anywhere to explicitly tell RealityView "include the depth information". So I'm not sure if I'm missing something there. It appears that there is indeed a depth texture being passed, but it looks blank. Is there a working example somewhere that we can reference?
Replies
2
Boosts
0
Views
649
Activity
Nov ’25
Metal 4: When is it ok to dealloc a MTLBuffer's memory
I have something like this drawing in an MTKView (see at bottom). I am finding it difficult to figure out when can the Swift-land resources used in making the MTLBuffer(s) be released? Below, for example, is it ok if args goes out of scope (or is otherwise deallocated) at point 1, 2, or 3? Or perhaps even earlier, as soon as argsBuffer has been created? I have been reading through various articles such as Setting resource storage modes Choosing a resource storage mode for Apple GPUs Copying data to a private resource but it's a lot to absorb and I haven't been really able to find an authoritative description of the required lifetime of the resources in CPU land. I should mention that this is Metal 4 code. In previous versions of Metal, the MTLCommandBuffer had the ability to add a completion handler to be called by the GPU after it has finished running the commands in the buffer but in Metal 4 there is no such thing (it it were even needed for the purpose I am interested in). Any advice and/or pointers to the definitive literature will be appreciated. guard let argsBuffer = device.makeBuffer(bytes: &args,... argumentTable.setAddress(argsBuffer.gpuAddress, ... encoder.setArgumentTable(argumentTable, stages: .vertex) // encode drawing renderEncoder.draw... ... encoder.endEncoding() // 1 commandBuffer.endCommandBuffer() // 2 commandQueue.waitForDrawable(drawable) commandQueue.commit([commandBuffer]) // 3 commandQueue.signalDrawable(drawable) drawable.present()
Replies
2
Boosts
0
Views
238
Activity
Jan ’26
Game Center Access Point does not appear on iOS 26 (Simulator)
Attempting to bring up the access point yields the following error log: [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy [GameCenterOverlayService] Could not create endpoint for service name: com.apple.GameOverlayUI.dashboard-service [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy [GameCenterOverlayService] Could not create endpoint for service name: com.apple.GameOverlayUI.dashboard-service [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy The same code (which is a single line setting 'active' to true) works on physical devices and on the simulator in iOS 18.6 I haven't been able to find any mention of this issue online. Any suggestions or help greatly appreciated.
Replies
2
Boosts
1
Views
808
Activity
Feb ’26