Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

OS choosing performance state poorly for GPU use case
I am building a MacOS desktop app (https://anukari.com) that is using Metal compute to do real-time audio/DSP processing, as I have a problem that is highly parallelizable and too computationally expensive for the CPU. However it seems that the way in which I am using the GPU, even when my app is fully compute-limited, the OS never increases the power/performance state. Because this is a real-time audio synthesis application, it's a huge problem to not be able to take advantage of the full clock speeds that the GPU is capable of, because the app can't keep up with real-time. I discovered this issue while profiling the app using Instrument's Metal tracing (and Game tracing) modes. In the profiling configuration under "Metal Application" there is a drop-down to select the "Performance State." If I run the application under Instruments with Performance State set to Maximum, it runs amazingly well, and all my problems go away. For comparison, when I run the app on its own, outside of Instruments, the expensive GPU computation it's doing takes around 2x as long to complete, meaning that the app performs half as well. I've done a ton of work to micro-optimize my Metal compute code, based on every scrap of information from the WWDC videos, etc. A problem I'm running into is that I think that the more efficient I make my code, the less it signals to the OS that I want high GPU clock speeds! I think part of why the OS is confused is that in most use cases, my computation can be done using only a small number of Metal threadgroups. I'm guessing that the OS heuristics see that only a small fraction of the GPU is saturated and fail to scale up the power/clock state. I'm not sure what to do here; I'm in a bit of a bind. One possibility is that I intentionally schedule busy work -- spin threadgroups just to waste energy and signal to the OS that I need higher clock speeds. This is obviously a really bad idea, but it might work. Is there any other (better) way for my app to signal to the OS that it is doing real-time latency-sensitive computation on the GPU and needs the clock speeds to be scaled up? Note that game mode is not really an option, as my app also runs as an AU plugin inside hosts like Garageband, so it can't be made fullscreen, etc.
6
0
1k
May ’25
Game Center SignIn alert appears on macOS 26 (Tahoe) without capability or entitlement
Hello, On macOS 26 (Tahoe), when building a OSX app that includes GameKit code, calling GKLocalPlayer.local.authenticateHandler shows the "Sign In to Game Center" alert (e.g. didShowFullscreenSignIn) — even if the app does not have the Game Center capability enabled or any related entitlement (com.apple.developer.game-center). This alert only appears when the user is not signed in to Game Center in system settings. However, when testing the same code path on iOS app built with macOS 26 (Tahoe), the alert does not appear unless the proper capability and entitlement are included. This behavior is different from macOS 15 (Sequoia) + Xcode 15.x. Prior to the update, Game Center features did not work at all even with the OSX app without Capability and Entitlements. Steps to Reproduce Create a new OSX app target (App Sandbox enabled, no Game Center capability). Add minimal GameKit code: GKLocalPlayer.local.authenticateHandler = { _, _, _ in } Build OSX app and run on macOS 26 (Tahoe). Ensure Game Center is signed out in System Settings. Observe: “Sign In to Game Center” alert appears automatically. Expected Behavior When Game Center capability and entitlement are not present, authenticateHandler should fail silently, and no signIn alert should appear. Actual Behavior On OSX app, the Game Center signIn UI appears even without any Game Center capability or entitlement. On iOS app, this alert does not appear. *Build Configuration: built with the same condition. (macOS 26 + Xcode 26) Question Could you please confirm whether this behavior is an intentional change in macOS 26 or a bug only for OSX apps in the GameKit authentication flow? Thank you.
2
0
529
Oct ’25
Xcode compile stucks(stops) when add new source files or add functions to previous files
Hello, we are working on a iOS game project, as progress, the project grows larger and larger. Because we are using other game dependencies and libraries, here larger and larger refers to the whole project, and our source files integrated and compiled by Xcode are not many. Now, it seems we hit a bottleneck, when I add new files or functions to the previous files to implement a new feature, Xcode compile stucks(stops), it's Indexing | Initializing datastore forever, cannot produce a final build. macOS 15.1, Xcode 16.2 Can you provide any solutions to solve this problem? Also submitted Feedback ID #FB18432749
4
0
727
Aug ’25
Compute kernel fails to compile when calling texture.read()
If I compile a compute kernel with a call to texture.read(), it fails with the following error: "Error Domain=AGXMetalG13X Code=3 "Encountered unlowered function call to air.get_read_sampler" UserInfo={NSLocalizedDescription=Encountered unlowered function call to air.get_read_sampler}." This error occurs on both macOS and iOS 26 Beta 5, but not when running on a simulator or in a playground. It does not occur on a macOS Sequoia VM. It occurs whether I use the old metal 3 or new metal 4 compilation method. A workaround would be to use a sampler, but according to the feature tables, all platforms support reading from textures of all formats. Below is a minimal example which produces the error: let device = MTLCreateSystemDefaultDevice()! let library = device.makeDefaultLibrary()! let computeFunction = library.makeFunction(name: "compute_test")! do { let pipeline = try device.makeComputePipelineState(function: computeFunction) debugPrint(pipeline) } catch { debugPrint("Metal 3 failed with error:\n\(error)") } #import <metal_stdlib> using namespace metal; kernel void compute_test(uint2 gid [[thread_position_in_grid]], texture2d<float, access::read> in [[texture(0)]], texture2d<float, access::write> out [[texture(1)]]) { out.write(in.read(gid), gid); } I filed feedback FB19530049.
1
0
234
Aug ’25
CGContext PDF/A intents
let dic : [AnyHashable:Any] = [ kCGPDFXRegistryName: "http://www.color.org" as CFString, kCGPDFXOutputConditionIdentifier: "FOGRA43" as CFString, kCGPDFContextOutputIntent: "GTS_PDFX" as CFString, kCGPDFXOutputIntentSubtype: "GTS_PDFX" as CFString, kCGPDFContextCreateLinearizedPDF: "" as CFString, kCGPDFContextCreatePDFA: "" as CFString, kCGPDFContextAuthor: "Placeholder" as CFString, kCGPDFContextCreator: "Placeholder" as CFString ] Hello, Now I would like to export my PDF's as PDF/A. In my opinion, there is also the right option for this under Core Graphics. Unfortunately, the documentation does not show what is 'kCGPDFContextCreatePDFA' or 'kCGPDFContextLinearizedPDF' for a stringvalue is required. What I have already tried: GTS_PDFA1 , PDF/A-1, true as CFString. (Above my CFDictionary. ...Author e.g are working perfectly.) In the Finder you can see these two options, which I would also like to implement in my app. Thank you in advance!
1
0
227
Jun ’25
RealityKit not behaving as expected
This week, I developed a small multiplatform RealityKit project. I also created a demo scene in Reality Composer Pro. Afterward, I imported the local package into the project. Running the project on macOS works perfectly. However, when I tried to run it on my iPhone, I encountered a permission error indicating that it couldn’t read the package. This seems unusual to me because I assumed that the dependency is bundled into the binary file. In an attempt to resolve the issue, I pushed the RCP package to GitHub, hoping it would work. Fortunately, everything compiles successfully now, but the loading time is significantly long, and the animations don’t play on tap gestures. Could someone please help me identify the root cause of this problem?
1
0
367
Jul ’25
Will OpenGL API and Drivers be removed after appleOS 26?
Hi, I am a Multimedia and Graphics researcher and I am wondering if OpenGL API and drivers will be removed after appleOS 26? macOS 26 iOS 26 iPadOS 26 visionOS 26 I am asking this because most of the libraries I use depends on OpenGL. Like CGAL, libigl, immediate mode ui, nanovg, nanogui, bullet physics. Transitioning from Vulkan and metal while using and learning those libraries is just not viable. I would like to ask you that. I am the sole developer and I just want to ask you that. Regards.
1
0
255
Aug ’25
Request: API to temporarily defer Reachability for fullscreen game swipes (bottom edge)
In swipe-driven games, a first downward swipe starting near the home indicator can trigger Reachability, even when using preferredScreenEdgesDeferringSystemGestures = .bottom and prefersHomeIndicatorAutoHidden = true. This causes the app to slide down to half screen and breaks gameplay. Please consider an API to temporarily defer Reachability while a custom gesture is active (similar to existing system gesture deferral), without disabling accessibility globally. Environment: Devices: iPhones with Home Indicator (Face ID) Why this matters: Bottom-origin swipes are core in many games (flick shots, slingshot, physics toss, bottom sheets). Current workarounds degrade UX and discoverability, and players still accidentally trigger Reachability. Feedback Assistant Post
2
0
686
Aug ’25
How to detect which entity was tapped?
Hi, I'm rewriting my game from SceneKit to RealityKit, and I'm having trouble implementing the following scenario: I tap on the iPhone screen to select an Entity that I want to drag. If an Entity was tapped, it should then be possible to drag it left, right, etc. SceneKit solution: func CGPointToSCNVector3(_ view: SCNView, depth: Float, point: CGPoint) -> SCNVector3 { let projectedOrigin = view.projectPoint(SCNVector3Make(0, 0, Float(depth))) let locationWithz = SCNVector3Make(Float(point.x), Float(point.y), Float(projectedOrigin.z)) return view.unprojectPoint(locationWithz) } and then I was calling: SCNView().hitTest(location, options: [SCNHitTestOption.firstFoundOnly:true]) the code was called inside of the UIPanGestureRecognizer in my UIViewController. Could I reuse that code or should I go with the SwiftUI approach - something like that: var body: some View { RealityView { .... } .gesture(TapGesture().onEnded { }) ? I already have this code: @State private var location: CGPoint? .onTapGesture { location in self.location = location } I'm trying to identify the entity that was tapped within the RealityView like that: RealityView { content in let box: ModelEntity = createBox() // for now there is only one box, however there will be many boxes content.add(box) let anchor = AnchorEntity(world: [0, 0, 0]) content.add(anchor) _ = content.subscribe(to: SceneEvents.Update.self) { event in //TODO: find tapped entity, so that it could be dragged inside of the DragGesture() } Any help would be appreciated. I also noticed that if I create a TapGesture like that: TapGesture(count: 1) .targetedToAnyEntity() and add it to my view using .gesture() then it is not triggered.
2
0
350
Aug ’25
RealityView postProcess effect depth texture
Hello, Question re: iOS RealityView postProcess. I've got a working postProcess kernel and I'd like to add some depth-based effects to it. Theoretically I should be able to just do: encoder.setTexture(context.sourceDepthTexture, index: 1) and then in the kernel: texture2d<float, access::read> depthIn [[texture(1)]] ... outTexture.write(depthIn.read(gid), gid); And I consistently see all black rendered to the view. The postProcess shader works, so that's not the issue. It just seems to not be receiving actual depth information. (If I set a breakpoint at the encoder setTexture step, I can see preview the color texture of the scene, but the context's depthTexture looks like all NaN / blank.) I've looked at all the WWDC samples, but they include ARView for all the depth sample code, which has a different set of configuration options than RealityView. So far I haven't seen anywhere to explicitly tell RealityView "include the depth information". So I'm not sure if I'm missing something there. It appears that there is indeed a depth texture being passed, but it looks blank. Is there a working example somewhere that we can reference?
2
0
690
Nov ’25
vImageConverter_CreateWithCGImageFormat Fails with kvImageInvalidImageFormat When Trying to Convert CMYK to RGB
So I get JPEG data in my app. Previously I was using the higher level NSBitmapImageRep API and just feeding the JPEG data to it. But now I've noticed on Sonoma If I get a JPEG in the CMYK color space the NSBitmapImageRep renders mostly black and is corrupted. So I'm trying to drop down to the lower level APIs. Specifically I grab a CGImageRef and and trying to use the Accelerate API to convert it to another format (to hopefully workaround the issue... CGImageRef sourceCGImage = `CGImageCreateWithJPEGDataProvider(jpegDataProvider,` NULL, shouldInterpolate, kCGRenderingIntentDefault); Now I use vImageConverter_CreateWithCGImageFormat... with the following values for source and destination formats: Source format: (derived from sourceCGImage) bitsPerComponent = 8 bitsPerPixel = 32 colorSpace = (kCGColorSpaceICCBased; kCGColorSpaceModelCMYK; Generic CMYK Profile) bitmapInfo = kCGBitmapByteOrderDefault version = 0 decode = 0x000060000147f780 renderingIntent = kCGRenderingIntentDefault Destination format: bitsPerComponent = 8 bitsPerPixel = 24 colorSpace = (DeviceRBG) bitmapInfo = 8197 version = 0 decode = 0x0000000000000000 renderingIntent = kCGRenderingIntentDefault But vImageConverter_CreateWithCGImageFormat fails with kvImageInvalidImageFormat. Now if I change the destination format to use 32 bitsPerpixel and use alpha in the bitmap info the vImageConverter_CreateWithCGImageFormat does not return an error but I get a black image just like NSBitmapImageRep
14
0
1.6k
Aug ’25
Why Large-Scale Model Scenes Cause Real Device Crashes
Is there any limitation in Vision Pro when loading scenes with large-scale models? ​Test Case: Asset: Composite USDA file containing ​10 individual models​ (total triangles count: ~4.2M) Simulator: Loads and renders correctly Real Device: Loads asset successfully but ​ failure during rendering phase: Environment abruptly dims System spontaneously reboots How can we resolve this issue? Below are excerpted logs preceding the crash: <<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606 Attempted to add ornament: <MRUIPlatterOrnament: 0x10a658f00; _isInternal: YES; _displaceWindowChrome: NO; _canCaptureUI: NO; _isBeingRemoved: NO; contentAnchorPoint3D: "{0.5, 0.5, 0}"; position: <MRUIPlatterOrnamentRelativePosition: 0x105b68e70; anchorPoint: {0.5, 0.5, 1}>; rotation: "{{0, 0, 0}, 0}"; opacity: 1.000000; canFollowUser: YES; effectiveOffset: "{0, 0, 0}"; presentingViewController: 0x0; billboardingBehavior: 0x0; scalingBehavior: 0x0; relativeToParent: NO; nonHeritableDepthDisplacement: 0.000000; order: 0.000000; _window._determinedSize: {0, 0}; _window: (null)> to nil or non-supporting UIScene: <UIWindowScene: 0x10a8a0000; role: UISceneSessionRoleImmersiveSpaceApplication; persistentIdentifier: test.test:SFBSystemService-BA3A21A3-D1AB-42E2-8AF0-AE0AB83BE528; activationState: UISceneActivationStateUnattached>. No action taken. Failed to set dependencies on asset 2823930584475958382 because NetworkAssetManager does not have an asset entity for that id. apply fence tx failed (client=0x98490e18) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xa86516e2) [0x10000003 (ipc/send) invalid destination port]
1
0
282
Jul ’25
App not showing in Game Center “All Activity” after release
Hello — I shipped an App Store build that signs in to Game Center using the Apple Unity Plugins (GameKit). The login banner appears, but my app still doesn’t show up in Game Center’s “All activity” (You started playing XXX 2d ago) What I’ve done Call await GKLocalPlayer.Authenticate(); “Game Center” is enabled for the current version in App Store Connect Confirmed: other App Store games do appear under “All Activity” on the same device/account Timeline: This is the first version that enables Game Center (not the app’s first release), and it has been about 2 hours since this build went live. Questions Is authentication alone sufficient for “Recently Played,” or is at least one Game Center component (leaderboards, achievements, activities, multiplayer) required? Is there a typical propagation delay before “Recently Played” starts showing a newly enabled app/version? Is there anything else I should configure in App Store Connect or entitlements to make “Recently Played” visible? Thanks for any help.
2
0
602
Aug ’25
Is there any future for screensavers on macOS?
I haven't been looking at screensavers for a long time because of Apple's lack of will (or resources?) to provide a public version of the private modern SDK used by Apple for a very long time now. I'm now looking at the Screen Saver pane in System Settings (the What-If version of System Preferences in an alternate universe where all screens are in portrait mode). In macOS Sequoia, it seems like 3rd party screensavers are not welcome considering that they are relegated to the "Other" section at the bottom of the list and you have to click Show All to start seeing 3rd party screen savers. I also had a quick look at macOS Tahoe Beta 3 and it looks like that all the real screensavers are gone (3rd party and the ones from Apple: Hello, Message, Flurry, etc.) or at least it requires to be a Nobel Prize to find them (and the Search field is not useful). I tried to install a 3rd party screen saver on macOS Tahoe Beta 3, it doesn't show up in the list. To summarize: No public access to modern APIs AFAIK. UI that is hostile to 3rd party screen savers on macOS Sequoia. Apparently only screensavers that are slideshows or movies curated by Apple in macOS Tahoe b3. Hence the question: Is there any future for screen savers on macOS? Because if there's none, I won't waste my time trying to update some old screen savers.
3
0
602
Aug ’25
Unable to package in UE5.6
Im new in the Mac area but for sure not UE. Windows is a long process to packaging but it could be done. All the documentation for Epic and from the internet is basically non existent with exactly how to package a project within UE. I have Xcode installed which makes sense, agreed to terms and install for MacOS, I've been able to make a project for several weeks now and want to package for a test run for my friends to play on Windows. Now I just get this in the log: UATHelper: Packaging (Mac): ERROR: Failed to finalize the .app with Xcode. Check the log for more information UATHelper: Packaging (Mac): Trace written to file /Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.uba with size 12.6kb UATHelper: Packaging (Mac): Total time in Unreal Build Accelerator local executor: 8.12 seconds UATHelper: Packaging (Mac): Result: Failed (OtherCompilationError) UATHelper: Packaging (Mac): Total execution time: 9.71 seconds PackagingResults: Error: Failed to finalize the .app with Xcode. Check the log for more information UATHelper: Packaging (Mac): Took 9.77s to run dotnet, ExitCode=6 UATHelper: Packaging (Mac): UnrealBuildTool failed. See log for more details. (/Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.txt) UATHelper: Packaging (Mac): AutomationTool executed for 0h 0m 10s UATHelper: Packaging (Mac): AutomationTool exiting with ExitCode=6 (6) UATHelper: Packaging (Mac): RunUAT ERROR: AutomationTool was unable to run successfully. Exited with code: 6 PackagingResults: Error: AutomationTool was unable to run successfully. Exited with code: 6 PackagingResults: Error: Unknown Error This absolutely makes no sense to me. Anyone have ideas?
2
0
323
Jul ’25
Can spatial scene function be used outside of the Photo App?
I'm new here so I don't know what's this function belongs to which topic... Sorry about that! I watched the WWDC stream and I am really interested in this function, I'm wondering if this function could be used in my apps. I looked up the document but I find it only support visionOS(i'm not sure about that, but I saw the demo is base on the visionOS)
2
0
332
Jun ’25
Core Text incremental redraw glitch: overlapping glyphs during editing
During editing in Pages (or Word) I am getting these glitches (see attachment). Started after the last update to Mac OS 26.3 (beta) Also removed 2 recent instalments (Blackhole audio driver and kDrive/Infomaniak, but trouble is still there. 27" iMac 2020 (Intel) i7 3,8 Ghz AMD Radeon Pro 5500 XT 8 GB 24 GB RAM macOS Tahoe 26.3 (=beta) Tried restart in safe mode, checked fonts. Talked to aissistent to get a solution, but no ...) Thx for any advice, Pieter (not a developer so please kee pit simple 🙏🏻)
3
0
851
Feb ’26
OS choosing performance state poorly for GPU use case
I am building a MacOS desktop app (https://anukari.com) that is using Metal compute to do real-time audio/DSP processing, as I have a problem that is highly parallelizable and too computationally expensive for the CPU. However it seems that the way in which I am using the GPU, even when my app is fully compute-limited, the OS never increases the power/performance state. Because this is a real-time audio synthesis application, it's a huge problem to not be able to take advantage of the full clock speeds that the GPU is capable of, because the app can't keep up with real-time. I discovered this issue while profiling the app using Instrument's Metal tracing (and Game tracing) modes. In the profiling configuration under "Metal Application" there is a drop-down to select the "Performance State." If I run the application under Instruments with Performance State set to Maximum, it runs amazingly well, and all my problems go away. For comparison, when I run the app on its own, outside of Instruments, the expensive GPU computation it's doing takes around 2x as long to complete, meaning that the app performs half as well. I've done a ton of work to micro-optimize my Metal compute code, based on every scrap of information from the WWDC videos, etc. A problem I'm running into is that I think that the more efficient I make my code, the less it signals to the OS that I want high GPU clock speeds! I think part of why the OS is confused is that in most use cases, my computation can be done using only a small number of Metal threadgroups. I'm guessing that the OS heuristics see that only a small fraction of the GPU is saturated and fail to scale up the power/clock state. I'm not sure what to do here; I'm in a bit of a bind. One possibility is that I intentionally schedule busy work -- spin threadgroups just to waste energy and signal to the OS that I need higher clock speeds. This is obviously a really bad idea, but it might work. Is there any other (better) way for my app to signal to the OS that it is doing real-time latency-sensitive computation on the GPU and needs the clock speeds to be scaled up? Note that game mode is not really an option, as my app also runs as an AU plugin inside hosts like Garageband, so it can't be made fullscreen, etc.
Replies
6
Boosts
0
Views
1k
Activity
May ’25
Game Center SignIn alert appears on macOS 26 (Tahoe) without capability or entitlement
Hello, On macOS 26 (Tahoe), when building a OSX app that includes GameKit code, calling GKLocalPlayer.local.authenticateHandler shows the "Sign In to Game Center" alert (e.g. didShowFullscreenSignIn) — even if the app does not have the Game Center capability enabled or any related entitlement (com.apple.developer.game-center). This alert only appears when the user is not signed in to Game Center in system settings. However, when testing the same code path on iOS app built with macOS 26 (Tahoe), the alert does not appear unless the proper capability and entitlement are included. This behavior is different from macOS 15 (Sequoia) + Xcode 15.x. Prior to the update, Game Center features did not work at all even with the OSX app without Capability and Entitlements. Steps to Reproduce Create a new OSX app target (App Sandbox enabled, no Game Center capability). Add minimal GameKit code: GKLocalPlayer.local.authenticateHandler = { _, _, _ in } Build OSX app and run on macOS 26 (Tahoe). Ensure Game Center is signed out in System Settings. Observe: “Sign In to Game Center” alert appears automatically. Expected Behavior When Game Center capability and entitlement are not present, authenticateHandler should fail silently, and no signIn alert should appear. Actual Behavior On OSX app, the Game Center signIn UI appears even without any Game Center capability or entitlement. On iOS app, this alert does not appear. *Build Configuration: built with the same condition. (macOS 26 + Xcode 26) Question Could you please confirm whether this behavior is an intentional change in macOS 26 or a bug only for OSX apps in the GameKit authentication flow? Thank you.
Replies
2
Boosts
0
Views
529
Activity
Oct ’25
Xcode compile stucks(stops) when add new source files or add functions to previous files
Hello, we are working on a iOS game project, as progress, the project grows larger and larger. Because we are using other game dependencies and libraries, here larger and larger refers to the whole project, and our source files integrated and compiled by Xcode are not many. Now, it seems we hit a bottleneck, when I add new files or functions to the previous files to implement a new feature, Xcode compile stucks(stops), it's Indexing | Initializing datastore forever, cannot produce a final build. macOS 15.1, Xcode 16.2 Can you provide any solutions to solve this problem? Also submitted Feedback ID #FB18432749
Replies
4
Boosts
0
Views
727
Activity
Aug ’25
Compute kernel fails to compile when calling texture.read()
If I compile a compute kernel with a call to texture.read(), it fails with the following error: "Error Domain=AGXMetalG13X Code=3 "Encountered unlowered function call to air.get_read_sampler" UserInfo={NSLocalizedDescription=Encountered unlowered function call to air.get_read_sampler}." This error occurs on both macOS and iOS 26 Beta 5, but not when running on a simulator or in a playground. It does not occur on a macOS Sequoia VM. It occurs whether I use the old metal 3 or new metal 4 compilation method. A workaround would be to use a sampler, but according to the feature tables, all platforms support reading from textures of all formats. Below is a minimal example which produces the error: let device = MTLCreateSystemDefaultDevice()! let library = device.makeDefaultLibrary()! let computeFunction = library.makeFunction(name: "compute_test")! do { let pipeline = try device.makeComputePipelineState(function: computeFunction) debugPrint(pipeline) } catch { debugPrint("Metal 3 failed with error:\n\(error)") } #import <metal_stdlib> using namespace metal; kernel void compute_test(uint2 gid [[thread_position_in_grid]], texture2d<float, access::read> in [[texture(0)]], texture2d<float, access::write> out [[texture(1)]]) { out.write(in.read(gid), gid); } I filed feedback FB19530049.
Replies
1
Boosts
0
Views
234
Activity
Aug ’25
CGContext PDF/A intents
let dic : [AnyHashable:Any] = [ kCGPDFXRegistryName: "http://www.color.org" as CFString, kCGPDFXOutputConditionIdentifier: "FOGRA43" as CFString, kCGPDFContextOutputIntent: "GTS_PDFX" as CFString, kCGPDFXOutputIntentSubtype: "GTS_PDFX" as CFString, kCGPDFContextCreateLinearizedPDF: "" as CFString, kCGPDFContextCreatePDFA: "" as CFString, kCGPDFContextAuthor: "Placeholder" as CFString, kCGPDFContextCreator: "Placeholder" as CFString ] Hello, Now I would like to export my PDF's as PDF/A. In my opinion, there is also the right option for this under Core Graphics. Unfortunately, the documentation does not show what is 'kCGPDFContextCreatePDFA' or 'kCGPDFContextLinearizedPDF' for a stringvalue is required. What I have already tried: GTS_PDFA1 , PDF/A-1, true as CFString. (Above my CFDictionary. ...Author e.g are working perfectly.) In the Finder you can see these two options, which I would also like to implement in my app. Thank you in advance!
Replies
1
Boosts
0
Views
227
Activity
Jun ’25
RealityKit not behaving as expected
This week, I developed a small multiplatform RealityKit project. I also created a demo scene in Reality Composer Pro. Afterward, I imported the local package into the project. Running the project on macOS works perfectly. However, when I tried to run it on my iPhone, I encountered a permission error indicating that it couldn’t read the package. This seems unusual to me because I assumed that the dependency is bundled into the binary file. In an attempt to resolve the issue, I pushed the RCP package to GitHub, hoping it would work. Fortunately, everything compiles successfully now, but the loading time is significantly long, and the animations don’t play on tap gestures. Could someone please help me identify the root cause of this problem?
Replies
1
Boosts
0
Views
367
Activity
Jul ’25
Customize the Metal Performance HUD on Apple TV
Hi there, Is it possible to customize the Metal Performance HUD on Apple TV, similar to how it can be done on iPhone & iPad? Would like to see things like Compiled Shaders for my Apps on tvOS .
Replies
3
Boosts
0
Views
385
Activity
Aug ’25
The problem of suspending the device in game mode
https://developer.apple.com/forums/profile/mozheralqahtani
Replies
1
Boosts
0
Views
373
Activity
Nov ’25
Will OpenGL API and Drivers be removed after appleOS 26?
Hi, I am a Multimedia and Graphics researcher and I am wondering if OpenGL API and drivers will be removed after appleOS 26? macOS 26 iOS 26 iPadOS 26 visionOS 26 I am asking this because most of the libraries I use depends on OpenGL. Like CGAL, libigl, immediate mode ui, nanovg, nanogui, bullet physics. Transitioning from Vulkan and metal while using and learning those libraries is just not viable. I would like to ask you that. I am the sole developer and I just want to ask you that. Regards.
Replies
1
Boosts
0
Views
255
Activity
Aug ’25
Request: API to temporarily defer Reachability for fullscreen game swipes (bottom edge)
In swipe-driven games, a first downward swipe starting near the home indicator can trigger Reachability, even when using preferredScreenEdgesDeferringSystemGestures = .bottom and prefersHomeIndicatorAutoHidden = true. This causes the app to slide down to half screen and breaks gameplay. Please consider an API to temporarily defer Reachability while a custom gesture is active (similar to existing system gesture deferral), without disabling accessibility globally. Environment: Devices: iPhones with Home Indicator (Face ID) Why this matters: Bottom-origin swipes are core in many games (flick shots, slingshot, physics toss, bottom sheets). Current workarounds degrade UX and discoverability, and players still accidentally trigger Reachability. Feedback Assistant Post
Replies
2
Boosts
0
Views
686
Activity
Aug ’25
How to detect which entity was tapped?
Hi, I'm rewriting my game from SceneKit to RealityKit, and I'm having trouble implementing the following scenario: I tap on the iPhone screen to select an Entity that I want to drag. If an Entity was tapped, it should then be possible to drag it left, right, etc. SceneKit solution: func CGPointToSCNVector3(_ view: SCNView, depth: Float, point: CGPoint) -> SCNVector3 { let projectedOrigin = view.projectPoint(SCNVector3Make(0, 0, Float(depth))) let locationWithz = SCNVector3Make(Float(point.x), Float(point.y), Float(projectedOrigin.z)) return view.unprojectPoint(locationWithz) } and then I was calling: SCNView().hitTest(location, options: [SCNHitTestOption.firstFoundOnly:true]) the code was called inside of the UIPanGestureRecognizer in my UIViewController. Could I reuse that code or should I go with the SwiftUI approach - something like that: var body: some View { RealityView { .... } .gesture(TapGesture().onEnded { }) ? I already have this code: @State private var location: CGPoint? .onTapGesture { location in self.location = location } I'm trying to identify the entity that was tapped within the RealityView like that: RealityView { content in let box: ModelEntity = createBox() // for now there is only one box, however there will be many boxes content.add(box) let anchor = AnchorEntity(world: [0, 0, 0]) content.add(anchor) _ = content.subscribe(to: SceneEvents.Update.self) { event in //TODO: find tapped entity, so that it could be dragged inside of the DragGesture() } Any help would be appreciated. I also noticed that if I create a TapGesture like that: TapGesture(count: 1) .targetedToAnyEntity() and add it to my view using .gesture() then it is not triggered.
Replies
2
Boosts
0
Views
350
Activity
Aug ’25
RealityView postProcess effect depth texture
Hello, Question re: iOS RealityView postProcess. I've got a working postProcess kernel and I'd like to add some depth-based effects to it. Theoretically I should be able to just do: encoder.setTexture(context.sourceDepthTexture, index: 1) and then in the kernel: texture2d<float, access::read> depthIn [[texture(1)]] ... outTexture.write(depthIn.read(gid), gid); And I consistently see all black rendered to the view. The postProcess shader works, so that's not the issue. It just seems to not be receiving actual depth information. (If I set a breakpoint at the encoder setTexture step, I can see preview the color texture of the scene, but the context's depthTexture looks like all NaN / blank.) I've looked at all the WWDC samples, but they include ARView for all the depth sample code, which has a different set of configuration options than RealityView. So far I haven't seen anywhere to explicitly tell RealityView "include the depth information". So I'm not sure if I'm missing something there. It appears that there is indeed a depth texture being passed, but it looks blank. Is there a working example somewhere that we can reference?
Replies
2
Boosts
0
Views
690
Activity
Nov ’25
vImageConverter_CreateWithCGImageFormat Fails with kvImageInvalidImageFormat When Trying to Convert CMYK to RGB
So I get JPEG data in my app. Previously I was using the higher level NSBitmapImageRep API and just feeding the JPEG data to it. But now I've noticed on Sonoma If I get a JPEG in the CMYK color space the NSBitmapImageRep renders mostly black and is corrupted. So I'm trying to drop down to the lower level APIs. Specifically I grab a CGImageRef and and trying to use the Accelerate API to convert it to another format (to hopefully workaround the issue... CGImageRef sourceCGImage = `CGImageCreateWithJPEGDataProvider(jpegDataProvider,` NULL, shouldInterpolate, kCGRenderingIntentDefault); Now I use vImageConverter_CreateWithCGImageFormat... with the following values for source and destination formats: Source format: (derived from sourceCGImage) bitsPerComponent = 8 bitsPerPixel = 32 colorSpace = (kCGColorSpaceICCBased; kCGColorSpaceModelCMYK; Generic CMYK Profile) bitmapInfo = kCGBitmapByteOrderDefault version = 0 decode = 0x000060000147f780 renderingIntent = kCGRenderingIntentDefault Destination format: bitsPerComponent = 8 bitsPerPixel = 24 colorSpace = (DeviceRBG) bitmapInfo = 8197 version = 0 decode = 0x0000000000000000 renderingIntent = kCGRenderingIntentDefault But vImageConverter_CreateWithCGImageFormat fails with kvImageInvalidImageFormat. Now if I change the destination format to use 32 bitsPerpixel and use alpha in the bitmap info the vImageConverter_CreateWithCGImageFormat does not return an error but I get a black image just like NSBitmapImageRep
Replies
14
Boosts
0
Views
1.6k
Activity
Aug ’25
SwiftCharts to add candlestick and OHLC types to their stable?
I didn't find a suggestion box on Swift's website so I'll post it here. SwiftCharts are great but limited. I need more data on a single chart. Candlestick and OHLC type charts would be an excellent addition. Hopefully, influencers from Apple can make that happen. Thanks.
Replies
3
Boosts
0
Views
320
Activity
Jun ’25
Why Large-Scale Model Scenes Cause Real Device Crashes
Is there any limitation in Vision Pro when loading scenes with large-scale models? ​Test Case: Asset: Composite USDA file containing ​10 individual models​ (total triangles count: ~4.2M) Simulator: Loads and renders correctly Real Device: Loads asset successfully but ​ failure during rendering phase: Environment abruptly dims System spontaneously reboots How can we resolve this issue? Below are excerpted logs preceding the crash: <<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606 Attempted to add ornament: <MRUIPlatterOrnament: 0x10a658f00; _isInternal: YES; _displaceWindowChrome: NO; _canCaptureUI: NO; _isBeingRemoved: NO; contentAnchorPoint3D: "{0.5, 0.5, 0}"; position: <MRUIPlatterOrnamentRelativePosition: 0x105b68e70; anchorPoint: {0.5, 0.5, 1}>; rotation: "{{0, 0, 0}, 0}"; opacity: 1.000000; canFollowUser: YES; effectiveOffset: "{0, 0, 0}"; presentingViewController: 0x0; billboardingBehavior: 0x0; scalingBehavior: 0x0; relativeToParent: NO; nonHeritableDepthDisplacement: 0.000000; order: 0.000000; _window._determinedSize: {0, 0}; _window: (null)> to nil or non-supporting UIScene: <UIWindowScene: 0x10a8a0000; role: UISceneSessionRoleImmersiveSpaceApplication; persistentIdentifier: test.test:SFBSystemService-BA3A21A3-D1AB-42E2-8AF0-AE0AB83BE528; activationState: UISceneActivationStateUnattached>. No action taken. Failed to set dependencies on asset 2823930584475958382 because NetworkAssetManager does not have an asset entity for that id. apply fence tx failed (client=0x98490e18) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xa86516e2) [0x10000003 (ipc/send) invalid destination port]
Replies
1
Boosts
0
Views
282
Activity
Jul ’25
App not showing in Game Center “All Activity” after release
Hello — I shipped an App Store build that signs in to Game Center using the Apple Unity Plugins (GameKit). The login banner appears, but my app still doesn’t show up in Game Center’s “All activity” (You started playing XXX 2d ago) What I’ve done Call await GKLocalPlayer.Authenticate(); “Game Center” is enabled for the current version in App Store Connect Confirmed: other App Store games do appear under “All Activity” on the same device/account Timeline: This is the first version that enables Game Center (not the app’s first release), and it has been about 2 hours since this build went live. Questions Is authentication alone sufficient for “Recently Played,” or is at least one Game Center component (leaderboards, achievements, activities, multiplayer) required? Is there a typical propagation delay before “Recently Played” starts showing a newly enabled app/version? Is there anything else I should configure in App Store Connect or entitlements to make “Recently Played” visible? Thanks for any help.
Replies
2
Boosts
0
Views
602
Activity
Aug ’25
Is there any future for screensavers on macOS?
I haven't been looking at screensavers for a long time because of Apple's lack of will (or resources?) to provide a public version of the private modern SDK used by Apple for a very long time now. I'm now looking at the Screen Saver pane in System Settings (the What-If version of System Preferences in an alternate universe where all screens are in portrait mode). In macOS Sequoia, it seems like 3rd party screensavers are not welcome considering that they are relegated to the "Other" section at the bottom of the list and you have to click Show All to start seeing 3rd party screen savers. I also had a quick look at macOS Tahoe Beta 3 and it looks like that all the real screensavers are gone (3rd party and the ones from Apple: Hello, Message, Flurry, etc.) or at least it requires to be a Nobel Prize to find them (and the Search field is not useful). I tried to install a 3rd party screen saver on macOS Tahoe Beta 3, it doesn't show up in the list. To summarize: No public access to modern APIs AFAIK. UI that is hostile to 3rd party screen savers on macOS Sequoia. Apparently only screensavers that are slideshows or movies curated by Apple in macOS Tahoe b3. Hence the question: Is there any future for screen savers on macOS? Because if there's none, I won't waste my time trying to update some old screen savers.
Replies
3
Boosts
0
Views
602
Activity
Aug ’25
Unable to package in UE5.6
Im new in the Mac area but for sure not UE. Windows is a long process to packaging but it could be done. All the documentation for Epic and from the internet is basically non existent with exactly how to package a project within UE. I have Xcode installed which makes sense, agreed to terms and install for MacOS, I've been able to make a project for several weeks now and want to package for a test run for my friends to play on Windows. Now I just get this in the log: UATHelper: Packaging (Mac): ERROR: Failed to finalize the .app with Xcode. Check the log for more information UATHelper: Packaging (Mac): Trace written to file /Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.uba with size 12.6kb UATHelper: Packaging (Mac): Total time in Unreal Build Accelerator local executor: 8.12 seconds UATHelper: Packaging (Mac): Result: Failed (OtherCompilationError) UATHelper: Packaging (Mac): Total execution time: 9.71 seconds PackagingResults: Error: Failed to finalize the .app with Xcode. Check the log for more information UATHelper: Packaging (Mac): Took 9.77s to run dotnet, ExitCode=6 UATHelper: Packaging (Mac): UnrealBuildTool failed. See log for more details. (/Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.txt) UATHelper: Packaging (Mac): AutomationTool executed for 0h 0m 10s UATHelper: Packaging (Mac): AutomationTool exiting with ExitCode=6 (6) UATHelper: Packaging (Mac): RunUAT ERROR: AutomationTool was unable to run successfully. Exited with code: 6 PackagingResults: Error: AutomationTool was unable to run successfully. Exited with code: 6 PackagingResults: Error: Unknown Error This absolutely makes no sense to me. Anyone have ideas?
Replies
2
Boosts
0
Views
323
Activity
Jul ’25
Can spatial scene function be used outside of the Photo App?
I'm new here so I don't know what's this function belongs to which topic... Sorry about that! I watched the WWDC stream and I am really interested in this function, I'm wondering if this function could be used in my apps. I looked up the document but I find it only support visionOS(i'm not sure about that, but I saw the demo is base on the visionOS)
Replies
2
Boosts
0
Views
332
Activity
Jun ’25
Core Text incremental redraw glitch: overlapping glyphs during editing
During editing in Pages (or Word) I am getting these glitches (see attachment). Started after the last update to Mac OS 26.3 (beta) Also removed 2 recent instalments (Blackhole audio driver and kDrive/Infomaniak, but trouble is still there. 27" iMac 2020 (Intel) i7 3,8 Ghz AMD Radeon Pro 5500 XT 8 GB 24 GB RAM macOS Tahoe 26.3 (=beta) Tried restart in safe mode, checked fonts. Talked to aissistent to get a solution, but no ...) Thx for any advice, Pieter (not a developer so please kee pit simple 🙏🏻)
Replies
3
Boosts
0
Views
851
Activity
Feb ’26