Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

CGSetDisplayTransferByTable is broken on macOS Tahoe 26.4 RC (and 26.3.1) with MacBook M5 Pro, Max and Neo
The CGSetDisplayTransferByTable() is not working on the latest round of Mac hardware, namely the MacBook Neo (external display), MacBook M5 Pro (both built-in and external display) and possibly the M5 Max. All tested apps (BetterDisplay, MonitorControl, f.lux, Lunar) exhibit the very issue both in macOS Tahoe 26.3 and macOS Tahoe 26.4 RC. Tested on multiple Macs and installations on the MacBook Neo and MacBook M5 Pro. This issue breaks several display related macOS apps. Way to reproduce the issue using an affected app: Install the app BetterDisplay (https://betterdisplay.pro) Launch the app, open the app menu, choose Image Adjustments and try to adjust colors. Adjustments take no effect Way to reproduce the issue programmatically: Attempt to use the affected macOS API feature: https://developer.apple.com/documentation/coregraphics/cgsetdisplaytransferbytable(::::_:) Here are the FB numbers: FB22273730 (Filed this one as a developer on an unaffected MBP M3 Max) FB22273782 (Filed from an affected MBP M5 Pro running 26.4 RC, with debug info attached)
4
3
1.8k
Mar ’26
How to load and draw texture with opacity in Metal
The background I'm finally working to convert my very old Mac kaleidoscope application, ScopeWorks, which was written in OpenGL and Objective-C, to a Multiplatform app in SwiftUI and Metal. I'm using the MetalKit MTKView class, wrapped for SwiftUI as an NSViewRepresentable or UIViewRepresentable. I then provide an MTKViewDelegate that provides a draw method. The draw method fetches the current render pass descriptor, creates a command buffer, sets up a render pipeline, and does its drawing. My renderer's makePipeline method looks like this: func makePipeline() { let library = device.makeDefaultLibrary() let pipelineDesc = MTLRenderPipelineDescriptor() pipelineDesc.vertexFunction = library?.makeFunction(name: "vertex_main") pipelineDesc.fragmentFunction = library?.makeFunction(name: "fragment_main") pipelineDesc.colorAttachments[0].pixelFormat = .bgra8Unorm pipeline = try! device.makeRenderPipelineState(descriptor: pipelineDesc) } And my shaders look like this: struct VertexOut { float4 position [[position]]; float2 texCoord; }; vertex VertexOut vertex_main(const device float2* position [[buffer(0)]], uint vid [[vertex_id]]) { VertexOut out; float2 pos = position[vid]; out.position = float4(pos, 0, 1); out.texCoord = pos * 0.5 + 0.5; // basic mapping return out; } fragment float4 fragment_main(VertexOut in [[stage_in]], texture2d<float> tex [[texture(0)]], constant float4& color [[buffer(1)]]) { constexpr sampler s(address::repeat, filter::linear); // float4 texColor = tex.sample(s, in.texCoord); // return texColor * color; float4 textureColor = {1, 2, 3, 4}; if (all(color == textureColor)) { return tex.sample(s, in.texCoord); } else { return color; } // Sample the texture directly — no color tint applied return tex.sample(s, in.texCoord); } The first part of my MTKViewDelegate's draw method looks like this: func draw(in view: MTKView) { guard let drawable = view.currentDrawable, let descriptor = view.currentRenderPassDescriptor, let pipeline = pipeline, let texture = texture else { return } let commandBuffer = commandQueue.makeCommandBuffer()! let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: descriptor)! encoder.setRenderPipelineState(pipeline) encoder.setFragmentTexture(texture, index: 0) descriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.0, green: 0, blue: 0, alpha: 1.0) // Draw six equilateral triangles forming the hexagon let radius: Float = 0.6 for i in 0..<6 { let angle = Float(i) * (.pi / 3) let cosA = cos(angle) let sinA = sin(angle) let nextA = Float(i+1) * (.pi / 3) let cosB = cos(nextA) let sinB = sin(nextA) let verts: [simd_float2] = [ simd_float2(0, 0), simd_float2(radius * cosA, radius * sinA), simd_float2(radius * cosB, radius * sinB) ] encoder.setVertexBytes(verts, length: MemoryLayout<simd_float2>.stride * 3, index: 0) // Tell the fragment shader to use the texture color. var textureColor: simd_float4 = simd_float4(1, 2, 3, 4) encoder.setFragmentBytes(&textureColor, length: MemoryLayout<SIMD4<Float>>.stride, index: 1) encoder.drawPrimitives(type: .triangle, vertexStart: 0, vertexCount: 3) One of the things the existing app does is load PNG or TIFF images with an alpha channel, and then overlay parts of the image on top of themselves flipped, so you get interesting Moiré patterns in the lines in the resulting kaleidoscope. For now I'm working on a single sample image, loading it into a texture in Metal, and just rendering it as a hexagon and drawing lines for the triangles that make up the hexagon. (For now I'm using the vertex coordinates as the texture coordinates, so I get a hexagonal part of my texture rather than a single triangular part tessellated into a hexagon. I'll fix that later.) In both iOS and OS I set the clear color to black at the beginning of the draw function. The issue: The source image is mostly transparent, but with a lot of partly transparent pixels. Here's what it looks like in Photoshop, where you can see the transparent parts as a checkerboard pattern: (I tried to crop the original image to show the approximate part that I'm rendering in a hexagon, but it's not exact. Look for the same shapes in the different images to compare them.) When I render my hexagon in the Metal view in the iOS version of the app, it looks like it's forcing each pixel to fully opaque or fully transparent: And in the macOS version of the app, it seems to force ALL the pixels to opaque: I haven't shown all the setup code, because it's' a lot. Is there some rendering mode setup I'm missing in order to get it to draw the pixels into the output based on their opacity, including partial opacity?
2
0
930
Mar ’26
Does VisionOS have the equivalent of ARView.physicsOrigin?
I'm trying to scale the physics of a scene without changing the apparent size to avoid the low-speed zeroing-out of motion that the physics simulation does. I found a technique for using separate simulation and physics roots in the docs, but it relies on ARView, which VisionOS doesn't have. This seems more elegant than scaling absolutely everything with shared root -- any chance I'm just failing in my searches to find the equivalent functionality?
3
1
790
Mar ’26
RealityKit .Kinematic + collisions (visionOs)
Hi everyone, I'm new to visionOS development. I'm trying to create a physics-based scene (with gravity) where users can pick up and move objects on a workbench. I am struggling with physics interactions during the drag gesture: Kinematic Mode: If I switch to .kinematic during the drag, the object moves smoothly but clips through other objects (no collisions). Dynamic Mode: I tried keeping it .dynamic and applying linear velocity toward the hand position, but the movement feels laggy and unresponsive. Hybrid Approach: I tried switching to .kinematic during DragGesture.onChange and back to .dynamic on collision, but this causes the entity to jitter/shake violently when touching other objects. Has anyone found a clean way to drag objects while maintaining solid collisions. Thanks for your help!
3
0
889
Mar ’26
ScreenCaptureKit recording output is corrupted when captureMicrophone is true
Hello everyone, I'm working on a screen recording app using ScreenCaptureKit and I've hit a strange issue. My app records the screen to an .mp4 file, and everything works perfectly until the .captureMicrophone is false In this case, I get a valid, playable .mp4 file. However, as soon as I try to enable the microphone by setting streamConfig.captureMicrophone = true, the recording seems to work, but the final .mp4 file is corrupted and cannot be played by QuickTime or any other player. This happens whether capturesAudio (app audio) is on or off. I've already added the "Privacy - Microphone Usage Description" (NSMicrophoneUsageDescription) to my Info.plist, so I don't think it's a permissions problem. I have my logic split into a ScreenRecorder class that manages state and a CaptureEngine that handles the SCStream. Here is how I'm configuring my SCStream: ScreenRecorder.swift // This is my main SCStreamConfiguration private var streamConfiguration: SCStreamConfiguration { var streamConfig = SCStreamConfiguration() // ... other HDR/preset config ... // These are the problem properties streamConfig.capturesAudio = isAudioCaptureEnabled streamConfig.captureMicrophone = isMicCaptureEnabled // breaks it if true streamConfig.excludesCurrentProcessAudio = false streamConfig.showsCursor = false if let region = selectedRegion, let display = currentDisplay { // My region/frame logic (works fine) let regionWidth = Int(region.frame.width) let regionHeight = Int(region.frame.height) streamConfig.width = regionWidth * scaleFactor streamConfig.height = regionHeight * scaleFactor // ... (sourceRect logic) ... } streamConfig.pixelFormat = kCVPixelFormatType_32BGRA streamConfig.colorSpaceName = CGColorSpace.sRGB streamConfig.minimumFrameInterval = CMTime(value: 1, timescale: 60) return streamConfig } And here is how I'm setting up the SCRecordingOutput that writes the file: ScreenRecorder.swift private func initRecordingOutput(for region: ScreenPickerManager.SelectedRegion) throws { let screeRecordingOutputURL = try RecordingWorkspace.createScreenRecordingVideoFile( in: workspaceURL, sessionIndex: sessionIndex ) let recordingConfiguration = SCRecordingOutputConfiguration() recordingConfiguration.outputURL = screeRecordingOutputURL recordingConfiguration.outputFileType = .mp4 recordingConfiguration.videoCodecType = .hevc let recordingOutput = SCRecordingOutput(configuration: recordingConfiguration, delegate: self) self.recordingOutput = recordingOutput } Finally, my CaptureEngine adds these to the SCStream: CaptureEngine.swift class CaptureEngine: NSObject, @unchecked Sendable { private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? // ... (dispatch queues) ... func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) // Add outputs for raw buffers (not used for file recording) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) // Add the file recording output try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } // ... (stopCapture, etc.) ... } When I had the .captureMicrophone value to be false, I get a perfect .mp4 video playable everywhere, however, when its true, I am getting corrupted video which doesn't play at all :-
2
0
786
Mar ’26
Leaderboard not available to edit points
When I go to the leadership section, I have 8 currently active leaderboards. When I select "manage scores and players" I see 9 leaderboards, from which 5 are legacy and my 4 new ones are not listed (also it comes in a very old design). New players will score in the 4 new leaderboards but I need to remove the top score which was the developer score, to give them a chance, but the new leaderboards are not accessible.
0
0
273
Mar ’26
Implementation of Screen Recording permissions for background OCR utility
I am exploring the development of a utility app that provides real-time feedback to users based on their active screen content (e.g., providing text suggestions for various communication apps). To achieve this, I am looking at using ReplayKit and Broadcast Upload Extensions to process screen frames in the background via OCR. I have a few questions regarding the "Screen Recording" permission and App Store Review: Permission Clarity: Is it possible to trigger the Screen Recording permission request in a way that clearly communicates the "utility" nature of the app without the system UI making it look like a standard video recording? Background Persistence: Can a Broadcast Extension reliably stay active in the background while the user switches between other third-party apps (like messaging or social apps) for the purpose of continuous OCR processing? App Store Guidelines: Are there specific "Privacy & Data Use" guidelines I should be aware of when an app requires persistent screen access to provide text-based suggestions? I want to ensure the user experience is transparent and that the permission flow feels like a "helper utility" rather than a security risk. Any insights on the best APIs to use for this specific background processing would be appreciated.
4
0
808
Mar ’26
CGSetDisplayTransferByTable no longer working on macOS Tahoe
For an app of mine I use CGSetDisplayTransferByTable to adjust the gamma table of the device. Since macOS Tahoe, these modifications are silently ignored. The display's actual gamma curve remains unchanged despite the API reporting successful completion. I've filed a FB for it a few weeks ago, and would love to figure out what could be causing this. FB18559786
4
1
798
Mar ’26
RealityKit equivalent of ARGeoAnchor?
In ARKit there is ARGeoAnchor, which lets you anchor content using latitude and longitude so objects stay fixed to a real-world location. Is there an equivalent feature in RealityKit? I want to place points in the world and make sure they don't move or drift after placement. If RealityKit doesn't support this directly, what is the recommended approach?
0
1
549
Mar ’26
Can I use metal shader if I use RealityKit to build a VisionOS app?
I asked AI to build a realistic ocean shader for a VisionOS project using RealityKit. It gave a bunch instructions and asked me to connect large amount of nodes for the ShaderGraph in the Reality Composer Pro. I am just wondering if AI can help to generate the Metal shader code directly, or build the ShaderGraph nodes for me automatically.
0
0
754
Mar ’26
SpriteKit framerate drop on iOS 26.3
Hello, I have noticed a performance drop on SpriteKit-based projects running on iOS 26.3. The issue seems very similar to the issue reported on iOS 26.0, and later solved from iOS 26.2 beta 3: https://developer.apple.com/forums/thread/800952?answerId=870617022#870617022 With 26.3, it seems a regression occured. Below is the same SpriteKit scene used to test framerate on different devices: import SpriteKit import SwiftUI class BareboneScene: SKScene { override func didMove(to view: SKView) { size = view.bounds.size anchorPoint = CGPoint(x: 0.5, y: 0.5) backgroundColor = .darkGray let roundedSquare = SKShapeNode(rectOf: CGSize(width: 150, height: 75), cornerRadius: 12) roundedSquare.fillColor = .systemRed roundedSquare.strokeColor = .black roundedSquare.lineWidth = 3 addChild(roundedSquare) let action = SKAction.rotate(byAngle: .pi, duration: 1) roundedSquare.run(.repeatForever(action)) } } struct BareboneSceneView: View { var body: some View { SpriteView( scene: BareboneScene(), debugOptions: [.showsFPS] ) .ignoresSafeArea() } } #Preview { BareboneSceneView() } Running this very basic spritekit scene on my device and I can not reach an FPS of 60. Instead, it stalls around 40. I see the same in my AppStore published SpriteKit games. As you can see in the discussion of the linked forum topic, the regression has been noticed by others as well. Can you please look into this, and let us know if there is an outstanding action to resolve it already? It's very problematic for published games suddenly dropping to 40FPS on even the most modern iOS devices.
4
3
1.7k
Mar ’26
RealityKit - Full 3D experience
I have a question I guess more for the Apple team. But why are there no totally 3D experiences for the Vision Pro lineup? I know they have given us tools to implement unity 3D games into iPhone and I guess you can also build it in RealityKit. But why at this moment are 3D games limited to just iPad and iPhone and can't you bring that into Vision Pro? Just to explain. When I say a totally 3D game, I mean games like Gorn. I mean the Vision Pro is definitely powerful enough, but it just feels limited to tabletop games and AR games. Is this something Apple is thinking about implementing?
1
0
1.5k
Mar ’26
Highlight or select entity in RealityKit
I'm in the process of converting my SceneKit game to RealityKit. In SceneKit I used to be able to mark nodes as selected by setting SCNMaterial.emission with a custom color. I can do the same with PhysicallyBasedMaterial.emissiveColor, but I'd like to keep my entitities unaffected by the scene lights by using UnlitMaterial. In SceneKit I can set a category mask to indicate what light should affect what node, but there doesn't seem to be such a thing in RealityKit. So at the moment it seems like I have to choose between being able to mark an entity as selected, or having entities unaffected by lighting, but not both. Is there some effect or component I can use to mark entities as selected by applying some coloring regardless of the material used?
2
0
269
Mar ’26
RealityKit animation with bindTarget: .opacity doesn't work
I want to fade objects in and out, and while setting an entity's OpacityComponent works, animating it doesn't seem to do anything. In the following code the second sphere should fade out, but it keeps its initial opacity. On the other hand, the animation that changes its transform works. What am I doing wrong? class ViewController: NSViewController { override func loadView() { let arView = ARView(frame: NSScreen.main!.frame) let anchor = AnchorEntity(.world(transform: matrix_identity_float4x4)) arView.scene.addAnchor(anchor) let sphere = ModelEntity(mesh: .generateSphere(radius: 0.5)) anchor.addChild(sphere) sphere.components.set(OpacityComponent(opacity: 0.1)) let sphere2 = ModelEntity(mesh: .generateSphere(radius: 0.5)) sphere2.position = .init(x: 0.2, y: 0, z: 0) anchor.addChild(sphere2) sphere2.components.set(OpacityComponent(opacity: 0.1)) sphere.playAnimation(try! AnimationResource.makeActionAnimation(for: FromToByAction(to: 0, timing: .linear), duration: 1, bindTarget: .opacity)) sphere.playAnimation(try! AnimationResource.makeActionAnimation(for: FromToByAction(to: Transform(translation: SIMD3(x: 0.1, y: 0, z: 0)), timing: .linear), duration: 1, bindTarget: .transform)) view = arView } }
4
0
531
Mar ’26
انشاء تطبيق جديد
اريد انشاء لعبه في ابل ستور و تكون اول صفحه تكون شروط و الاحكام و خيار بدا اللعبه200 فئات من السعوديه من مسلسل من العب من بنات و بس وقطر و الإمارات وانمي ومسلسلات تركيه و السياحه و الدول وشركات عالميه و شركات كترونيه
3
0
425
Mar ’26
RealityKit game keeps using ~65% CPU even with empty scene
I'm trying to convert my game from SceneKit to RealityKit. I noticed that even when the scene is static (nothing moves), RealityKit keeps using CPU. In SceneKit, CPU goes down to 0% with a static scene. With this simplest of games, RealityKit keeps using about 65% CPU: class ViewController: NSViewController { override func loadView() { view = ARView(frame: NSScreen.main!.frame) } } Is this expected or a bug? I created FB22125047.
0
1
201
Mar ’26
SpriteKit framerate drop on iOS 26.0
Hello, I have noticed a performance drop on SpriteKit-based projects running on iOS 26.0 (23A341). Below is a SpriteKit scene used to test framerate on different devices: import SpriteKit import SwiftUI class BareboneScene: SKScene { override func didMove(to view: SKView) { size = view.bounds.size anchorPoint = CGPoint(x: 0.5, y: 0.5) backgroundColor = .darkGray let roundedSquare = SKShapeNode(rectOf: CGSize(width: 150, height: 75), cornerRadius: 12) roundedSquare.fillColor = .systemRed roundedSquare.strokeColor = .black roundedSquare.lineWidth = 3 addChild(roundedSquare) let action = SKAction.rotate(byAngle: .pi, duration: 1) roundedSquare.run(.repeatForever(action)) } } struct BareboneSceneView: View { var body: some View { SpriteView( scene: BareboneScene(), debugOptions: [.showsFPS] ) .ignoresSafeArea() } } #Preview { BareboneSceneView() } The scene is very simple, yet framerate drops to ~40 fps as shown by the Metal HUD. Tested on: iPhone 13, iOS 26.0: framerate drops to 40 fps. Sometimes it runs at near 60fps. But if the screen is touched repeatedly, the framerate drops to 40-50 fps again. iPhone 11 Pro, iOS 26.0: ~40fps. iPad 9th Gen, iOS 18.6.2: 60fps, no issues. See screenshots attached. These numbers were observed by me and members of our beloved SpriteKit Discord server. Thank you for your attention.
15
6
3.4k
Mar ’26
Metal Shader inside Swift Package not found?
Hello everyone! I am trying to wrap a ViewModifier inside a Swift Package that bundles a metal shader file to be used in the modifier. Everything works as expected in the Preview, in the Simulator and on a real device for iOS. It also works in Preview and in the Simulator for tvOS but not on a real AppleTV. I have tried this on a 4th generation Apple TV running tvOS 26.3 using Xcode 26.2.0. Xcode logs the following: The metallib is processed and exists in the bundle. Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Contents of Package.swift: import PackageDescription let package = Package( name: "Test", platforms: [ .iOS(.v17), .tvOS(.v17) ], products: [ .library( name: "Test", targets: [ "Test" ] ) ], targets: [ .target( name: "Test", resources: [ .process("Shaders") ] ), .testTarget( name: "TestTests", dependencies: [ "Test" ] ) ] ) Content of my metal file: #include <metal_stdlib> using namespace metal; [[ stitchable ]] float2 complexWave(float2 position, float time, float2 size, float speed, float strength, float frequency) { float2 normalizedPosition = position / size; float moveAmount = time * speed; position.x += sin((normalizedPosition.x + moveAmount) * frequency) * strength; position.y += cos((normalizedPosition.y + moveAmount) * frequency) * strength; return position; } And my ViewModifier: import MetalKit import SwiftUI extension ShaderFunction { static let complexWave: ShaderFunction = { ShaderFunction( library: .bundle(.module), name: "complexWave" ) }() } extension Shader { static func complexWave(arguments: [Shader.Argument]) -> Shader { Shader(function: .complexWave, arguments: arguments) } } struct WaveModifier: ViewModifier { let start: Date = .now func body(content: Content) -> some View { TimelineView(.animation) { context in let delta = context.date.timeIntervalSince(start) content .visualEffect { view, proxy in view.distortionEffect( .complexWave( arguments: [ .float(delta), .float2(proxy.size), .float(0.5), .float(8), .float(10) ] ), maxSampleOffset: .zero ) } } .onAppear { let paths = Bundle.module.paths(forResourcesOfType: "metallib", inDirectory: nil) print(paths) } } } extension View { public func wave() -> some View { modifier(WaveModifier()) } } #Preview { Image(systemName: "cart") .wave() } Any help is appreciated.
0
0
454
Mar ’26
vImageBuffer_InitWithCGImage fails with Xcode 26 but succeeds with Xcode 15I am
I am attempting to load a jpeg image into a vImage_Buffer. I am just trying to get the data in an ARGB format. This code works fine in the Xcode 15 build , but fails with kvImageInvalidParameter error from vImageBuffer_InitWithCGImage in the Xcode 26 build. This code is written in ObjectiveC++. Here is a code fragment: int CDib_ARGB::Load(LPCSTR pFilename) { int rc = 0; if (NULL == m_pRaw_vImage_Buffer) { NSString *pNS_filename = [[NSString alloc]initWithUTF8String:pFilename]; NSImage *pNSImage = [[NSImage alloc] initWithContentsOfFile:pNS_filename]; if (nil == pNSImage) rc = -1; else { int width = pNSImage.size.width; int height = pNSImage.size.height; if (pNSImage.representations) { NSImageRep *imageRep; int jj; width = 0; height = 0; for (jj = 0; jj < pNSImage.representations.count; jj++) { imageRep = pNSImage.representations[jj]; if (imageRep.pixelsWide > width) width = imageRep.pixelsWide; if (imageRep.pixelsHigh > height) height = imageRep.pixelsHigh; } } NSSize imageSize = NSMakeSize(width, height); NSRect imageRect = NSMakeRect(0, 0, width, height); pNSImage.size = imageSize; CGImageRef cgImage = [pNSImage CGImageForProposedRect:&imageRect context:NULL hints:nil]; if (nil == cgImage) rc = -1; else { //Alloc and load vImage_Buffer. vImage_Buffer *pvImage_Buffer = new vImage_Buffer; if (NULL == pvImage_Buffer) rc = -1; else { vImage_CGImageFormat format; format.bitsPerComponent = 8; format.bitsPerPixel = 32; format.colorSpace = nil; format.bitmapInfo = kCGImageAlphaFirst | kCGBitmapByteOrderDefault;//ARGB8888 format.version = 0; format.decode = nil; format.renderingIntent = kCGRenderingIntentDefault; memset(pvImage_Buffer, 0, sizeof(vImage_Buffer)); long status = vImageBuffer_InitWithCGImage(pvImage_Buffer, &format, nil, cgImage, kvImagePrintDiagnosticsToConsole); if (kvImageNoError != status) { //This is where Xcode 26 sends me. delete pvImage_Buffer; rc = -1; } =========================
13
0
585
Feb ’26
CGSetDisplayTransferByTable is broken on macOS Tahoe 26.4 RC (and 26.3.1) with MacBook M5 Pro, Max and Neo
The CGSetDisplayTransferByTable() is not working on the latest round of Mac hardware, namely the MacBook Neo (external display), MacBook M5 Pro (both built-in and external display) and possibly the M5 Max. All tested apps (BetterDisplay, MonitorControl, f.lux, Lunar) exhibit the very issue both in macOS Tahoe 26.3 and macOS Tahoe 26.4 RC. Tested on multiple Macs and installations on the MacBook Neo and MacBook M5 Pro. This issue breaks several display related macOS apps. Way to reproduce the issue using an affected app: Install the app BetterDisplay (https://betterdisplay.pro) Launch the app, open the app menu, choose Image Adjustments and try to adjust colors. Adjustments take no effect Way to reproduce the issue programmatically: Attempt to use the affected macOS API feature: https://developer.apple.com/documentation/coregraphics/cgsetdisplaytransferbytable(::::_:) Here are the FB numbers: FB22273730 (Filed this one as a developer on an unaffected MBP M3 Max) FB22273782 (Filed from an affected MBP M5 Pro running 26.4 RC, with debug info attached)
Replies
4
Boosts
3
Views
1.8k
Activity
Mar ’26
How to load and draw texture with opacity in Metal
The background I'm finally working to convert my very old Mac kaleidoscope application, ScopeWorks, which was written in OpenGL and Objective-C, to a Multiplatform app in SwiftUI and Metal. I'm using the MetalKit MTKView class, wrapped for SwiftUI as an NSViewRepresentable or UIViewRepresentable. I then provide an MTKViewDelegate that provides a draw method. The draw method fetches the current render pass descriptor, creates a command buffer, sets up a render pipeline, and does its drawing. My renderer's makePipeline method looks like this: func makePipeline() { let library = device.makeDefaultLibrary() let pipelineDesc = MTLRenderPipelineDescriptor() pipelineDesc.vertexFunction = library?.makeFunction(name: "vertex_main") pipelineDesc.fragmentFunction = library?.makeFunction(name: "fragment_main") pipelineDesc.colorAttachments[0].pixelFormat = .bgra8Unorm pipeline = try! device.makeRenderPipelineState(descriptor: pipelineDesc) } And my shaders look like this: struct VertexOut { float4 position [[position]]; float2 texCoord; }; vertex VertexOut vertex_main(const device float2* position [[buffer(0)]], uint vid [[vertex_id]]) { VertexOut out; float2 pos = position[vid]; out.position = float4(pos, 0, 1); out.texCoord = pos * 0.5 + 0.5; // basic mapping return out; } fragment float4 fragment_main(VertexOut in [[stage_in]], texture2d<float> tex [[texture(0)]], constant float4& color [[buffer(1)]]) { constexpr sampler s(address::repeat, filter::linear); // float4 texColor = tex.sample(s, in.texCoord); // return texColor * color; float4 textureColor = {1, 2, 3, 4}; if (all(color == textureColor)) { return tex.sample(s, in.texCoord); } else { return color; } // Sample the texture directly — no color tint applied return tex.sample(s, in.texCoord); } The first part of my MTKViewDelegate's draw method looks like this: func draw(in view: MTKView) { guard let drawable = view.currentDrawable, let descriptor = view.currentRenderPassDescriptor, let pipeline = pipeline, let texture = texture else { return } let commandBuffer = commandQueue.makeCommandBuffer()! let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: descriptor)! encoder.setRenderPipelineState(pipeline) encoder.setFragmentTexture(texture, index: 0) descriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.0, green: 0, blue: 0, alpha: 1.0) // Draw six equilateral triangles forming the hexagon let radius: Float = 0.6 for i in 0..<6 { let angle = Float(i) * (.pi / 3) let cosA = cos(angle) let sinA = sin(angle) let nextA = Float(i+1) * (.pi / 3) let cosB = cos(nextA) let sinB = sin(nextA) let verts: [simd_float2] = [ simd_float2(0, 0), simd_float2(radius * cosA, radius * sinA), simd_float2(radius * cosB, radius * sinB) ] encoder.setVertexBytes(verts, length: MemoryLayout<simd_float2>.stride * 3, index: 0) // Tell the fragment shader to use the texture color. var textureColor: simd_float4 = simd_float4(1, 2, 3, 4) encoder.setFragmentBytes(&textureColor, length: MemoryLayout<SIMD4<Float>>.stride, index: 1) encoder.drawPrimitives(type: .triangle, vertexStart: 0, vertexCount: 3) One of the things the existing app does is load PNG or TIFF images with an alpha channel, and then overlay parts of the image on top of themselves flipped, so you get interesting Moiré patterns in the lines in the resulting kaleidoscope. For now I'm working on a single sample image, loading it into a texture in Metal, and just rendering it as a hexagon and drawing lines for the triangles that make up the hexagon. (For now I'm using the vertex coordinates as the texture coordinates, so I get a hexagonal part of my texture rather than a single triangular part tessellated into a hexagon. I'll fix that later.) In both iOS and OS I set the clear color to black at the beginning of the draw function. The issue: The source image is mostly transparent, but with a lot of partly transparent pixels. Here's what it looks like in Photoshop, where you can see the transparent parts as a checkerboard pattern: (I tried to crop the original image to show the approximate part that I'm rendering in a hexagon, but it's not exact. Look for the same shapes in the different images to compare them.) When I render my hexagon in the Metal view in the iOS version of the app, it looks like it's forcing each pixel to fully opaque or fully transparent: And in the macOS version of the app, it seems to force ALL the pixels to opaque: I haven't shown all the setup code, because it's' a lot. Is there some rendering mode setup I'm missing in order to get it to draw the pixels into the output based on their opacity, including partial opacity?
Replies
2
Boosts
0
Views
930
Activity
Mar ’26
Does VisionOS have the equivalent of ARView.physicsOrigin?
I'm trying to scale the physics of a scene without changing the apparent size to avoid the low-speed zeroing-out of motion that the physics simulation does. I found a technique for using separate simulation and physics roots in the docs, but it relies on ARView, which VisionOS doesn't have. This seems more elegant than scaling absolutely everything with shared root -- any chance I'm just failing in my searches to find the equivalent functionality?
Replies
3
Boosts
1
Views
790
Activity
Mar ’26
RealityKit .Kinematic + collisions (visionOs)
Hi everyone, I'm new to visionOS development. I'm trying to create a physics-based scene (with gravity) where users can pick up and move objects on a workbench. I am struggling with physics interactions during the drag gesture: Kinematic Mode: If I switch to .kinematic during the drag, the object moves smoothly but clips through other objects (no collisions). Dynamic Mode: I tried keeping it .dynamic and applying linear velocity toward the hand position, but the movement feels laggy and unresponsive. Hybrid Approach: I tried switching to .kinematic during DragGesture.onChange and back to .dynamic on collision, but this causes the entity to jitter/shake violently when touching other objects. Has anyone found a clean way to drag objects while maintaining solid collisions. Thanks for your help!
Replies
3
Boosts
0
Views
889
Activity
Mar ’26
ScreenCaptureKit recording output is corrupted when captureMicrophone is true
Hello everyone, I'm working on a screen recording app using ScreenCaptureKit and I've hit a strange issue. My app records the screen to an .mp4 file, and everything works perfectly until the .captureMicrophone is false In this case, I get a valid, playable .mp4 file. However, as soon as I try to enable the microphone by setting streamConfig.captureMicrophone = true, the recording seems to work, but the final .mp4 file is corrupted and cannot be played by QuickTime or any other player. This happens whether capturesAudio (app audio) is on or off. I've already added the "Privacy - Microphone Usage Description" (NSMicrophoneUsageDescription) to my Info.plist, so I don't think it's a permissions problem. I have my logic split into a ScreenRecorder class that manages state and a CaptureEngine that handles the SCStream. Here is how I'm configuring my SCStream: ScreenRecorder.swift // This is my main SCStreamConfiguration private var streamConfiguration: SCStreamConfiguration { var streamConfig = SCStreamConfiguration() // ... other HDR/preset config ... // These are the problem properties streamConfig.capturesAudio = isAudioCaptureEnabled streamConfig.captureMicrophone = isMicCaptureEnabled // breaks it if true streamConfig.excludesCurrentProcessAudio = false streamConfig.showsCursor = false if let region = selectedRegion, let display = currentDisplay { // My region/frame logic (works fine) let regionWidth = Int(region.frame.width) let regionHeight = Int(region.frame.height) streamConfig.width = regionWidth * scaleFactor streamConfig.height = regionHeight * scaleFactor // ... (sourceRect logic) ... } streamConfig.pixelFormat = kCVPixelFormatType_32BGRA streamConfig.colorSpaceName = CGColorSpace.sRGB streamConfig.minimumFrameInterval = CMTime(value: 1, timescale: 60) return streamConfig } And here is how I'm setting up the SCRecordingOutput that writes the file: ScreenRecorder.swift private func initRecordingOutput(for region: ScreenPickerManager.SelectedRegion) throws { let screeRecordingOutputURL = try RecordingWorkspace.createScreenRecordingVideoFile( in: workspaceURL, sessionIndex: sessionIndex ) let recordingConfiguration = SCRecordingOutputConfiguration() recordingConfiguration.outputURL = screeRecordingOutputURL recordingConfiguration.outputFileType = .mp4 recordingConfiguration.videoCodecType = .hevc let recordingOutput = SCRecordingOutput(configuration: recordingConfiguration, delegate: self) self.recordingOutput = recordingOutput } Finally, my CaptureEngine adds these to the SCStream: CaptureEngine.swift class CaptureEngine: NSObject, @unchecked Sendable { private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? // ... (dispatch queues) ... func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) // Add outputs for raw buffers (not used for file recording) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) // Add the file recording output try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } // ... (stopCapture, etc.) ... } When I had the .captureMicrophone value to be false, I get a perfect .mp4 video playable everywhere, however, when its true, I am getting corrupted video which doesn't play at all :-
Replies
2
Boosts
0
Views
786
Activity
Mar ’26
Leaderboard not available to edit points
When I go to the leadership section, I have 8 currently active leaderboards. When I select "manage scores and players" I see 9 leaderboards, from which 5 are legacy and my 4 new ones are not listed (also it comes in a very old design). New players will score in the 4 new leaderboards but I need to remove the top score which was the developer score, to give them a chance, but the new leaderboards are not accessible.
Replies
0
Boosts
0
Views
273
Activity
Mar ’26
Implementation of Screen Recording permissions for background OCR utility
I am exploring the development of a utility app that provides real-time feedback to users based on their active screen content (e.g., providing text suggestions for various communication apps). To achieve this, I am looking at using ReplayKit and Broadcast Upload Extensions to process screen frames in the background via OCR. I have a few questions regarding the "Screen Recording" permission and App Store Review: Permission Clarity: Is it possible to trigger the Screen Recording permission request in a way that clearly communicates the "utility" nature of the app without the system UI making it look like a standard video recording? Background Persistence: Can a Broadcast Extension reliably stay active in the background while the user switches between other third-party apps (like messaging or social apps) for the purpose of continuous OCR processing? App Store Guidelines: Are there specific "Privacy & Data Use" guidelines I should be aware of when an app requires persistent screen access to provide text-based suggestions? I want to ensure the user experience is transparent and that the permission flow feels like a "helper utility" rather than a security risk. Any insights on the best APIs to use for this specific background processing would be appreciated.
Replies
4
Boosts
0
Views
808
Activity
Mar ’26
CGSetDisplayTransferByTable no longer working on macOS Tahoe
For an app of mine I use CGSetDisplayTransferByTable to adjust the gamma table of the device. Since macOS Tahoe, these modifications are silently ignored. The display's actual gamma curve remains unchanged despite the API reporting successful completion. I've filed a FB for it a few weeks ago, and would love to figure out what could be causing this. FB18559786
Replies
4
Boosts
1
Views
798
Activity
Mar ’26
RealityKit equivalent of ARGeoAnchor?
In ARKit there is ARGeoAnchor, which lets you anchor content using latitude and longitude so objects stay fixed to a real-world location. Is there an equivalent feature in RealityKit? I want to place points in the world and make sure they don't move or drift after placement. If RealityKit doesn't support this directly, what is the recommended approach?
Replies
0
Boosts
1
Views
549
Activity
Mar ’26
Can I use metal shader if I use RealityKit to build a VisionOS app?
I asked AI to build a realistic ocean shader for a VisionOS project using RealityKit. It gave a bunch instructions and asked me to connect large amount of nodes for the ShaderGraph in the Reality Composer Pro. I am just wondering if AI can help to generate the Metal shader code directly, or build the ShaderGraph nodes for me automatically.
Replies
0
Boosts
0
Views
754
Activity
Mar ’26
SpriteKit framerate drop on iOS 26.3
Hello, I have noticed a performance drop on SpriteKit-based projects running on iOS 26.3. The issue seems very similar to the issue reported on iOS 26.0, and later solved from iOS 26.2 beta 3: https://developer.apple.com/forums/thread/800952?answerId=870617022#870617022 With 26.3, it seems a regression occured. Below is the same SpriteKit scene used to test framerate on different devices: import SpriteKit import SwiftUI class BareboneScene: SKScene { override func didMove(to view: SKView) { size = view.bounds.size anchorPoint = CGPoint(x: 0.5, y: 0.5) backgroundColor = .darkGray let roundedSquare = SKShapeNode(rectOf: CGSize(width: 150, height: 75), cornerRadius: 12) roundedSquare.fillColor = .systemRed roundedSquare.strokeColor = .black roundedSquare.lineWidth = 3 addChild(roundedSquare) let action = SKAction.rotate(byAngle: .pi, duration: 1) roundedSquare.run(.repeatForever(action)) } } struct BareboneSceneView: View { var body: some View { SpriteView( scene: BareboneScene(), debugOptions: [.showsFPS] ) .ignoresSafeArea() } } #Preview { BareboneSceneView() } Running this very basic spritekit scene on my device and I can not reach an FPS of 60. Instead, it stalls around 40. I see the same in my AppStore published SpriteKit games. As you can see in the discussion of the linked forum topic, the regression has been noticed by others as well. Can you please look into this, and let us know if there is an outstanding action to resolve it already? It's very problematic for published games suddenly dropping to 40FPS on even the most modern iOS devices.
Replies
4
Boosts
3
Views
1.7k
Activity
Mar ’26
RealityKit - Full 3D experience
I have a question I guess more for the Apple team. But why are there no totally 3D experiences for the Vision Pro lineup? I know they have given us tools to implement unity 3D games into iPhone and I guess you can also build it in RealityKit. But why at this moment are 3D games limited to just iPad and iPhone and can't you bring that into Vision Pro? Just to explain. When I say a totally 3D game, I mean games like Gorn. I mean the Vision Pro is definitely powerful enough, but it just feels limited to tabletop games and AR games. Is this something Apple is thinking about implementing?
Replies
1
Boosts
0
Views
1.5k
Activity
Mar ’26
Highlight or select entity in RealityKit
I'm in the process of converting my SceneKit game to RealityKit. In SceneKit I used to be able to mark nodes as selected by setting SCNMaterial.emission with a custom color. I can do the same with PhysicallyBasedMaterial.emissiveColor, but I'd like to keep my entitities unaffected by the scene lights by using UnlitMaterial. In SceneKit I can set a category mask to indicate what light should affect what node, but there doesn't seem to be such a thing in RealityKit. So at the moment it seems like I have to choose between being able to mark an entity as selected, or having entities unaffected by lighting, but not both. Is there some effect or component I can use to mark entities as selected by applying some coloring regardless of the material used?
Replies
2
Boosts
0
Views
269
Activity
Mar ’26
RealityKit animation with bindTarget: .opacity doesn't work
I want to fade objects in and out, and while setting an entity's OpacityComponent works, animating it doesn't seem to do anything. In the following code the second sphere should fade out, but it keeps its initial opacity. On the other hand, the animation that changes its transform works. What am I doing wrong? class ViewController: NSViewController { override func loadView() { let arView = ARView(frame: NSScreen.main!.frame) let anchor = AnchorEntity(.world(transform: matrix_identity_float4x4)) arView.scene.addAnchor(anchor) let sphere = ModelEntity(mesh: .generateSphere(radius: 0.5)) anchor.addChild(sphere) sphere.components.set(OpacityComponent(opacity: 0.1)) let sphere2 = ModelEntity(mesh: .generateSphere(radius: 0.5)) sphere2.position = .init(x: 0.2, y: 0, z: 0) anchor.addChild(sphere2) sphere2.components.set(OpacityComponent(opacity: 0.1)) sphere.playAnimation(try! AnimationResource.makeActionAnimation(for: FromToByAction(to: 0, timing: .linear), duration: 1, bindTarget: .opacity)) sphere.playAnimation(try! AnimationResource.makeActionAnimation(for: FromToByAction(to: Transform(translation: SIMD3(x: 0.1, y: 0, z: 0)), timing: .linear), duration: 1, bindTarget: .transform)) view = arView } }
Replies
4
Boosts
0
Views
531
Activity
Mar ’26
انشاء تطبيق جديد
اريد انشاء لعبه في ابل ستور و تكون اول صفحه تكون شروط و الاحكام و خيار بدا اللعبه200 فئات من السعوديه من مسلسل من العب من بنات و بس وقطر و الإمارات وانمي ومسلسلات تركيه و السياحه و الدول وشركات عالميه و شركات كترونيه
Replies
3
Boosts
0
Views
425
Activity
Mar ’26
RealityKit game keeps using ~65% CPU even with empty scene
I'm trying to convert my game from SceneKit to RealityKit. I noticed that even when the scene is static (nothing moves), RealityKit keeps using CPU. In SceneKit, CPU goes down to 0% with a static scene. With this simplest of games, RealityKit keeps using about 65% CPU: class ViewController: NSViewController { override func loadView() { view = ARView(frame: NSScreen.main!.frame) } } Is this expected or a bug? I created FB22125047.
Replies
0
Boosts
1
Views
201
Activity
Mar ’26
SpriteKit framerate drop on iOS 26.0
Hello, I have noticed a performance drop on SpriteKit-based projects running on iOS 26.0 (23A341). Below is a SpriteKit scene used to test framerate on different devices: import SpriteKit import SwiftUI class BareboneScene: SKScene { override func didMove(to view: SKView) { size = view.bounds.size anchorPoint = CGPoint(x: 0.5, y: 0.5) backgroundColor = .darkGray let roundedSquare = SKShapeNode(rectOf: CGSize(width: 150, height: 75), cornerRadius: 12) roundedSquare.fillColor = .systemRed roundedSquare.strokeColor = .black roundedSquare.lineWidth = 3 addChild(roundedSquare) let action = SKAction.rotate(byAngle: .pi, duration: 1) roundedSquare.run(.repeatForever(action)) } } struct BareboneSceneView: View { var body: some View { SpriteView( scene: BareboneScene(), debugOptions: [.showsFPS] ) .ignoresSafeArea() } } #Preview { BareboneSceneView() } The scene is very simple, yet framerate drops to ~40 fps as shown by the Metal HUD. Tested on: iPhone 13, iOS 26.0: framerate drops to 40 fps. Sometimes it runs at near 60fps. But if the screen is touched repeatedly, the framerate drops to 40-50 fps again. iPhone 11 Pro, iOS 26.0: ~40fps. iPad 9th Gen, iOS 18.6.2: 60fps, no issues. See screenshots attached. These numbers were observed by me and members of our beloved SpriteKit Discord server. Thank you for your attention.
Replies
15
Boosts
6
Views
3.4k
Activity
Mar ’26
Metal Shader inside Swift Package not found?
Hello everyone! I am trying to wrap a ViewModifier inside a Swift Package that bundles a metal shader file to be used in the modifier. Everything works as expected in the Preview, in the Simulator and on a real device for iOS. It also works in Preview and in the Simulator for tvOS but not on a real AppleTV. I have tried this on a 4th generation Apple TV running tvOS 26.3 using Xcode 26.2.0. Xcode logs the following: The metallib is processed and exists in the bundle. Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Compiler failed to build request precondition failure: pipeline error: custom_effect-fg2a5cia7fmha4: error: unresolved visible function reference: custom_fn Reason: visible function not loaded Contents of Package.swift: import PackageDescription let package = Package( name: "Test", platforms: [ .iOS(.v17), .tvOS(.v17) ], products: [ .library( name: "Test", targets: [ "Test" ] ) ], targets: [ .target( name: "Test", resources: [ .process("Shaders") ] ), .testTarget( name: "TestTests", dependencies: [ "Test" ] ) ] ) Content of my metal file: #include <metal_stdlib> using namespace metal; [[ stitchable ]] float2 complexWave(float2 position, float time, float2 size, float speed, float strength, float frequency) { float2 normalizedPosition = position / size; float moveAmount = time * speed; position.x += sin((normalizedPosition.x + moveAmount) * frequency) * strength; position.y += cos((normalizedPosition.y + moveAmount) * frequency) * strength; return position; } And my ViewModifier: import MetalKit import SwiftUI extension ShaderFunction { static let complexWave: ShaderFunction = { ShaderFunction( library: .bundle(.module), name: "complexWave" ) }() } extension Shader { static func complexWave(arguments: [Shader.Argument]) -> Shader { Shader(function: .complexWave, arguments: arguments) } } struct WaveModifier: ViewModifier { let start: Date = .now func body(content: Content) -> some View { TimelineView(.animation) { context in let delta = context.date.timeIntervalSince(start) content .visualEffect { view, proxy in view.distortionEffect( .complexWave( arguments: [ .float(delta), .float2(proxy.size), .float(0.5), .float(8), .float(10) ] ), maxSampleOffset: .zero ) } } .onAppear { let paths = Bundle.module.paths(forResourcesOfType: "metallib", inDirectory: nil) print(paths) } } } extension View { public func wave() -> some View { modifier(WaveModifier()) } } #Preview { Image(systemName: "cart") .wave() } Any help is appreciated.
Replies
0
Boosts
0
Views
454
Activity
Mar ’26
Equivalent in RealityKit of SceneKit's SCNAction.customAction to run custom animation
I'm currently converting my game from SceneKit to RealityKit. What's the easiest way to run an animation in which every frame I want to run custom code, similar to SCNAction.customAction which accepts a callback that is called repeatedly until the action completes?
Replies
0
Boosts
0
Views
433
Activity
Feb ’26
vImageBuffer_InitWithCGImage fails with Xcode 26 but succeeds with Xcode 15I am
I am attempting to load a jpeg image into a vImage_Buffer. I am just trying to get the data in an ARGB format. This code works fine in the Xcode 15 build , but fails with kvImageInvalidParameter error from vImageBuffer_InitWithCGImage in the Xcode 26 build. This code is written in ObjectiveC++. Here is a code fragment: int CDib_ARGB::Load(LPCSTR pFilename) { int rc = 0; if (NULL == m_pRaw_vImage_Buffer) { NSString *pNS_filename = [[NSString alloc]initWithUTF8String:pFilename]; NSImage *pNSImage = [[NSImage alloc] initWithContentsOfFile:pNS_filename]; if (nil == pNSImage) rc = -1; else { int width = pNSImage.size.width; int height = pNSImage.size.height; if (pNSImage.representations) { NSImageRep *imageRep; int jj; width = 0; height = 0; for (jj = 0; jj < pNSImage.representations.count; jj++) { imageRep = pNSImage.representations[jj]; if (imageRep.pixelsWide > width) width = imageRep.pixelsWide; if (imageRep.pixelsHigh > height) height = imageRep.pixelsHigh; } } NSSize imageSize = NSMakeSize(width, height); NSRect imageRect = NSMakeRect(0, 0, width, height); pNSImage.size = imageSize; CGImageRef cgImage = [pNSImage CGImageForProposedRect:&imageRect context:NULL hints:nil]; if (nil == cgImage) rc = -1; else { //Alloc and load vImage_Buffer. vImage_Buffer *pvImage_Buffer = new vImage_Buffer; if (NULL == pvImage_Buffer) rc = -1; else { vImage_CGImageFormat format; format.bitsPerComponent = 8; format.bitsPerPixel = 32; format.colorSpace = nil; format.bitmapInfo = kCGImageAlphaFirst | kCGBitmapByteOrderDefault;//ARGB8888 format.version = 0; format.decode = nil; format.renderingIntent = kCGRenderingIntentDefault; memset(pvImage_Buffer, 0, sizeof(vImage_Buffer)); long status = vImageBuffer_InitWithCGImage(pvImage_Buffer, &format, nil, cgImage, kvImagePrintDiagnosticsToConsole); if (kvImageNoError != status) { //This is where Xcode 26 sends me. delete pvImage_Buffer; rc = -1; } =========================
Replies
13
Boosts
0
Views
585
Activity
Feb ’26