Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

How to Receive Callbacks for UIAccessibilityAction Methods Like accessibilityPerformMagicTap()?
I’ve tried implementing the accessibilityPerformMagicTap() method in a specific UIViewController, its view, and even in AppDelegate, but I am not receiving any callbacks. I directly overrode this method in the mentioned areas, but it never gets triggered when performing a magic tap. How can I properly observe and handle the accessibilityPerformMagicTap() action?
3
0
473
Mar ’25
To say I'm extremely hurt is an understatement! 😔
Voice Control Disabling System Services After Reboot I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem. To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months. I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem. I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter. Sincerely, Donald Spencer Kirby Dayton, Ohio
2
0
173
Jun ’25
IOS 26 Full Keyboard Access (navigation) and WKWebView
We use an embedded WKWebView for several screens in our app. Recently, we have been testing keyboard navigation via Full Keyboard Access in our apps. On IOS 18, everything works pretty much as expected. On IOS 26, it does not. On IOS 26, you can "tab" away from the webview and then never tab back to the webview for keyboard navigation. Is this a known issue? Are there workarounds for this issue that anyone is aware of?
2
0
363
Nov ’25
VoiceOver cursor focus tracking
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients. We need to know which item has the VoiceOver focus so we can keep track of it. setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused). At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder. Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element? Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
4
0
485
Mar ’25
Attempting to go directly to a help book page results in the main help book page being displayed instead
There is an issue with Help Books that started with the release of macOS 14.4. The issue is that when an app attempts to go directly to a Help Book page, the help viewer opens to the Help Book's main index page, rather than the specific page requested. As I investigated the issue I found that the requested page was actually part of help viewer's navigation history, and all I had to do was to click the Back navigation arrow and the requested page would be displayed. So it seems like the requested page is momentarily visited but is then (for whatever reason) quickly replaced by the main index page. Our app uses the AHGotoPage() API for directly accessing our Help Book's pages. This is the same mechanism/code that our app has used for more than a decade and has never caused us any issues. Everything works fine on macOS 14.3.0 and earlier. I've scoured the documentation and can't find any newer APIs for accessing Help pages. I've also tried various other things (e.g. reworking the code, creating new indexes for the app's Help, etc.), but none of it seems to make a difference. As far as I can tell, the issue seems to stem from some change made to the OS. So my questions are: Is this a known bug? And if so, is there any ETA on a fix? Is there something different we should be doing for newer versions of the OS (create indexes differently, use a different API, etc.)?
3
0
1.9k
Oct ’25
Programmatically Modifying Per Application or System Wide Color Filters using Cocoa/Swift in MacOS?
I'm looking into how to programmatically control color filters in the Accessibility settings under "System Settings" -> "Accessibility" -> "Color Filters"--in particular the "Intensity" and "Filter type" settings. As far as I have gathered, changing this setting can only be accomplished using the CoreGraphics APIs or Accessibility APIs (I've poked around GitHub, Stack Overflow, and queried some LLMs), but there doesn't seem to be a clear cut example for doing this using public facing APIs, without ripping off source code from another project wholesale or using private APIs. My goal is to overlay a color filter at either a per-application or system level to help with accessibility. If there's a way to overlay this capability on an application-by-application basis as a third-party developer, that would be the most ideal scenario. For example, modifying the look and feel/UX for Launchpad, Photos, etc, as a third-party developer without accessing the source code of the application that I'm modifying the look/feel for (with appropriate user consent of course).
0
0
375
Jul ’25
Custom tab bar in SwiftUI
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused. According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup. In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact. So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
2
0
324
Aug ’25
VoiceOver doesn't work for AVRoutePickerView wrapped in UIViewRepresentable
Hi, I've wrapped AVRoutePickerView in SwiftUI using pretty much the code given here, with a few changes: func makeUIView(context: Context) -> UIView { let routePickerView = AVRoutePickerView() // Configure the button's color. //routePickerView.delegate = context.coordinator //routePickerView.backgroundColor = .secondarySystemBackground routePickerView.tintColor = .accent routePickerView.activeTintColor = .accent // Indicate whether your app prefers video content. routePickerView.prioritizesVideoDevices = false return routePickerView } I commented out routePickerView.delegate = context.coordinator because it doesn't compile; context.coordinator is of type Void and I'm not sure how to fix that. I'm not sure if that has anything to do with the issue. Anyway, this works fine without VoiceOver; if I tap the button, I get the AirPlay popover. But in VoiceOver, if I select the button and double-tap, nothing happens… it just reads the button's accessibilityLabel again. How can I get the AirPlay popover to show in VoiceOver?
3
0
385
Aug ’25
AXIsProcessTrusted returns true, but AXUIElementCopyAttributeValue fails with .cannotComplete
This was working a few days ago, but it has since stopped and I can't figure out why. I've tried resetting TCC, double-checking my entitlements, restarting, deleting and rebuilding, and nothing works. My app is a sandboxed macOS SwiftUI LSUIElement app that, when invoked, checks to see if the frontmost process is Terminal, then tries to get the frontmost window’s title. func getFrontmostWindowTitle() throws -> String? { let trusted = AXIsProcessTrusted() print("getFrontmostWindowTitle AX trusted: \(trusted)") guard let app = NSWorkspace.shared.frontmostApplication else { return nil } let appElement = AXUIElementCreateApplication(app.processIdentifier) var focusedWindow: AnyObject? let status = AXUIElementCopyAttributeValue(appElement, kAXFocusedWindowAttribute as CFString, &focusedWindow) guard status == .success, let window = focusedWindow else { if status == .cannotComplete { throw Errors.needAccessibilityPermission } return nil } var title: AnyObject? let titleStatus = AXUIElementCopyAttributeValue(window as! AXUIElement, kAXTitleAttribute as CFString, &title) guard titleStatus == .success else { return nil } return title as? String } I recently renamed the app, but the Bundle ID has not yet changed. I have com.apple.security.accessibility set to YES in the Entitlements file (although i had to add it manually), and a NSAccessibilityUsageDescription string set in Info.plist. The first time I ran this, macOS nicely prompted for permission. Now it won't do that, even when I use AXIsProcessTrustedWithOptions() to try to force it. If I use tccutil to reset accessibility and apple events, it still doesn't prompt. If I drag my app from the build products folder to System Settings, it gets added to the system TCC DB (not the user DB). It shows an auth value of 2 for my app: % sudo sqlite3 "/Library/Application Support/com.apple.TCC/TCC.db" "SELECT client,auth_value FROM access WHERE service='kTCCServiceAccessibility' OR service='kTCCServiceAppleEvents';" com.latencyzero.<redacted>|2 <redactd> I'm at a loss as to what went wrong. I proved out the concept earlier and it worked, and have since spent a lot of time enhancing and polishing the app, and now things aren't working and I'm starting to worry.
4
0
1k
Jul ’25
Speak Selection broken with SwiftUI text
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection. Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
1
0
235
Jul ’25
Microphone Not Working When Running Unity Vision Pro App Normally
} // Start listening to the microphone public void StartListening() { if (!isListening) { #if UNITY_IOS || UNITY_TVOS microphoneInput = Microphone.Start(null, true, 10, 44100); #else try { microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100 if (microphoneInput == null) { microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate); } #endif isListening = true; Debug.Log(Microphone.devices.Length + " Started listening..."); debugText.text = Microphone.devices.Length + "- Started listening..."; } catch (System.Exception e) { Debug.LogError($"Starting microphone failed: {e.Message}"); debugText.text = $"Starting microphone failed: {e.Message}"; } } } void Update() { if (isListening && microphoneInput != null) { // Analyze the audio for voice activity float volume = GetAverageVolume(); if (volume > detectionThreshold) { Debug.Log("User is speaking!"); lastVoiceTime = Time.time; SoundDetected = true; if (Time.time - lastVoiceTime > silenceDuration) { Debug.Log("User is silent."); debugText.text = volume.ToString() + " - User is silent."; } slider.value = volume; } } } private float GetAverageVolume() { float[] samples = new float[128]; microphoneInput.GetData(samples, Microphone.GetPosition(null)); float sum = 0f; foreach (float sample in samples) { sum += Mathf.Abs(sample); } return sum / samples.Length; } Problem: When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected. Question: Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario? Thanks in advance for any insights! Best, Siddharth
2
0
437
Feb ’25
VoiceOver doesn't work well for accessing PDFs/forms with tables
I have been working to remediate PDFs for a client. The documents/forms have many tables. When I correctly tag a table, using Foxit Editor Pro, it works beautifully on a PC reading it with NVDA. On Mac using VoiceOver the table isn't accessible. It doesn't matter if I try to read it in Adobe Acrobat, Foxit, or Preview. The reader often says the document is empty, omits column headers, and/or associates the wrong header with the column data. The documents have essentially the same coding behind them as for the web. Why is it they perform so well on a PC with NVDA, but so poorly with Mac VoiceOver? I am a Quality Assurance Specialist. I review websites apps, and documents for accessibility. Why can't I do my job using only my Mac system? As a Mac user, it frustrates me that I can't use my preferred system for checking documents to see if they are accessible because VoiceOver doesn't work well. I actually have to recommend to my clients and their customers that they need to use a PC with NVDA or Jaws for these documents to be able to get all the information. Unfortunately, most people aren't able to have, or maintain, both systems. Overall, Mac products are very high quality. This, and other issues with VoiceOver, seems to be a large gap in Apple's offerings and functionality. I would appreciate a human response to the original email I sent about this on 7/30/2025.
1
0
132
Jul ’25
NSAlert button background/contrast
If I use NSAlert the buttons look like this: The Cancel button has a gray background. We got complaints about the bad contrast and people pointed out that the alerts from System Settings look like this: Here the Cancel button has a white background. Unfortunately I did not find out how to make the buttons in my own alerts look like those in System Settings. Setting the button's bezel color to white did not work. Any help would be highly appreciated. Thanks. Best regards, Marc
4
0
540
Feb ’25
Programmatically force VoiceOver to read parentheses for math expressions
How can I force VoiceOver to read parentheses for math expressions like this: Text("(2+3)×4") // VoiceOver: Two plus three, times four I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks. Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.” Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
2
0
371
Jul ’25
Guided Access Mode From Background
My team is designing an app for retail associates that need to share managed iPads. We keep the app in Guided Access mode on our login app until an auth token is obtained. Then the iPad is opened for general use. Upon signout we need to re-enter guided access mode and we can do this via manual signout easily. But with idle signout, ie after 60 minutes of inactivity, we need to be able to make a call from the background (in a locked state even) and sign out the user, clear the pin code and enter single app mode before restarting. So that hopefully once the device restarts, we have the app in a locked state again until the next user provides credentials that can obtain a new auth token. We are struggling to see if this is even possible. Our bosses will be displeased if we tell them it isn't. So anybody with any tips would be very appreciated.
2
0
253
Mar ’25