This was working a few days ago, but it has since stopped and I can't figure out why. I've tried resetting TCC, double-checking my entitlements, restarting, deleting and rebuilding, and nothing works.
My app is a sandboxed macOS SwiftUI LSUIElement app that, when invoked, checks to see if the frontmost process is Terminal, then tries to get the frontmost window’s title.
func
getFrontmostWindowTitle()
throws
-> String?
{
let trusted = AXIsProcessTrusted()
print("getFrontmostWindowTitle AX trusted: \(trusted)")
guard let app = NSWorkspace.shared.frontmostApplication else { return nil }
let appElement = AXUIElementCreateApplication(app.processIdentifier)
var focusedWindow: AnyObject?
let status = AXUIElementCopyAttributeValue(appElement, kAXFocusedWindowAttribute as CFString, &focusedWindow)
guard
status == .success,
let window = focusedWindow
else
{
if status == .cannotComplete
{
throw Errors.needAccessibilityPermission
}
return nil
}
var title: AnyObject?
let titleStatus = AXUIElementCopyAttributeValue(window as! AXUIElement, kAXTitleAttribute as CFString, &title)
guard titleStatus == .success else { return nil }
return title as? String
}
I recently renamed the app, but the Bundle ID has not yet changed. I have com.apple.security.accessibility set to YES in the Entitlements file (although i had to add it manually), and a NSAccessibilityUsageDescription string set in Info.plist.
The first time I ran this, macOS nicely prompted for permission. Now it won't do that, even when I use AXIsProcessTrustedWithOptions() to try to force it.
If I use tccutil to reset accessibility and apple events, it still doesn't prompt. If I drag my app from the build products folder to System Settings, it gets added to the system TCC DB (not the user DB). It shows an auth value of 2 for my app:
% sudo sqlite3 "/Library/Application Support/com.apple.TCC/TCC.db" "SELECT client,auth_value FROM access WHERE service='kTCCServiceAccessibility' OR service='kTCCServiceAppleEvents';"
com.latencyzero.<redacted>|2
<redactd>
I'm at a loss as to what went wrong. I proved out the concept earlier and it worked, and have since spent a lot of time enhancing and polishing the app, and now things aren't working and I'm starting to worry.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Have tried to join the developer programme and says its still pending after 3 days.
Anyone any idea how long the procedure takes??
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m seeing a layout issue in SwiftUI on iOS 26 that only reproduces with specific Accessibility Motion settings.
Steps to reproduce
1. Open Settings → Accessibility → Motion.
2. Enable Reduce Motion and Prefer Cross-Fade Transitions.
3. Launch an app with a SwiftUI TextField.
4. Tap the field to show the keyboard.
5. Dismiss the keyboard (tap outside, swipe down, etc.).
Expected:
After the keyboard is dismissed, the view’s bottom safe area / layout should return to normal.
Actual:
The view continues to reserve space equal to the keyboard height — as if the keyboard were still visible. UI anchored to the safe area remains shifted upward until the view is reloaded.
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients.
We need to know which item has the VoiceOver focus so we can keep track of it.
setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused).
At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder.
Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element?
Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking
If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
Good day!
I have a long-term project ported all the way up from old Think C through many versions of Xcode. Its source files are encoded in "Western (Mac OS Roman)".
Some of my error messages have characters outside the straight ASCII character set (i.e. "å"). The editor correctly displays these, but I get plenty of Illegal Character warnings and the messages do not display properly.
I imagine there's a way to have seperate files of localized text for internationalized applications, but I am the only end-user of this application, and it used to just plain work in earlier Xcode versions. Furthermore, there must be developers throughout Europe who use such characters in string literals, just typing in their native languages, straight off their keyboards.
I was thinking that there must be a Clang setting or something, but have been unable to find it, and an internet search turns up no solution except to cumbersomely escape each individual character. I can't imagine that a French programmer does that every time they want to type "è", "é", or "à"!
Any help? (Disclaimer: I'm an English speaker and only use such characters whimsically, but want to keep them for legacy's sake.)
Thanks....
p.s. using Xcode 15.3, and under Settings->Text Editing->Editing, "Western (Mac OS Roman)" is already selected as the default text encoding with "Convert existing files on save" checked.
Environment:xcode 16.2
WidgetKit: Image(uiImage: UIImage(named: "jp_jump")!).resizable().scaledToFit().frame(width: 58, height: 16).padding(EdgeInsets(top: 0, leading: 16, bottom: 0, trailing: 0))
”jp_jump“: Local color picture load widget crashes
info:
Thread 4: EXC_RESOURCE (RESOURCE_TYPE_MEMORY: high watermark memory limit exceeded) (limit=30 MB)
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
Hi,
I've wrapped AVRoutePickerView in SwiftUI using pretty much the code given here, with a few changes:
func makeUIView(context: Context) -> UIView {
let routePickerView = AVRoutePickerView()
// Configure the button's color.
//routePickerView.delegate = context.coordinator
//routePickerView.backgroundColor = .secondarySystemBackground
routePickerView.tintColor = .accent
routePickerView.activeTintColor = .accent
// Indicate whether your app prefers video content.
routePickerView.prioritizesVideoDevices = false
return routePickerView
}
I commented out routePickerView.delegate = context.coordinator because it doesn't compile; context.coordinator is of type Void and I'm not sure how to fix that. I'm not sure if that has anything to do with the issue.
Anyway, this works fine without VoiceOver; if I tap the button, I get the AirPlay popover. But in VoiceOver, if I select the button and double-tap, nothing happens… it just reads the button's accessibilityLabel again. How can I get the AirPlay popover to show in VoiceOver?
accessibilityUserInputLabels is working fine with any view I tried this on. Meaning that the control can be toggled with the provided alternative names when using Voice Control.
When setting this property on any UIBarButtonItem though, it seems Voice Control ignores the alternative names provided by setting accessibilityUserInputLabels. For comparison, accessibilityLabel works perfectly when set on UIBarButtonItem.
Is anyone facing the same issue?
Using Xcode 16.0 (16A242) on iOS 18
I’m currently exploring VoiceOver accessibility in iOS and looking for the best way to reduce the number of swipes required to navigate a UITableView. I’ve come across a couple of potential solutions but am unsure which is preferred.
Solution 1: Grouping Subviews in Each Cell
Combine all subviews inside a UITableViewCell into a single accessibility element.
Provide a concise and meaningful accessibilityLabel.
Use custom actions (UIAccessibilityCustomAction) or accessibilityActivationPoint to handle interactions on specific elements within the cell.
Solution 2: Using UIAccessibilityContainerDataTableCell & UIAccessibilityContainerDataTable
Implement UIAccessibilityContainerDataTable for structured table navigation.
Make each cell conform to UIAccessibilityContainerDataTableCell, defining its row and column positions.
However, I’m finding this approach a bit complex, and I need guidance on properly implementing these protocols.
Additionally, in my case, VoiceOver is not navigating to Section 2—I’m not sure why.
Questions:
Which of these approaches is generally preferred for better VoiceOver navigation?
How do I properly implement UIAccessibilityContainerDataTable so that all sections and rows are navigable?
Any best practices or alternative recommendations?
Would really appreciate any insights or guidance!
Hope it's okay to post here - I haven't gotten resolution anywhere else. Apple's iOs Live Captions is supposed to translate speech into written text either on the phone (works like a charm!) or via microphone (think meeting in a conference room). Microphone doesn't work anywhere, anytime on a new iPhone 14 purchased November 2024. Anyone out there want to fix this and help a lot of people who have trouble hearing? I'm part of an entire generation that didn't know we were supposed to protect our hearing at concerts and clubs and worse, thought it was cool to snag a spot by the speakers...
When I am doing a file search, in TextEdit, and on certain webistes the space bar will quit functioning as soon as i start typing. If I hold down the "Option" key it allows the space bar to work as normal. I have checked every setting I can think of and nothing has helped.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi everyone,
After installing the macOS beta (Tahoe 26.0) on my MacBook Pro M3, I’ve noticed two issues:
Significant increase in system temperature
The laptop feels hot even with light usage like Safari and Figma
Rapid battery drain
Battery is dropping unusually fast compared to macOS Sonoma.
I’ve tried, Restarting the device.
I’m aware this is a beta, but just wondering.
Is anyone else experiencing this?
Is this a known issue?
Would love to hear if others are facing similar problems or if it might be something specific to my setup.
Thanks in advance!
Topic:
Accessibility & Inclusion
SubTopic:
General
My game app is text-based interactive fiction, containing no audio/video content, making captions unnecessary. Our game is completely accessible to deaf users.
Despite this, in the Accessibility Nutrition Label, I'm only able to leave the "Captions" box checked or unchecked. Leaving it unchecked would leave deaf players with the wrong impression that they can't enjoy our game. Leaving it checked would imply that we do have A/V content with captions included.
In the WWDC video on this, https://developer.apple.com/videos/play/wwdc2025/224/ the video says:
After we completed common tasks, we realized our app doesn’t have any video or audio only content. In this case, we aren’t going to indicate that Landmarks supports Captions. That's okay. This accurately describes the features that people will expect to be available while using the app.
Maybe that's "OK," but I wish the form allowed me to say "This app doesn't contain audio/video content."
I wrote this in the regular forums and they deleted it and told me to write it here because it was dealing with unreleased software. I read that Launchpad is disappearing in Tahoe and I have real concerns about that. For me, that is an accessibility issue. I have both memory problems and scanning problems. So having my apps organized into categories is extremely important to me. Just today I needed to find an app that I didn't remember the name of and I rarely use, but when I need it, it is important to me. Just to see if I could find it without launchpad, I scanned my applications folder and I couldn't find it. I went to launchpad and to the category I knew it was in and it was right there, easy for me to find. Please don't take away our organization options.
We are unable to programmatically enable AppleScript automation for VoiceOver on macOS 15 (Sequoia)
In macOS 15, Apple moved the VoiceOver configuration from:
~/Library/Preferences/com.apple.VoiceOver4/default.plist
to a sandboxed path:
~/Library/Group Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist
Steps to Reproduce:
Use a macOS 15 (ARM64) machine (or GitHub Actions runner image with macOS 15 ARM).
Open VoiceOver:
open /System/Library/CoreServices/VoiceOver.app
Set the SCREnableAppleScript flag to true in the new sandboxed .plist:
plutil -replace SCREnableAppleScript -bool true ~/Library/Group\ Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist
Confirm csrutil status is either disabled or not enforced.
Attempt to control VoiceOver via AppleScript (e.g., using osascript voiceOverPerform.applescript).
Observe that the AppleScript command fails with no useful output (exit code 1), and VoiceOver does not respond to automation.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
macOS
Accessibility
App Sandbox
AppleScript
I have a parent view containing 10 subviews. To control the VoiceOver navigation order, I set only a few elements in accessibilityElements. However, the remaining elements are not being focused or are completely inaccessible.
Is this the expected behavior? If I only specify a subset of elements in accessibilityElements, does it exclude the rest? What’s the best way to ensure all elements remain accessible while customising the order?
When my macOS Cocoa app displays a modal alert with beginSheetModal(for:completionHandler:), VoiceOver sometimes seems to focus on an "illegal" upper level, where any attempts at navigation will give the unhelpful response "Alert, dialog", until you "drill down" with VO + shift + down or switch apps. After that, things will work as expected.
Is this a known bug? Does it happen to anybody else, or am I doing something wrong?
Watched videos, blog post and downloaded their projects and there the core spot lights works accordingly.
I copied code to an empty project and did the same as what they did but still is not working
os: macOS and iOS
on coredataobject I settled up a attribute to index for spotlight and in object it self I putted the attribute name in display name for spotlight.
static let shared = PersistenceController()
var spotlightDelegate: NSCoreDataCoreSpotlightDelegate?
@MainActor
static let preview: PersistenceController = {
let result = PersistenceController(inMemory: true)
let viewContext = result.container.viewContext
for _ in 0..<10 {
let newItem = Item(context: viewContext)
newItem.timestamp = Date()
}
do {
try viewContext.save()
} catch {
let nsError = error as NSError
fatalError("Unresolved error \(nsError), \(nsError.userInfo)")
}
return result
}()
let container: NSPersistentContainer
init(inMemory: Bool = false) {
container = NSPersistentContainer(name: "SpotLightSearchTest")
if inMemory {
container.persistentStoreDescriptions.first!.url = URL(fileURLWithPath: "/dev/null")
}
container.loadPersistentStores(completionHandler: { [weak self] (storeDescription, error) in
if let error = error as NSError? {
fatalError("Unresolved error \(error), \(error.userInfo)")
}
if let description = self?.container.persistentStoreDescriptions.first {
description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey)
description.type = NSSQLiteStoreType
if let coordinator = self?.container.persistentStoreCoordinator {
self?.spotlightDelegate = NSCoreDataCoreSpotlightDelegate(
forStoreWith: description,
coordinator: coordinator
)
self?.spotlightDelegate?.startSpotlightIndexing()
}
}
})
container.viewContext.automaticallyMergesChangesFromParent = true
}
}
in my @main view
struct SpotLightSearchTestApp: App {
let persistenceController = PersistenceController.shared
var body: some Scene {
WindowGroup {
ContentView()
.environment(\.managedObjectContext, persistenceController.container.viewContext)
.onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
}
}
}
onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
never gets triggered. Sow What am I missing that they dont explain in the blog post or videos ?
After enabling Developer Mode on my iPhone and restarting it, the device asks me to press the Home button to confirm. Unfortunately, my Home button is broken, so I can’t access Developer Mode. The iPhone itself still works, but I can’t enable the mode. Is there any way to bypass this without the Home button?
Topic:
Accessibility & Inclusion
SubTopic:
General