Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

SwiftUI tvOS Accessibility VoiceOver - prevent reading all items in ScrollView over and over
Hi, I'm trying to fix tvOS view for VoiceOver accessibility feature: TabView { // 5 tabs Text(title) Button(play) ScrollView { // Live LazyHStack { 200 items } } ScrollView { // Continue watching LazyHStack { 500 items } } } When the view shows up VoiceOver reads: "Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack. VocieOver should only read "Home tab 1 of 5" When moving focus to scroll view it reads: "Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to second item it reads: "Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to third item it reads: "Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4" It should be just reading what is focused, idealy just "Live, Item 1, 1 of 200" then after moving focus on item 2 "Item 2, 2 of 200" this time without the word "Live" because we are on the same scroll view (the same horizontal list) Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused. This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction. Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists. How do I disable reading content that is not focused? I tried: .accessibilityLabel(isFocused ? title : "") .accessibilityHidden(!isFocused) .accessibilityHidden(true) - tried on various levels in view hierarchy .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .contain) - tried on various levels in view hierarchy .accessiblityElement(children: .combine) - tried on various levels in view hierarchy .accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy .accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy // the last 2 was basically an attempt to hack it .accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view. 50+ other attempts at configuring accessibility tags attached to views. I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before. Any idea how to fix this? Thanks.
1
0
163
Apr ’25
Using WebSocket for BCI Click Input in VisionOS - FocusState vs. System-Level Limitations
Hi everyone, My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment. Problem 1: Simulating Eye Tracking in Simulator We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro? Problem 2: WebSocket-triggered "Click" doesn't work outside the app We successfully use WebSocket to send a custom signal (a "1" from the brain signal device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works. We suspect this is due to sandbox restrictions or lack of system-level permissions. Is it possible in anyway to: Trigger interaction events outside the app using custom input (like BCI via Websocket)? Access system-wide click/tap simulation APIs from within VisionOS apps Integrate this with accessibility services (like Voice Control or AssistiveTouch) We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS. Thanks in advance for any insight you can provide!
0
0
221
Apr ’25
SwiftUI List Accessibility VoiceOver
I have been working on a feature, where I have a List in SwiftUI with previous and next data loading, user can scroll up and down to load previous/next page data. Recently, I faced one accessibility issue while testing voice-over, when user lands on the listing screen and swipe across the screen from navigation and when focus comes on list it should highlight the first item visible. But when user swipes back: Should it load the previous data and announce the previous item or it should go back to the navigation items? If it loads the previous item, what if the user wants to go to the navigation to switch to other actions and vice-versa? Did anyone come across this kind of issue? What can be the standard expected behavior in this case if list has both previous and next page scroll? I different tried gestures https://support.apple.com/en-in/guide/iphone/iph3e2e2281/ios, but it isn't working
2
0
136
Apr ’25
Attaching procedural audio to an ARKit SCNNode
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post Working Implementation with Static Audio File: let audioPlayer = SCNAudioPlayer(source: audioSource) node.addAudioPlayer(audioPlayer) Attempted Implementation with Procedural Audio: // Audio generation code } let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode) node.addAudioPlayer(audioPlayer) In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here: Apple docs Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating: Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use. However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
0
0
234
Apr ’25
SpaceBar Not functioning as expected
When I am doing a file search, in TextEdit, and on certain webistes the space bar will quit functioning as soon as i start typing. If I hold down the "Option" key it allows the space bar to work as normal. I have checked every setting I can think of and nothing has helped.
3
0
169
Apr ’25
Add VoiceOver touch gesture guidance for frame iframe in webView and Safari web
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures. Specifically... iframes. There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch. If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users. VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes. VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor. While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
1
0
155
Apr ’25
iPhone screen layers
I need to understand the different layers that are there in the iPhone X and later OLED screens as I am designing a hardware attachment. They seem to be projecting letters and images from a different layer than the subpixel layer. Is this proprietary information, or is there a resource that explores them?
0
0
130
Apr ’25
What is the appropriate accessibility trait for selectable UITableViewCell?
I’m trying to understand the best practice for assigning accessibilityTraits to a UITableViewCell that users can select from a list of options. In Apple’s first-party apps like Settings, I’ve noticed an inconsistent approach—some cells use the Button trait, while others simply announce the label along with the Selected trait when applicable, without any additional role like Button or Adjustable. So my question is: What is the most appropriate accessibility trait to use for a selectable table view cell that updates a selection (like a settings option)? Is using .button the right approach, or should we rely solely on .selected? Is there any user experience guideline from Apple that recommends one over the other? Would love to hear how others handle this for clarity and consistency in VoiceOver behavior.
1
0
170
Apr ’25
MAS restrictions on file read-write for desktop electron apps
We have an electron app developed for Mac. We would like to restore the user data previously saved in downloads once user installs the app from store and first launch. But MAS has restrictions with ""com.apple.security.files.downloads.read-write". We have enabled the user access in Entitlement files and request user permission before access What options can be user to auto restore the data from downlodas?
0
0
100
Apr ’25
Defining boundaries of inline dialogs for VO users
Hello, I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182. The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends? In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs. I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
0
0
141
May ’25
False 3.1.1 Rejection: Real-World Dues Payments App
Hello everyone, Our community dues payment app only facilitates real-world maintenance-dues payments directly to property managers’ bank accounts. However, during testing it was likely flagged by the AI-driven review system for a metadata criterion and rejected under Guideline 3.1.1 (“Paid digital content must use IAP”). Meanwhile, hundreds of similar apps remain live on the App Store using the exact same model: The app is completely free No digital content or subscriptions are sold Dues payments are made via bank transfer or credit card directly to the manager Has anyone else encountered this? How did you overcome the metadata check in the AI-driven review process? Thanks!
0
0
122
May ’25
AXChildren does not get all children
I'd like to add borders to all buttons in the iOS simulator from my Mac app. First I get the simulator window. Then I access the children of all AXGroup and if it's a button or a static text, I add a border. But for some buttons this does not work. In the example image the NavigationBarButtons are not found. I guess the problem is, that for some AXGroup the children array access with AXChildren is empty. Here is some relevant code: - (NSArray<DDHOverlayElement *> *)overlayChildrenOfUIElement:(AXUIElementRef)element index:(NSInteger)index { NSMutableArray<DDHOverlayElement *> *tempOverlayElements = [[NSMutableArray alloc] init]; NSLog(@">>> -----------------------------------------------------"); NSString *role = [UIElementUtilities roleOfUIElement:element]; NSRect frame = [UIElementUtilities frameOfUIElement:element]; NSLog(@"%@, role: %@, %@", element, role, [NSValue valueWithRect:frame]); NSArray *lineage = [UIElementUtilities lineageOfUIElement:element]; NSLog(@"lineage: %@", lineage); NSArray<NSValue *> *children = [UIElementUtilities childrenOfUIElement:element]; if (children.count < 1) { NSLog(@"NO CHILDREN"); } for (NSInteger i = 0; i < [children count]; i++) { NSValue *child = children[i]; AXUIElementRef uiElement = (__bridge AXUIElementRef)child; NSString *role = [UIElementUtilities roleOfUIElement:uiElement]; NSRect frame = [UIElementUtilities frameOfUIElement:uiElement]; NSLog(@"----%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]); } NSLog(@"<<< -----------------------------------------------------"); for (NSInteger i = 0; i < [children count]; i++) { NSValue *child = children[i]; AXUIElementRef uiElement = (__bridge AXUIElementRef)child; NSString *role = [UIElementUtilities roleOfUIElement:uiElement]; NSRect frame = [UIElementUtilities frameOfUIElement:uiElement]; NSLog(@"%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]); if ([role isEqualToString:@"AXButton"] || [role isEqualToString:@"AXTextField"] || [role isEqualToString:@"AXStaticText"]) { NSString *tag = [NSString stringWithFormat:@"%ld%ld", (long)index, (long)i]; NSLog(@"tag: %@", tag); DDHOverlayElement *overlayElement = [[DDHOverlayElement alloc] initWithUIElementValue:child tag:tag]; [tempOverlayElements addObject:overlayElement]; } else if ([role isEqualToString:@"AXGroup"] || [role isEqualToString:@"AXToolbar"]) { [tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:++index]]; } else if ([role isEqualToString:@"AXWindow"]) { [self.overlayWindowController setFrame:[UIElementUtilities frameOfUIElement:uiElement]]; [tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:index]]; } } return [tempOverlayElements copy]; } For some AXGroup the children are found. For some they are empty. I cannot figure out why. Does anyone have an idea what I'm doing wrong?
2
0
163
May ’25
Making PhotoLibrary UIImagePickerController a11y compliant
I am invoking the UIImagePickerController of type UIImagePickerControllerSourceTypePhotoLibrary from my viewController. I want shift the keyboard focus to the Cancel button which is the first interactive element on the gallery picker. When a user has full keyboard access turned on they should be able to tap tab and interact with the gallery picker modal. How do I achieve this?
1
0
156
May ’25
Apple greets Global Accessibility Awareness Day with severe accessibility violations on macOS
I'm reposting here my FB17602742, submitted yesterday: The strong wording of this message comes from years of Apple ignoring the needs of users who can't tolerate UI animations and convulsions. At this point, it's clear that Apple is either intentionally harming users like me or simply doesn't care about meeting even the most basic accessibility standards on macOS. Yes, many UI animations and convulsions can, fortunately, be disabled - but not through straightforward UI controls. Instead, users are forced to look for obscure Terminal commands found scattered across the Internet. The "Reduce motion" checkbox in System Settings is simply a fake control that doesn't do anything - instead of actually disabling all UI animations and convulsions. What's worse, two of the most offensive UI animations cannot be disabled at all. Apple has consistently dismissed requests to let users disable the following UI animations: Scroll bar rollover highlight effect (introduced on macOS 10.7.3). Every time the cursor passes over a scroll bar, it gets highlighted. This draws the user's attention to random scroll bars for no reason - just because the cursor happened to pass over them. It results in HUNDREDS of unnecessary, annoying events of distraction daily!
 Expand/collapse animation of NSOutlineView (e.g., when opening/closing folders in the list view in the Finder, or any other app using outline views). This animation is extremely distracting, irritating, and time-wasting. Global Accessibility Awareness Day is approaching. Dear Apple, Please adhere to the most basic accessibility standards. Stop the needless suffering of countless users like me. Let us disable the two aforementioned UI convulsions. Thank you for your attention to the issue.
0
0
166
May ’25
Clarification on Color Path Determination in Wallet Provisioning (Green,Yellow, Orange) Path recommendation
Hi, I’ve been reviewing the Apple Wallet provisioning documentation (Getting Started with Apple Pay In-App Provisioning_ Verification_Security_Wallet Extensions )and had a few questions regarding the color path recommendation (Green, Yellow, Orange, Red) returned during the in-app provisioning flow: Who determines the color path—is it Apple directly, the Payment Network Operator (PNO), or both? What criteria are used to determine the color path (e.g., device info, Apple ID reputation, past provisioning attempts)? At what point in the provisioning flow is the color path recommendation received? Is it included in the response after the PKAddPaymentPassRequest is submitted? Is it accessible through any specific property or callback in the delegate method? Additionally, for Orange Path with Reason Code 0G, I understand that in-app verification is not allowed and must be handled via tenured channels (e.g., SMS/email). Can you confirm if this logic still applies for requests initiated from within the issuer's iOS app? Would appreciate any clarification or pointers to related documentation.
0
0
162
May ’25
VoiceOver is not respecting lang in HTML option
I have an HTML select that has Spanish text in the options. When VoiceOver reads the selected option (unopened), it switches to Spanish as expected. However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text. I have tried adding lang on to the select tag and the option tag but neither helps https://codepen.io/grahamfowles/pen/VYYRxMK
0
0
142
May ’25
FamilyControls API access
I’m requesting access to the Family Controls API for an iOS app currently in development. I’ve submitted the request through the official form here: https://developer.apple.com/contact/request/family-controls-distribution However, after submitting, I receive no confirmation email or support ticket ID. The page only shows a “Thank you for requesting the API” message, and I’m left without a way to track or confirm the request. This entitlement is essential for my app’s functionality, and I need to move forward with development and testing. Can someone from the Apple team please confirm receipt of the request and provide guidance on the next steps or estimated timelines?
0
0
380
May ’25
SwiftUI tvOS Accessibility VoiceOver - prevent reading all items in ScrollView over and over
Hi, I'm trying to fix tvOS view for VoiceOver accessibility feature: TabView { // 5 tabs Text(title) Button(play) ScrollView { // Live LazyHStack { 200 items } } ScrollView { // Continue watching LazyHStack { 500 items } } } When the view shows up VoiceOver reads: "Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack. VocieOver should only read "Home tab 1 of 5" When moving focus to scroll view it reads: "Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to second item it reads: "Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to third item it reads: "Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4" It should be just reading what is focused, idealy just "Live, Item 1, 1 of 200" then after moving focus on item 2 "Item 2, 2 of 200" this time without the word "Live" because we are on the same scroll view (the same horizontal list) Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused. This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction. Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists. How do I disable reading content that is not focused? I tried: .accessibilityLabel(isFocused ? title : "") .accessibilityHidden(!isFocused) .accessibilityHidden(true) - tried on various levels in view hierarchy .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .contain) - tried on various levels in view hierarchy .accessiblityElement(children: .combine) - tried on various levels in view hierarchy .accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy .accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy // the last 2 was basically an attempt to hack it .accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view. 50+ other attempts at configuring accessibility tags attached to views. I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before. Any idea how to fix this? Thanks.
Replies
1
Boosts
0
Views
163
Activity
Apr ’25
Using WebSocket for BCI Click Input in VisionOS - FocusState vs. System-Level Limitations
Hi everyone, My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment. Problem 1: Simulating Eye Tracking in Simulator We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro? Problem 2: WebSocket-triggered "Click" doesn't work outside the app We successfully use WebSocket to send a custom signal (a "1" from the brain signal device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works. We suspect this is due to sandbox restrictions or lack of system-level permissions. Is it possible in anyway to: Trigger interaction events outside the app using custom input (like BCI via Websocket)? Access system-wide click/tap simulation APIs from within VisionOS apps Integrate this with accessibility services (like Voice Control or AssistiveTouch) We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS. Thanks in advance for any insight you can provide!
Replies
0
Boosts
0
Views
221
Activity
Apr ’25
SwiftUI List Accessibility VoiceOver
I have been working on a feature, where I have a List in SwiftUI with previous and next data loading, user can scroll up and down to load previous/next page data. Recently, I faced one accessibility issue while testing voice-over, when user lands on the listing screen and swipe across the screen from navigation and when focus comes on list it should highlight the first item visible. But when user swipes back: Should it load the previous data and announce the previous item or it should go back to the navigation items? If it loads the previous item, what if the user wants to go to the navigation to switch to other actions and vice-versa? Did anyone come across this kind of issue? What can be the standard expected behavior in this case if list has both previous and next page scroll? I different tried gestures https://support.apple.com/en-in/guide/iphone/iph3e2e2281/ios, but it isn't working
Replies
2
Boosts
0
Views
136
Activity
Apr ’25
Attaching procedural audio to an ARKit SCNNode
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post Working Implementation with Static Audio File: let audioPlayer = SCNAudioPlayer(source: audioSource) node.addAudioPlayer(audioPlayer) Attempted Implementation with Procedural Audio: // Audio generation code } let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode) node.addAudioPlayer(audioPlayer) In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here: Apple docs Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating: Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use. However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
Replies
0
Boosts
0
Views
234
Activity
Apr ’25
SpaceBar Not functioning as expected
When I am doing a file search, in TextEdit, and on certain webistes the space bar will quit functioning as soon as i start typing. If I hold down the "Option" key it allows the space bar to work as normal. I have checked every setting I can think of and nothing has helped.
Replies
3
Boosts
0
Views
169
Activity
Apr ’25
Add VoiceOver touch gesture guidance for frame iframe in webView and Safari web
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures. Specifically... iframes. There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch. If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users. VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes. VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor. While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
Replies
1
Boosts
0
Views
155
Activity
Apr ’25
iPhone screen layers
I need to understand the different layers that are there in the iPhone X and later OLED screens as I am designing a hardware attachment. They seem to be projecting letters and images from a different layer than the subpixel layer. Is this proprietary information, or is there a resource that explores them?
Replies
0
Boosts
0
Views
130
Activity
Apr ’25
UIAlertViewController and phone number read back during voiceover
The issue described here in this stack overflow conversation is still an issue today when it comes to the read back of the last 4 digits in the phone numbers for North American numbers as minus. Is there a solution other than overriding the accessibleLabel property?
Replies
2
Boosts
0
Views
158
Activity
Apr ’25
Voiceover TextField don't read out all punctuation
I have a TextField and entered for example "sg?!". At the TextField I set the modifier speechAlwaysIncludesPunctuation(). But when I activate VoiceOver the content of TextField is reading. The special characters don't read out. How can I fix this?
Replies
1
Boosts
0
Views
104
Activity
Apr ’25
What is the appropriate accessibility trait for selectable UITableViewCell?
I’m trying to understand the best practice for assigning accessibilityTraits to a UITableViewCell that users can select from a list of options. In Apple’s first-party apps like Settings, I’ve noticed an inconsistent approach—some cells use the Button trait, while others simply announce the label along with the Selected trait when applicable, without any additional role like Button or Adjustable. So my question is: What is the most appropriate accessibility trait to use for a selectable table view cell that updates a selection (like a settings option)? Is using .button the right approach, or should we rely solely on .selected? Is there any user experience guideline from Apple that recommends one over the other? Would love to hear how others handle this for clarity and consistency in VoiceOver behavior.
Replies
1
Boosts
0
Views
170
Activity
Apr ’25
MAS restrictions on file read-write for desktop electron apps
We have an electron app developed for Mac. We would like to restore the user data previously saved in downloads once user installs the app from store and first launch. But MAS has restrictions with ""com.apple.security.files.downloads.read-write". We have enabled the user access in Entitlement files and request user permission before access What options can be user to auto restore the data from downlodas?
Replies
0
Boosts
0
Views
100
Activity
Apr ’25
VisionPro - Dwell control setting
I remember that Vision Pro's dwell control could previously be set to 0.1 seconds, but now it can't. Is there a way to adjust it?
Replies
1
Boosts
0
Views
175
Activity
Apr ’25
Defining boundaries of inline dialogs for VO users
Hello, I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182. The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends? In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs. I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
Replies
0
Boosts
0
Views
141
Activity
May ’25
False 3.1.1 Rejection: Real-World Dues Payments App
Hello everyone, Our community dues payment app only facilitates real-world maintenance-dues payments directly to property managers’ bank accounts. However, during testing it was likely flagged by the AI-driven review system for a metadata criterion and rejected under Guideline 3.1.1 (“Paid digital content must use IAP”). Meanwhile, hundreds of similar apps remain live on the App Store using the exact same model: The app is completely free No digital content or subscriptions are sold Dues payments are made via bank transfer or credit card directly to the manager Has anyone else encountered this? How did you overcome the metadata check in the AI-driven review process? Thanks!
Replies
0
Boosts
0
Views
122
Activity
May ’25
AXChildren does not get all children
I'd like to add borders to all buttons in the iOS simulator from my Mac app. First I get the simulator window. Then I access the children of all AXGroup and if it's a button or a static text, I add a border. But for some buttons this does not work. In the example image the NavigationBarButtons are not found. I guess the problem is, that for some AXGroup the children array access with AXChildren is empty. Here is some relevant code: - (NSArray<DDHOverlayElement *> *)overlayChildrenOfUIElement:(AXUIElementRef)element index:(NSInteger)index { NSMutableArray<DDHOverlayElement *> *tempOverlayElements = [[NSMutableArray alloc] init]; NSLog(@">>> -----------------------------------------------------"); NSString *role = [UIElementUtilities roleOfUIElement:element]; NSRect frame = [UIElementUtilities frameOfUIElement:element]; NSLog(@"%@, role: %@, %@", element, role, [NSValue valueWithRect:frame]); NSArray *lineage = [UIElementUtilities lineageOfUIElement:element]; NSLog(@"lineage: %@", lineage); NSArray<NSValue *> *children = [UIElementUtilities childrenOfUIElement:element]; if (children.count < 1) { NSLog(@"NO CHILDREN"); } for (NSInteger i = 0; i < [children count]; i++) { NSValue *child = children[i]; AXUIElementRef uiElement = (__bridge AXUIElementRef)child; NSString *role = [UIElementUtilities roleOfUIElement:uiElement]; NSRect frame = [UIElementUtilities frameOfUIElement:uiElement]; NSLog(@"----%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]); } NSLog(@"<<< -----------------------------------------------------"); for (NSInteger i = 0; i < [children count]; i++) { NSValue *child = children[i]; AXUIElementRef uiElement = (__bridge AXUIElementRef)child; NSString *role = [UIElementUtilities roleOfUIElement:uiElement]; NSRect frame = [UIElementUtilities frameOfUIElement:uiElement]; NSLog(@"%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]); if ([role isEqualToString:@"AXButton"] || [role isEqualToString:@"AXTextField"] || [role isEqualToString:@"AXStaticText"]) { NSString *tag = [NSString stringWithFormat:@"%ld%ld", (long)index, (long)i]; NSLog(@"tag: %@", tag); DDHOverlayElement *overlayElement = [[DDHOverlayElement alloc] initWithUIElementValue:child tag:tag]; [tempOverlayElements addObject:overlayElement]; } else if ([role isEqualToString:@"AXGroup"] || [role isEqualToString:@"AXToolbar"]) { [tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:++index]]; } else if ([role isEqualToString:@"AXWindow"]) { [self.overlayWindowController setFrame:[UIElementUtilities frameOfUIElement:uiElement]]; [tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:index]]; } } return [tempOverlayElements copy]; } For some AXGroup the children are found. For some they are empty. I cannot figure out why. Does anyone have an idea what I'm doing wrong?
Replies
2
Boosts
0
Views
163
Activity
May ’25
Making PhotoLibrary UIImagePickerController a11y compliant
I am invoking the UIImagePickerController of type UIImagePickerControllerSourceTypePhotoLibrary from my viewController. I want shift the keyboard focus to the Cancel button which is the first interactive element on the gallery picker. When a user has full keyboard access turned on they should be able to tap tab and interact with the gallery picker modal. How do I achieve this?
Replies
1
Boosts
0
Views
156
Activity
May ’25
Apple greets Global Accessibility Awareness Day with severe accessibility violations on macOS
I'm reposting here my FB17602742, submitted yesterday: The strong wording of this message comes from years of Apple ignoring the needs of users who can't tolerate UI animations and convulsions. At this point, it's clear that Apple is either intentionally harming users like me or simply doesn't care about meeting even the most basic accessibility standards on macOS. Yes, many UI animations and convulsions can, fortunately, be disabled - but not through straightforward UI controls. Instead, users are forced to look for obscure Terminal commands found scattered across the Internet. The "Reduce motion" checkbox in System Settings is simply a fake control that doesn't do anything - instead of actually disabling all UI animations and convulsions. What's worse, two of the most offensive UI animations cannot be disabled at all. Apple has consistently dismissed requests to let users disable the following UI animations: Scroll bar rollover highlight effect (introduced on macOS 10.7.3). Every time the cursor passes over a scroll bar, it gets highlighted. This draws the user's attention to random scroll bars for no reason - just because the cursor happened to pass over them. It results in HUNDREDS of unnecessary, annoying events of distraction daily!
 Expand/collapse animation of NSOutlineView (e.g., when opening/closing folders in the list view in the Finder, or any other app using outline views). This animation is extremely distracting, irritating, and time-wasting. Global Accessibility Awareness Day is approaching. Dear Apple, Please adhere to the most basic accessibility standards. Stop the needless suffering of countless users like me. Let us disable the two aforementioned UI convulsions. Thank you for your attention to the issue.
Replies
0
Boosts
0
Views
166
Activity
May ’25
Clarification on Color Path Determination in Wallet Provisioning (Green,Yellow, Orange) Path recommendation
Hi, I’ve been reviewing the Apple Wallet provisioning documentation (Getting Started with Apple Pay In-App Provisioning_ Verification_Security_Wallet Extensions )and had a few questions regarding the color path recommendation (Green, Yellow, Orange, Red) returned during the in-app provisioning flow: Who determines the color path—is it Apple directly, the Payment Network Operator (PNO), or both? What criteria are used to determine the color path (e.g., device info, Apple ID reputation, past provisioning attempts)? At what point in the provisioning flow is the color path recommendation received? Is it included in the response after the PKAddPaymentPassRequest is submitted? Is it accessible through any specific property or callback in the delegate method? Additionally, for Orange Path with Reason Code 0G, I understand that in-app verification is not allowed and must be handled via tenured channels (e.g., SMS/email). Can you confirm if this logic still applies for requests initiated from within the issuer's iOS app? Would appreciate any clarification or pointers to related documentation.
Replies
0
Boosts
0
Views
162
Activity
May ’25
VoiceOver is not respecting lang in HTML option
I have an HTML select that has Spanish text in the options. When VoiceOver reads the selected option (unopened), it switches to Spanish as expected. However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text. I have tried adding lang on to the select tag and the option tag but neither helps https://codepen.io/grahamfowles/pen/VYYRxMK
Replies
0
Boosts
0
Views
142
Activity
May ’25
FamilyControls API access
I’m requesting access to the Family Controls API for an iOS app currently in development. I’ve submitted the request through the official form here: https://developer.apple.com/contact/request/family-controls-distribution However, after submitting, I receive no confirmation email or support ticket ID. The page only shows a “Thank you for requesting the API” message, and I’m left without a way to track or confirm the request. This entitlement is essential for my app’s functionality, and I need to move forward with development and testing. Can someone from the Apple team please confirm receipt of the request and provide guidance on the next steps or estimated timelines?
Replies
0
Boosts
0
Views
380
Activity
May ’25