When using iOS VoiceOver to navigate a webpage, selecting a element correctly activates the :focus-visible state. However, when VoiceOver moves to a non-button element (such as a or ), the previously focused button retains its :focus-visible state. The focus indicator only updates when VoiceOver moves to another .
This behavior can be confusing for screen reader users, as it creates the appearance of multiple elements being focused simultaneously. It also differs from expected keyboard navigation behavior, where focus styles typically update as soon as the user moves to a new interactive element.
Is this an intentional VoiceOver behavior, or could this be a bug? If intentional, is there a recommended workaround to ensure correct focus indication when moving between different types of elements?
Steps to Reproduce:
Enable VoiceOver on an iOS device.
Navigate using swipe gestures or explore-by-touch to focus on a .
Observe that the button correctly receives the :focus-visible styling.
Move to a non-button element (e.g., a with tabindex="0" or an ).
Notice that the button still retains its :focus-visible state, even though VoiceOver has moved to a new element.
Expected Behavior:
The previously focused should lose its :focus-visible state when VoiceOver moves to a different interactive element, just as it does when using keyboard navigation.
Actual Behavior:
The :focus-visible state remains on the previously focused button unless VoiceOver moves to another . This can create confusion by displaying multiple focus indicators at once.
Tested On:
iOS 17.7, 18.3.1
iOS Safari
iPhone 11 Pro, iPhone 14 Pro Max
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am encountering the following issue while working with app group preferences in my Safari web extension:
Couldn't read values in CFPrefsPlistSource<0x3034e7f80> (Domain: [MyAppGroup], User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd.
I am trying to read/write shared preferences using UserDefaults with an App Group but keep running into this error. Any guidance on how to resolve this would be greatly appreciated!
Has anyone encountered this before? How can I properly configure my app group preferences to avoid this issue?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Swift Packages
Messages
Xcode
Group Activities
I have a Twitter account that I registered with Apple id and I still don't know the PIN and I'm having a problem with it knowing the PIN I need help
privaterelay.appleid.com
Topic:
Accessibility & Inclusion
SubTopic:
General
I'd like to add borders to all buttons in the iOS simulator from my Mac app. First I get the simulator window. Then I access the children of all AXGroup and if it's a button or a static text, I add a border.
But for some buttons this does not work. In the example image the NavigationBarButtons are not found. I guess the problem is, that for some AXGroup the children array access with AXChildren is empty.
Here is some relevant code:
- (NSArray<DDHOverlayElement *> *)overlayChildrenOfUIElement:(AXUIElementRef)element index:(NSInteger)index {
NSMutableArray<DDHOverlayElement *> *tempOverlayElements = [[NSMutableArray alloc] init];
NSLog(@">>> -----------------------------------------------------");
NSString *role = [UIElementUtilities roleOfUIElement:element];
NSRect frame = [UIElementUtilities frameOfUIElement:element];
NSLog(@"%@, role: %@, %@", element, role, [NSValue valueWithRect:frame]);
NSArray *lineage = [UIElementUtilities lineageOfUIElement:element];
NSLog(@"lineage: %@", lineage);
NSArray<NSValue *> *children = [UIElementUtilities childrenOfUIElement:element];
if (children.count < 1) {
NSLog(@"NO CHILDREN");
}
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"----%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
}
NSLog(@"<<< -----------------------------------------------------");
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
if ([role isEqualToString:@"AXButton"] ||
[role isEqualToString:@"AXTextField"] ||
[role isEqualToString:@"AXStaticText"]) {
NSString *tag = [NSString stringWithFormat:@"%ld%ld", (long)index, (long)i];
NSLog(@"tag: %@", tag);
DDHOverlayElement *overlayElement = [[DDHOverlayElement alloc] initWithUIElementValue:child tag:tag];
[tempOverlayElements addObject:overlayElement];
} else if ([role isEqualToString:@"AXGroup"] ||
[role isEqualToString:@"AXToolbar"]) {
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:++index]];
} else if ([role isEqualToString:@"AXWindow"]) {
[self.overlayWindowController setFrame:[UIElementUtilities frameOfUIElement:uiElement]];
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:index]];
}
}
return [tempOverlayElements copy];
}
For some AXGroup the children are found. For some they are empty. I cannot figure out why.
Does anyone have an idea what I'm doing wrong?
Topic:
Accessibility & Inclusion
SubTopic:
General
iOSアプリでNEAppPushSessionを使い、NEAppPushDelegateの通知を受けてCallKitの着信画面を表示する実装をしていますが、以下の問題に直面しています。
8/13
ログにて下記のエラーが頻発しました。
通知の受け取りテストを約120回してその間ずっとこのエラーが出ていました。
エラー 2025-08-14 11:27:06.793073 +0900 nesessionmanager NESMAppPushSession[SimplePushDefaultConfiguration:7B7218F3-94B5-4AE5-9B9E-94E176694D02] failed to report incoming call to CallKit, error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process." UserInfo={NSDebugDescription=The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process.}
このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。
ですがどうやら通知の監視は開始しているようです。
15時間後の8/14
時間をあけたからか、再度通知が来るようになりました。
ですが再度通知の受け取りテストを行った時に同じエラーログが出ました。
再度通知テストを約120回程行ったら、このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。
ですがどうやら今回も通知の監視は開始しているようです。
15時間後の8/15
今日は15時間かけても通知を取得できませんでした。
ですが同じく通知の監視は開始していそうです。
iPhoneの再起動、Xcodeのクリーンアップ、アンインストールして再インストールなどしても通知は来ないままでした。
また、不思議なことに通知が来ない事象が起きた端末以外でも同じように通知を取得することができません。
他の通知は受け取ることができますが独自の通知であるNEAppPushManagerだけ通知を取得することができません。
質問です。
再度通知を出すためには何をすれば良いでしょうか。
この事象は4099エラーを出しすぎたことにより発生する障害なのでしょうか。
4099エラーを出ている原因は何でしょうか。
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
User Notifications
PushKit
CallKit
Push To Talk
I'm reposting here my FB17602742, submitted yesterday:
The strong wording of this message comes from years of Apple ignoring the needs of users who can't tolerate UI animations and convulsions.
At this point, it's clear that Apple is either intentionally harming users like me or simply doesn't care about meeting even the most basic accessibility standards on macOS.
Yes, many UI animations and convulsions can, fortunately, be disabled - but not through straightforward UI controls. Instead, users are forced to look for obscure Terminal commands found scattered across the Internet.
The "Reduce motion" checkbox in System Settings is simply a fake control that doesn't do anything - instead of actually disabling all UI animations and convulsions.
What's worse, two of the most offensive UI animations cannot be disabled at all. Apple has consistently dismissed requests to let users disable the following UI animations:
Scroll bar rollover highlight effect (introduced on macOS 10.7.3). Every time the cursor passes over a scroll bar, it gets highlighted. This draws the user's attention to random scroll bars for no reason - just because the cursor happened to pass over them. It results in HUNDREDS of unnecessary, annoying events of distraction daily!
Expand/collapse animation of NSOutlineView (e.g., when opening/closing folders in the list view in the Finder, or any other app using outline views). This animation is extremely distracting, irritating, and time-wasting.
Global Accessibility Awareness Day is approaching.
Dear Apple,
Please adhere to the most basic accessibility standards. Stop the needless suffering of countless users like me. Let us disable the two aforementioned UI convulsions.
Thank you for your attention to the issue.
Hello,
When I listen to title in my app with VoiceOver, it makes a strange sound.
This characters make with Korean+number+Alphabet.
Is this combination makes some strange sound with voice over?
I would like to ask if Apple can fix this issue.
Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
It appears iOS only comes with low quality voices installed.
iOS requires the user to go into settings to download higher quality voices to be used with AVSpeechUtterance.
There doesn't seem to be any api that can be used to make this process easier for the app user.
Is there a way / api that would allow an app to download and use a higher quality voice?
Will apple ever install on default higher quality voices?
We really want to use the text to speech api in iOS however the very high amount of user friction to use high quality voices is stopping us. I would appreciate a response.
Thanks
Accessibility Voiceover is not treating navigation bar left button as first focused element.
If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title.
If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI.
how should I do? if I want to focus on the navigation item back button or navigation title.
my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange.
Why did you design it this way? Can you tell me your thinking?
Thanks
I have an HTML select that has Spanish text in the options.
When VoiceOver reads the selected option (unopened), it switches to Spanish as expected.
However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text.
I have tried adding lang on to the select tag and the option tag but neither helps
https://codepen.io/grahamfowles/pen/VYYRxMK
Topic:
Accessibility & Inclusion
SubTopic:
General
I would like to enable the option to resize windows with the apple pencil pro. I tried but I see that this feature is not enabled.
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m requesting access to the Family Controls API for an iOS app currently in development. I’ve submitted the request through the official form here:
https://developer.apple.com/contact/request/family-controls-distribution
However, after submitting, I receive no confirmation email or support ticket ID. The page only shows a “Thank you for requesting the API” message, and I’m left without a way to track or confirm the request.
This entitlement is essential for my app’s functionality, and I need to move forward with development and testing. Can someone from the Apple team please confirm receipt of the request and provide guidance on the next steps or estimated timelines?
The Text view seems to automatically prevent orphaned words on screen, but barring exceptional circumstances such as the font size being too large.
I couldn't find any documentation on this behaviour, how to configure, and would also be very interested in how it's implemented?
Thanks!
Topic:
Accessibility & Inclusion
SubTopic:
General
I'm developing a document editor for macOS using AppKit, which supports structured content such as titles and multiple heading levels—similar to what you see in the Pages app.
I'm looking for a way to programmatically mark a specific substring within an NSTextView as a heading, so that VoiceOver can recognize it and announce it appropriately (e.g., by saying “heading” before reading the text). This would be similar in spirit to how NSAccessibilityLinkTextAttribute works for links.
Is there an existing accessibility text attribute or recommended approach to achieve this behavior for headings? If not, I’d appreciate any guidance or suggestions on how best to implement this in a VoiceOver-friendly way.
Thank you in advance for your help!
Best regards,
Topic:
Accessibility & Inclusion
SubTopic:
General
after the 26.3 beta update, my mouse has been having major problems with transparency, have to keep going to reset colors in display, but it doesn't hold, anyone else?
Topic:
Accessibility & Inclusion
SubTopic:
General
In the app I'm working on, I have a SwiftUI View embedded in a UIKit Storyboard. The SwiftUI View holds a menu with a list of payment tools, and the ForEach loop looks like this:
ForEach(self.paymentToolsVM.paymentToolsItems, id: \.self) { paymentTool in
Button {
navigationCallback(paymentTool.segueID)
} label: {
PaymentToolsRow(paymentToolName: paymentTool.title, imageName: paymentTool.imageName)
.accessibilityElement()
.accessibilityIdentifier("Billing_\(paymentTool.title.replacingOccurrences(of: " ", with: ""))")
}
if paymentTool != self.paymentToolsVM.paymentToolsItems.last {
Divider()
}
}
So you can see the accessibility ID is there, and it shows up properly when I open up Accessibility Inspector with the simulator, but the testing script isn't picking up on it, and it doesn't show up when the view is inspected in Appium. I have other SwiftUI views embedded in the UIKit view, and the script picks up the buttons on those, so I'm not sure what's different about this one.
If it helps, the script is written in Java with the BDD framework. I can try to get the relevant part of the script if anyone thinks that would be helpful. Otherwise, is there anything else I can try?
Is there any way to get history?
Topic:
Accessibility & Inclusion
SubTopic:
General
i would like to ask what are the new touchpad shortcuts in the latest os for ipad using a trackpad. The three fingers swipe to the right and left seemed to be removed. Thank you
I need to understand the different layers that are there in the iPhone X and later OLED screens as I am designing a hardware attachment. They seem to be projecting letters and images from a different layer than the subpixel layer. Is this proprietary information, or is there a resource that explores them?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
In our app we provide a button that initiates a phone call using tel://.
For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel.
If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call.
However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT.
Is this the expected system behavior for emergency numbers on iOS?
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option?
Thanks in advance for clarifying.