Hi!
I'm working on an application where I'd like VoiceOver to give each element of a tab bar the "Tab" trait. I'm testing this using the Accessibility Inspector. Essentially, I'd like to replicate the behavior of how Safari identifies each of its tabs as a "Tab" (I've attached a photo below).
How exactly is this accomplished? I've tried using the .isTabBar trait to designate the child objects as "Tabs", but this doesn't seem to be working and I've struggled to find documentation about this. For additional context, these child items are Buttons, and I would like to have the .isButton trait essentially replaced by something like an .isTab trait. Not sure if this is actually possible or not, but curious how the Accessibility Inspector recognizes this in Safari.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac
However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day?
For context, my app is a macOS-native SwiftUI app.
When VoiceOver reads decimal numbers with six or more digits after the decimal, it stops announcing the decimal separator and also adds pauses between each digit.
Text("0.12345") // VoiceOver: "zero **point** one two three four five"
Text("0.123456") // VoiceOver: "zero one, two, three, four, five, six"
How can I force VoiceOver to announce the decimal separator ("point") and not insert pauses regardless of the number of decimal digits?
Hi,
On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor).
In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense.
There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following.
.accessibilityRotor(.headings) {
ForEach(entries) { entry in
// entry.id must be the same as the id of the SwiftUI view it is about
AccessibilityRotorEntry(entry.name, id: entry.id)
}
}
It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up.
I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them.
So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else)
The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed).
Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder.
Thanks
I have subscribed to the developer program, but it’s already been a day and it still shows “is not enrolled in the Apple Developer Program.”
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi,
I am setting an accessibilityLabel and accessibilityHint property of a UIAlertAction. However, VoiceOver is only reading the label out. Usually, the label is read out, followed by a short pause and then the hint. Is this a known issue, where hints do not work for this element? I can append the hint to the label, but interested to know if there's something I'm doing wrong.
Regards.
I’m trying to customize the keyboard focus appearance in SwiftUI.
In UIKit (see WWDC 2021 session Focus on iPad keyboard navigation), it’s possible to remove the default UIFocusHaloEffect and change a view’s appearance depending on whether it has focus or not.
In SwiftUI I’ve tried the following:
.focusable() // .focusable(true, interactions: .activate)
.focusEffectDisabled()
.focused($isFocused)
However, I’m running into several issues:
.focusable(true, interactions: .activate) causes an infinite loop, so keyboard navigation stops responding
.focusEffectDisabled() doesn’t seem to remove the default focus effect on iOS
Using @FocusState prevents Space from triggering the action when the view has keyboard focus
My main questions:
How can I reliably detect whether a SwiftUI view has keyboard focus? (Is there an alternative to FocusState that integrates better with keyboard navigation on iOS?)
What’s the recommended way in SwiftUI to disable the default focus effect (the blue overlay) and replace it with a custom border?
Any guidance or best practices would be greatly appreciated!
Here's my sample code:
import SwiftUI
struct KeyboardFocusExample: View {
var body: some View {
// The ScrollView is required, otherwise the custom focus value resets to false after a few seconds. I also need it for my actual use case
ScrollView {
VStack {
Text("First button")
.keyboardFocus()
.button {
print("First button tapped")
}
Text("Second button")
.keyboardFocus()
.button {
print("Second button tapped")
}
}
}
}
}
// MARK: - Focus Modifier
struct KeyboardFocusModifier: ViewModifier {
@FocusState private var isFocused: Bool
func body(content: Content) -> some View {
content
.focusable() // ⚠️ Must come before .focused(), otherwise the FocusState won’t be recognized
// .focusable(true, interactions: .activate) // ⚠️ This causes an infinite loop, so keyboard navigation no longer responds
.focusEffectDisabled() // ⚠️ Has no effect on iOS
.focused($isFocused)
// Custom Halo effect
.padding(4)
.overlay(
RoundedRectangle(cornerRadius: 18)
.strokeBorder(
isFocused ? .red : .clear,
lineWidth: 2
)
)
.padding(-4)
}
}
extension View {
public func keyboardFocus() -> some View {
modifier(KeyboardFocusModifier())
}
}
// MARK: - Button Modifier
/// ⚠️ Using a Button view makes no difference
struct ButtonModifier: ViewModifier {
let action: () -> Void
func body(content: Content) -> some View {
content
.contentShape(Rectangle())
.onTapGesture {
action()
}
.accessibilityAction {
action()
}
.accessibilityAddTraits(.isButton)
.accessibilityElement(children: .combine)
.accessibilityRespondsToUserInteraction()
}
}
extension View {
public func button(action: @escaping () -> Void) -> some View {
modifier(ButtonModifier(action: action))
}
}
I have a product for designing particle emitters, which I suspect may be of limited interest to people with limited vision.
I'd still like to ensure I'm doing a good job with VoiceOver mode.
There's a related, simplified sample online, if you want to look at the code
As you can see from the picture below, a large part of the interface mimics Xcode's particle editor, with many value entry controls that combine up/down buttons with a tappable label. Tapping the label goes into edit mode.
Apart from changing how labels are stepped through in voiceover in my app, how should I handle these stepper buttons? Is this a good place to use a Custom Rotor?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi, I am not able to renew my Apple Developer Program membership. I do not see any renew button. not on app, not on website, not on iphone. :(
what should i do?
Topic:
Accessibility & Inclusion
SubTopic:
General
C:\Users\xjc>openssl s_client -connect gateway.push.apple.com:2195 -showcerts
Connecting to 17.188.183.32
CONNECTED(000000AC)
depth=1 C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K
verify error:num=20:unable to get local issuer certificate
verify return:1
depth=0 C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com
verify return:1
B0640000:error:0A000410:SSL routines:ssl3_read_bytes:ssl/tls alert handshake failure:ssl\record\rec_layer_s3.c:908:SSL alert number 40
Certificate chain
0 s:C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com
i:C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K
a:PKEY: rsaEncryption, 2048 (bit); sigalg: RSA-SHA256
v:NotBefore: Aug 16 21:34:09 2024 GMT; NotAfter: Aug 15 21:34:07 2025 GMT
-----BEGIN CERTIFICATE-----
MIIGqDCCBZCgAwIBAgIQCUjuxVwL1mhSlrjSSk/+BzANBgkqhkiG9w0BAQsFADCB
WnKd+td/wZ6Ej6EB
mDF8JCSKz/ck+NnLfGM0jFdcTCl8dKuqM9XetP4ls1sVyUuLM7sJiQvMVDzluZ22
LA9EMc5ZcbdV96ZpKS3ETk5n7355fyVX+jZ24ZvfhtdyPvdUGuHzcrK/YfB0AsjY
hIhXgkxMfqJDjj7Af1CDPSAv9cylGI5b9v5QX93pM8uGxSRZTGS5m4qJG0Jj4UpV
QlzppFg+qE41yDrdy4rLxROW4bp/HPvEjo1YoAle3K208UMffVPBqGfZqbZ01+hP
gHCeamBb6QlV2Zq6q/VEKUO6p6oFQnI0phQiAQ==
-----END CERTIFICATE-----
1 s:C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K
i:C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2009 Entrust, Inc. - for authorized use only, CN=Entrust Root Certification Authority - G2
a:PKEY: rsaEncryption, 2048 (bit); sigalg: RSA-SHA256
v:NotBefore: Oct 5 19:13:56 2015 GMT; NotAfter: Dec 5 19:43:56 2030 GMT
-----BEGIN CERTIFICATE-----
MIIFDjCCA/agAwIBAgIMDulMwwAAAABR03eFMA0GCSqGSIb3DQEBCwUAMIG+MQsw
CQYDVQQGEwJVUzEWMBQGA1UEChMNRW50cnVzdCwgSW5jLjEoMCYGA1UECxMfU2Vl
IHd3dy5lbnRydXN0Lm5ldC9sZWdhbC10ZXJtczE5MDcGA1UECxMwKGMpIDIwMDkg
RW50cnVzdCwgSW5jLiAtIGZvciBhdXRob3JpemVkIHVzZSBvbmx5MTIwMAYDVQQD
EylFbnRydXN0IFJvb3QgQ2VydGlmaWNhdGlvbiBBdXRob3JpdHkgLSBHMjAeFw0x
NTEwMDUxOTEzNTZaFw0zMDEyMDUxOTQzNTZaMIG6MQswCQYDVQQGEwJVUzEWMBQG
A1UEChMNRW50cnVzdCwgSW5jLjEoMCYGA1UECxMfU2VlIHd3dy5lbnRydXN0Lm5l
dC9sZWdhbC10ZXJtczE5MDcGA1UECxMwKGMpIDIwMTIgRW50cnVzdCwgSW5jLiAt
IGZvciBhdXRob3JpemVkIHVzZSBvbmx5MS4wLAYDVQQDEyVFbnRydXN0IENlcnRp
ZmljYXRpb24gQXV0aG9yaXR5IC0gTDFLMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8A
MIIBCgKCAQEA2j+W0E25L0Tn2zlem1DuXKVh2kFnUwmqAJqOV38pa9vH4SEkqjrQ
jUcj0u1yFvCRIdJdt7hLqIOPt5EyaM/OJZMssn2XyP7BtBe6CZ4DkJN7fEmDImiK
m95HwzGYei59QAvS7z7Tsoyqj0ip/wDoKVgG97aTWpRzJiatWA7lQrjV6nN5ZGhT
JbiEz5R6rgZFDKNrTdDGvuoYpDbwkrK6HIiPOlJ/915tgxyd8B/lw9bdpXiSPbBt
LOrJz5RBGXFEaLpHPATpXbo+8DX3Fbae8i4VHj9HyMg4p3NFXU2wO7GOFyk36t0F
ASK7lDYqjVs1/lMZLwhGwSqzGmIdTivZGwIDAQABo4IBDDCCAQgwDgYDVR0PAQH/
BAQDAgEGMBIGA1UdEwEB/wQIMAYBAf8CAQAwMwYIKwYBBQUHAQEEJzAlMCMGCCsG
AQUFBzABhhdodHRwOi8vb2NzcC5lbnRydXN0Lm5ldDAwBgNVHR8EKTAnMCWgI6Ah
hh9odHRwOi8vY3JsLmVudHJ1c3QubmV0L2cyY2EuY3JsMDsGA1UdIAQ0MDIwMAYE
VR0gADAoMCYGCCsGAQUFBwIBFhpodHRwOi8vd3d3LmVudHJ1c3QubmV0L3JwYTAd
BgNVHQ4EFgQUgqJwdN28Uz/Pe9T3zX+nYMYKTL8wHwYDVR0jBBgwFoAUanImetAe
733nO2lR1GyNn5ASZqswDQYJKoZIhvcNAQELBQADggEBADnVjpiDYcgsY9NwHRkw
y/YJrMxp1cncN0HyMg/vdMNY9ngnCTQIlZIv19+4o/0OgemknNM/TWgrFTEKFcxS
BJPok1DD2bHi4Wi3Ogl08TRYCj93mEC45mj/XeTIRsXsgdfJghhcg85x2Ly/rJkC
k9uUmITSnKa1/ly78EqvIazCP0kkZ9Yujs+szGQVGHLlbHfTUqi53Y2sAEo1GdRv
c6N172tkw+CNgxKhiucOhk3YtCAbvmqljEtoZuMrx1gL+1YQ1JH7HdMxWBCMRON1
exCdtTix9qrKgWRs6PLigVWXUX/hwidQosk8WwBD9lu51aX8/wdQQGcHsFXwt35u
Lcw=
-----END CERTIFICATE-----
Server certificate
subject=C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com
issuer=C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K
Acceptable client certificate CA names
C=US, O=Apple Inc., OU=Apple Certification Authority, CN=Apple Root CA
CN=Apple Worldwide Developer Relations Certification Authority, OU=G4, O=Apple Inc., C=US
CN=Apple Application Integration 2 Certification Authority, OU=Apple Certification Authority, O=Apple Inc., C=US
CN=Apple Corporate Authentication CA 1, OU=Certification Authority, O=Apple Inc., C=US
C=US, O=Apple Inc., OU=Apple Worldwide Developer Relations, CN=Apple Worldwide Developer Relations Certification Authority
CN=Apple Corporate Root CA, OU=Certification Authority, O=Apple Inc., C=US
C=US, O=Apple Inc., OU=Apple Certification Authority, CN=Apple Application Integration Certification Authority
C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com
Client Certificate Types: RSA sign, ECDSA sign
Requested Signature Algorithms: ECDSA+SHA256:RSA-PSS+SHA256:RSA+SHA256:ECDSA+SHA384:RSA-PSS+SHA384:RSA+SHA384:RSA-PSS+SHA512:RSA+SHA512:RSA+SHA1
Shared Requested Signature Algorithms: ECDSA+SHA256:RSA-PSS+SHA256:RSA+SHA256:ECDSA+SHA384:RSA-PSS+SHA384:RSA+SHA384:RSA-PSS+SHA512:RSA+SHA512
SSL handshake has read 4138 bytes and written 687 bytes
Verification error: unable to get local issuer certificate
New, SSLv3, Cipher is AES128-SHA
Protocol: TLSv1.2
Server public key is 2048 bit
Secure Renegotiation IS supported
Compression: NONE
Expansion: NONE
No ALPN negotiated
SSL-Session:
Protocol : TLSv1.2
Cipher : AES128-SHA
Session-ID:
Session-ID-ctx:
Master-Key: D504C13BDBC59CDF3B883D1B626FA2B59000754DED57CD77A72F761A52AEED719DA06C100FBA1430BB9D8DECFC7C9307
PSK identity: None
PSK identity hint: None
SRP username: None
Start Time: 1741092949
Timeout : 7200 (sec)
Verify return code: 20 (unable to get local issuer certificate)
Extended master secret: yes
Description
When calling AppStore.showManageSubscriptions(in:), the system modal for managing subscriptions appears visually. However, it is not automatically focused by VoiceOver, and in some cases, VoiceOver still allows interaction with elements in the underlying view controller, such as buttons and labels. This creates confusion and violates accessibility expectations.
Steps to Reproduce
1. In a UIKit app, present the system subscription sheet via AppStore.showManageSubscriptions(in:).
2. Ensure VoiceOver is enabled on the device.
3. Observe the focus behavior when the modal appears.
4. Try swiping right/left — VoiceOver continues to announce items in the presenting view controller.
Expected Result
The modal should automatically take VoiceOver focus, and all elements behind it should be non-accessible until dismissed.
Actual Result
VoiceOver continues to focus and interact with elements behind the presented modal.
Notes
• Tested on iOS 18.5
• Reproducible on device
• Using Swift/UIKit (not SwiftUI)
Topic:
Accessibility & Inclusion
SubTopic:
General
Amogst the two languages my app would have lets say 10% and 90%.
I am launching an app for a unlisted Primary Language. I consider whatever is inside the app as the primary and that wont be English. The secondary language
We have a requirement to manage the shortcuts and hotkeys in our application, and have it to be intuitive and support multi-lingual fully. The understanding that we have currently is that most universal shortcuts and hotkeys on MacOS/iOS are expressed using English/Latin characters’ – and now, when a ‘pure foreign language physical or virtual keyboard’ is the ‘input device’ – we are unclear how the user would invoke such a hotkey.
Now, considering cases where other language keyboards have no Latin characters, in these environments, managing shortcuts and hotkeys becomes a rather difficult task. Taking a very simple example, the shortcut for Printing a page is Command/Control + 'P'. This can be an issue on Non English character keyboards like Arabic, where not only are there no letters for P, there is also no equivalent phonetic character as well, since the language itself does not have it.
Also – when we are wanting customizability of a hotkey by the user, how would the user express ‘which is the key combination for a given action they want to perform’.
So, based on these conditions, in order to provide the most comprehensive and optimal experience for the user in their own language, what is it that Apple recommend we do here, for Hotkeys/Shortcuts support in Pure Languages
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
InputMethodKit
Internationalization
Shortcuts
Localization
Hi everyone — I’m implementing the new Hearing Device Support API described here:
https://developer.apple.com/documentation/accessibility/hearing-device-support
I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration
).
com.apple.developer.hearing.aid.app
xxxxx
but the app won't even compile with this entitlement
Problem
NotificationCenter.default.addObserver(...) for
pairedUUIDsDidChangeNotification
never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids.
Because the notification never triggers, calls like:
HearingDeviceSession.shared.pairedDevices
always return an empty list.
What I expected
According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens.
Questions
Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file?
Is there a way to verify that iOS is actually honoring this entitlement?
Has anyone successfully received this notification on a real device?
Any help or confirmation would be greatly appreciated.
I have an app that needs Input Monitoring permissions to get keyboard access in the background. I've attempted to use both IOHIDCheckAccess(kIOHIDRequestTypeListenEvent) and IOHIDRequestAccess(kIOHIDRequestTypeListenEvent), but they always return denied, even though I have given the permission for Input Monitoring to the app in Settings.
Is there something I need to put in my Info.plist to enable this permission to work?
Topic:
Accessibility & Inclusion
SubTopic:
General
I try with this URL App-prefs:General&path=Date&Time it work on iOS 17 but not with iOS 18
When turning VoiceOver ON, GCController does not send button press events for "Button A" and "Button Center".
This happens when using Siri 2nd generation remote (with dedicated arrow buttons on the circle around center button) and also when using iOS remote. I didn't test it on old Siri 1st generation with touchpad without arrow buttons.
Example:
gameController.microGamepad?.allButtons.forEach { button in
button.valueChangedHandler = { [weak self] _, _, _ in
self?.buttonHandler(gameController: gameController, button: button)
}
private func buttonHandler(gameController: GCController, button: GCControllerButtonInput) {
print("BUTTON: Pressed \(button.description) isPressed=\(button.isPressed) isTouched=\(button.isTouched)")
}
#endif
VoiceOver ON (incorrect behavior):
BUTTON: Pressed Direction Pad Left (value: 0.030, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Direction Pad Down (value: 0.079, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false
BUTTON: Pressed Direction Pad Down (value: 0.000, pressed: 0) isPressed=false isTouched=false
VoiceOver OFF (correct behavior):
BUTTON: Pressed Direction Pad Left (value: 0.137, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Direction Pad Up (value: 0.078, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Button A (value: 1.000, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Button Center (value: 1.000, pressed: 1) isPressed=true isTouched=true
BUTTON: Pressed Button A (value: 0.000, pressed: 0) isPressed=false isTouched=false
BUTTON: Pressed Button Center (value: 0.000, pressed: 0) isPressed=false isTouched=false
BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false
BUTTON: Pressed Direction Pad Up (value: 0.000, pressed: 0) isPressed=false isTouched=false
I could use for detection Direction Pad Left/Right/Up/Down and detect position between -0.7 and +0.7 and handle it as center button press, because I use that on old Siri remote where I need to distinguish center button and arrows (for switching TV channels by Up/Down and Skip forward/back by Left/Right arrows), but for new Siri remote it would be unnecessary workaround.
Does anybody know why the center/select button is not detected when VoiceOver is ON. Is there another way of detecting it using GCController?
I don't want to use SwiftUI onTapGesture for this one particular case.
Is it an unexpected bug in tvOS APIs or is there some specific reason why center button is not handled by GCController when VoiceOver is ON?
Thanks.
I have a UITextField in my application for entering a state. If I tap on it, a UIPickerView pops up and let's the user select a state (but they can still type too).
The issue relates to Full Keyboard Access. If we select the UITextField using an external keyboard, the UIPickerView appears, but in order to get to it the user has to tab through the whole view controller to get to the UIPickerView at the end.
What would be nice is to a) move focus directly to the UIPickerView (have it highlighted in blue and scrollable right away with keyboard) or b) make the UIPickerView the next view that's accessible when tabbing over or using the arrow keys.
I've tried using:
UIAccessibility notifications (both .screenChanged and .layoutChanged, with and without a delay). This ended up only announcing the view, but didn't help with full keyboard access.
Making the UIPickerView a first responder when it appears.
Attempting to change the accessibilityElements order (but with so many views and views within views, this isn't really a viable option either).
Pressing tab + -> (tab and right arrow button) will quickly take the user to the end of the chain of accessibility elements, in other words, to the UIPickerView. But there has to be a cleaner way of just automatically setting the focus to the UIPickerView or making it the next element by pressing the arrow key.
Dear Apple Support,
I am reporting a critical issue affecting parental control apps like my app, Choreio, which is live on the App Store.
When Screen Time settings are configured to require a parent’s password for changes, parents must log in on their child’s device to make any adjustments. This restriction is expected to extend to apps using the Screen Time API, such as Choreio.
However, I’ve discovered a significant bug: children can bypass this restriction by simply toggling off Choreio in the Screen Time settings—without needing the parent’s password. This effectively disables the app and defeats its purpose as a parental control tool.
Please address this issue as soon as possible to ensure the intended functionality of parental controls. Let me know if you need any additional information to assist with resolving this.
Thank you for your attention to this matter.
Best regards,
Jeff Houston
STEPS TO REPRODUCE
Here are the steps to reproduce the issue clearly:
Install Choreio from the App Store on the child’s phone.
Enable parental controls in Screen Time and set it to require the parent’s password for any changes to Screen Time settings.
Go to the Screen Time settings on the child’s phone.
Observe that the child can simply toggle off Choreio, effectively deactivating the app, without needing the parent’s password.
Expected behavior: Toggling off Choreio should require the parent’s password, just like it does for other Screen Time settings.
Let me know if additional details are needed!
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need?
For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient.
Is there a better way to handle this using VoiceOver?