Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Registering a macOS app for dynamic text sizing in macOS 15
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day? For context, my app is a macOS-native SwiftUI app.
1
0
603
Jan ’25
Imessage and Facetime error
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message. Does anyone have any ideas what the problem could be?
1
1
148
Jun ’25
Ipad 7th generation
I updated with 18.3 beta, but lost video and audio option with that update, I tried to restore with itune in windows 11, got struck between. Forcefully turned off ipad, after two tries got off.... Off like blinked out screen... Not tried all tricks to on, can't on....please tell a solution. Used all your advices in internet. It was 90% charged, working superbly. So far no risk... Please help me. No charging icon, no sign of life. How can On?
1
0
452
Feb ’25
Speak Screen gesture not working
I am testing the accessibility feature available in the Settings app called "Speak Screen". The help text in the Setting app states that swiping down with two fingers will cause the screen content to be spoken. However, I've been unable to get this feature to work. Every time I try the double finger swipe down, it behaves the same as the single finger swipe down gesture. Usually this manifests as making scroll views bounce. I've tried toggling the feature on and off, turning off Reachability, and rebooting my phone, but I can't get the speak screen gesture to work. If I access the speak screen feature from the "Speech Controller" button, then the screens content is spoken, as expected, so I know the feature is enabled. It's just the gesture that doesn't work. Is there something else I need to do to get this gesture to work? I don't want to tell my users to turn this feature on if I can't verify that the gesture will work with my app.
1
0
207
Jul ’25
Size of stylus mesh tip
Hello community, We're designing an app that can optionally be controlled by a stylus with a mesh tip. In this case, the mesh tip we're using is 5 mm in diameter. It seems that mesh tip contact detection is unstable in this size, although it works better with a larger diameter. Is it possible to access a setting in iOS that lets you define the minimum contact area needed to detect a contact on the screen? This would enable us to use this 5 mm stylus. Best regards, Edwin
1
0
377
Feb ’25
Unable to set dialect of Chinese of AVSpeechSynthesisVoice in iOS 18
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of: AVSpeechUtterance(string: "中文") // Any Chinese Content in the dialect specified by: Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language instead of the dialect that I specified in AVSpeechUtterance.voice: AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below. My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18. Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen. However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens. This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior. Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
1
1
97
Jun ’25
Subpath for access to Silence Unknown Callers
Hi, Our app has a section where, we show to users how to activate "Silence Unknown Callers", because is a crucial feature for our app. But, we saw that 30% of users drop the process here, because we can't open directly that setting option in phone app. We are using this url scheme to open phone settings in iOS 18: if let url = URL(string: "App-prefs:com.apple.mobilephone") { UIApplication.shared.open(url) } But, we don't see other way to open directly the path "silence", like in iOS 17, with this url scheme: prefs:root=Phone&path=SILENCE_CALLS So, do you know if is possible open that option directly? We want to improve our accessibility. Thank you!
1
1
354
Mar ’25
How to Ensure Data Privacy with VoiceOver Reading Sensitive Information?
VoiceOver reads out all visible content on the screen, which is essential for visually challenged users. However, this raises a privacy concern—what if a user accidentally focuses on sensitive information, like a bank account password, and it gets read aloud? How can developers prevent VoiceOver from exposing confidential data while still maintaining accessibility? Are there best practices or recommended approaches to handle such scenarios effectively?
1
0
340
Mar ’25
The brightness of the iPad Pro screen is gone after new ios26
After 26 IOS update, the colors on my new iPad Pro M4 have become extremely dull almost like those on a very old device. The screen brightness is significantly reduced, and it's now difficult to see UI elements clearly. This is very disappointing considering the device’s high display quality before the update. Please advise if this is a known issue or if there's a fix.
1
1
94
Jun ’25
FocusState Issue in iOS 18 with Keyboard Navigation
I have implemented a SwiftUI view containing a grid of TextField elements, where focus moves automatically to the next field upon input. This behavior works well on iOS 16 and 17, maintaining proper focus highlighting when keyboard full access is enabled. However, in iOS 18 and above, the keyboard full access focus behaves differently. It always stays behind the actual focus state, causing a mismatch between the visually highlighted field and the active text input. This leads to usability issues, especially for users navigating with an external keyboard. Below is the SwiftUI code for reference: struct AutoFocusGridTextFieldsView: View { private let fieldCount: Int private let columns: Int @State private var textFields: [String] @FocusState private var focusedField: Int? init(fieldCount: Int = 17, columns: Int = 5) { self.fieldCount = fieldCount self.columns = columns _textFields = State(initialValue: Array(repeating: "", count: fieldCount)) } var body: some View { let rows = (fieldCount / columns) + (fieldCount % columns == 0 ? 0 : 1) VStack(spacing: 10) { ForEach(0..<rows, id: \.self) { row in HStack(spacing: 10) { ForEach(0..<columns, id: \.self) { col in let index = row * columns + col if index < fieldCount { TextField("", text: $textFields[index]) .frame(width: 40, height: 40) .multilineTextAlignment(.center) .textFieldStyle(RoundedBorderTextFieldStyle()) .focused($focusedField, equals: index) .onChange(of: textFields[index]) { newValue in if newValue.count > 1 { textFields[index] = String(newValue.prefix(1)) } if !textFields[index].isEmpty { moveToNextField(from: index) } } } } } } } .padding() .onAppear { focusedField = 0 } } private func moveToNextField(from index: Int) { if index + 1 < fieldCount { focusedField = index + 1 } } } struct AutoFocusGridTextFieldsView_Previews: PreviewProvider { static var previews: some View { AutoFocusGridTextFieldsView(fieldCount: 10, columns: 5) } } Has anyone else encountered this issue with FocusState in iOS 18? I really do believe that this is a bug strictly connected to keyboard navigation since I experienced similar problem also on UIKit equivalent of the view. Any insights or suggestions would be greatly appreciated!
1
0
580
Mar ’25
Add custom emoji tags
Allow the user to add their own tags to the default emoji tags. For instance, this emoji, for me, is nonna: 🤌🏻. My efficiency would improve immensely if I could search for it as the “Nonna” emoji, rather than searching for nonna, remembering it doesn’t exist, trying the search for other things it might be called, realising I don’t know what it is, then having to scroll through all the hand emojis twice to find it. 🤌🏻🤞🏼👌
1
0
505
Jan ’25
Voice Over Sound
Hello, When I listen to title in my app with VoiceOver, it makes a strange sound. This characters make with Korean+number+Alphabet. Is this combination makes some strange sound with voice over? I would like to ask if Apple can fix this issue. Thank you.
1
0
204
Mar ’25
External Keyboard + Voiceover focus not working with .searchable + List
While editing the search text using the external keyboard (with VoiceOver on), if I try to navigate the to List using the keyboard, the focus jumps back to the search field immediately, preventing selection of list items. It's important to note that the voiceover navigation alone without a keyboard works as expected. It’s as if the List never gains focus—every attempt to move focus lands back on the search field. The code: struct ContentView: View { @State var searchText = "" let items = ["Apple", "Banana", "Cherry", "Date", "Elderberry", "Fig", "Grape"] var filteredItems: [String] { if searchText.isEmpty { return items } else { return items.filter { $0.localizedCaseInsensitiveContains(searchText) } } } var body: some View { if #available(iOS 16.0, *) { NavigationStack { List(filteredItems, id: \.self) { item in Text(item) } .navigationTitle("Fruits") .searchable(text: $searchText) } } else { NavigationView { List(filteredItems, id: \.self) { item in Text(item) } .navigationTitle("Fruits") .searchable(text: $searchText) } } } }
1
0
89
Jun ’25
Feature Idea: Autonomous, Motion-Powered Clock Display on iPhone.
Hey everyone, I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery? The Core Idea Imagine this: Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket. Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery. Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures. On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED). Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy. Why This Matters This isn't just a cool gimmick; it offers significant benefits: True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else. Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses. Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech. Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source. This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
1
1
113
Jun ’25