Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Microphone Not Working When Running Unity Vision Pro App Normally
} // Start listening to the microphone public void StartListening() { if (!isListening) { #if UNITY_IOS || UNITY_TVOS microphoneInput = Microphone.Start(null, true, 10, 44100); #else try { microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100 if (microphoneInput == null) { microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate); } #endif isListening = true; Debug.Log(Microphone.devices.Length + " Started listening..."); debugText.text = Microphone.devices.Length + "- Started listening..."; } catch (System.Exception e) { Debug.LogError($"Starting microphone failed: {e.Message}"); debugText.text = $"Starting microphone failed: {e.Message}"; } } } void Update() { if (isListening && microphoneInput != null) { // Analyze the audio for voice activity float volume = GetAverageVolume(); if (volume > detectionThreshold) { Debug.Log("User is speaking!"); lastVoiceTime = Time.time; SoundDetected = true; if (Time.time - lastVoiceTime > silenceDuration) { Debug.Log("User is silent."); debugText.text = volume.ToString() + " - User is silent."; } slider.value = volume; } } } private float GetAverageVolume() { float[] samples = new float[128]; microphoneInput.GetData(samples, Microphone.GetPosition(null)); float sum = 0f; foreach (float sample in samples) { sum += Mathf.Abs(sample); } return sum / samples.Length; } Problem: When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected. Question: Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario? Thanks in advance for any insights! Best, Siddharth
2
0
493
Feb ’25
accessibilityActivationPoint Not Working When Set Directly on UITableViewCell
I’m trying to set the accessibilityActivationPoint directly on a UITableViewCell so that VoiceOver activate on a specific button inside the cell. However, this approach doesn’t seem to work. Instead, when I override the accessibilityActivationPoint property inside the UITableViewCell subclass and return the desired point, it works as expected. Why doesn’t setting accessibilityActivationPoint directly on the cell work, but overriding it inside the cell does? Is there a recommended approach for handling this scenario? The following approach works, override var accessibilityActivationPoint: CGPoint { get { return convert(toggleSwitch.center, to: nil) } set{ super.accessibilityActivationPoint = newValue } } but setting accessibility point directly not works private func configureAccessibility() { isAccessibilityElement = true accessibilityLabel = titleLabel.text accessibilityTraits = .toggleButton accessibilityActivationPoint = self.convert(toggleSwitch.center, to: self) accessibilityValue = toggleSwitch.accessibilityValue }
2
0
151
Apr ’25
Programmatically force VoiceOver to read parentheses for math expressions
How can I force VoiceOver to read parentheses for math expressions like this: Text("(2+3)×4") // VoiceOver: Two plus three, times four I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks. Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.” Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
2
0
376
Jul ’25
Keyboard navigation not working in native iOS Wallet interface
Hi guys, I'm facing an issue with the native interface to add a card into the wallet - does someone have some ideas on how to fix/work around that? STEPS TO REPRODUCE: Disable VoiceOver (Settings → Accessibility → VoiceOver → Off). Connect and confirm that you can navigate other iOS interfaces using an external keyboard. In any app, present a PKAddPassesViewController with a valid .pkpass file. When the Wallet “Add Pass” sheet appears, attempt to navigate using only the external keyboard (Tab/Arrow/Enter). Observe that focus does not move to the Cancel or Add buttons, and no elements receive keyboard focus. EXPECTED RESULT: All interactive elements in PKAddPassesViewController (e.g., Cancel and Add) should be fully keyboard accessible without requiring VoiceOver. Users should be able to navigate, select, and complete actions using only a hardware keyboard. ACTUAL RESULT: Keyboard navigation is not possible. No elements receive focus. Users cannot activate Cancel or Add buttons using keyboard input. The only way to interact is by touch or enabling VoiceOver, which does not satisfy keyboard accessibility requirements. IMPACT: Violates WCAG 2.1 Success Criterion 2.1.1 (Keyboard Accessible). Prevents keyboard-only users (including users with motor disabilities) from adding passes to Wallet. Affects users of external keyboards who rely on tab/arrow navigation. Creates an inconsistent accessibility experience compared to other iOS system modals.
2
0
1.4k
Aug ’25
Assistive Access Bugs
Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode. Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work? Cannot make FaceTime calls with favorite contacts. Find My iPhone cannot jump to the maps app. Camera cannot zoom in or out. Photos cannot be deleted, edited, or shared in a shared album in the photos app. Photos/videos cannot be sent in messages. Spotify cannot be accessed from the lock screen. Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks). There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
1
0
146
Mar ’25
Unable to set dialect of Chinese of AVSpeechSynthesisVoice in iOS 18
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of: AVSpeechUtterance(string: "中文") // Any Chinese Content in the dialect specified by: Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language instead of the dialect that I specified in AVSpeechUtterance.voice: AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below. My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18. Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen. However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens. This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior. Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
1
1
118
Jun ’25
Accessibility for detents behaves different in fullscreen cover
The only way I found to make the accessibility focus work correctly in the detent in a fullscreen cover is to apply the focus manually. The issue is in the ContentView the grabber works while in the fullscreen it does not. Is there something I am missing or is this a bug. I also don't understand why I need to apply focus in the fullscreen cover while in the ContentView I do not. struct ContentView: View { @State private var buttonClicked = false @State private var bottomSheetShowing = false var body: some View { NavigationView { VStack { Button(action: { buttonClicked = true }, label: { Text("First Page Button") .padding() .background(Color.blue) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("First Page Button") FullscreenView2() } .navigationTitle("Welcome") .fullScreenCover(isPresented: $buttonClicked) { FullscreenView(buttonClicked: $buttonClicked, bottomSheetShowing: $bottomSheetShowing) } } } } struct FullscreenView: View { @Binding var buttonClicked: Bool @Binding var bottomSheetShowing: Bool var body: some View { NavigationView { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") .toolbar { ToolbarItem(placement: .navigationBarLeading) { Button(action: { buttonClicked = false }, label: { Text("Close") }) .accessibilityLabel("Close Fullscreen View Button") } } .accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } } struct FullscreenView2: View { @State var bottomSheetShowing = false var body: some View { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") //.accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } struct BottomSheetView: View { @Binding var bottomSheetShowing: Bool // @AccessibilityFocusState var isFocused: Bool var body: some View { VStack(spacing: 20) { Text("Bottom Sheet") .font(.headline) .accessibilityAddTraits(.isHeader) Button(action: { bottomSheetShowing = false }, label: { Text("Dismiss") .padding() .background(Color.red) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("Dismiss Bottom Sheet Button") } .padding() .frame(maxWidth: .infinity, maxHeight: .infinity) .background( Color(UIColor.systemBackground) .edgesIgnoringSafeArea(.all) ) .accessibilityAddTraits(.isModal) // Indicates that this view is a modal // .onAppear { // // Set initial accessibility focus when the sheet appears // DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { // isFocused = true // } // } // .accessibilityFocused($isFocused) } }
1
1
616
Feb ’25