Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Accessibility Voiceover is not treating navigation bar left button as first focused element
Accessibility Voiceover is not treating navigation bar left button as first focused element. If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title. If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI. how should I do? if I want to focus on the navigation item back button or navigation title. my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange. Why did you design it this way? Can you tell me your thinking? Thanks
0
0
426
Sep ’25
I have a problem
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
1
0
664
Sep ’25
RTT call option and confirmation dialog missing when dialing emergency numbers
Hello, In our app we provide a button that initiates a phone call using tel://. For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel. If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call. However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT. Is this the expected system behavior for emergency numbers on iOS? 
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option? Thanks in advance for clarifying.
2
0
696
Sep ’25
Custom Keyboard Extension Not Showing in Settings for Activation
Hi everyone, I’m developing a React Native iOS app that includes a custom keyboard extension for sending stickers across apps. The project builds successfully, and the main app installs fine on my test device. However, I’m not seeing the keyboard extension appear under Settings → General → Keyboard → Keyboards → Add New Keyboard, which means I can’t activate it or grant access. At this point, I’m not even sure if the extension is actually being installed on the device along with the main app. Here’s what I’ve done so far. I created a Keyboard Extension target in Xcode, set the correct bundle identifiers and provisioning profiles, and enabled “Requests Open Access” in the extension’s Info.plist. I built and installed the app on a physical device rather than the simulator to ensure proper testing. My main questions are: how can I confirm that the extension is being installed on the device, and if it isn’t, what might prevent it from installing even though the build completes successfully? Any insights, troubleshooting steps, or guidance would be greatly appreciated.
0
0
842
Nov ’25
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
1
0
446
Nov ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
781
Dec ’25
pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
1
0
683
Dec ’25
Icon label's missing
Since the last bet upgrade for iPad to 26.3 labels have disappeared. Going into system/accessibility the toggle setting for labels makes no difference whether on or off. labels are permanently not there/missing.
1
0
1.1k
Jan ’26
Voice Control evaluation questions: "Stop Recording" command failure & Item numbers on non-interactive web elements
Hello everyone, I am currently evaluating my app's accessibility features to accurately display the "Accessibility" information on the App Store. I have encountered two specific issues regarding Voice Control testing and would appreciate any guidance. Voice Command for "Stop Recording" According to the evaluation criteria, if an app supports audio recording or dictation, users must be able to start and stop recording using only their voice. Behavior: I can successfully trigger the recording using the command "Start Recording". However, I cannot find a command to stop it. Commands like "Stop Recording" or "Stop" are not recognized by the system. Question: Is there a specific standard voice command intended for stopping a recording? Item Number Overlays on Non-Interactive Web Elements (WKWebView) I noticed an inconsistency between native views and web content regarding Voice Control item numbering. Behavior: When testing web content within the app (WKWebView) or in Safari, Voice Control displays item number overlays on non-interactive text elements (such as standard or tags). In native views, static labels do not receive item numbers. Question: Is this expected behavior for web content? Since these elements are not interactive, I am unsure if this should be considered a bug (fail) or an acceptable exception for the accessibility evaluation. Has anyone experienced similar issues or know the correct criteria for these cases? Thank you.
1
0
1.5k
Feb ’26
VoiceOver with Swift Charts summaries
I had a VoiceOver user point out an issue with my app that I’ve definitely known about but have never been able to fix. I thought that I had filed feedback for it but it looks like I didn’t. Before I do I’m hoping someone has some insight. With Swift Charts when I tap part of a chart it summarizes the three hours and then you can swipe vertically to hear it read out details of each hour. For example, the Y-Axis is the amount of precipitation for the hour and the X-Axis is the hours of the day. The units aren't being read in the summary but they are for individual hours when you vertical swipe. The summary says something such as "varies between 0.012 and 0.082". In the AXChartDescriptor I’ve tried everything I can think of, including adding a label to the Y axis in the DataPoint but nothing seems to work in getting that summary to include units. With a vertical swipe it seems to just be using my accessibility label and value (like I would expect).
0
0
236
Feb ’26
Apple Pay e installazione di app di terze parti non funzionanti
Scrivo questo post per farmi notare meglio, il 6 marzo ho mandato un feedback (poi aggiornato oggi, 18 marzo) tramite l‘app Feedback installata su iPhone chiedo a chiunque lavori all’interno di Apple, specialmente agli ingegneri informatici che si occupano delle funzioni di accessibilità di iOS 26 di visionare questo Feedback per aumentare ancora di più le opzioni di accessibilità degli utenti Apple, vi lascio di seguito l’ID del Feedback, grazie mille per il lavoro che fate FB22142615
1
0
292
3d
“Desktop & Documents Folders” feature in iCloud Drive.
Dear Apple Support, I would like to raise a concern regarding the behavior of the “Desktop & Documents Folders” feature in iCloud Drive. From a business and development standpoint, the fact that folders may be automatically moved or created without clear and explicit user awareness is quite concerning. File system behavior is something users generally expect to remain predictable and fully under their control. In particular, when working in development environments, even small and unintended changes to folder structures can lead to issues such as broken file paths, build errors, or inconsistencies in project setups. The possibility that such changes may occur automatically introduces an element of uncertainty that is difficult to manage in professional workflows. Additionally, there are security considerations. For example, if sensitive files such as configuration data or API keys are temporarily stored on the Desktop, the possibility that they could be unintentionally synced to the cloud raises valid concerns. Even if safeguards exist, the lack of clear visibility and explicit confirmation makes it difficult to confidently assess and manage risk. Overall, the current behavior gives the impression that folder operations may occur without sufficient transparency. From a business perspective, this impacts trust, predictability, and operational reliability. I would appreciate consideration of the following improvements: Clear and explicit communication before any folder movement or creation occurs A strictly opt-in model with unambiguous user consent Greater visibility into when and how synchronization affects local files Options to ensure fully local control over specific directories Thank you for your attention to this matter. I hope this feedback will contribute to improving the reliability and transparency of the feature. Sincerely,
1
0
263
1d
AXSpeech Thread Crash SEGV_ACCERR
Hi everyone, I've encountered a rare and strange crash in my app that I can't consistently reproduce. The crash seems to occur deep within Apple's internal frameworks, and I can't pinpoint which line of my own code is causing it. Here's the crash stack trace: #44 AXSpeech SIGSEGV SEGV_ACCERR 0 CoreFoundation ___CFCheckCFInfoPACSignature + 4 1 CoreFoundation _CFRunLoopSourceSignal + 28 2 Foundation _performQueueDequeue + 492 3 Foundation ___NSThreadPerformPerform + 88 4 CoreFoundation ___CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 5 CoreFoundation ___CFRunLoopDoSource0 + 176 6 CoreFoundation ___CFRunLoopDoSources0 + 340 7 CoreFoundation ___CFRunLoopRun + 828 8 CoreFoundation _CFRunLoopRunSpecific + 608 9 Foundation -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212 10 TextToSpeech _TTSCFAttributedStringCreateStringByBracketingAttributeWithString + 776 11 Foundation ___NSThread__start__ + 732 12 libsystem_pthread.dylib __pthread_start + 136 Sometimes, instead of line 10 referencing _TTSCFAttributedStringCreateStringByBracketingAttributeWithString, it shows: 10 TextToSpeech LogWarning(char const*, ...) + 7288 Has anyone experienced a similar issue or know what might be triggering this crash? Any guidance on how to investigate or resolve this would be greatly appreciated. Thank you!
4
0
460
10h
Developer Mode Restart without HomeButton
After enabling Developer Mode on my iPhone and restarting it, the device asks me to press the Home button to confirm. Unfortunately, my Home button is broken, so I can’t access Developer Mode. The iPhone itself still works, but I can’t enable the mode. Is there any way to bypass this without the Home button?
3
0
122
Mar ’25
VisionOS - Gamepad steals focus
I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running. Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
1
0
273
Apr ’25
accessibilityActivationPoint Not Working When Set Directly on UITableViewCell
I’m trying to set the accessibilityActivationPoint directly on a UITableViewCell so that VoiceOver activate on a specific button inside the cell. However, this approach doesn’t seem to work. Instead, when I override the accessibilityActivationPoint property inside the UITableViewCell subclass and return the desired point, it works as expected. Why doesn’t setting accessibilityActivationPoint directly on the cell work, but overriding it inside the cell does? Is there a recommended approach for handling this scenario? The following approach works, override var accessibilityActivationPoint: CGPoint { get { return convert(toggleSwitch.center, to: nil) } set{ super.accessibilityActivationPoint = newValue } } but setting accessibility point directly not works private func configureAccessibility() { isAccessibilityElement = true accessibilityLabel = titleLabel.text accessibilityTraits = .toggleButton accessibilityActivationPoint = self.convert(toggleSwitch.center, to: self) accessibilityValue = toggleSwitch.accessibilityValue }
2
0
209
Apr ’25
accessibilityElements excludes the unlisted subviews – How to Fix?
I have a parent view containing 10 subviews. To control the VoiceOver navigation order, I set only a few elements in accessibilityElements. However, the remaining elements are not being focused or are completely inaccessible. Is this the expected behavior? If I only specify a subset of elements in accessibilityElements, does it exclude the rest? What’s the best way to ensure all elements remain accessible while customising the order?
3
0
231
Apr ’25
The camera preview screen cannot be previewed in full screen
I downloaded the official camera sample code(https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview )it's a .swiftpm package and created a SwiftUI project. I copied the official sample code into this new project, build it, and ran it on an iPhone 13 for testing. I found that there were black empty areas on the top and bottom of the application interface, which means that the application interface cannot be previewed in full screen. I have tried many methods but cannot preview in full screen. How can I modify the code?
1
0
205
Apr ’25
Accessibility Voiceover is not treating navigation bar left button as first focused element
Accessibility Voiceover is not treating navigation bar left button as first focused element. If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title. If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI. how should I do? if I want to focus on the navigation item back button or navigation title. my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange. Why did you design it this way? Can you tell me your thinking? Thanks
Replies
0
Boosts
0
Views
426
Activity
Sep ’25
I have a problem
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
Replies
1
Boosts
0
Views
664
Activity
Sep ’25
RTT call option and confirmation dialog missing when dialing emergency numbers
Hello, In our app we provide a button that initiates a phone call using tel://. For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel. If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call. However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT. Is this the expected system behavior for emergency numbers on iOS? 
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option? Thanks in advance for clarifying.
Replies
2
Boosts
0
Views
696
Activity
Sep ’25
Custom Keyboard Extension Not Showing in Settings for Activation
Hi everyone, I’m developing a React Native iOS app that includes a custom keyboard extension for sending stickers across apps. The project builds successfully, and the main app installs fine on my test device. However, I’m not seeing the keyboard extension appear under Settings → General → Keyboard → Keyboards → Add New Keyboard, which means I can’t activate it or grant access. At this point, I’m not even sure if the extension is actually being installed on the device along with the main app. Here’s what I’ve done so far. I created a Keyboard Extension target in Xcode, set the correct bundle identifiers and provisioning profiles, and enabled “Requests Open Access” in the extension’s Info.plist. I built and installed the app on a physical device rather than the simulator to ensure proper testing. My main questions are: how can I confirm that the extension is being installed on the device, and if it isn’t, what might prevent it from installing even though the build completes successfully? Any insights, troubleshooting steps, or guidance would be greatly appreciated.
Replies
0
Boosts
0
Views
842
Activity
Nov ’25
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
Replies
1
Boosts
0
Views
446
Activity
Nov ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
Replies
1
Boosts
0
Views
781
Activity
Dec ’25
Accessibility voice command recording does not start on Apple Vision Pro
Is the accessibility feature, voice command recording available on the Apple Vision Pro? It does not start on my device. The Apple Vision Pro is on 26.1. Regular single voice commands work on the Apple Vision Pro. Recording commands worked on other devices. (iPad and iPhone)
Replies
2
Boosts
0
Views
781
Activity
Dec ’25
pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
Replies
1
Boosts
0
Views
683
Activity
Dec ’25
Icon label's missing
Since the last bet upgrade for iPad to 26.3 labels have disappeared. Going into system/accessibility the toggle setting for labels makes no difference whether on or off. labels are permanently not there/missing.
Replies
1
Boosts
0
Views
1.1k
Activity
Jan ’26
Voice Control evaluation questions: "Stop Recording" command failure & Item numbers on non-interactive web elements
Hello everyone, I am currently evaluating my app's accessibility features to accurately display the "Accessibility" information on the App Store. I have encountered two specific issues regarding Voice Control testing and would appreciate any guidance. Voice Command for "Stop Recording" According to the evaluation criteria, if an app supports audio recording or dictation, users must be able to start and stop recording using only their voice. Behavior: I can successfully trigger the recording using the command "Start Recording". However, I cannot find a command to stop it. Commands like "Stop Recording" or "Stop" are not recognized by the system. Question: Is there a specific standard voice command intended for stopping a recording? Item Number Overlays on Non-Interactive Web Elements (WKWebView) I noticed an inconsistency between native views and web content regarding Voice Control item numbering. Behavior: When testing web content within the app (WKWebView) or in Safari, Voice Control displays item number overlays on non-interactive text elements (such as standard or tags). In native views, static labels do not receive item numbers. Question: Is this expected behavior for web content? Since these elements are not interactive, I am unsure if this should be considered a bug (fail) or an acceptable exception for the accessibility evaluation. Has anyone experienced similar issues or know the correct criteria for these cases? Thank you.
Replies
1
Boosts
0
Views
1.5k
Activity
Feb ’26
square mouse and lack of transparency
after the 26.3 beta update, my mouse has been having major problems with transparency, have to keep going to reset colors in display, but it doesn't hold, anyone else?
Replies
1
Boosts
0
Views
1k
Activity
Feb ’26
VoiceOver with Swift Charts summaries
I had a VoiceOver user point out an issue with my app that I’ve definitely known about but have never been able to fix. I thought that I had filed feedback for it but it looks like I didn’t. Before I do I’m hoping someone has some insight. With Swift Charts when I tap part of a chart it summarizes the three hours and then you can swipe vertically to hear it read out details of each hour. For example, the Y-Axis is the amount of precipitation for the hour and the X-Axis is the hours of the day. The units aren't being read in the summary but they are for individual hours when you vertical swipe. The summary says something such as "varies between 0.012 and 0.082". In the AXChartDescriptor I’ve tried everything I can think of, including adding a label to the Y axis in the DataPoint but nothing seems to work in getting that summary to include units. With a vertical swipe it seems to just be using my accessibility label and value (like I would expect).
Replies
0
Boosts
0
Views
236
Activity
Feb ’26
Apple Pay e installazione di app di terze parti non funzionanti
Scrivo questo post per farmi notare meglio, il 6 marzo ho mandato un feedback (poi aggiornato oggi, 18 marzo) tramite l‘app Feedback installata su iPhone chiedo a chiunque lavori all’interno di Apple, specialmente agli ingegneri informatici che si occupano delle funzioni di accessibilità di iOS 26 di visionare questo Feedback per aumentare ancora di più le opzioni di accessibilità degli utenti Apple, vi lascio di seguito l’ID del Feedback, grazie mille per il lavoro che fate FB22142615
Replies
1
Boosts
0
Views
292
Activity
3d
“Desktop & Documents Folders” feature in iCloud Drive.
Dear Apple Support, I would like to raise a concern regarding the behavior of the “Desktop & Documents Folders” feature in iCloud Drive. From a business and development standpoint, the fact that folders may be automatically moved or created without clear and explicit user awareness is quite concerning. File system behavior is something users generally expect to remain predictable and fully under their control. In particular, when working in development environments, even small and unintended changes to folder structures can lead to issues such as broken file paths, build errors, or inconsistencies in project setups. The possibility that such changes may occur automatically introduces an element of uncertainty that is difficult to manage in professional workflows. Additionally, there are security considerations. For example, if sensitive files such as configuration data or API keys are temporarily stored on the Desktop, the possibility that they could be unintentionally synced to the cloud raises valid concerns. Even if safeguards exist, the lack of clear visibility and explicit confirmation makes it difficult to confidently assess and manage risk. Overall, the current behavior gives the impression that folder operations may occur without sufficient transparency. From a business perspective, this impacts trust, predictability, and operational reliability. I would appreciate consideration of the following improvements: Clear and explicit communication before any folder movement or creation occurs A strictly opt-in model with unambiguous user consent Greater visibility into when and how synchronization affects local files Options to ensure fully local control over specific directories Thank you for your attention to this matter. I hope this feedback will contribute to improving the reliability and transparency of the feature. Sincerely,
Replies
1
Boosts
0
Views
263
Activity
1d
AXSpeech Thread Crash SEGV_ACCERR
Hi everyone, I've encountered a rare and strange crash in my app that I can't consistently reproduce. The crash seems to occur deep within Apple's internal frameworks, and I can't pinpoint which line of my own code is causing it. Here's the crash stack trace: #44 AXSpeech SIGSEGV SEGV_ACCERR 0 CoreFoundation ___CFCheckCFInfoPACSignature + 4 1 CoreFoundation _CFRunLoopSourceSignal + 28 2 Foundation _performQueueDequeue + 492 3 Foundation ___NSThreadPerformPerform + 88 4 CoreFoundation ___CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 5 CoreFoundation ___CFRunLoopDoSource0 + 176 6 CoreFoundation ___CFRunLoopDoSources0 + 340 7 CoreFoundation ___CFRunLoopRun + 828 8 CoreFoundation _CFRunLoopRunSpecific + 608 9 Foundation -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212 10 TextToSpeech _TTSCFAttributedStringCreateStringByBracketingAttributeWithString + 776 11 Foundation ___NSThread__start__ + 732 12 libsystem_pthread.dylib __pthread_start + 136 Sometimes, instead of line 10 referencing _TTSCFAttributedStringCreateStringByBracketingAttributeWithString, it shows: 10 TextToSpeech LogWarning(char const*, ...) + 7288 Has anyone experienced a similar issue or know what might be triggering this crash? Any guidance on how to investigate or resolve this would be greatly appreciated. Thank you!
Replies
4
Boosts
0
Views
460
Activity
10h
Developer Mode Restart without HomeButton
After enabling Developer Mode on my iPhone and restarting it, the device asks me to press the Home button to confirm. Unfortunately, my Home button is broken, so I can’t access Developer Mode. The iPhone itself still works, but I can’t enable the mode. Is there any way to bypass this without the Home button?
Replies
3
Boosts
0
Views
122
Activity
Mar ’25
VisionOS - Gamepad steals focus
I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running. Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
Replies
1
Boosts
0
Views
273
Activity
Apr ’25
accessibilityActivationPoint Not Working When Set Directly on UITableViewCell
I’m trying to set the accessibilityActivationPoint directly on a UITableViewCell so that VoiceOver activate on a specific button inside the cell. However, this approach doesn’t seem to work. Instead, when I override the accessibilityActivationPoint property inside the UITableViewCell subclass and return the desired point, it works as expected. Why doesn’t setting accessibilityActivationPoint directly on the cell work, but overriding it inside the cell does? Is there a recommended approach for handling this scenario? The following approach works, override var accessibilityActivationPoint: CGPoint { get { return convert(toggleSwitch.center, to: nil) } set{ super.accessibilityActivationPoint = newValue } } but setting accessibility point directly not works private func configureAccessibility() { isAccessibilityElement = true accessibilityLabel = titleLabel.text accessibilityTraits = .toggleButton accessibilityActivationPoint = self.convert(toggleSwitch.center, to: self) accessibilityValue = toggleSwitch.accessibilityValue }
Replies
2
Boosts
0
Views
209
Activity
Apr ’25
accessibilityElements excludes the unlisted subviews – How to Fix?
I have a parent view containing 10 subviews. To control the VoiceOver navigation order, I set only a few elements in accessibilityElements. However, the remaining elements are not being focused or are completely inaccessible. Is this the expected behavior? If I only specify a subset of elements in accessibilityElements, does it exclude the rest? What’s the best way to ensure all elements remain accessible while customising the order?
Replies
3
Boosts
0
Views
231
Activity
Apr ’25
The camera preview screen cannot be previewed in full screen
I downloaded the official camera sample code(https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview )it's a .swiftpm package and created a SwiftUI project. I copied the official sample code into this new project, build it, and ran it on an iPhone 13 for testing. I found that there were black empty areas on the top and bottom of the application interface, which means that the application interface cannot be previewed in full screen. I have tried many methods but cannot preview in full screen. How can I modify the code?
Replies
1
Boosts
0
Views
205
Activity
Apr ’25