Please refer to Feedback report: FB19701007
The Personal Voice file created in English changes to either Spanish or Chinese and no longer works properly. This has been happening since Beta 1 of iOS/iPadOS 26.
I have been unable to pinpoint what causes this to occur. Possibly downloading foreign voices to a device or using different voices via AVSpeechSynthesizer.
I run an app in Xcode on my device that prints to the console info about the installed voices. Initially after creation here is the output:
Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4
Language: en-US
Then I hear a Spanish accent and find this:
Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4
Language: es-MX
Currently it isn't working and here is the output:
Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4
Language: zh-CN
Note that the voice file on all three above is the same. No matter what the user does, the created voice file should never be able to change languages. On my test devices I reset them all by erasing all content and settings and creating a new English Personal Voice and the issue persists.
A side issue is the toggle share across devices doesn't remain off if turned off. I tried to not share to see if that could be the cause, but the toggle turns on automatically. It won’t remain off.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post
Working Implementation with Static Audio File:
let audioPlayer = SCNAudioPlayer(source: audioSource)
node.addAudioPlayer(audioPlayer)
Attempted Implementation with Procedural Audio:
// Audio generation code
}
let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode)
node.addAudioPlayer(audioPlayer)
In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here:
Apple docs
Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating:
Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use.
However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
Hello
I tried implementing the ASAM for macOS as per apple guidelines with configuration profile mentioned here but didn't had any success.
Then Apple suggested to use requestGuidedAccessSession in macOS but that is only supported in macOS Catalyst but that also didn't work with valid config profiles too.
Did anyone get success with ASAM mode without assessment entitltlement?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have an HTML select that has Spanish text in the options.
When VoiceOver reads the selected option (unopened), it switches to Spanish as expected.
However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text.
I have tried adding lang on to the select tag and the option tag but neither helps
https://codepen.io/grahamfowles/pen/VYYRxMK
Topic:
Accessibility & Inclusion
SubTopic:
General
Since UITextView does not support the zoom function, the zoom function of UITextView with addSubview is used in UIScrollView.
However, when I use the link here, the text behind it is missing.
Ex) https://appstoreconnect.apple.com/login\nApple Developer Login
-> The text “Apple Developer Login” does not appear.
If anyone has experienced the same problem as me or knows a solution, please leave a comment.
Note)
It is working normally in iOS16, but the text behind the link disappears in iOS18.
The text is not visible, but you can copy it and paste it to retrieve the missing text.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey everyone!
I am developing a screen time limit app to help people spend less time in distracting apps.
It works this way: people choose unhealthy apps for them and opposite productivity apps. In the app you can exchange time spent on healthy habits to scroll or use other distracting apps.
This idea was loved by social media, and the app already has 100k followers on social media without even being launched yet.
So I am waiting just for one feature permission from Apple, and they have not given me any answer since I applied 3 weeks ago.
There are a lot of similar apps on the market, and this feature exists in other screen time limit apps.
Why is app blocking permission needed?
Time Exchange Functionality:
Users independently select which apps are productive and which are distracting for them.
The system blocks the "negative" apps until the user accumulates enough time in the "positive" ones. This encourages healthy device usage.
Full User Control:
All apps to be blocked are manually selected by the user in the settings.
The extension does not impose any restrictions without explicit permission.
Transparency and Security:
Blocking happens locally, with no data collected about app usage.
We adhere to Apple’s privacy policy.
Compliance with App Store Guidelines:
We understand that app blocking is a sensitive feature, but in our case it:
Is used for the benefit of the user (digital detox, productivity improvement).
Does not interfere with system processes or other developers’ apps.
Does not misuse access to APIs.
My question to the forum is:
Did you have similar problems, and how did you resolve them?
Are there any ways to speed up the process or contact someone from the approval team directly?
Should I give up and release it on Android?
I am very disappointed and frustrated. Hope to get some useful tips.
Thank you very much!
Topic:
Accessibility & Inclusion
SubTopic:
General
Amogst the two languages my app would have lets say 10% and 90%.
I am launching an app for a unlisted Primary Language. I consider whatever is inside the app as the primary and that wont be English. The secondary language
Hi everyone,
I’m developing a React Native iOS app that includes a custom keyboard extension for sending stickers across apps. The project builds successfully, and the main app installs fine on my test device.
However, I’m not seeing the keyboard extension appear under Settings → General → Keyboard → Keyboards → Add New Keyboard, which means I can’t activate it or grant access. At this point, I’m not even sure if the extension is actually being installed on the device along with the main app.
Here’s what I’ve done so far. I created a Keyboard Extension target in Xcode, set the correct bundle identifiers and provisioning profiles, and enabled “Requests Open Access” in the extension’s Info.plist. I built and installed the app on a physical device rather than the simulator to ensure proper testing.
My main questions are: how can I confirm that the extension is being installed on the device, and if it isn’t, what might prevent it from installing even though the build completes successfully?
Any insights, troubleshooting steps, or guidance would be greatly appreciated.
I've just received an email from Apple regarding the Global Accessibility Awareness Day and some forthcoming sessions to promote their accessibility features.
What a joke.
For many years, Apple refuses to provide the most basic accessibility requirement on macOS:
LET USERS DISABLE ALL NON-CONSENSUAL UNSOLICITED ANIMATIONS AND OTHER UI CONVULSIONS.
The scourge of animations started from macOS Lion.
Yes, many of them can be, fortunately, disabled through some obscure Terminal commands (that is, if the user is lucky enough to discover them on some obscure internet resources).
The "Reduce motion" control in System Settings is a fake option that doesn't do anything.
And there are two most glaring accessibility violations that cannot be disabled:
Scroll bar rollover highlight effect introduced on macOS 10.7.3. Every time you move the cursor over a scroll bar, the bar gets highlighted. It results in bringing the user's attention to random scroll bars for no reason whatsoever just because the cursor happens to pass over the bar at some point. HUNDREDS of unnecessary, annoying events of distraction daily!
Expand/collapse animation of NSOutlineView (such as when we open/close a folder in the list view in the Finder, as well as any other app that's using outline views). It's extremely annoying, distracting, and time-wasting.
All feedback submitted about this through the years remains mostly ignored (except for a few cases where I received some ridiculous replies from employees who, apparently, are barely familiar with Macs in general).
Apple does NOT care about accessibility. Not only this, but it's obvious that Apple is, in fact, intentionally abusing those users who can't tolerate distracting, time-wasting animations and UI convulsions.
I have a UITextField in my application for entering a state. If I tap on it, a UIPickerView pops up and let's the user select a state (but they can still type too).
The issue relates to Full Keyboard Access. If we select the UITextField using an external keyboard, the UIPickerView appears, but in order to get to it the user has to tab through the whole view controller to get to the UIPickerView at the end.
What would be nice is to a) move focus directly to the UIPickerView (have it highlighted in blue and scrollable right away with keyboard) or b) make the UIPickerView the next view that's accessible when tabbing over or using the arrow keys.
I've tried using:
UIAccessibility notifications (both .screenChanged and .layoutChanged, with and without a delay). This ended up only announcing the view, but didn't help with full keyboard access.
Making the UIPickerView a first responder when it appears.
Attempting to change the accessibilityElements order (but with so many views and views within views, this isn't really a viable option either).
Pressing tab + -> (tab and right arrow button) will quickly take the user to the end of the chain of accessibility elements, in other words, to the UIPickerView. But there has to be a cleaner way of just automatically setting the focus to the UIPickerView or making it the next element by pressing the arrow key.
I created a desktop app for Mac using Xojo. The app has a controller in the main window and displays advertisements and notices on a connected external display.
I'm currently connecting my iMac24 to a REGZA-55M550M via AirPlay, and displaying video from the iMac to the REGZA, but the connection occasionally drops out. Yesterday, the connection dropped about 3.5 hours after connecting. Of course, I have other apps running on the iMac, but I'm not using any operations that would put a strain on the network or memory.
Does AirPlay connection to non-Apple products become unstable over long periods of time?
The accessibility nutrition labels seem like a great feature. I don't see keyboard access mentioned, are there plans to add this into the accessibility nutrition labels? To emphasise that keyboard accessibility is not just for desktop computers, apps need it too.
Hello everyone,
Our community dues payment app only facilitates real-world maintenance-dues payments directly to property managers’ bank accounts. However, during testing it was likely flagged by the AI-driven review system for a metadata criterion and rejected under Guideline 3.1.1 (“Paid digital content must use IAP”).
Meanwhile, hundreds of similar apps remain live on the App Store using the exact same model:
The app is completely free
No digital content or subscriptions are sold
Dues payments are made via bank transfer or credit card directly to the manager
Has anyone else encountered this? How did you overcome the metadata check in the AI-driven review process?
Thanks!
Topic:
Accessibility & Inclusion
SubTopic:
General
I need to direct text-to-speech generated audio from my app simultaneously to a bluetooth speaker device AND to the internal iPad speaker. The app uses AVSpeechSynthesizer and several third party speech engines. How best to do this?
I noticed the outputChannels property on AVSpeechSynthesizer...are there any examples of how to use this?
Topic:
Accessibility & Inclusion
SubTopic:
General
I'm reposting here my FB17602742, submitted yesterday:
The strong wording of this message comes from years of Apple ignoring the needs of users who can't tolerate UI animations and convulsions.
At this point, it's clear that Apple is either intentionally harming users like me or simply doesn't care about meeting even the most basic accessibility standards on macOS.
Yes, many UI animations and convulsions can, fortunately, be disabled - but not through straightforward UI controls. Instead, users are forced to look for obscure Terminal commands found scattered across the Internet.
The "Reduce motion" checkbox in System Settings is simply a fake control that doesn't do anything - instead of actually disabling all UI animations and convulsions.
What's worse, two of the most offensive UI animations cannot be disabled at all. Apple has consistently dismissed requests to let users disable the following UI animations:
Scroll bar rollover highlight effect (introduced on macOS 10.7.3). Every time the cursor passes over a scroll bar, it gets highlighted. This draws the user's attention to random scroll bars for no reason - just because the cursor happened to pass over them. It results in HUNDREDS of unnecessary, annoying events of distraction daily!
Expand/collapse animation of NSOutlineView (e.g., when opening/closing folders in the list view in the Finder, or any other app using outline views). This animation is extremely distracting, irritating, and time-wasting.
Global Accessibility Awareness Day is approaching.
Dear Apple,
Please adhere to the most basic accessibility standards. Stop the needless suffering of countless users like me. Let us disable the two aforementioned UI convulsions.
Thank you for your attention to the issue.
System settings => Accessibility => System Voice => the little (i) beside the pulldown => Voices => THIS SCREEN will allow you to download Premium Voices
Is there a way to trigger this screen programmatically. Or at least a link to get my users there without having to dig thru that swamp of screens?
I'm developing a document editor for macOS using AppKit, which supports structured content such as titles and multiple heading levels—similar to what you see in the Pages app.
I'm looking for a way to programmatically mark a specific substring within an NSTextView as a heading, so that VoiceOver can recognize it and announce it appropriately (e.g., by saying “heading” before reading the text). This would be similar in spirit to how NSAccessibilityLinkTextAttribute works for links.
Is there an existing accessibility text attribute or recommended approach to achieve this behavior for headings? If not, I’d appreciate any guidance or suggestions on how best to implement this in a VoiceOver-friendly way.
Thank you in advance for your help!
Best regards,
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
Whenever I put accessibility focus on an image and if image has some text in it, voiceover reads that text along with image's accessibility label.
Is there a way to programmatically turn off text recognition on images for accessibility?
I couldn't find any relevant accessibility API's that could help here.
Thanks!
I’m trying to enroll in the Apple Developer Program as an individual. I’ve gone through the steps on the website and started the purchase process. However, after a couple of days when I return to the site, it doesn’t remember my progress — I have to start the enrollment from scratch every time.
Is this expected behavior? Am I missing a step to save my progress or complete the enrollment properly?
Any help or guidance would be appreciated. Thank you!
I have some doubts about how VoiceOver handles focus when the screen updates.
When a new UIViewController is pushed onto a UINavigationController or presented modally, how does VoiceOver decide which element to focus on? Is there a way to control or customize this behavior?
In a UISplitViewController, when an item is selected in the primary view controller, the focus should shift to the relevant content in the secondary view controller. How can we ensure that VoiceOver correctly moves focus to the right element in the secondary panel?