I am working on capturing 48MP images using the iPhone 16 Pro Max with the Ultra-wide camera. I’ve updated the code to capture the maximum supported dimensions with the following snippet:
if #available(iOS 16.0, *) {
photoOutput.maxPhotoDimensions = device.activeFormat.supportedMaxPhotoDimensions.last!
photoSettings.maxPhotoDimensions = .init(width: 5712, height: 4284)
}
However, I’m still not getting the expected results. My goal is to capture 48MP images, and I want to confirm if the Ultra-wide camera supports this resolution or if I’m missing any other configuration.
Any guidance would be appreciated!
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
AVPlayer has 3 visual accessibility issues with videos out of the box:
The contrast fails for the current time in the video
The contrast fails for the remaining time in the video
The hit area is too small for the time slider. The WCAG AA requirement is a minimum hit size of 24 x 24. The height of the hit area of the offending region is 8.
Is there a known fix for any of these?
This can be reproduced with this code in an app playground:
import SwiftUI
import AVKit
import UIKit
struct ContentView: View {
private let video = URL(string: "https://server15700.contentdm.oclc.org/dmwebservices/index.php?q=dmGetStreamingFile/p15700coll2/15.mp4/byte/json")!
@State private var player: AVPlayer?
var body: some View {
VStack {
VideoPlayerView(player: player)
.frame(maxWidth: .infinity, maxHeight: 200)
}
.task {
player = try? await loadPlayer(video: video)
}
}
}
private struct VideoPlayerView: UIViewControllerRepresentable {
let player: AVPlayer?
func makeUIViewController(context: Context) -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = player
controller.modalPresentationStyle = .overFullScreen
return controller
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {
uiViewController.player = player
}
}
private func loadPlayer(video: URL) async throws -> AVPlayer {
let videoAsset = AVURLAsset(url: video)
let videoPlusSubtitles = AVMutableComposition()
try await videoPlusSubtitles.add(videoAsset, withMediaType: .video)
try await videoPlusSubtitles.add(videoAsset, withMediaType: .audio)
return await AVPlayer(playerItem: AVPlayerItem(asset: videoPlusSubtitles))
}
private extension AVMutableComposition {
func add(_ asset: AVAsset, withMediaType mediaType: AVMediaType) async throws {
let duration = try await asset.load(.duration)
try await asset.loadTracks(withMediaType: mediaType).first.map { track in
let newTrack = self.addMutableTrack(withMediaType: mediaType, preferredTrackID: kCMPersistentTrackID_Invalid)
let range = CMTimeRangeMake(start: .zero, duration: duration)
try newTrack?.insertTimeRange(range, of: track, at: .zero)
}
}
}
I would like to enable the option to resize windows with the apple pencil pro. I tried but I see that this feature is not enabled.
Topic:
Accessibility & Inclusion
SubTopic:
General
有在Info.plist配置权限使用说明(NSLocalNetworkUsageDescription),App本地网络权限也已经打开,但App请求局域网设备接口时,仍返回异常:
Error Domain=NSURLErrorDomain Code=-1009 "The Internet connection appears to be offline." UserInfo={_kCFStreamErrorCodeKey=50, NSUnderlyingError=0x302d79d40 {Error Domain=kCFErrorDomainCFNetwork Code=-1009 "(null)" UserInfo={_NSURLErrorNWPathKey=unsatisfied (Local network prohibited), interface: en0[802.11], ipv4, uses wifi, _kCFStreamErrorCodeKey=50, _kCFStreamErrorDomainKey=1}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask .<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalDataTask .<1>"
), NSLocalizedDescription=The Internet connection appears to be offline., NSErrorFailingURLStringKey=http://192.168.1.1:80/xxx, NSErrorFailingURLKey=http://192.168.1.1:80/xxx, _kCFStreamErrorDomainKey=1}
Topic:
Accessibility & Inclusion
SubTopic:
General
I'd like to add borders to all buttons in the iOS simulator from my Mac app. First I get the simulator window. Then I access the children of all AXGroup and if it's a button or a static text, I add a border.
But for some buttons this does not work. In the example image the NavigationBarButtons are not found. I guess the problem is, that for some AXGroup the children array access with AXChildren is empty.
Here is some relevant code:
- (NSArray<DDHOverlayElement *> *)overlayChildrenOfUIElement:(AXUIElementRef)element index:(NSInteger)index {
NSMutableArray<DDHOverlayElement *> *tempOverlayElements = [[NSMutableArray alloc] init];
NSLog(@">>> -----------------------------------------------------");
NSString *role = [UIElementUtilities roleOfUIElement:element];
NSRect frame = [UIElementUtilities frameOfUIElement:element];
NSLog(@"%@, role: %@, %@", element, role, [NSValue valueWithRect:frame]);
NSArray *lineage = [UIElementUtilities lineageOfUIElement:element];
NSLog(@"lineage: %@", lineage);
NSArray<NSValue *> *children = [UIElementUtilities childrenOfUIElement:element];
if (children.count < 1) {
NSLog(@"NO CHILDREN");
}
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"----%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
}
NSLog(@"<<< -----------------------------------------------------");
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
if ([role isEqualToString:@"AXButton"] ||
[role isEqualToString:@"AXTextField"] ||
[role isEqualToString:@"AXStaticText"]) {
NSString *tag = [NSString stringWithFormat:@"%ld%ld", (long)index, (long)i];
NSLog(@"tag: %@", tag);
DDHOverlayElement *overlayElement = [[DDHOverlayElement alloc] initWithUIElementValue:child tag:tag];
[tempOverlayElements addObject:overlayElement];
} else if ([role isEqualToString:@"AXGroup"] ||
[role isEqualToString:@"AXToolbar"]) {
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:++index]];
} else if ([role isEqualToString:@"AXWindow"]) {
[self.overlayWindowController setFrame:[UIElementUtilities frameOfUIElement:uiElement]];
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:index]];
}
}
return [tempOverlayElements copy];
}
For some AXGroup the children are found. For some they are empty. I cannot figure out why.
Does anyone have an idea what I'm doing wrong?
Topic:
Accessibility & Inclusion
SubTopic:
General
Haptic or Sound queue to allow for the accessibility of the blind (sound) and deaf population (haptic) for even knowing when location services and the camera were last used?
Also, the grey notification rather than the purple notification for location services should appear for the full 24 hours after an application has used the app, if the correct description is within the "copy" of Settings
The green light lets them know that the application has changed to the camera and fade out orange light both could even have subtle simply click sounds, like a
shutter, big haptic, softer sound, but editable in Settings, of course
I did the 18.3 update over the weekend and every contact and their information (family names, addresses, photos etc) that was added to my phone over the last year is completely gone. I’ve spent hours on the phone with Apple and their “top” senior account employees with no resolution. I am told my case has been escalated to engineering and they will get back to me in one week. I have zero confidence my issue will be resolved. I’ve gone over and over every action done over the weekend and the only thing I did was erase some emails and do the update. There has to be a way they can see every action made on my phone to find the issue.
Topic:
Accessibility & Inclusion
SubTopic:
General
写了个自己用的app,在自己手机上测试中,隔一周左右就打不开了,显示不再可用。
ps.没花钱买开发者账号,app也不打算发布。
Topic:
Accessibility & Inclusion
SubTopic:
General
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused.
According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup.
In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact.
So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
I updated developers beta 18.3 on my iPad 7th generation. Post the update sound isn’t working. I tried to restart and reset device completely But no luck. Also, youtube app isn’t working post update
Hello,
I’m in the process of enrolling my business (Carzo Rent A Car, Prishtine, Kosovo) in the Apple Developer Program, but I have been waiting for my D-U-N-S number to be issued.
I submitted the request to Dun & Bradstreet on July 28, 2025 (Case #9142648) and have only received a system-generated email with a tracking ID (#9086421). There has been no further update.
My questions are:
Is there a way for Apple to expedite or provisionally approve my enrollment while the D-U-N-S number is pending?
How long does Apple typically wait for D&B updates before the enrollment is affected?
Are there any alternative steps I can take to avoid further delays?
Thank you for your guidance.
Topic:
Accessibility & Inclusion
SubTopic:
General
i have updated to the ipados 26 and my pointer is still the circle one and not the arrow cursor
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello everyone,
I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5.
This has also been reported via Feedback Assistant with
Feedback ID: FB17806167
Description:
When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation.
Specifically:
The pointer cannot move past the center of the screen
Horizontal and vertical (X/Y) movements appear to be swapped or misaligned
Natural movement of the pointer is not possible
It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape.
This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps.
Steps to Reproduce:
Enable AssistiveTouch
Connect a Bluetooth mouse to the iPhone
Rotate the device to landscape orientation
Try moving the mouse pointer across the screen
→ Notice that:
Pointer cannot move past the center
Horizontal/vertical input is interpreted incorrectly (as if still in portrait)
Expected Behavior:
The mouse pointer should move across the entire screen correctly, regardless of device orientation.
Actual Behavior:
In landscape orientation, the pointer is either restricted to part of the screen or misaligned.
It behaves as if the device is still in portrait.
Horizontal mouse movement causes vertical pointer movement, and vice versa
User experience feels broken and unintuitive
Feature Suggestion:
Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS.
I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior.
Thanks in advance for any insights or suggestions.
Best regards,
Jannis
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore.
Is There any reason for this to happening?.
Is anyone aware of this behaviour?
Any suggestion would be really appreciated.
Cheers
The issue is, I cannot auto acquire bluetooth keyboard focus in PHPickerViewController after enabling 'Full Keyboard Access' in my IPhone 14 with iOS version 18.3.1. The keyboard focus in PHPickerViewController will show, however, after I tapped on the blank space of the PHPickerViewController. How to make the focus on at the first place then?
I'm using UINavigationController and calling setNavigationBarHidden(true, animated: false). Then I use this controller to present PHPickerViewController using some configuration setup below.
self.configuration = PHPickerConfiguration()
configuration.filter = .any(of: filters)
configuration.selectionLimit = selectionLimit
if #available(iOS 15.0, *), allowOrdering {
configuration.selection = .ordered
}
configuration.preferredAssetRepresentationMode = .current
Finally I set the delegate to PHPickerViewController and call UINavigationController.present(PHPickerViewController, animated: true) to render it.
Also I notice animation showing in first video then disappear.
How can I force VoiceOver to read parentheses for math expressions like this:
Text("(2+3)×4") // VoiceOver: Two plus three, times four
I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks.
Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.”
Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
At present, in iOS, if using the in-house app, there may be crashes in the new iOS 18.3 and later versions, but it works normally on other phones and the certificate is not problematic.
A total of 3 machines were found, and there was no pattern between the machines and the system, with different models and versions.
We tested it on a machine that crashes, but the app downloaded from the store doesn't. If the same app is packaged and installed directly in the development tool, it will crash. Is this related to compatibility with the new version of IOS?
Is there a solution? Do others also have relevant situations?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey there! Hope you are starting the year with great joy.
My situation
I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift
My problem
I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target.
My discoveries so far
I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API.
Thank you in advanced.
Hi,I applied for the COMMUNICATION capability, but have a message that I already have the driving task app entitlement.
After that ,I have applied one more time ,there is no reply anymore.
I do not have the com.apple.developer.carplay-communication capability, that means I can not apply this capability?
What should i do next to get this capatibility?
Thanks
I have a couple follow up questions after the "Accessibility technologies group lab".
I know it was briefly mentioned that user feedback is an excellent way to grow inclusivity in the design an app and utilizing these forums were one for example.
Is inviting folks here on the forum via test flight a reasonable approach to this for a solo developer?
Are there other strategies, avenues, or examples to promote user feedback?