In UIKit, certain events like a button tap can be simulated using:
button.sendActions(for: .touchUpInside)
This allows us to trigger the button’s action programmatically.
However, in SwiftUI, there is no direct equivalent of sendActions(for:) for views like Button. What is the recommended approach to programmatically simulate a SwiftUI button tap and trigger its action?
Is there an alternative mechanism to achieve this(and for other events under UIControl.event) , especially in scenarios where we want to test interactions or trigger actions without direct user input?
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm not quite sure where the problem is, but I will describe what I am doing to recreate the issue, and am happy to provide whatever information I can to be more useful.
I am changing the ActivationPolicy for my app in order to make it unobtrusive when in the background (e.g. hiding it from the dock and using only a menu bar status item). When the user activates the app with a hotkey, it changes from NSApplicationActivationPolicyAccessory back to NSApplicationActivationPolicyRegular. This allows normal usage (dock icon, menu bar, etc.)
This works fine, except in a rare situation which I finally just tracked down. If there is a window open in the app and I use the hotkey to convert back to an accessory, and then disconnect and reconnect the display on which the app was previously displayed, when I convert the app back to "regular mode", the menu bar has disappeared (and I am left with an empty space at the top of the screen). I can also trigger this bug by having the display in question briefly mirror the other display (effectively "orphaning" the hidden app), and then restoring the original side-by-side configuration before activating the app again.
The app otherwise works, but the menu bar is missing. Switching back and forth with other apps does not fix the problem. Quitting and restarting the app resolves the issue. As does disabling the accessory only mode and forcing the app to always remain in "regular mode" with a dock icon (there is a preference for this in my app). Once fixed, I can then re-enable the "accessory mode" and all is well until the bug is triggered again.
The bug would normally occur quite sporadically, presumably requiring a particular combination of changing Spaces or displays, or having the computer go to sleep while this app was in accessory mode. Thus far, the above is the only way I have found that can replicate this issue on demand.
If I close all windows before hiding the app, then it works fine when I revert to "regular mode". It only happens if there is a window open at the time.
Using applicationDidChangeScreenParameters: on my AppDelegate indicates that there is a change in screen, and logging window.screen.frame for each open window in [NSApp orderedWindows] shows that the size changes from e.g. 1920x1080 to 0x0 and back while the display is disconnected or mirrored.
There is also an error in the console in Xcode when this happens -- invalid display identifier <some UUID>.
I have tried various options for window collectionBehavior, as well as various settings for Spaces (which I normally use). None of these changes has fixed the behavior thus far.
I use [NSApp hide:self]; from my AppDelegate to hide the app, and [[NSRunningApplication currentApplication] activateWithOptions:NSApplicationActivateAllWindows];[NSApp unhide:self]; to bring it back to the front.
I welcome any ideas for things to chase down, or requests for more specific information that would be useful.
Thank you!
Fletcher
I'm building an app using UITabbarController with 2 tabs: screen A and B. When standing on tab B and I taps on tab A, the order in which the events are triggered will be:
For iOS < 18:
viewWillDisappear() of screen B
tabBarController(_:didSelect:) of UITabbarController
For iOS >= 18:
tabBarController(_:didSelect:) of UITabbarController
viewWillDisappear() of screen B
So my question is this an issue or a new update from Apple on iOS 18.*?
I want SensorKit data for research purposes in my current application. I have applied for and received permission from Apple to access SensorKit data.
During implementation, I encountered an issue in which no data was being retrieved despite granting all the necessary permissions.
I am using did CompleteFetch & didFetchResult delegate methods for retrieving data from Sensorkit. CompleteFetch method calls but where I can find different event data like Device usage, Ambient Light, etc? & didFetchResult method does not call.
Methods I am using:
1. func sensorReader(_ reader: SRSensorReader, didCompleteFetch fetchRequest: SRFetchRequest)
2. func sensorReader(_ reader: SRSensorReader, fetching fetchRequest: SRFetchRequest, didFetchResult result: SRFetchResult<AnyObject>) -> Bool
Could anyone please assist me in resolving this issue? Any guidance or troubleshooting steps would be greatly appreciated.
Where from and how does an NSRulerView get its magnification from? I am not using the automatic magnification by NSScrollView but using my own mechanism. How do I relay the zoom factor to NSRulerView?
I have a use case in which there is a zoomable and pannable parent view, and a child view that needs to display a custom context menu on long press.
(The reason why I need to implement a custom context menu is this: https://developer.apple.com/forums/thread/773810)
It seems, however, that this setup produces a bug in SwiftUI. The problem is that the onChanged of the drag gesture is only invoked right before its onEnded, hence you cannot smoothly animate the drag:
struct ContentView: View {
var body: some View {
ZStack {
Rectangle()
.foregroundStyle(.green)
.frame(width: 200, height: 200)
.onLongPressGesture {
print("long press")
}
}
.gesture(MagnifyGesture().onEnded { value in
print("magnify end")
})
.gesture(DragGesture()
.onChanged { value in
print("drag change")
}
.onEnded { value in
print("drag end")
})
}
}
Changing the DragGesture to a .highPriorityGesture() makes the drag's onChange execute at the correct time but results in the LongPressGesture only triggering when the user lifts up his/her finger (which was originally not the case).
So it seems that the two gestures are in a sort of conflict with each other.
Is there a workaround?
Topic:
UI Frameworks
SubTopic:
SwiftUI
If you add the .scaleEffect() modifier to a parent view inside of which there are children with contextMenu()-s, the context menu preview unfortunately keeps the original, unscaled size of the child view. This results in a very weird, glitchy user experience.
Unfortunately, providing a custom preview that is scaled up also does not help, since even though it is the right size, it gets cropped to the unscaled size of the child view.
Adding scaleEffect() to each child element individually (BEFORE the contextMenu() modifier!) does make the problem disappear. However, I would like to avoid this, since my use case is zooming into a complex graph with context menus on its nodes, and having to recalculate the position of each node manually seems to perform much worse than delegating that work to scaleEffect().
Tested on iOS 18.2 (device + emulator)
Is there a workaround?
Here is a minimal working example that demonstrates the problem:
struct ContentView: View {
var body: some View {
VStack(spacing: 20) {
Rectangle()
.frame(width: 100, height: 100)
.contextMenu {
Button("Test") {}
Button("Test") {}
}
Rectangle()
.frame(width: 100, height: 100)
.contextMenu {
Button("Test") {}
Button("Test") {}
}
}
.scaleEffect(1.5)
}
}
Screenshot (problem: The two squares are the same size. However, the long-tapped upper square got shrunk down before the context menu got displayed.)
Topic:
UI Frameworks
SubTopic:
SwiftUI
I have a custom keypad to accept numeric input for iPads that I have been using for many years now. This is longstanding working code. With iOS 18 the touchUpInside (and other) events in the underlying Objective-C modules are not called in the file owner module when activated from the interface. The buttons seem to be properly activated based on the visual cues (they change colors when pressed). This is occurring in both simulators and on hardware. Setting the target OS version does not help. What could the cause and/or solution of this be?
Hi,
I have an existing Mac app on the App Store with a couple of widgets as part of the app. I want to now add a new widget to the WidgetBundle. When I build the updated app with Xcode, and then run the updated app, the widgets list doesn't seem to get updated in Notification Center or in the WidgetKit Simulator.
I do have the App Store version installed in the /Applications folder as well, so there might be some conflict. What's the trick to getting the widgets list to run the debug version?
I have DNG files that I want to open and show as EDR content in my app. It seems like the DNG files should have enough per pixel information to show more colors that Display P3 but whenever I load the images using CIRawFilter and then inspect the outputImage color space it is always "DisplayP3", not something like "ITU-R BT.2100 PQ" there doesn't seem to be any way to make it load with a different color space for displaying EDR images.
Does this make sense for DNG files, it seems like it should?
If I open the same file using CIImage with the expandToHDR option e.g.
CIImage(contentsOf: rawURL, options: [.expandToHDR: true])
then it does have the desired EDR color space, but then I don't get any of the properties that are available via the CIRAWFilter class to manipulate the data.
Basically I just want to be able to open the DNG file via CIRAWFilter and then display it in my SwiftUI app as an EDR image by adding the allowedDynamicRange(.high) property.
Image("my-dng-image").allowedDynamicRange(.high)
Or do DNG files (just RAW not ProRAW) not contain enough information to be displayed as EDR images, seems like they should.
In SwiftUI I want to create a list with LazyVstack and each row item in the LazyVstack is a LazyHstack of horizontally scrollable list of images with some description with line limit of 3 and width of every item is fixed to 100 but height of every item is variable as per description text content.
But in any of the rows if the first item has image description of 1 line and the remaining items in the same row has image description of 3 lines then the LazyHStack is truncating all the image descriptions in the same row to one line making all the items in that row of same height.
Why LazyHStack is not supporting items of varying height ?
Expected behaviour should be that height of every LazyHStack should automatically adjust as per item content height. But it seems SwiftUI is not supporting LazyHstack with items of varying height.
Will SwiftUI ever support this feature?
Topic:
UI Frameworks
SubTopic:
SwiftUI
So I am looking to use a custom NSWindow application (so I can implement some enhanced resizing/dragging behavior which is only possible overriding NSWindow).
The problem is my whole application is currently SwiftUI-based (see the project here: https://github.com/msdrigg/Roam/blob/50a2a641aa5f2fccb4382e14dbb410c1679d8b0c/Roam/RoamApp.swift).
I know there is a way to make this work by dropping my @main SwiftUI app and replacing it with a SwiftUI root view hosted in a standard AppKit root app, but that feels like I'm going backwards.
Is there another way to get access (and override) the root NSWindow for a SwiftUI app?
This pertains to iPad apps and UITabbarController in UIKit.
Our internal app for employees utilizes UITabbarController displayed at the bottom of the screen. Users prefer to maintain consistency with it being at the bottom. It becomes challenging to argue against this when users point out the iPhone version displaying it "correctly" at the bottom. My response is to trust Apple's design team to keep it at the top.
One workaround is to develop the app using the previous Xcode version, version 15 (via Xcode Cloud), targeting padOS17. This ensures the tab bar is shown at the bottom of the screen. However, this approach has its drawbacks: Apple may not support it in the future, leading to missed secure updates for the app, among other issues.
Exploring the UITabbarController mode appears to be the solution I am seeking. To quote the documentation "on iPad, If the tabs array contains one or more UITabGroup items, the system displays the content as either a tab bar or a sidebar, depending on the context. Otherwise, it only displays the content only as a tab bar.". The part "displays the content only as a tab bar." made think this is BAU:
class ViewController: UITabBarController {
override func viewDidLoad() {
super.viewDidLoad()
self.mode = .tabBar
}
}
Unfortunately, this does not resolve the issue.
Is there an API method that can force the tabbar to its previous bottom position?
The app uses multiple windows. When we split a window to launch another instance, the tab bar appears at the bottom. This behavior seems tied to the form factor—or potentially to how many items are in the tab bar.
We could build a custom tab bar to override this, but that goes against my “don’t reinvent the wheel” principle.
Any comments we welcome and thank you for reading this (to the end)
Theo
Hi,
I have few questions regarding the widgets.
I would like to know whether widget and app extensions are same ? This link(https://developer.apple.com/app-extensions/) says widget is type of app extension but I am not quite sure as few link in web says they are different. so need to confirm here :)
Can a widget share same bundle id as the main app ? so basically can we use the same provisioning profile as the main app?
If we use the same bundle id and provisioning profile, will there be any issue during the app store submission process.?
I'm creating an app which gamifies Screen Time reduction. I'm running into an issue with apples Screen Time setting where the user can disable my apps "Screen Time access" and get around losing the game.
Is there a way to detect when this setting is disabled for my app? I've tried using AuthorizationCenter.shared.authorizationStatus but this didn't do the trick. Does anyone have an ideas?
Hi,
I have an existing AppKit-based Mac app that I have been working on for a few years. For a new feature, I wanted to have the app opened by a different app, so I setup the URL scheme under CFBundleURLTypes in my Info.plist, and adopted this delegate callback:
- (void)application: (NSApplication *)application openURLs:(nonnull NSArray<NSURL *> *)urls
Now when I invoke the URL from the 2nd app, it opens my app correctly, BUT this delegate method isn't called. What's interesting is that if I make a totally new app with a URL scheme and adopt this delegate method, it gets called without a problem!
SO what about my original project could be responsible for this 'opensURLs' method to not be called? I've been searching for a solution for a couple of days without any luck. The macOS app's target has a Deployment Target of 10.15 and I'm running this on macOS12.0 with Xcode 13.
I've encountered an issue where using @Observable in SwiftUI causes extra initializations and deinitializations when a reference type is included as a property inside a struct. Specifically, when I include a reference type (a simple class Empty {}) inside a struct (Test), DetailsViewModel is initialized and deinitialized twice instead of once. If I remove the reference type, the behavior is correct.
This issue does not occur when using @StateObject instead of @Observable. Additionally, I've submitted a feedback report: FB16631081.
Steps to Reproduce
Run the provided SwiftUI sample code (tested on iOS 18.2 & iOS 18.3 using Xcode 16.2).
Observe the console logs when navigating to DetailsView.
Comment out var empty = Empty() in the Test struct.
Run again and compare console logs.
Change @Observable in DetailsViewModel to @StateObject and observe that the issue no longer occurs.
Expected Behavior
The DetailsViewModel should initialize once and deinitialize once, regardless of whether Test contains a reference type.
Actual Behavior
With var empty = Empty() present, DetailsViewModel initializes and deinitializes twice. However, if the reference type is removed, or when using @StateObject, the behavior is correct (one initialization, one deinitialization).
Code Sample
import SwiftUI
enum Route {
case details
}
@MainActor
@Observable
final class NavigationManager {
var path = NavigationPath()
}
struct ContentView: View {
@State private var navigationManager = NavigationManager()
var body: some View {
NavigationStack(path: $navigationManager.path) {
HomeView()
.environment(navigationManager)
}
}
}
final class Empty { }
struct Test {
var empty = Empty() // Comment this out to make it work
}
struct HomeView: View {
private let test = Test()
@Environment(NavigationManager.self) private var navigationManager
var body: some View {
Form {
Button("Go To Details View") {
navigationManager.path.append(Route.details)
}
}
.navigationTitle("Home View")
.navigationDestination(for: Route.self) { route in
switch route {
case .details:
DetailsView()
.environment(navigationManager)
}
}
}
}
@MainActor
@Observable
final class DetailsViewModel {
var fullScreenItem: Item?
init() {
print("DetailsViewModel Init")
}
deinit {
print("DetailsViewModel Deinit")
}
}
struct Item: Identifiable {
let id = UUID()
let value: Int
}
struct DetailsView: View {
@State private var viewModel = DetailsViewModel()
@Environment(NavigationManager.self) private var navigationManager
var body: some View {
ZStack {
Color.green
Button("Show Full Screen Cover") {
viewModel.fullScreenItem = .init(value: 4)
}
}
.navigationTitle("Details View")
.fullScreenCover(item: $viewModel.fullScreenItem) { item in
NavigationStack {
FullScreenView(item: item)
.navigationTitle("Full Screen Item: \(item.value)")
.toolbar {
ToolbarItem(placement: .cancellationAction) {
Button("Cancel") {
withAnimation(completionCriteria: .logicallyComplete) {
viewModel.fullScreenItem = nil
} completion: {
var transaction = Transaction()
transaction.disablesAnimations = true
withTransaction(transaction) {
navigationManager.path.removeLast()
}
}
}
}
}
}
}
}
}
struct FullScreenView: View {
@Environment(\.dismiss) var dismiss
let item: Item
var body: some View {
ZStack {
Color.red
Text("Full Screen View \(item.value)")
.navigationTitle("Full Screen View")
}
}
}
Console Output
With var empty = Empty() in Test
DetailsViewModel Init
DetailsViewModel Init
DetailsViewModel Deinit
DetailsViewModel Deinit
Without var empty = Empty() in Test
DetailsViewModel Init
DetailsViewModel Deinit
Using @StateObject Instead of @Observable
DetailsViewModel Init
DetailsViewModel Deinit
Additional Notes
This issue occurs only when using @Observable. Switching to @StateObject prevents it. This behavior suggests a possible issue with how SwiftUI handles reference-type properties inside structs when using @Observable.
Using a struct-only approach (removing Empty class) avoids the issue, but that’s not always a practical solution.
Questions for Discussion
Is this expected behavior with @Observable?
Could this be an unintended side effect of SwiftUI’s state management?
Are there any recommended workarounds apart from switching to @StateObject?
Would love to hear if anyone else has run into this or if Apple has provided any guidance!
SWIFTUI :
Chart control / view
is there way to align the Legends to align the center
I see there only 2 options,
.chartLegend(position: .bottom, alignment: .leading,spacing: 10)
.leading or .trailing
code :
Chart{
.....
SectorMark..
}
.chartLegend(position: .bottom, alignment: .leading,spacing: 10)
Topic:
UI Frameworks
SubTopic:
SwiftUI
Issue:- The characters in SearchController SearchBar are overlapping on focusing.
Steps:-
Select the System language as Arabic
Launch the tvOS app
Set the app language as English in the tvOS app
Focus on SearchController SearchBar
Focus on any characters there you can see overlapping issue.
In our application we have two usecases for a Hotkey/Shortcut identification API/method.
We have some predefined shortcuts that will ship with our MacOS application. They may or may not change dynamically, based on what the user has already set as shortcuts/hotkeys, and also to avoid any important system wide shortcuts that the user may or may not have changed.
We allow the user to customize the shortcuts/hotkeys in our application, so we want to show what shortcuts the user already has in use system-wide and across their OS experience.
This gives rise to the need for an API that lets us know which shortcut/hotkeys are currently being used by the user and also the current system wide OS shortcuts in use.
Please let me know if there are any APIs in AppKit or SwiftUI we can use for the above