The code below is a test to trigger UI updates every 30 seconds. I'm trying to keep most work off main and only push to main once I have the string (which is cached). Why is updating SwiftUI 30 times per second so expensive? This code causes 10% CPU on my M4 Mac, but comment out the following line:
Text(model.timeString)
and it's 0% CPU. The reason why I think I have too much work on main is because of this from instruments. But I'm no instruments expert.
import SwiftUI
import UniformTypeIdentifiers
@main
struct RapidUIUpdateTestApp: App {
var body: some Scene {
DocumentGroup(newDocument: RapidUIUpdateTestDocument()) { file in
ContentView(document: file.$document)
}
}
}
struct ContentView: View {
@Binding var document: RapidUIUpdateTestDocument
@State private var model = PlayerModel()
var body: some View {
VStack(spacing: 16) {
Text(model.timeString) // only this changes
.font(.system(size: 44, weight: .semibold, design: .monospaced))
.transaction { $0.animation = nil } // no implicit animations
HStack {
Button(model.running ? "Pause" : "Play") {
model.running ? model.pause() : model.start()
}
Button("Reset") { model.seek(0) }
Stepper("FPS: \(Int(model.fps))", value: $model.fps, in: 10...120, step: 1)
.onChange(of: model.fps) { _, _ in model.applyFPS() }
}
}
.padding()
.onAppear { model.start() }
.onDisappear { model.stop() }
}
}
@Observable
final class PlayerModel {
// Publish ONE value to minimize invalidations
var timeString: String = "0.000 s"
var fps: Double = 30
var running = false
private var formatter: NumberFormatter = {
let f = NumberFormatter()
f.minimumFractionDigits = 3
f.maximumFractionDigits = 3
return f
}()
@ObservationIgnored private let q = DispatchQueue(label: "tc.timer", qos: .userInteractive)
@ObservationIgnored private var timer: DispatchSourceTimer?
@ObservationIgnored private var startHost: UInt64 = 0
@ObservationIgnored private var pausedAt: Double = 0
@ObservationIgnored private var lastFrame: Int = -1
// cache timebase once
private static let secsPerTick: Double = {
var info = mach_timebase_info_data_t()
mach_timebase_info(&info)
return Double(info.numer) / Double(info.denom) / 1_000_000_000.0
}()
func start() {
guard timer == nil else { running = true; return }
let desiredUIFPS: Double = 30 // or 60, 24, etc.
let periodNs = UInt64(1_000_000_000 / desiredUIFPS)
running = true
startHost = mach_absolute_time()
let t = DispatchSource.makeTimerSource(queue: q)
// ~30 fps, with leeway to let the kernel coalesce wakeups
t.schedule(
deadline: .now(),
repeating: .nanoseconds(Int(periodNs)), // 33_333_333 ns ≈ 30 fps
leeway: .milliseconds(30) // allow coalescing
)
t.setEventHandler { [weak self] in self?.tick() }
timer = t
t.resume()
}
func pause() {
guard running else { return }
pausedAt = now()
running = false
}
func stop() {
timer?.cancel()
timer = nil
running = false
pausedAt = 0
lastFrame = -1
}
func seek(_ seconds: Double) {
pausedAt = max(0, seconds)
startHost = mach_absolute_time()
lastFrame = -1 // force next UI update
}
func applyFPS() { lastFrame = -1 } // next tick will refresh string
// MARK: - Tick on background queue
private func tick() {
let s = now()
let str = formatter.string(from: s as NSNumber) ?? String(format: "%.3f", s)
let display = "\(str) s"
DispatchQueue.main.async { [weak self] in
self?.timeString = display
}
}
private func now() -> Double {
guard running else { return pausedAt }
let delta = mach_absolute_time() &- startHost
return pausedAt + Double(delta) * Self.secsPerTick
}
}
nonisolated struct RapidUIUpdateTestDocument: FileDocument {
var text: String
init(text: String = "Hello, world!") {
self.text = text
}
static let readableContentTypes = [
UTType(importedAs: "com.example.plain-text")
]
init(configuration: ReadConfiguration) throws {
guard let data = configuration.file.regularFileContents,
let string = String(data: data, encoding: .utf8)
else {
throw CocoaError(.fileReadCorruptFile)
}
text = string
}
func fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper {
let data = text.data(using: .utf8)!
return .init(regularFileWithContents: data)
}
}
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
In our app, there is a UIWindow makeKeyAndVisible crash, and for now, it appears once, crash stack:
the crash detail:
crash.txt
in the RCWindowSceneManager class's makeWindowKeyAndVisible method, we check and set a window's windowScene and makeKeyAndVisible:
public func makeWindowKeyAndVisible(_ window: UIWindow?) {
guard let window else {
return
}
if let currentWindowScene {
if window.windowScene == nil || window.windowScene != currentWindowScene {
window.windowScene = currentWindowScene
}
window.makeKeyAndVisible()
}
}
and I set a break point at a normal no crash flow, the stack is:
why it crash? and how we avoid this, thank you.
The WatchOS app and view lifecycles for WatchKit and SwiftUI are documented in https://developer.apple.com/documentation/watchkit/working-with-the-watchos-app-life-cycle and https://developer.apple.com/documentation/swiftui/migrating-to-the-swiftui-life-cycle.
WatchOS 26 appears to change the app & view lifecycle from the behavior in WatchOS 11, and no longer matches the documented lifecycles.
On WatchOS 11, with a @WKApplicationDelegateAdaptor set, the following sequence of events would occur on app launch:
WKApplicationDelegate applicationDidFinishLaunching in WKApplicationState .inactive.
WKApplicationDelegate applicationWillEnterForeground in WKApplicationState .inactive.
View .onAppear @Environment(.scenePhase) .inactive
App onChange(of: @Environment(.scenePhase)): .active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
App onReceive(.didBecomeActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: .@Environment(.scenePhase) .active
In WatchOS 26, this is now:
WKApplicationDelegate applicationDidFinishLaunching in WKApplicationState .inactive.
WKApplicationDelegate applicationWillEnterForeground in WKApplicationState .inactive.
App onChange(of: @Environment(.scenePhase)): .active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
View .onAppear @Environment(.scenePhase) .active
When resuming from the background in WatchOS 11:
App onChange(of: @Environment(.scenePhase)): inactive
WKApplicationDelegate applicationWillEnterForeground in WKApplicationState .background.
App onReceive(.willEnterForegroundNotification): WKApplicationState(rawValue: 2)
View .onChange of: .@Environment(.scenePhase) inactive
App onChange(of: @Environment(.scenePhase)): active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
App onReceive(.didBecomeActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: .@Environment(.scenePhase) active
The resume from background process in WatchOS 26 is baffling and seems like it must be a bug:
App onChange(of: @Environment(.scenePhase)): inactive
WKApplicationDelegate applicationWillEnterForeground in WKApplicationState .background.
App onReceive(.willEnterForegroundNotification): WKApplicationState(rawValue: 2)
App onChange(of: @Environment(.scenePhase)): active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
App onReceive(.didBecomeActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: @Environment(.scenePhase) active
App onChange(of: @Environment(.scenePhase)): inactive
WKApplicationDelegate applicationWillResignActive in WKApplicationState .active.
App onReceive(.willResignActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: @Environment(.scenePhase) inactive
App onChange(of: @Environment(.scenePhase)): active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
App onReceive(.didBecomeActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: @Environment(.scenePhase) active
The app becomes active, then inactive, then active again.
The issues with these undocumented changes are:
It is undocumented.
If you relied on the previous process, this change can break your app.
A view no longer receives .onChange of: .@Environment(.scenePhase) .active state change during the launch process.
This bizarre applicationWillEnterForeground - applicationDidBecomeActive - applicationWillResignActive - applicationDidBecomeActive process on app resume does not match the documented process and is just...strange.
Is this new process what is intended? Is it a bug? Can an Apple engineer explain this new App resume from background process and why the View is created slightly later in the App launch process, so it does not receive the .onChange of @Environment(.scenePhase) message?
In contrast, the iOS 26 app lifecycle has not changed, and the iOS 18/26 app lifecycle closely follows the watchOS 11 app lifecycle (or watchOS 11 closely mimics iOS 18/26 with the exception that watchOS does not have a SceneDelegate).
Hi there! I'm making an app that stores data for the user's profile in SwiftData. I was originally going to use UserDefaults but I thought SwiftData could save Images natively but this is not true so I really could switch back to UserDefaults and save images as Data but I'd like to try to get this to work first. So essentially I have textfields and I save the values of them through a class allProfileData. Here's the code for that:
import SwiftData
import SwiftUI
@Model
class allProfileData {
var profileImageData: Data?
var email: String
var bio: String
var username: String
var profileImage: Image {
if let data = profileImageData,
let uiImage = UIImage(data: data) {
return Image(uiImage: uiImage)
} else {
return Image("DefaultProfile")
}
}
init(email:String, profileImageData: Data?, bio: String, username:String) {
self.profileImageData = profileImageData
self.email = email
self.bio = bio
self.username = username
}
}
To save this I create a new class (I think, I'm new) and save it through ModelContext
import SwiftUI
import SwiftData
struct CreateAccountView: View {
@Query var profiledata: [allProfileData]
@Environment(\.modelContext) private var modelContext
let newData = allProfileData(email: "", profileImageData: nil, bio: "", username: "")
var body: some View {
Button("Button") {
newData.email = email
modelContext.insert(newData)
try? modelContext.save()
print(newData.email)
}
}
}
To fetch the data, I originally thought that @Query would fetch that data but I saw that it fetches it asynchronously so I attempted to manually fetch it, but they both fetched nothing
import SwiftData
import SwiftUI
@Query var profiledata: [allProfileData]
@Environment(\.modelContext) private var modelContext
let fetchRequest = FetchDescriptor<allProfileData>()
let fetchedData = try? modelContext.fetch(fetchRequest)
print("Fetched count: \(fetchedData?.count ?? 0)")
if let imageData = profiledata.first?.profileImageData,
let uiImage = UIImage(data: imageData) {
profileImage = Image(uiImage: uiImage)
} else {
profileImage = Image("DefaultProfile")
}
No errors. Thanks in advance
Hello
I'm trying to use a TabView inside of the Sidebar in a NavigationSplitView.
I'm wanting to use .listStyle(.sidebar) in order to get the Liquid Glass effect.
However I can not find a way to remove the background of the TabView without changing the behavior of the TabView itself to paging with .tabViewStyle(.page)
NavigationSplitView {
TabView(
selection: .constant("List")
) {
Tab(value: "List") {
List {
Text("List")
}
}
}
} detail: {
}
Note: I wanting change the background of the TabView container itself. Not the TabBar.
Hey there,
I have an app that allows picking any folder via UIDocumentPickerViewController. Up until iOS18 users were able to pick folders from connected servers (servers connected in the Files app) as well.
On iOS26, the picker allows for browsing into the connected servers, but the Select button is greyed out and does nothing when tapped.
Is this a known issue? This breaks the whole premise of my file syncronization application.
I'm looking for clarification on a SwiftUI performance point mentioned in the recent Optimize your app's speed and efficiency | Meet with Apple video.
(YouTube link not allowed, but the video is available on the Apple Developer channel.)
At the 1:48:50 mark, the presenter says:
Writing a value to the Environment doesn't only affect the views that read the key you're updating. It updates any view that reads from any Environment key. [abbreviated quote]
That statement seems like a big deal if your app relies heavily on Environment values.
Context
I'm building a macOS application with a traditional three-panel layout. At any given time, there are many views on screen, plus others that exist in the hierarchy but are currently hidden (for example, views inside tab views or collapsed splitters).
Nearly every major view reads something from the environment—often an @Observable object that acts as a service or provider.
However, there are a few relatively small values that are written to the environment frequently, such as:
The selected tab index
The currently selected object on a canvas
The Question
Based on the presenter's statement, I’m wondering:
Does writing any value to the environment really cause all views in the entire SwiftUI view hierarchy that read any environment key to have their body re-evaluated?
Do environment writes only affect child views, or do they propagate through the entire SwiftUI hierarchy?
Example:
View A
└─ View B
├─ View C
└─ View D
If View B updates an environment value, does that affect only C and D, or does it also trigger updates in A and B (assuming each view has at least one @Environment property)?
Possible Alternative
If all views are indeed invalidated by environment writes, would it be more efficient to “wrap” frequently-changing values inside an @Observable object instead of updating the environment directly?
// Pseudocode
@Observable final class SelectedTab {
var index: Int
}
ContentView()
.environment(\.selectedTab, selectedTab)
struct TabView: View {
@Environment(\.selectedTab) private var selectedTab
var body: some View {
Button("Action") {
// Would this avoid invalidating all views using the environment?
selectedTab.index = 1
}
}
}
Summary
From what I understand, it sounds like the environment should primarily be used for stable, long-lived objects—not for rapidly changing values—since writes might cause far more view invalidations than most developers realize.
Is that an accurate interpretation?
Follow-Up
In Xcode 26 / Instruments, is there a way to monitor writes to @Environment?
Since iPadOS 26.1 I notice a new annoying bug when changing the dark mode option of the system. The appearance of the UI changes, but no longer for view controllers which are presented as Popover. For these view controllers the method "traitCollectionDidChange()" is still called (though sometimes with a very large delay), but checking the traitCollection property of the view controller in there does no longer return the correct appearance (which is probably why the visual appearance of the popover doesn't change anymore). So if the dark mode was just switched on, traitCollectionDidChange() is called, but the "traitCollection.userInterfaceStyle" property still tells me that the system is in normal mode.
More concrete, traitCollection.userInterfaceStyle seems to be set correctly only(!) when opening the popover, and while the popover is open, it is never updated anymore when the dark mode changes.
This is also visible in the standard Apps of the iPad, like the Apple Maps App: just tap on the "map" icon at the top right to open the "Map mode" view. While the view is open, change the dark mode. All of the Maps App will change its appearance, with the exception of this "Map mode" view.
Does anyone know an easy workaround? Or do I really need to manually change the colors for all popup view controllers whenever the dark mode changes? Using dynamic UIColors won't help, because these rely on the "userInterfaceStyle" property, and this is no longer correct.
Bugreport: FB20928471
I've being playing aground with long press gesture in scroll view and noticed gesture(LongPressGesture()) doesn't seem to work with scroll view's scrolling which doesn't seem to be the intended behavior to me.
Take the following example: the blue rectangle is modified with onLongPressGesture and the red rectangle is modified with LongPressGesture (_EndedGesture<LongPressGesture> to be specific).
ScrollView {
Rectangle()
.fill(.blue)
.frame(width: 200, height: 200)
.onLongPressGesture {
print("onLongPressGesture performed")
} onPressingChanged: { _ in
print("onLongPressGesture changed")
}
.overlay {
Text("onLongPressGesture")
}
Rectangle()
.fill(.red)
.frame(width: 200, height: 200)
.gesture(LongPressGesture()
.onEnded { _ in
print("gesture ended")
})
.overlay {
Text("gesture(LongPressGesture)")
}
}
If you start scrolling from either of the rectangles (that is, start scrolling with your finger on either of the rectangles), the ScrollView will scroll.
However, if LongPressGesture is modified with either onChanged or .updating, ScrollView won't respond to scroll if the scroll is started from red rectangle. Even setting the maximumDistance to 0 won't help. As for its counter part onLongPressGesture, even though onPressingChanged to onLongPressGesture, scrolling still works if it's started from onLongPressGesture modified view.
ScrollView {
Rectangle()
.fill(.blue)
.frame(width: 200, height: 200)
.onLongPressGesture {
print("onLongPressGesture performed")
} onPressingChanged: { _ in
print("onLongPressGesture changed")
}
.overlay {
Text("onLongPressGesture")
}
Rectangle()
.fill(.red)
.frame(width: 200, height: 200)
.gesture(LongPressGesture(maximumDistance: 0)
// scroll from the red rectangle won't work if I add either `updating` or `onChanged` but I put both here just to demonstrate
// you will need to add `@GestureState private var isPressing = false` to your view body
.updating($isPressing) { value, state, transaction in
state = value
print("gesture updating")
}
.onChanged { value in
print("gesture changed")
}
.onEnded { _ in
print("gesture ended")
})
.overlay {
Text("gesture(LongPressGesture)")
}
}
This doesn't seem right to me. I would expect the view modified by LongPressGesture(), no matter if the gesture has onChanged or updating, should be able to start scroll in a scroll view, just like onLongPressGesture.
I observed this behavior in a physical device running iOS 26.1, and I do not know the behavior on other versions.
In one of my apps, i am using .glassEffect(_:In) to add glass effect on various elements. The app always crushes when a UI element with glassEffect(_in:) modifier is being rendered. This only happens on device running iOS 26 public beta. I know this for certain because I connected the particular device to xcode and run the app on the device. When i comment out the glassEffect modifier, app doesn't crush.
Is it possible to check particular realeases with #available? If not, how should something like this be handled. Also how do i handle such os level erros without the app crushing. Thanks.
The same code that I have, runs fine on iOS 26.0, but on iOS 26.1, there's a delay in the Button with role .confirm to be shown properly and tinted.
Shown in the screen recording here ->
https://imgur.com/a/uALuW50
This is my code that shows slightly different button in iOS 18 vs iOS26.
var body: some View {
if #available(iOS 26.0, *) {
Button("Save", systemImage: "checkmark", role: .confirm) {
action()
}.labelStyle(.iconOnly)
} else {
Button("Save") {
action()
}
}
}
Topic:
UI Frameworks
SubTopic:
SwiftUI
I’m trying to figure out how to extend PaperKit beyond a single fixed-size canvas.
From what I understand, calling PaperMarkup(bounds:) creates one finite drawing region, and so far I have not figured out a reliable way to create multi-page or infinite canvases.
Are any of these correct?
Creating multiple PaperMarkup instances, each managed by its own PaperMarkupViewController, and arranging them in a ScrollView or similar paged container to represent multiple pages?
Overlaying multiple PaperMarkup instances on top of PDFKit pages for paged annotation workflows?
Or possibly another approach that works better with PaperKit’s design?
I mean it has to be possible, right? Apple's native Preview app almost certainly uses it, and there are so many other notes apps that get this behavior working correctly, even if it requires using a legacy thing other than PaperKit.
Curious if others have been able to find the right pattern for going beyond a single canvas.
Such a simple piece of code:
import SwiftUI
import WebKit
struct ContentView: View {
var body: some View {
WebView(url: URL(string: "https://www.apple.com"))
}
}
When I run this, the web content shows under the top notch’s safe area, and buttons inside that region aren’t tappable. I tried a bunch of things and the only “fix” that seems to work is .padding(.top, 1), but that leaves a noticeable white strip in non-portrait orientations.
What’s the proper way to solve this? Safari handles the safe area correctly and doesn’t render content there.
Hi there! I'm having this issue with my main windows. I'm having a big space on top of that without any logic explanation (at least for my poor knowledge).
Using the code below I'm getting this Windows layout:
Does anybody have any guidance on how to get out that extra space at the beginning?
Thanks a lot!
import SwiftUI
import SwiftData
#if os(macOS)
import AppKit
#endif
// Helper to access and control NSWindow for size/position persistence
#if os(macOS)
struct WindowAccessor: NSViewRepresentable {
let onWindow: (NSWindow) -> Void
func makeNSView(context: Context) -> NSView {
let view = NSView()
DispatchQueue.main.async {
if let window = view.window {
onWindow(window)
}
}
return view
}
func updateNSView(_ nsView: NSView, context: Context) {
DispatchQueue.main.async {
if let window = nsView.window {
onWindow(window)
}
}
}
}
#endif
@main
struct KaraoPartyApp: App {
@StateObject private var songsModel = SongsModel()
@Environment(\.openWindow) private var openWindow
var body: some Scene {
Group {
WindowGroup {
#if os(macOS)
WindowAccessor { window in
window.minSize = NSSize(width: 900, height: 700)
// Configure window to eliminate title bar space
window.titleVisibility = .hidden
window.titlebarAppearsTransparent = true
window.styleMask.insert(.fullSizeContentView)
}
#endif
ContentView()
.environmentObject(songsModel)
}
.windowToolbarStyle(.unifiedCompact)
.windowResizability(.contentSize)
.defaultSize(width: 1200, height: 900)
.windowStyle(.titleBar)
#if os(macOS)
.windowToolbarStyle(.unified)
#endif
WindowGroup("CDG Viewer", id: "cdg-viewer", for: CDGWindowParams.self) { $params in
if let params = params {
ZStack {
#if os(macOS)
WindowAccessor { window in
window.minSize = NSSize(width: 600, height: 400)
// Restore window frame if available
let key = "cdgWindowFrame"
let defaults = UserDefaults.standard
if let frameString = defaults.string(forKey: key) {
let frame = NSRectFromString(frameString)
if window.frame != frame {
window.setFrame(frame, display: true)
}
} else {
// Open CDG window offset from main window
if let mainWindow = NSApp.windows.first {
let mainFrame = mainWindow.frame
let offsetFrame = NSRect(x: mainFrame.origin.x + 60, y: mainFrame.origin.y - 60, width: 800, height: 600)
window.setFrame(offsetFrame, display: true)
}
}
// Observe frame changes and save
NotificationCenter.default.addObserver(forName: NSWindow.didMoveNotification, object: window, queue: .main) { _ in
let frameStr = NSStringFromRect(window.frame)
defaults.set(frameStr, forKey: key)
}
NotificationCenter.default.addObserver(forName: NSWindow.didEndLiveResizeNotification, object: window, queue: .main) { _ in
let frameStr = NSStringFromRect(window.frame)
defaults.set(frameStr, forKey: key)
}
}
#endif
CDGView(
cancion: Cancion(
title: params.title ?? "",
artist: params.artist ?? "",
album: "",
genre: "",
year: "",
bpm: "",
playCount: 0,
folderPath: params.cdgURL.deletingLastPathComponent().path,
trackName: params.cdgURL.deletingPathExtension().lastPathComponent + ".mp3"
),
backgroundType: params.backgroundType,
videoURL: params.videoURL,
cdfContent: params.cdfContent.flatMap { String(data: $0, encoding: .utf8) },
artist: params.artist,
title: params.title
)
}
} else {
Text("No se pudo abrir el archivo CDG.")
}
}
.windowResizability(.contentSize)
.defaultSize(width: 800, height: 600)
WindowGroup("Metadata Editor", id: "metadata-editor") {
MetadataEditorView()
.environmentObject(songsModel)
}
.windowResizability(.contentSize)
.defaultSize(width: 400, height: 400)
WindowGroup("Canciones DB", id: "canciones-db") {
CancionesDBView()
}
.windowResizability(.contentSize)
.defaultSize(width: 800, height: 500)
WindowGroup("Importar canciones desde carpeta", id: "folder-song-importer") {
FolderSongImporterView()
}
.windowResizability(.contentSize)
.defaultSize(width: 500, height: 350)
}
.modelContainer(for: Cancion.self)
// Add menu command under Edit
.commands {
CommandGroup(replacing: .pasteboard) { }
CommandMenu("Edit") {
Button("Actualizar Metadatos") {
openWindow(id: "metadata-editor")
}
.keyboardShortcut(",", modifiers: [.command, .shift])
}
CommandMenu("Base de Datos") {
Button("Ver Base de Datos de Canciones") {
openWindow(id: "canciones-db")
}
.keyboardShortcut("D", modifiers: [.command, .shift])
}
}
}
init() {
print("\n==============================")
print("[KaraoParty] Nueva ejecución iniciada: \(Date())")
print("==============================\n")
}
}
Hi everyone,
I’m building a full-screen Map (MapKit + SwiftUI) with persistent top/bottom chrome (menu buttons on top, session stats + map controls on bottom). I have three working implementations and I’d like guidance on which pattern Apple recommends long-term (gesture correctness, safe areas, Dynamic Island/home indicator, and future compatibility).
Version 1 — overlay(alignment:) on Map
Idea: Draw chrome using .overlay(alignment:) directly on the map and manage padding manually.
Map(position: $viewModel.previewMapCameraPosition, scope: mapScope) {
UserAnnotation {
UserLocationCourseMarkerView(angle: viewModel.userCourse - mapHeading)
}
}
.mapStyle(viewModel.mapType.mapStyle)
.mapControls {
MapUserLocationButton().mapControlVisibility(.hidden)
MapCompass().mapControlVisibility(.hidden)
MapPitchToggle().mapControlVisibility(.hidden)
MapScaleView().mapControlVisibility(.hidden)
}
.overlay(alignment: .top) { mapMenu } // manual padding inside
.overlay(alignment: .bottom) { bottomChrome } // manual padding inside
Version 2 — ZStack + .safeAreaPadding
Idea: Place the map at the back, then lay out top/bottom chrome in a VStack inside a ZStack, and use .safeAreaPadding(.all) so content respects safe areas.
ZStack(alignment: .top) {
Map(...).ignoresSafeArea()
VStack {
mapMenu
Spacer()
bottomChrome
}
.safeAreaPadding(.all)
}
Version 3 — .safeAreaInset on the Map
Idea: Make the map full-bleed and then reserve top/bottom space with safeAreaInset, letting SwiftUI manage insets
Map(...).ignoresSafeArea()
.mapStyle(viewModel.mapType.mapStyle)
.mapControls {
MapUserLocationButton().mapControlVisibility(.hidden)
MapCompass().mapControlVisibility(.hidden)
MapPitchToggle().mapControlVisibility(.hidden)
MapScaleView().mapControlVisibility(.hidden)
}
.safeAreaInset(edge: .top) { mapMenu } // manual padding inside
.safeAreaInset(edge: .bottom) { bottomChrome } // manual padding inside
Question
I noticed:
Safe-area / padding behavior
– Version 2 requires the least extra padding and seems to create a small but partial safe-area spacing automatically.
– Version 3 still needs roughly the same manual padding as Version 1, even though it uses safeAreaInset. Why doesn’t safeAreaInset fully handle that spacing?
Rotation crash (Metal)
When using Version 3 (safeAreaInset + ignoresSafeArea), rotating the device portrait↔landscape several times triggers a
Metal crash:
failed assertion 'The following Metal object is being destroyed while still required… CAMetalLayer Display Drawable'
The same crash can happen with Version 1, though less often. I haven’t tested it much with Version 2.
Is this a known issue or race condition between Map’s internal Metal rendering and view layout changes?
Expected behavior
What’s the intended or supported interaction between safeAreaInset, safeAreaPadding, and overlay when embedding persistent chrome inside a SwiftUI Map?
Should safeAreaInset normally remove the need for manual padding, or is that by design?
Starting with iOS 18, UITabBarController no longer updates tab bar item titles when localized strings are changed or reassigned at runtime.
This behavior worked correctly in iOS 17 and earlier, but in iOS 18 the tab bar titles remain unchanged until the app restarts or the view controller hierarchy is reset. This regression appears to be caused by internal UITabBarController optimizations introduced in iOS 18.
Steps to Reproduce
Create a UITabBarController with two or more tabs, each having a UITabBarItem with a title.
Localize the tab titles using NSLocalizedString():
tabBar.items?[0].title = NSLocalizedString("home_tab", comment: "")
tabBar.items?[1].title = NSLocalizedString("settings_tab", comment: "")
Run the app.
Change the app’s language at runtime (without restarting), or manually reassign the localized titles again:
tabBar.items?[0].title = NSLocalizedString("home_tab", comment: "")
tabBar.items?[1].title = NSLocalizedString("settings_tab", comment: "")
Observe that the tab bar titles do not update visually.
Hi all,
when I launch my macOS app from Xcode 16 on ARM64, appKit logs me this error on the debug console:
It's not legal to call -layoutSubtreeIfNeeded on a view which is already being laid out. If you are implementing the view's -layout method, you can call -[super layout] instead. Break on _NSDetectedLayoutRecursion(void) to debug. This will be logged only once. This may break in the future.
_NSDetectedLayoutRecursion doesn't help a lot, giving me these assembly codes from a call to a subclassed window method that looks like this:
-(void) setFrame:(NSRect)frameRect display:(BOOL)flag {
if (!_frameLocked) [super setFrame:frameRect display:flag];
}
I have no direct call to -layoutSubtreeIfNeeded from a
-layout implementation in my codes. I have a few calls to this method from update methods, however even if I comment all of them, the error is still logged...
Finally, apart from that log, I cannot observe any layout error when running the program. So I wonder if this error can be safely ignored?
Thanks!
This problem occurs only when using the Japanese keyboard.
After typing ""あい"" into a text field and pressing Enter, insert ""う"" between ""あ"" and ""い"".
As a result, the displayed text becomes ""あううい"".
Text field: ""あい"" → ""あううい""
The range values at this point:
range.location = 2
range.length = 2
・Behavior in iOS versions before 26
After typing ""あい"" into a text field and pressing Enter, insert ""う"" between ""あ"" and ""い"".
As a result, the displayed text becomes ""あうい"".
Text field: ""あい"" → ""あうい""
The range values at this point:
range.location = 1
range.length = 1
The following source code can be used to reproduce the issue.
mainApp (displaying a UIKit ViewController from SwiftUI)
import SwiftUI
import UIKit
struct ViewControllerWrapper: UIViewControllerRepresentable {
func makeUIViewController(context: Context) -> ViewController {
return ViewController()
}
func updateUIViewController(_ uiViewController: ViewController, context: Context) {}
}
@main
struct konnitihaApp: App {
var body: some Scene {
WindowGroup {
ViewControllerWrapper() // UIKitのViewControllerを表示
}
}
}
ViewController
import UIKit
class ViewController: UIViewController, UITextFieldDelegate {
let textField = UITextField()
let displayLabel = UILabel()
// 追加で管理するプロパティ
var replacementText: String = ""
var textInsertLocation: Int = 0
var textSelectedLength: Int = 0
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .white
// UITextFieldの設定
textField.borderStyle = .roundedRect
textField.placeholder = ""
textField.delegate = self
textField.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(textField)
// UILabelの設定
displayLabel.text = ""
displayLabel.layer.borderWidth = 2
displayLabel.layer.borderColor = UIColor.blue.cgColor
displayLabel.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(displayLabel)
// レイアウト
NSLayoutConstraint.activate([
textField.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor, constant: 40),
textField.leadingAnchor.constraint(equalTo: view.leadingAnchor, constant: 20),
textField.trailingAnchor.constraint(equalTo: view.trailingAnchor, constant: -20),
textField.heightAnchor.constraint(equalToConstant: 40),
//label
displayLabel.topAnchor.constraint(equalTo: textField.bottomAnchor, constant: 500),
displayLabel.leadingAnchor.constraint(equalTo: view.leadingAnchor, constant: 20),
displayLabel.trailingAnchor.constraint(equalTo: view.trailingAnchor, constant: -20),
displayLabel.heightAnchor.constraint(equalToConstant: 40),
])
}
// UITextFieldDelegate
func textField(_ textField: UITextField, shouldChangeCharactersIn range: NSRange, replacementString string: String) -> Bool {
self.replacementText = string
self.textInsertLocation = range.location
self.textSelectedLength = range.length
print("rangeは",range)
// ラベルに最新の文字列を反映
if let text = textField.text, let textRange = Range(range, in: text) {
let updatedText = text.replacingCharacters(in: textRange, with: string)
if textField == self.textField {
displayLabel.text = updatedText
}
}
return true
}
}
Topic:
UI Frameworks
SubTopic:
UIKit
in swiftui how can I change the text color when use "menu"? I mean I need that each text in pippo will be .red color as you can see in the code, but .foregroundColor(.red) do not works
thanks
var pippo = ["AAA","BBB","CCC", "DDD"]
var body: some View {
Menu {
ForEach(pippo, id: \.self) { item in
Button {
//categoriaSelezionata = categoria
} label: {
Text(item)
.foregroundColor(.red)
}
}
} label: {
HStack {
Text("Select Theme")
.multilineTextAlignment(.center)
Image(systemName: "chevron.down")
.foregroundColor(.gray)
}
.frame(maxWidth: .infinity)
.padding(6)
}
Menu {
ForEach(pippo, id: \.self) { item in
Button(item) {
// azione
}
.foregroundColor(.red)
}
} label: {
HStack {
Text("Select Theme")
Image(systemName: "chevron.down")
.foregroundColor(.gray)
}
.frame(maxWidth: .infinity)
.padding(6)
}
.menuStyle(BorderlessButtonMenuStyle()) // Prova diversi stili
.menuIndicator(.hidden) // Nasconde l'indicatore di default
}
Topic:
UI Frameworks
SubTopic:
SwiftUI
NSVisualEffectView in AppKit has two main properties: material and blendingMode.
Material is well supported in SwiftUI, but I can't seem to find an equivalent for blendingMode.
What is the SwiftUI equivalent to NSVisualEffect.BlendingMode?