I'm developing a grid of focusable elements in SwiftUI with different sizes for tvOS (similar to a tv channel grid).
Because the Focus Engine calculates the next view to focus based on the center of the currently focused view, sometimes it changes focus to an unexpected view. Here's an example:
Actual:
Expected:
Is it possible to customize the anchor point from which the focus engine traces a ray to the next view? I would prefer the leading edge in my case.
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
how to save the state of my APP when I open another APP so that It can restore when I re-open it?
my app will use over 10mb memory so if I open another APP(my app will go background) it will closed at all.
when I re-open it it will restart.
but I do not want it I want if I open Page A and then it go background and when I re-open it it still is Page A and do not restart.
I have a view that lets the user position and size a bunch of subviews. I use .frame() and .position() to accomplish this.
Right now, if the user resizes the window, the views stay where they are, anchored to the top-left corner. What I'd like is for the views to scale as a whole with window, maintaining their relative position, and their aspect ratios.
I can apply .scaleEffect(_:anchor:) to the containing view, and it scales them the way I want, but I'm not sure how to tie it to the window.
My first thought is to use a GeometryReader, but I don't really know what the "original" size would have been in order to compute a scale factor.
How else might I accomplish this?
Topic:
UI Frameworks
SubTopic:
SwiftUI
I have a Form with a custom TextField which uses a custom Text().
When I use .alignmentGuide on the Text() it seems the origin reference point varies with the length of, but not by the length of, the TextField label String. This is a problem when in a Form. My workaround has been to not use a TextField label but enclose the each TextField in a LabeledContent and then I can set the width of the label and align off of that.
How does Form cause TextField to set it's width and why if using .alignmentGuide on Text() does the TextField label length even matter?
Topic:
UI Frameworks
SubTopic:
SwiftUI
I want to make my buttons inline with each other, but spaced apart, a bit like the apple topbar BUT in swift.
My code:
struct NormalPageView: View {
var body: some View {
VStack {
NavigationView {
Form {
Section {
Image(systemName: "house.fill")
Spacer()
Image(systemName: "plus")
Spacer()
Image(systemName: "gearshape.fill")
}
}
}
I have an app which uses Scene
var body: some Scene {
WindowGroup {
RootView(root: appDelegate.root)
.edgesIgnoringSafeArea(.all)
.onOpenURL { url in
let stringUrl = url.absoluteString
if (stringUrl.starts(with: "http")) {
handleUniversalLink(url: stringUrl)
} else if (stringUrl.starts(with: "fb")) {
let _ = ApplicationDelegate.shared.application(
UIApplication.shared,
open: url,
sourceApplication: nil,
annotation: [UIApplication.OpenURLOptionsKey.annotation])
}
}
}
}
I need to detect when a user taps on status bar. And call some functions when he does it. Is it possible in Swift?
Topic:
UI Frameworks
SubTopic:
SwiftUI
This crash occurs when the app is started, and it is not necessary. The probability of occurrence is relatively low and it is not easy to reproduce. Please guide me how to deal with it. Thank you
Please see the crash log below
Crash log
I understand two key concepts from desktop platforms:
Screen Mirroring – The same content is displayed on both the primary and external screens.
Screen Extension – The external display shows different content that complements what's on the main screen.
My question pertains to the second point: Is it possible to extend the display on iOS and iPadOS devices?
I'm referring to this Apple documentation, which explains how to extend content from an iOS/iPadOS device to an external display.
I tested this in a sample iOS Xcode project. In the iOS Simulator, I was able to detect an "external display" and present a separate UIWindow on it. However, when I tried the same on a real device (iPhone 15 connected to a MacBook Pro via cable), the external display connection was not detected.
I’d like to confirm whether screen extension is possible on a real iOS device. From my research, it appears that extension is only supported on iPadOS via Stage Manager, but I want to verify if there’s any way to achieve this on an iPhone. If so, are there any known apps that currently utilize extended display functionality on iOS?
If extension is not possible on iOS, what does the documentation mentions iOS?
I'm encountering an issue displaying a large HTML string (over 11470 characters) in a UILabel. Specifically, the Arabic text within the string is rendering left-to-right instead of the correct right-to-left direction. I've provided a truncated version of the HTML string and the relevant code snippet below. I've tried setting the UILabel's text alignment to right, but this didn't resolve the issue. Could you please advise on how to resolve this bidirectional text rendering problem?
The results of the correct and incorrect approaches are shown in the image below.
Here's the relevant Swift code:
let labelView: UILabel = {
let label = UILabel()
label.textAlignment = .right
label.translatesAutoresizingMaskIntoConstraints = false
label.numberOfLines = 0
label.semanticContentAttribute = .forceRightToLeft
label.backgroundColor = .white
label.lineBreakMode = .byWordWrapping
return label
}()
//Important!!
//It must exceed 11470 characters.
let htmlString = """
<p style=\"text-align: center;\"><strong>İSTİÂZE</strong></p> <p>Nahl sûresindeki:</p>
<p dir="rtl" lang="ar"> فَاِذَا قَرَاْتَ الْقُرْاٰنَ فَاسْتَعِذْ بِاللّٰهِ مِنَ الشَّيْطَانِ الرَّج۪يمِ </p>
<p><strong>“</strong><strong>Kur’an okuyacağın zaman kovulmuş şeytandan hemen Allah’a sığın!</strong><strong>”</strong> (Nahl 16/98) emri gereğince Kur’ân-ı Kerîm okumaya başlarken:</p> <p dir="rtl" lang="ar">اَعُوذُ بِاللّٰهِ مِنَ الشَّيْطَانِ الرَّج۪يمِ</p> <p><em>“Kovulmuş şeytandan Allah’a sığınırım” </em>deriz. Bu sözü söylemeye “istiâze<em>” denilir. “Eûzü”</em>, sığınırım, emân dilerim, yardım taleb ederim, gibi anlamlara gelir. It must exceed 11470 characters.</p>
“””
if let data = htmlString.data(using: .utf8) {
let options: [NSAttributedString.DocumentReadingOptionKey: Any] = [
.documentType: NSAttributedString.DocumentType.html,
.characterEncoding: String.Encoding.utf8.rawValue
]
do {
let attributedString = try NSAttributedString(data: data, options: options, documentAttributes: nil)
labelView.attributedText = attributedString
} catch {
print("HTML string işlenirken hata oluştu: \(error)")
}
}
I'm using iOS 18.2 and Swift 6. Any suggestions on how to correct the bidirectional text rendering?
I have an app on the Mac App Store (so sandboxed) that includes a QuickLook Preview Extension that targets Markdown files. It established a QLPreviewingController instance for the macOS QuickLook system to access and it works.
I'm in the process of updating it so that it displays inline images referenced in the file as well as styling the file's text. However, despite setting Downloads folder read-only access permission (and user-selected, though I know that shouldn't be required: no open/save dialogs here) in the extension's entitlements, Sandbox refuses too allow access to the test image: I always get a deny(1) file-read-data error in the log.
FWIW, the test file is referenced in the source Markdown as an absolute unix file path.
I've tried different signings and no joy. I’ve tried placing the referenced image in various other locations. Also no joy. All I can display is the error-case bundle image for 'missing image'.
Question is, is this simply something that QuickLook extensions cannot do from within the sandbox, or am I missing something? Is there anything extra I can do to debug this?
Hi,
Despite having CarPlay capabilities authorised for our navigation app, our users are seeing some odd behaviour in the appearance of the icon in the sidebar menu on the side of the CarPlay display.
The documentation suggests the quickbar will show the most recently used: navigation app,
Open our app in CarPlay
Switch to another non-navigation app via CarPlay sidebar
Note that our navigation app remains in sidebar
Switch back to our navigation app
Search for destination, select, tap 'Let's Go' to start navigation
Switch to a non-navigation app via CarPlay sidebar
Note that our app is replaced by another navigation app in the sidebar (Google/Apple), despite being the most recently used
Any ideas?
Hello,
I implemented offerCodeRedemption recently on my app in my subscription/onboarding flow. When I did, it broke my camera functionality (elsewhere in the app; totally unrelated code).
I was able to fix the issue when implementing the old "AppStore.presentOfferCodeRedeemSheet" code with UIKit. I'm not sure why this is happening, but it seemed like a bug to me.
Topic:
UI Frameworks
SubTopic:
SwiftUI
Hi,
I am having trouble setting default focus on a TextField that is inside of an alert. I expected TextField to receive default focus when alert is presented but result is that TextField does not receive default focus. This is happening on macOS 15.2, Swift (SwiftUI), Xcode 16.2 but hasn't worked on previous versions as well.
Example:
ContentView().alert("Sample Alert", isPresented: $present) {
AlertView()
} message: {
Text("Sample alert message.")
}
struct AlertView: View {
@Namespace private var namespace
@Environment(\.dismiss) private var dismiss
@State private var text = ""
var body: some View {
VStack {
TextField(text: $text, prompt: Text("Enter text")) {}
.onSubmit {
var _ = print(text)
dismiss()
}
.autocorrectionDisabled()
.lineLimit(1)
.prefersDefaultFocus(in: namespace)
Button("OK") {
dismiss()
}
Button("Cancel", role: .cancel) {
dismiss()
}
}
.focusScope(namespace)
}
}
Topic:
UI Frameworks
SubTopic:
SwiftUI
When integrating SwiftData for an already existing app that uses CoreData as data management, I encounter errors.
When building the ModelContainer for the first time, the following error appears:
Error: Persistent History (184) has to be truncated due to the following entities being removed (all Entities except for the 2 where I defined a SwiftData Model)
class SwiftDataManager: ObservableObject {
static let shared = SwiftDataManager()
private let persistenceManager = PersistenceManager.shared
private init(){}
lazy var modelContainer: ModelContainer = {
do {
let storeUrl = persistenceManager.storeURL()
let schema = Schema([
HistoryIncident.self,
HistoryEvent.self
])
let modelConfig = ModelConfiguration(url: storeUrl)
return try ModelContainer(for: schema, configurations: [modelConfig])
} catch {
fatalError("Could not create ModelContainer: \(error)")
}
}()
}
@Model public class HistoryIncident {
var missionNr: String?
@Relationship(deleteRule: .cascade) var events: [HistoryEvent]?
public init(){}
}
@Model class HistoryEvent {
var decs: String?
var timestamp: Date?
init(){}
}
As soon as I call the following function.
func addMockEventsToCurrentHistorie() {
var descriptor = FetchDescriptor<HistoryIncident>()
let key = self.hKey ?? ""
descriptor.predicate = #Predicate { mE in
key == mE.key
}
let historyIncident = try? SwiftDataManager.shared.modelContext.fetch(descriptor).first
guard var events = historyIncident?.events else {return}
events.append(contentsOf: createEvents())
}
I get the error:
CoreData: error: (1) I/O error for database at /var/mobile/Containers/Data/Application/55E9D59D-48C4-4D86-8D9F-8F9CA019042D/Library/ Private Documents/appDatabase.sqlite. SQLite error code:1, 'no such column: t0.Z1EVENTS'
/var/mobile/Containers/Data/Application/55E9D59D-48C4-4D86-8D9F-8F9CA019042D/Library/ Private Documents/appDatabase.sqlite. SQLite error code:1, 'no such column: t0.Z1EVENTS' with userInfo of { NSFilePath = "/var/mobile/Containers/Data/Application/55E9D59D-48C4-4D86-8D9F-8F9CA019042D/Library/ Private Documents/appDatabase.sqlite"; NSSQLiteErrorDomain = 1; }
Suppose there are two buttons in VStack, the second button is unclickable. I'm running macOS 15.2 with Xcode 16.2.
import SwiftUI
struct ContentView: View {
var body: some View {
ScrollView(.horizontal) {
VStack {
Spacer()
// this button is clickable
Button("foo") {
print("foo")
}
// this button can't be clicked
Button("bar") {
print("bar")
}
}
}
}
}
If I change .horizontal -> .vertical and VStack -> HStack, the second button behave normally.
If I remove ScrollView, everything works fine.
it works fine before macOS 15.2.
Topic:
UI Frameworks
SubTopic:
SwiftUI
We were using below delegate methods from QuickLook to get modified PDF file URL after the sketching But we are not able see the multi line text properly laid out on PDF and part of text missing. Same time Other pencil kit tools are working as expected.
`func previewController(_ controller: QLPreviewController, didSaveEditedCopyOf previewItem: QLPreviewItem, at modifiedContentsURL: URL)
func previewController(_ controller: QLPreviewController, didUpdateContentsOf previewItem: any QLPreviewItem)`
We tested all code in iOS 18.2.
Please let us know if the text edited URL on PDF can be retrieved in any possible way without tampering text
Hi,
I thought that drag drop reorder should be very easy with SwiftUI, but apparently I was wrong (unless I'm missing something). It seems to me that SwiftUI's drag-drop reorder is only easy for List, which supports .onMove modifier.
However, for UI like Grid, a horizontal ScrollView with items in a HStack, I don't see any easy approach to implement this. For example,
ScrollView(.horizontal) {
HStack {
ForEach(items) {
ItemView(item)
}
}
}
Does anyone know what's the best way to implement drag drop reorder for this horizontal scroll view?
I'm testing using Group Activities and having no trouble iOS<->iOS or starting an activity on macOS and joining via iOS. However, when I start an activity and then try to join it from another macOS client, the starting side joins the session just fine, but the receiving side acts like I don't have the required app, even when it is already running.
I see the active SharePlay icon in the menu bar, and the Current Activity is shown, but instead of an "Open" button there is a "MyApp Required" string and a "View" button that goes to the App Store. (Where the app is not available yet, as expected, since I'm still working on it.) There is no GroupSession started on that Mac yet, obviously.
I'm looking for any hints to help debug what is going on. How does Group Activities find the app for the activity on macOS and how can I figure out why it isn't finding mine?
Thanks!
I'm trying to achieve a specific UI design in SwiftUI where the bottom section of my List has a different background color than the top section. For example in the Medications portion of the Health app, the "Your Medications" Section has a different background than the top "Log" Section. How do I achieve this?:
Here some example code. I wonder if I am supposed to use two Lists instead. If I use two Lists though and nest it in a ScrollView, the height of the lists needs to be specified. I am working with dynamic content, though so I don't think that is ideal.
class ProtocolMedication {} // Example model
struct HomeView: View {
@Query private var protocolMedications: [ProtocolMedication]
var body: some View {
NavigationStack {
List {
// Upper sections with default background
Section {
Text("Content 1")
} header: {
Text("Log")
}
// Bottom section that needs different background
Section {
ForEach(protocolMedications) { medication in
Text(medication.name)
}
} header: {
Text("Your Medications")
}
}
.listStyle(.insetGrouped)
}
}
}
In my collection view I have allowsSelection, allowsSelectionDuringEditing, and allowsMultipleSelectionDuringEditing set to true.
In my delegate's collectionView(_:shouldBeginMultipleSelectionInteractionAt:) I return true unconditionally, getting me the desired behavior of triggering edit mode with a two-finger swipe. So far so good.
The problem is that collectionView(_:shouldBeginMultipleSelectionInteractionAt:) is also called on a single-finger click from a trackpad. Tapping one of my cells is supposed to open a sub-screen, so with this happening there's no way to navigate my screen using a trackpad. This also goes against the documentation, which says this delegate method is supposed to get called because of a two-finger swipe.
Is there a way to keep that call from happing from a trackpad click? Or a way to distinguish whether I'm getting the call because of an actual two finger swipe?