Explore the power of machine learning and Apple Intelligence within apps. Discuss integrating features, share best practices, and explore the possibilities for your app here.

All subtopics
Posts under Machine Learning & AI topic

Post

Replies

Boosts

Views

Activity

CoreML multifunction model runtime memory cost
Recently, I'm trying to deploy some third-party LLM to Apple devices. The methodoloy is similar to https://github.com/Anemll/Anemll. The biggest issue I'm having now is the runtime memory usage. When there are multiple functions in a model (mlpackage or mlmodelc), the runtime memory usage for weights is somehow duplicated when I load all of them. Here's the detail: I created my multifunction mlpackage following https://apple.github.io/coremltools/docs-guides/source/multifunction-models.html I loaded each of the functions using the generated swift class: let config = MLModelConfiguration() config.computeUnits = MLComputeUnits.cpuAndNeuralEngine config.functionName = "infer_512"; let ffn1_infer_512 = try! mimo_FFN_PF_lut4_chunk_01of02(configuration: config) config.functionName = "infer_1024"; let ffn1_infer_1024 = try! mimo_FFN_PF_lut4_chunk_01of02(configuration: config) config.functionName = "infer_2048"; let ffn1_infer_2048 = try! mimo_FFN_PF_lut4_chunk_01of02(configuration: config) I observed that RAM usage increases linearly as I load each of the functions. Using instruments, I see that there are multiple HWX files generated and loaded, each of which contains all the weight data. My understanding of what's happening here: The CoreML framework did some MIL->MIL preprocessing before further compilation, which includes separating CPU workload from ANE workload. The ANE part of each function is moved into a separate MIL file then compile separately into a HWX file each. The problem is that the weight data of these HWX files are duplicated. Since that the weight data of LLMs is huge, it will cause out-of-memory issue on mobile devices. The improvement I'm hoping from Apple: I hope we can try to merge the processed MIL files back into one before calling ANECCompile(), so that the weights can be merged. I don't have control over that in user space and I'm not sure if that is feasible. So I'm asking for help here. Thanks.
1
0
207
Apr ’25
Translation Framework: Code 16 "Offline models not available" despite status showing .installed
Hi everyone, I'm experiencing an inconsistent behavior with the Translation framework on iOS 18. The LanguageAvailability.status() API reports language models as .installed, but translation fails with Code 16. Setup: Using translationTask modifier with TranslationSession Batch translation with explicit source/target languages Languages: Portuguese→English, German→English Issue: let status = await LanguageAvailability().status(from: sourceLang, to: targetLang) // Returns: .installed // But translation fails: let responses = try await session.translations(from: requests) // Error: TranslationErrorDomain Code=16 "Offline models not available" Logs: Language model installed: pt -> en Language model installed: de -> en Starting translation: de -> en Error Domain=TranslationErrorDomain Code=16 "Translation failed"NSLocalizedFailureReason=Offline models not available for language pair What I've tried: Re-downloading languages in Settings Using source: nil for auto-detection Fresh TranslationSession.Configuration each time Questions: Is there a way to force model re-validation/re-download programmatically? Should translationTask show download popup when Code 16 occurs? Has anyone found a reliable workaround? I've seen similar reports in threads 791357 and 777113. Any guidance appreciated! Thanks!
1
0
451
Jan ’26
Embedding model missing once transferred to Xcode
I've created a "Transfer Learning BERT Embeddings" model with the default "Latin" language family and "Automatic" Language setting. This model performs exceptionally well against the test data set and functions as expected when I preview it in Create ML. However, when I add it to the Xcode project of the application to which I am deploying it, I am getting runtime errors that suggest it can't find the embedding resources: Failed to locate assets for 'mul_Latn' - '5C45D94E-BAB4-4927-94B6-8B5745C46289' embedding model Note, I am adding the model to the app project the same way that I added an earlier "Maximum Entropy" model. That model had no runtime issues. So it seems there is an issue getting hold of the embeddings at runtime. For now, "runtime" means in the Simulator. I intend to deploy my application to iOS devices once GM 26 is released (the app also uses AFM). I'm developing on Tahoe 26 beta, running on iOS 26 beta, using Xcode 26 beta. Is this a known/expected issue? Are the embeddings expected to be a resource in the model? Is there a workaround? I did try opening the model in Xcode and saving it as an mlpackage, then adding that to my app project, but that also didn't resolve the issue.
1
0
528
Sep ’25
Tone, Sentiment, language analysis on iPhone - Ideas
Hi everyone, I’m exploring ideas around on-device analysis of user typing behavior on iPhone, and I’d love input from others who’ve worked in this area or thought about similar problems. Conceptually, I’m interested in things like: High-level sentiment or tone inferred from what a user types over time using ML-models Identifying a user’s most important or frequent topics over a recent window (e.g., “last week”) Aggregated insights rather than raw text (privacy-preserving summaries: e.g., your typo-rate by hour to infer highly efficient time slots or "take-a-break" warning typing errors increase) I understand the significant privacy restrictions around keyboard input on iOS, especially for third-party keyboards and system text fields. I’m not trying to bypass those constraints—rather, I’m curious about what’s realistically possible within Apple’s frameworks and policies. (For instance, Grammarly as a correction tool includes some information about tone) Questions I’m thinking through: Are there any recommended approaches for on-device text analysis that don’t rely on capturing raw keystrokes? Has anyone used NLP / Core ML / Natural Language successfully for similar summarization or sentiment tasks, scoped only to user-explicit input? For custom keyboards, what kinds of derived or transient signals (if any) are acceptable to process and summarize locally? Any design patterns that balance usefulness with Apple’s privacy expectations? If you’ve built something adjacent—journaling, writing analytics, well-being apps, etc.—I’d appreciate hearing what worked, what didn’t, and what Apple reviewers were comfortable with. Thanks in advance for any ideas or references 🙏
1
1
614
Feb ’26
Custom keypoint detection model through vision api
Hi there, I have a custom keypoint detection model and want to use it via vision's CoremlRequest API. Here's some complication for input and output: For input My model expect 512x512 a image. Which would be resized and padded from a 1920x1080 frame. I use the .scaleToFit option, but can I also specify the color used for padding? For output: My model output a CoreMLFeatureValueObservation, can I have it output in a format vision recognizes? such as joints/keypoints If my model is able to output in a format vision recognizes, would it take care to restoring the coordinates back to the original frame? (undo the padding) If not, how do I restore it from .scaletofit option? Best,
1
0
932
Oct ’25
iOS 26 beta breaking my model
I just recently updated to iOS 26 beta (23A5336a) to test an app I am developing I running an MLModel loaded from a .mlmodelc file. On the current iOS version 18.6.2 the model is running as expected with no issues. However on iOS 26 I am now getting error when trying to perform an inference to the model where I pass a camera frame into it. Below is the error I am seeing when I attempt to run an inference. at the bottom it says "Failed with status=0x1d : statusType=0x9: Program Inference error status=-1 Unable to compute the prediction using a neural network model. It can be an invalid input data or broken/unsupported model " does this indicate I need to convert my model or something? I don't understand since it runs as normal on iOS 18. Any help getting this to run again would be greatly appreciated. Thank you, processRequest:model:qos:qIndex:modelStringID:options:returnValue:error:: Could not process request ret=0x1d lModel=_ANEModel: { modelURL=file:///var/containers/Bundle/Application/04F01BF5-D48B-44EC-A5F6-3C7389CF4856/RizzCanvas.app/faceParsing.mlmodelc/ : sourceURL=(null) : UUID=46228BFC-19B0-45BF-B18D-4A2942EEC144 : key={"isegment":0,"inputs":{"input":{"shape":[512,512,1,3,1]}},"outputs":{"var_633":{"shape":[512,512,1,19,1]},"94_argmax_out_value":{"shape":[512,512,1,1,1]},"argmax_out":{"shape":[512,512,1,1,1]},"var_637":{"shape":[512,512,1,19,1]}}} : identifierSource=1 : cacheURLIdentifier=01EF2D3DDB9BA8FD1FDE18C7CCDABA1D78C6BD02DC421D37D4E4A9D34B9F8181_93D03B87030C23427646D13E326EC55368695C3F61B2D32264CFC33E02FFD9FF : string_id=0x00000000 : program=_ANEProgramForEvaluation: { programHandle=259022032430 : intermediateBufferHandle=13949 : queueDepth=127 } : state=3 : [Espresso::ANERuntimeEngine::__forward_segment 0] evaluate[RealTime]WithModel returned 0; code=8 err=Error Domain=com.apple.appleneuralengine Code=8 "processRequest:model:qos:qIndex:modelStringID:options:returnValue:error:: ANEProgramProcessRequestDirect() Failed with status=0x1d : statusType=0x9: Program Inference error" UserInfo={NSLocalizedDescription=processRequest:model:qos:qIndex:modelStringID:options:returnValue:error:: ANEProgramProcessRequestDirect() Failed with status=0x1d : statusType=0x9: Program Inference error} [Espresso::handle_ex_plan] exception=Espresso exception: "Generic error": ANEF error: /private/var/containers/Bundle/Application/04F01BF5-D48B-44EC-A5F6-3C7389CF4856/RizzCanvas.app/faceParsing.mlmodelc/model.espresso.net, processRequest:model:qos:qIndex:modelStringID:options:returnValue:error:: ANEProgramProcessRequestDirect() Failed with status=0x1d : statusType=0x9: Program Inference error status=-1 Unable to compute the prediction using a neural network model. It can be an invalid input data or broken/unsupported model (error code: -1). Error Domain=com.apple.Vision Code=3 "The VNCoreMLTransform request failed" UserInfo={NSLocalizedDescription=The VNCoreMLTransform request failed, NSUnderlyingError=0x114d92940 {Error Domain=com.apple.CoreML Code=0 "Unable to compute the prediction using a neural network model. It can be an invalid input data or broken/unsupported model (error code: -1)." UserInfo={NSLocalizedDescription=Unable to compute the prediction using a neural network model. It can be an invalid input data or broken/unsupported model (error code: -1).}}}
1
0
1.2k
Sep ’25
FoundationModels Content Sanitizer Blocking Legitimate Text Processing
I'm developing a macOS application using the FoundationModels framework (LanguageModelSession) and encountering issues with the content sanitizer blocking legitimate text input. ** Issue Description:** The content sanitizer is flagging text strings that contain certain substrings, even when they represent legitimate technical content. For example: F_SEEL_SEX1S.wav (sE Electronics SEX1S microphone model) Technical product identifiers Serial numbers and version codes ** Broader Concern:** The content sanitizer appears to be applying restrictions that seem inappropriate for user-owned content. Even if a filename were something like "human sex.wav", users should have the right to process their own legitimate files on their own devices without content filtering interference. ** Error Messages:** SensitiveContentSettings: Sanitizer model found unsafe content in value FoundationModels.LanguageModelSession.GenerationError error 2 ** Questions:** Is there a way to disable content sanitization for processing user-owned content? 2. What's the recommended approach for applications that need to handle arbitrary user text? 3. Are there APIs to process personal content without filtering restrictions? ** Environment:** macOS 26.0 FoundationModels framework LanguageModelSession Any guidance would be appreciated.
1
0
397
Jun ’25
Provide spoken voice search string
Hello, My goal is to enable users to perform a freeform search request for any product I sell using a spoken phrase, for example, "Hey Siri, search GAMING CONSOLES on MyCatalogApp". The result would launch MyCatalogApp and navigate to a search results page displaying gaming consoles. I have defined a SearchIntent (using the .system.search schema) and a Shortcut to accomplish this. However, Siri doesn't seem to be able to correctly parse the spoken phrase, extract the search string, and provide it as the critiria term within SearchIntent. What am I doing wrong? Here is the SearchIntent. Note the print() statement outputs the search string--which in the scenario above would be "GAMING CONSOLES"--but it doesn't work. import AppIntents @available(iOS 17.2, *) @AppIntent(schema: .system.search) struct SearchIntent: ShowInAppSearchResultsIntent { static var searchScopes: [StringSearchScope] = [.general] @Parameter(title: "Criteria") var criteria: StringSearchCriteria static var title: LocalizedStringResource = "Search with MyCatalogApp" @MainActor func perform() async throws -> some IntentResult { let searchString = criteria.term print("**** Search String: \(searchString) ****") // tmp debugging try await MyCatalogSearchHelper.search(for: searchString) // fetch results via my async fetch API return .result() } } Here's the Shortcuts definition: import AppIntents @available(iOS 17.2, *) struct Shortcuts: AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut( intent: SearchIntent(), phrases: ["Search for \(\.$criteria) on \(.applicationName)."], shortTitle: "Search", systemImageName: "magnifyingglass" ) } } Thanks for any help!
1
0
570
5d
linear_quantize_activations taking 90 minutes + on MacBook Air M1 2020
In my quantization code, the line: compressed_model_a8 = cto.coreml.experimental.linear_quantize_activations( model, activation_config, [{'img':np.random.randn(1,13,1024,1024)}] ) has taken 90 minutes to run so far and is still not completed. From debugging, I can see that the line it's stuck on is line 261 in _model_debugger.py: model = ct.models.MLModel( cloned_spec, weights_dir=self.weights_dir, compute_units=compute_units, skip_model_load=False, # Don't skip model load as we need model prediction to get activations range. ) Is this expected behaviour? Would it be quicker to run on another computer with more RAM?
1
0
176
Mar ’25
Train adapter with tool calling
Documentation on adapter train is lacking any details related to training on dataset with tool calling. And page about tool calling itself only explain how to use it from Swift without any internal details useful in training. Question is how schema should looks like for including tool calling in dataset?
1
0
274
Jun ’25
Siri 2.0 (suggests and future updates)
Hey dear developers! This post should be available for the future Siri updates and improvements but also for wishes in this forum so that everyone can share their opinion and idea please stay friendly. have fun! I had already thought about developing a demo app to demonstrate my idea for a better Siri. My change of many: Wish Update: Siri's language recognition capabilities have been significantly enhanced. Instead of manually setting the language, Siri can now automatically recognize the language you intend to use, making language switching much more efficient. Simply speak the language you want to communicate in, and Siri will automatically recognize it and respond accordingly. Whether you speak English, German, or Japanese, Siri will respond in the language you choose.
1
1
941
Oct ’25
Unified Use Case Mail Categories & Spam
Hi Apple product owners. I am missing a unified concept which might be derived from the use cases for mail categories and mail spam for the app "Mail" on Mac. I need a recommendation on how to use categories in combination with the spam filter to get most out of it. So I was looking for the use cases for the 2 functionality areas in order to figure out how to organise my mails by using as much automation as possible before I start creating intelligent folders in addition. What can you recommend where I get this information from? I don't want to guess or read a lot of forum contributions which are based on guesses.
1
0
97
Apr ’25
What is the Foundation Models support for basic math?
I am experimenting with Foundation Models in my time tracking app to analyze users tracked events, but I am finding that the model struggles with even basic computation of time. Specifically converting from seconds to hours and minutes. To give just one example, when I prompt: "Convert 3672 seconds to hours, minutes, and seconds. Don't include the calculations in the resulting output" I get this: "3672 seconds is equal to 1 hour, 0 minutes, and 36 seconds". Which is clearly wrong - it should be 1 hour, 1 minute, and 12 seconds. Another issue that I saw a lot is that seconds were considered to be minutes, or that the hours were just completely off. What can I do to make the support for math better? Or is that just something that the model is not meant to be used for?
1
0
218
Jun ’25
ImagePlayground API not working on Xcode Simulator Devices
Hi! I'm trying to use the ImagePlayground API in SwiftUI with the .imagePlaygroundSheet modifier. However, when the sheet is shown (in the preview or in the simulator) it displays the following message: "Image Playground is not available. Image Playground is not available on this iPhone.". I'm using an iPhone 16 Pro with iOS 18.3.1 in the Xcode (16.2) Simulator. Anyone else having this problem? How can I fix it?
1
0
200
Apr ’25
Threading issues when using debugger
Hi, I am modifying the sample camera app that is here: https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview ... In the processPreviewImages, I am using the Vision APIs to generate a segmentation mask for a person/object, then compositing that person onto a different background (with some other filtering). The filtering and compositing is done via CoreImage. At the end, I convert the CIImage to a CGImage then to a SwiftUI Image. When I run it on my iPhone, it works fine, and has not crashed. When I run it on the iPhone with the debugger, it crashes within a few seconds with: EXC_BAD_ACCESS in libRPAC.dylib`std::__1::__hash_table<std::__1::__hash_value_type<long, qos_info_t>, std::__1::__unordered_map_hasher<long, std::__1::__hash_value_type<long, qos_info_t>, std::__1::hash, std::__1::equal_to, true>, std::__1::__unordered_map_equal<long, std::__1::__hash_value_type<long, qos_info_t>, std::__1::equal_to, std::__1::hash, true>, std::__1::allocator<std::__1::__hash_value_type<long, qos_info_t>>>::__emplace_unique_key_args<long, std::__1::piecewise_construct_t const&, std::__1::tuple<long const&>, std::__1::tuple<>>: It had previously been working fine with the debugger, so I'm not sure what has changed. Is there a difference in how the Vision APIs are executed if the debugger is attached vs. not?
1
0
402
Jan ’26
AXSpeech Crash
I have a very terrible crash problem in my App when I use AVSpeechSynthesizer and I can't repetition it.Here is my code, It's a singleton- (void)stopSpeech { if ([self.synthesizer isPaused]) { return; } if ([self.synthesizer isSpeaking]) { BOOL isSpeech = [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryImmediate]; if (!isSpeech) { [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryWord]; } } self.stopBlock ? self.stopBlock() : nil; } -(AVSpeechSynthesizer *)synthesizer { if (!_synthesizer) { _synthesizer = [[AVSpeechSynthesizer alloc] init]; _synthesizer.delegate = self; } return _synthesizer; }When the user leaves the page, I call the stopSpeech method。Then I got a lot of crash messagesHere is a crash log:# Crashlytics - plaintext stacktrace downloaded by liweican at Mon, 13 May 2019 03:03:24 GMT # URL: https://fabric.io/youdao-dict/ios/apps/com.youdao.udictionary/issues/5a904ed88cb3c2fa63ad7ed3?time=last-thirty-days/sessions/b1747d91bafc4680ab0ca8e3a702c52c_DNE_0_v2 # Organization: zzz # Platform: ios # Application: U-Dictionary # Version: 3.0.5.4 # Bundle Identifier: com.youdao.UDictionary # Issue ID: 5a904ed88cb3c2fa63ad7ed3 # Session ID: b1747d91bafc4680ab0ca8e3a702c52c_DNE_0_v2 # Date: 2019-05-13T02:27:00Z # OS Version: 12.2.0 (16E227) # Device: iPhone 8 Plus # RAM Free: 17% # Disk Free: 64.6% #19. Crashed: AXSpeech 0 libsystem_pthread.dylib 0x19c15e5b8 pthread_mutex_lock$VARIANT$armv81 + 102 1 CoreFoundation 0x19c4cf84c CFRunLoopSourceSignal + 68 2 Foundation 0x19cfc7280 performQueueDequeue + 464 3 Foundation 0x19cfc680c __NSThreadPerformPerform + 136 4 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 5 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88 6 CoreFoundation 0x19c4d1b74 __CFRunLoopDoSources0 + 256 7 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004 8 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 9 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300 10 libAXSpeechManager.dylib 0x1ac16c94c -[AXSpeechThread main] + 264 11 Foundation 0x19cfc66e4 __NSThread__start__ + 984 12 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 13 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 14 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 -- #0. com.apple.main-thread 0 libsystem_malloc.dylib 0x19c11ce24 small_free_list_remove_ptr_no_clear + 768 1 libsystem_malloc.dylib 0x19c11f094 small_malloc_from_free_list + 296 2 libsystem_malloc.dylib 0x19c11f094 small_malloc_from_free_list + 296 3 libsystem_malloc.dylib 0x19c11d63c small_malloc_should_clear + 224 4 libsystem_malloc.dylib 0x19c11adcc szone_malloc_should_clear + 132 5 libsystem_malloc.dylib 0x19c123c18 malloc_zone_malloc + 156 6 CoreFoundation 0x19c569ab4 __CFBasicHashRehash + 300 7 CoreFoundation 0x19c56b430 __CFBasicHashAddValue + 96 8 CoreFoundation 0x19c56ab9c CFBasicHashAddValue + 2160 9 CoreFoundation 0x19c49f3bc CFDictionaryAddValue + 260 10 CoreFoundation 0x19c572ee8 __54-[CFPrefsSource mergeIntoDictionary:sourceDictionary:]_block_invoke + 28 11 CoreFoundation 0x19c49f0b4 __CFDictionaryApplyFunction_block_invoke + 24 12 CoreFoundation 0x19c568b7c CFBasicHashApply + 116 13 CoreFoundation 0x19c49f090 CFDictionaryApplyFunction + 168 14 CoreFoundation 0x19c42f504 -[CFPrefsSource mergeIntoDictionary:sourceDictionary:] + 136 15 CoreFoundation 0x19c4bcd38 -[CFPrefsSearchListSource alreadylocked_getDictionary:] + 644 16 CoreFoundation 0x19c42e71c -[CFPrefsSearchListSource alreadylocked_copyValueForKey:] + 152 17 CoreFoundation 0x19c42e660 -[CFPrefsSource copyValueForKey:] + 60 18 CoreFoundation 0x19c579e88 __76-[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:]_block_invoke + 40 19 CoreFoundation 0x19c4bdff4 __108-[_CFXPreferences(SearchListAdditions) withSearchListForIdentifier:container:cloudConfigurationURL:perform:]_block_invoke + 272 20 CoreFoundation 0x19c4bda38 normalizeQuintuplet + 340 21 CoreFoundation 0x19c42c634 -[_CFXPreferences(SearchListAdditions) withSearchListForIdentifier:container:cloudConfigurationURL:perform:] + 108 22 CoreFoundation 0x19c42cec0 -[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:] + 148 23 CoreFoundation 0x19c57c2d0 _CFPreferencesCopyAppValueWithContainerAndConfiguration + 124 24 TextInput 0x1a450e550 -[TIPreferencesController valueForPreferenceKey:] + 460 25 UIKitCore 0x1c87c71f8 -[UIKeyboardPreferencesController handBias] + 36 26 UIKitCore 0x1c887275c -[UIKeyboardLayoutStar showKeyboardWithInputTraits:screenTraits:splitTraits:] + 320 27 UIKitCore 0x1c88f4240 -[UIKeyboardImpl finishLayoutChangeWithArguments:] + 492 28 UIKitCore 0x1c88f47c8 -[UIKeyboardImpl updateLayout] + 1208 29 UIKitCore 0x1c88eaad0 -[UIKeyboardImpl updateLayoutIfNecessary] + 448 30 UIKitCore 0x1c88eab9c -[UIKeyboardImpl setFrame:] + 140 31 UIKitCore 0x1c88d5d60 -[UIKeyboard activate] + 652 32 UIKitCore 0x1c894c90c -[UIKeyboardAutomatic activate] + 128 33 UIKitCore 0x1c88d5158 -[UIKeyboard setFrame:] + 296 34 UIKitCore 0x1c88d81b0 -[UIKeyboard _didChangeKeyplaneWithContext:] + 228 35 UIKitCore 0x1c88f4aa0 -[UIKeyboardImpl didMoveToSuperview] + 136 36 UIKitCore 0x1c8f2ad84 __45-[UIView(Hierarchy) _postMovedFromSuperview:]_block_invoke + 888 37 UIKitCore 0x1c8f2a970 -[UIView(Hierarchy) _postMovedFromSuperview:] + 760 38 UIKitCore 0x1c8f39ddc -[UIView(Internal) _addSubview:positioned:relativeTo:] + 1740 39 UIKitCore 0x1c88d5d84 -[UIKeyboard activate] + 688 40 UIKitCore 0x1c894c90c -[UIKeyboardAutomatic activate] + 128 41 UIKitCore 0x1c893b3a4 -[UIPeripheralHost(UIKitInternal) _reloadInputViewsForResponder:] + 1332 42 UIKitCore 0x1c8ae66d8 -[UIResponder(UIResponderInputViewAdditions) reloadInputViews] + 80 43 UIKitCore 0x1c8ae23bc -[UIResponder becomeFirstResponder] + 804 44 UIKitCore 0x1c8f2a560 -[UIView(Hierarchy) becomeFirstResponder] + 156 45 UIKitCore 0x1c8d93e84 -[UITextField becomeFirstResponder] + 244 46 UIKitCore 0x1c8d578dc -[UITextInteractionAssistant(UITextInteractionAssistant_Internal) setFirstResponderIfNecessary] + 192 47 UIKitCore 0x1c8d45d8c -[UITextSelectionInteraction oneFingerTap:] + 3136 48 UIKitCore 0x1c86e0bcc -[UIGestureRecognizerTarget _sendActionWithGestureRecognizer:] + 64 49 UIKitCore 0x1c86e8dd4 _UIGestureRecognizerSendTargetActions + 124 50 UIKitCore 0x1c86e6778 _UIGestureRecognizerSendActions + 316 51 UIKitCore 0x1c86e5ca4 -[UIGestureRecognizer _updateGestureWithEvent:buttonEvent:] + 760 52 UIKitCore 0x1c86d9d80 _UIGestureEnvironmentUpdate + 2180 53 UIKitCore 0x1c86d94b0 -[UIGestureEnvironment _deliverEvent:toGestureRecognizers:usingBlock:] + 384 54 UIKitCore 0x1c86d9290 -[UIGestureEnvironment _updateForEvent:window:] + 204 55 UIKitCore 0x1c8af14a8 -[UIWindow sendEvent:] + 3112 56 UIKitCore 0x1c8ad1534 -[UIApplication sendEvent:] + 340 57 UIKitCore 0x1c8b977c0 __dispatchPreprocessedEventFromEventQueue + 1768 58 UIKitCore 0x1c8b99eec __handleEventQueueInternal + 4828 59 UIKitCore 0x1c8b9311c __handleHIDEventFetcherDrain + 152 60 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 61 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88 62 CoreFoundation 0x19c4d1b24 __CFRunLoopDoSources0 + 176 63 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004 64 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 65 GraphicsServices 0x19e6cc79c GSEventRunModal + 104 66 UIKitCore 0x1c8ab7b68 UIApplicationMain + 212 67 UDictionary 0x10517e138 main (main.m:17) 68 libdyld.dylib 0x19bf928e0 start + 4 #1. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #2. com.apple.uikit.eventfetch-thread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300 6 Foundation 0x19ce99e5c -[NSRunLoop(NSRunLoop) runUntilDate:] + 96 7 UIKitCore 0x1c8b9d540 -[UIEventFetcher threadMain] + 136 8 Foundation 0x19cfc66e4 __NSThread__start__ + 984 9 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 10 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 11 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #3. JavaScriptCore bmalloc scavenger 0 libsystem_kernel.dylib 0x19c0ddee4 __psynch_cvwait + 8 1 libsystem_pthread.dylib 0x19c15d4a4 _pthread_cond_wait$VARIANT$armv81 + 628 2 libc++.1.dylib 0x19b6b5090 std::__1::condition_variable::wait(std::__1::unique_lock&lt;std::__1::mutex&gt;&amp;) + 24 3 JavaScriptCore 0x1a36a2238 void std::__1::condition_variable_any::wait&lt;std::__1::unique_lock&lt;bmalloc::Mutex&gt; &gt;(std::__1::unique_lock&lt;bmalloc::Mutex&gt;&amp;) + 108 4 JavaScriptCore 0x1a36a622c bmalloc::Scavenger::threadRunLoop() + 176 5 JavaScriptCore 0x1a36a59a4 bmalloc::Scavenger::Scavenger(std::__1::lock_guard&lt;bmalloc::Mutex&gt;&amp;) + 10 6 JavaScriptCore 0x1a36a73e4 std::__1::__thread_specific_ptr&lt;std::__1::__thread_struct&gt;::set_pointer(std::__1::__thread_struct*) + 38 7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #4. WebThread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 WebCore 0x1a5126480 RunWebThread(void*) + 600 6 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 7 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 8 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #5. com.twitter.crashlytics.ios.MachExceptionServer 0 UDictionary 0x1058a5564 CLSProcessRecordAllThreads (CLSProcess.c:376) 1 UDictionary 0x1058a594c CLSProcessRecordAllThreads (CLSProcess.c:407) 2 UDictionary 0x1058952dc CLSHandler (CLSHandler.m:26) 3 UDictionary 0x1058906cc CLSMachExceptionServer (CLSMachException.c:446) 4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #6. com.apple.NSURLConnectionLoader 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 CFNetwork 0x19cae574c -[__CoreSchedulingSetRunnable runForever] + 216 6 Foundation 0x19cfc66e4 __NSThread__start__ + 984 7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #7. AVAudioSession Notify Thread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 AVFAudio 0x1a238a378 GenericRunLoopThread::Entry(void*) + 156 6 AVFAudio 0x1a23b4c60 CAPThread::Entry(CAPThread*) + 88 7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #8. WebCore: LocalStorage 0 libsystem_kernel.dylib 0x19c0ddee4 __psynch_cvwait + 8 1 libsystem_pthread.dylib 0x19c15d4a4 _pthread_cond_wait$VARIANT$armv81 + 628 2 JavaScriptCore 0x1a3668ce4 ***::ThreadCondition::timedWait(***::Mutex&amp;, ***::WallTime) + 80 3 JavaScriptCore 0x1a364f96c ***::ParkingLot::parkConditionallyImpl(void const*, ***::ScopedLambda&lt;bool ()&gt; const&amp;, ***::ScopedLambda&lt;void ()&gt; const&amp;, ***::TimeWithDynamicClockType const&amp;) + 2004 4 WebKitLegacy 0x1a67b6ea8 bool ***::Condition::waitUntil&lt;***::Lock&gt;(***::Lock&amp;, ***::TimeWithDynamicClockType const&amp;) + 184 5 WebKitLegacy 0x1a67b9ba4 std::__1::unique_ptr&lt;***::Function&lt;void ()&gt;, std::__1::default_delete&lt;***::Function&lt;void ()&gt; &gt; &gt; ***::MessageQueue&lt;***::Function&lt;void ()&gt; &gt;::waitForMessageFilteredWithTimeout&lt;***::MessageQueue&lt;***::Function&lt;void ()&gt; &gt;::waitForMessage()::'lambda'(***::Function&lt;void ()&gt; const&amp;)&gt;(***::MessageQueueWaitResult&amp;, ***::MessageQueue&lt;***::Function&lt;void ()&gt; &gt;::waitForMessage()::'lambda'(***::Function&lt;void ()&gt; const&amp;)&amp;&amp;, ***::WallTime) + 156 6 WebKitLegacy 0x1a67b91c0 WebCore::StorageThread::threadEntryPoint() + 68 7 JavaScriptCore 0x1a3666f88 ***::Thread::entryPoint(***::Thread::NewThreadContext*) + 260 8 JavaScriptCore 0x1a3668494 ***::wtfThreadEntryPoint(void*) + 12 9 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 10 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 11 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #9. com.apple.CoreMotion.MotionThread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 CoreFoundation 0x19c4cd0b0 CFRunLoopRun + 80 6 CoreMotion 0x1a1df0240 (Missing) 7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #10. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #11. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c1611f8 _pthread_wqthread + 532 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #12. com.apple.CFStream.LegacyThread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 CoreFoundation 0x19c4e5094 _legacyStreamRunLoop_workThread + 260 6 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 7 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 8 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #13. Thread 0 libsystem_pthread.dylib 0x19c163cd0 start_wqthread + 190 #14. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #15. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #16. Thread 0 libsystem_kernel.dylib 0x19c0d3148 semaphore_timedwait_trap + 8 1 libdispatch.dylib 0x19bf50a4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64 2 libdispatch.dylib 0x19bf513a8 _dispatch_semaphore_wait_slow + 72 3 libdispatch.dylib 0x19bf647c8 _dispatch_worker_thread + 344 4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #17. Thread 0 libsystem_kernel.dylib 0x19c0d3148 semaphore_timedwait_trap + 8 1 libdispatch.dylib 0x19bf50a4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64 2 libdispatch.dylib 0x19bf513a8 _dispatch_semaphore_wait_slow + 72 3 libdispatch.dylib 0x19bf647c8 _dispatch_worker_thread + 344 4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #18. Thread 0 libsystem_kernel.dylib 0x19c0d3148 semaphore_timedwait_trap + 8 1 libdispatch.dylib 0x19bf50a4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64 2 libdispatch.dylib 0x19bf513a8 _dispatch_semaphore_wait_slow + 72 3 libdispatch.dylib 0x19bf647c8 _dispatch_worker_thread + 344 4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #19. Crashed: AXSpeech 0 libsystem_pthread.dylib 0x19c15e5b8 pthread_mutex_lock$VARIANT$armv81 + 102 1 CoreFoundation 0x19c4cf84c CFRunLoopSourceSignal + 68 2 Foundation 0x19cfc7280 performQueueDequeue + 464 3 Foundation 0x19cfc680c __NSThreadPerformPerform + 136 4 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 5 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88 6 CoreFoundation 0x19c4d1b74 __CFRunLoopDoSources0 + 256 7 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004 8 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 9 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300 10 libAXSpeechManager.dylib 0x1ac16c94c -[AXSpeechThread main] + 264 11 Foundation 0x19cfc66e4 __NSThread__start__ + 984 12 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 13 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 14 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #20. AXSpeech 0 (Missing) 0x1071ba524 (Missing) 1 (Missing) 0x1071b3e7c (Missing) 2 (Missing) 0x10718fba4 (Missing) 3 (Missing) 0x107184bc8 (Missing) 4 libdyld.dylib 0x19bf95908 dlopen + 176 5 CoreFoundation 0x19c5483e8 _CFBundleDlfcnLoadBundle + 140 6 CoreFoundation 0x19c486918 _CFBundleLoadExecutableAndReturnError + 352 7 Foundation 0x19ced5734 -[NSBundle loadAndReturnError:] + 428 8 TextToSpeech 0x1abfff800 TTSSpeechUnitTestingMode + 1020 9 libdispatch.dylib 0x19bf817d4 _dispatch_client_callout + 16 10 libdispatch.dylib 0x19bf52040 _dispatch_once_callout + 28 11 TextToSpeech 0x1abfff478 TTSSpeechUnitTestingMode + 116 12 libobjc.A.dylib 0x19b7173cc CALLING_SOME_+initialize_METHOD + 24 13 libobjc.A.dylib 0x19b71cee0 initializeNonMetaClass + 296 14 libobjc.A.dylib 0x19b71e640 initializeAndMaybeRelock(objc_class*, objc_object*, mutex_tt&lt;false&gt;&amp;, bool) + 260 15 libobjc.A.dylib 0x19b7265a4 lookUpImpOrForward + 244 16 libobjc.A.dylib 0x19b733858 _objc_msgSend_uncached + 56 17 libAXSpeechManager.dylib 0x1ac167324 -[AXSpeechManager _initialize] + 68 18 Foundation 0x19cfc68d4 __NSThreadPerformPerform + 336 19 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 20 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88 21 CoreFoundation 0x19c4d1b74 __CFRunLoopDoSources0 + 256 22 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004 23 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 24 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300 25 libAXSpeechManager.dylib 0x1ac16c94c -[AXSpeechThread main] + 264 26 Foundation 0x19cfc66e4 __NSThread__start__ + 984 27 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 28 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 29 libsystem_pthread.dylib 0x19c163cdc thread_start + 4I change my code like this, It still has the same problem- (void)stopSpeech { if (self.synthesizer != nil &amp;&amp; [self.synthesizer isPaused]) { return; } // if ([self.synthesizer isSpeaking]) { // BOOL isSpeech = [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryImmediate]; // if (!isSpeech) { // [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryWord]; // } // } if (self.synthesizer != nil) { [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryImmediate]; // if (!isSpeech) { // [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryWord]; // } self.stopBlock ? self.stopBlock() : nil; } }
1
1
2.5k
4d
CoreML multifunction model runtime memory cost
Recently, I'm trying to deploy some third-party LLM to Apple devices. The methodoloy is similar to https://github.com/Anemll/Anemll. The biggest issue I'm having now is the runtime memory usage. When there are multiple functions in a model (mlpackage or mlmodelc), the runtime memory usage for weights is somehow duplicated when I load all of them. Here's the detail: I created my multifunction mlpackage following https://apple.github.io/coremltools/docs-guides/source/multifunction-models.html I loaded each of the functions using the generated swift class: let config = MLModelConfiguration() config.computeUnits = MLComputeUnits.cpuAndNeuralEngine config.functionName = "infer_512"; let ffn1_infer_512 = try! mimo_FFN_PF_lut4_chunk_01of02(configuration: config) config.functionName = "infer_1024"; let ffn1_infer_1024 = try! mimo_FFN_PF_lut4_chunk_01of02(configuration: config) config.functionName = "infer_2048"; let ffn1_infer_2048 = try! mimo_FFN_PF_lut4_chunk_01of02(configuration: config) I observed that RAM usage increases linearly as I load each of the functions. Using instruments, I see that there are multiple HWX files generated and loaded, each of which contains all the weight data. My understanding of what's happening here: The CoreML framework did some MIL->MIL preprocessing before further compilation, which includes separating CPU workload from ANE workload. The ANE part of each function is moved into a separate MIL file then compile separately into a HWX file each. The problem is that the weight data of these HWX files are duplicated. Since that the weight data of LLMs is huge, it will cause out-of-memory issue on mobile devices. The improvement I'm hoping from Apple: I hope we can try to merge the processed MIL files back into one before calling ANECCompile(), so that the weights can be merged. I don't have control over that in user space and I'm not sure if that is feasible. So I'm asking for help here. Thanks.
Replies
1
Boosts
0
Views
207
Activity
Apr ’25
How to get access to VisionPro cameras?
Access to VisionPro cameras is required for a research project. The project is on mixed reality software development for healthcare applications in dentistry.
Replies
1
Boosts
0
Views
604
Activity
Jul ’25
Translation Framework: Code 16 "Offline models not available" despite status showing .installed
Hi everyone, I'm experiencing an inconsistent behavior with the Translation framework on iOS 18. The LanguageAvailability.status() API reports language models as .installed, but translation fails with Code 16. Setup: Using translationTask modifier with TranslationSession Batch translation with explicit source/target languages Languages: Portuguese→English, German→English Issue: let status = await LanguageAvailability().status(from: sourceLang, to: targetLang) // Returns: .installed // But translation fails: let responses = try await session.translations(from: requests) // Error: TranslationErrorDomain Code=16 "Offline models not available" Logs: Language model installed: pt -> en Language model installed: de -> en Starting translation: de -> en Error Domain=TranslationErrorDomain Code=16 "Translation failed"NSLocalizedFailureReason=Offline models not available for language pair What I've tried: Re-downloading languages in Settings Using source: nil for auto-detection Fresh TranslationSession.Configuration each time Questions: Is there a way to force model re-validation/re-download programmatically? Should translationTask show download popup when Code 16 occurs? Has anyone found a reliable workaround? I've seen similar reports in threads 791357 and 777113. Any guidance appreciated! Thanks!
Replies
1
Boosts
0
Views
451
Activity
Jan ’26
Embedding model missing once transferred to Xcode
I've created a "Transfer Learning BERT Embeddings" model with the default "Latin" language family and "Automatic" Language setting. This model performs exceptionally well against the test data set and functions as expected when I preview it in Create ML. However, when I add it to the Xcode project of the application to which I am deploying it, I am getting runtime errors that suggest it can't find the embedding resources: Failed to locate assets for 'mul_Latn' - '5C45D94E-BAB4-4927-94B6-8B5745C46289' embedding model Note, I am adding the model to the app project the same way that I added an earlier "Maximum Entropy" model. That model had no runtime issues. So it seems there is an issue getting hold of the embeddings at runtime. For now, "runtime" means in the Simulator. I intend to deploy my application to iOS devices once GM 26 is released (the app also uses AFM). I'm developing on Tahoe 26 beta, running on iOS 26 beta, using Xcode 26 beta. Is this a known/expected issue? Are the embeddings expected to be a resource in the model? Is there a workaround? I did try opening the model in Xcode and saving it as an mlpackage, then adding that to my app project, but that also didn't resolve the issue.
Replies
1
Boosts
0
Views
528
Activity
Sep ’25
Tone, Sentiment, language analysis on iPhone - Ideas
Hi everyone, I’m exploring ideas around on-device analysis of user typing behavior on iPhone, and I’d love input from others who’ve worked in this area or thought about similar problems. Conceptually, I’m interested in things like: High-level sentiment or tone inferred from what a user types over time using ML-models Identifying a user’s most important or frequent topics over a recent window (e.g., “last week”) Aggregated insights rather than raw text (privacy-preserving summaries: e.g., your typo-rate by hour to infer highly efficient time slots or "take-a-break" warning typing errors increase) I understand the significant privacy restrictions around keyboard input on iOS, especially for third-party keyboards and system text fields. I’m not trying to bypass those constraints—rather, I’m curious about what’s realistically possible within Apple’s frameworks and policies. (For instance, Grammarly as a correction tool includes some information about tone) Questions I’m thinking through: Are there any recommended approaches for on-device text analysis that don’t rely on capturing raw keystrokes? Has anyone used NLP / Core ML / Natural Language successfully for similar summarization or sentiment tasks, scoped only to user-explicit input? For custom keyboards, what kinds of derived or transient signals (if any) are acceptable to process and summarize locally? Any design patterns that balance usefulness with Apple’s privacy expectations? If you’ve built something adjacent—journaling, writing analytics, well-being apps, etc.—I’d appreciate hearing what worked, what didn’t, and what Apple reviewers were comfortable with. Thanks in advance for any ideas or references 🙏
Replies
1
Boosts
1
Views
614
Activity
Feb ’26
Custom keypoint detection model through vision api
Hi there, I have a custom keypoint detection model and want to use it via vision's CoremlRequest API. Here's some complication for input and output: For input My model expect 512x512 a image. Which would be resized and padded from a 1920x1080 frame. I use the .scaleToFit option, but can I also specify the color used for padding? For output: My model output a CoreMLFeatureValueObservation, can I have it output in a format vision recognizes? such as joints/keypoints If my model is able to output in a format vision recognizes, would it take care to restoring the coordinates back to the original frame? (undo the padding) If not, how do I restore it from .scaletofit option? Best,
Replies
1
Boosts
0
Views
932
Activity
Oct ’25
Cannot find 'SystemLanguageModel' in scope
Hi everyone, I am using Xcode 16.4 in MacOS Sequoia 15.5 with Apple Intelligence turned on. The following code gives the error message in the title: import NaturalLanguage @available(iOS 18.0, *) func testSystemModel() { let model = SystemLanguageModel.default print(model) } What am I missing?
Replies
1
Boosts
0
Views
279
Activity
Jun ’25
iOS 26 beta breaking my model
I just recently updated to iOS 26 beta (23A5336a) to test an app I am developing I running an MLModel loaded from a .mlmodelc file. On the current iOS version 18.6.2 the model is running as expected with no issues. However on iOS 26 I am now getting error when trying to perform an inference to the model where I pass a camera frame into it. Below is the error I am seeing when I attempt to run an inference. at the bottom it says "Failed with status=0x1d : statusType=0x9: Program Inference error status=-1 Unable to compute the prediction using a neural network model. It can be an invalid input data or broken/unsupported model " does this indicate I need to convert my model or something? I don't understand since it runs as normal on iOS 18. Any help getting this to run again would be greatly appreciated. Thank you, processRequest:model:qos:qIndex:modelStringID:options:returnValue:error:: Could not process request ret=0x1d lModel=_ANEModel: { modelURL=file:///var/containers/Bundle/Application/04F01BF5-D48B-44EC-A5F6-3C7389CF4856/RizzCanvas.app/faceParsing.mlmodelc/ : sourceURL=(null) : UUID=46228BFC-19B0-45BF-B18D-4A2942EEC144 : key={"isegment":0,"inputs":{"input":{"shape":[512,512,1,3,1]}},"outputs":{"var_633":{"shape":[512,512,1,19,1]},"94_argmax_out_value":{"shape":[512,512,1,1,1]},"argmax_out":{"shape":[512,512,1,1,1]},"var_637":{"shape":[512,512,1,19,1]}}} : identifierSource=1 : cacheURLIdentifier=01EF2D3DDB9BA8FD1FDE18C7CCDABA1D78C6BD02DC421D37D4E4A9D34B9F8181_93D03B87030C23427646D13E326EC55368695C3F61B2D32264CFC33E02FFD9FF : string_id=0x00000000 : program=_ANEProgramForEvaluation: { programHandle=259022032430 : intermediateBufferHandle=13949 : queueDepth=127 } : state=3 : [Espresso::ANERuntimeEngine::__forward_segment 0] evaluate[RealTime]WithModel returned 0; code=8 err=Error Domain=com.apple.appleneuralengine Code=8 "processRequest:model:qos:qIndex:modelStringID:options:returnValue:error:: ANEProgramProcessRequestDirect() Failed with status=0x1d : statusType=0x9: Program Inference error" UserInfo={NSLocalizedDescription=processRequest:model:qos:qIndex:modelStringID:options:returnValue:error:: ANEProgramProcessRequestDirect() Failed with status=0x1d : statusType=0x9: Program Inference error} [Espresso::handle_ex_plan] exception=Espresso exception: "Generic error": ANEF error: /private/var/containers/Bundle/Application/04F01BF5-D48B-44EC-A5F6-3C7389CF4856/RizzCanvas.app/faceParsing.mlmodelc/model.espresso.net, processRequest:model:qos:qIndex:modelStringID:options:returnValue:error:: ANEProgramProcessRequestDirect() Failed with status=0x1d : statusType=0x9: Program Inference error status=-1 Unable to compute the prediction using a neural network model. It can be an invalid input data or broken/unsupported model (error code: -1). Error Domain=com.apple.Vision Code=3 "The VNCoreMLTransform request failed" UserInfo={NSLocalizedDescription=The VNCoreMLTransform request failed, NSUnderlyingError=0x114d92940 {Error Domain=com.apple.CoreML Code=0 "Unable to compute the prediction using a neural network model. It can be an invalid input data or broken/unsupported model (error code: -1)." UserInfo={NSLocalizedDescription=Unable to compute the prediction using a neural network model. It can be an invalid input data or broken/unsupported model (error code: -1).}}}
Replies
1
Boosts
0
Views
1.2k
Activity
Sep ’25
FoundationModels Content Sanitizer Blocking Legitimate Text Processing
I'm developing a macOS application using the FoundationModels framework (LanguageModelSession) and encountering issues with the content sanitizer blocking legitimate text input. ** Issue Description:** The content sanitizer is flagging text strings that contain certain substrings, even when they represent legitimate technical content. For example: F_SEEL_SEX1S.wav (sE Electronics SEX1S microphone model) Technical product identifiers Serial numbers and version codes ** Broader Concern:** The content sanitizer appears to be applying restrictions that seem inappropriate for user-owned content. Even if a filename were something like "human sex.wav", users should have the right to process their own legitimate files on their own devices without content filtering interference. ** Error Messages:** SensitiveContentSettings: Sanitizer model found unsafe content in value FoundationModels.LanguageModelSession.GenerationError error 2 ** Questions:** Is there a way to disable content sanitization for processing user-owned content? 2. What's the recommended approach for applications that need to handle arbitrary user text? 3. Are there APIs to process personal content without filtering restrictions? ** Environment:** macOS 26.0 FoundationModels framework LanguageModelSession Any guidance would be appreciated.
Replies
1
Boosts
0
Views
397
Activity
Jun ’25
Provide spoken voice search string
Hello, My goal is to enable users to perform a freeform search request for any product I sell using a spoken phrase, for example, "Hey Siri, search GAMING CONSOLES on MyCatalogApp". The result would launch MyCatalogApp and navigate to a search results page displaying gaming consoles. I have defined a SearchIntent (using the .system.search schema) and a Shortcut to accomplish this. However, Siri doesn't seem to be able to correctly parse the spoken phrase, extract the search string, and provide it as the critiria term within SearchIntent. What am I doing wrong? Here is the SearchIntent. Note the print() statement outputs the search string--which in the scenario above would be "GAMING CONSOLES"--but it doesn't work. import AppIntents @available(iOS 17.2, *) @AppIntent(schema: .system.search) struct SearchIntent: ShowInAppSearchResultsIntent { static var searchScopes: [StringSearchScope] = [.general] @Parameter(title: "Criteria") var criteria: StringSearchCriteria static var title: LocalizedStringResource = "Search with MyCatalogApp" @MainActor func perform() async throws -> some IntentResult { let searchString = criteria.term print("**** Search String: \(searchString) ****") // tmp debugging try await MyCatalogSearchHelper.search(for: searchString) // fetch results via my async fetch API return .result() } } Here's the Shortcuts definition: import AppIntents @available(iOS 17.2, *) struct Shortcuts: AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut( intent: SearchIntent(), phrases: ["Search for \(\.$criteria) on \(.applicationName)."], shortTitle: "Search", systemImageName: "magnifyingglass" ) } } Thanks for any help!
Replies
1
Boosts
0
Views
570
Activity
5d
linear_quantize_activations taking 90 minutes + on MacBook Air M1 2020
In my quantization code, the line: compressed_model_a8 = cto.coreml.experimental.linear_quantize_activations( model, activation_config, [{'img':np.random.randn(1,13,1024,1024)}] ) has taken 90 minutes to run so far and is still not completed. From debugging, I can see that the line it's stuck on is line 261 in _model_debugger.py: model = ct.models.MLModel( cloned_spec, weights_dir=self.weights_dir, compute_units=compute_units, skip_model_load=False, # Don't skip model load as we need model prediction to get activations range. ) Is this expected behaviour? Would it be quicker to run on another computer with more RAM?
Replies
1
Boosts
0
Views
176
Activity
Mar ’25
Train adapter with tool calling
Documentation on adapter train is lacking any details related to training on dataset with tool calling. And page about tool calling itself only explain how to use it from Swift without any internal details useful in training. Question is how schema should looks like for including tool calling in dataset?
Replies
1
Boosts
0
Views
274
Activity
Jun ’25
iOS 18.2 beta
I have recently been having trouble with my iOS 18.2 beta update. It has been 2 weeks since I have updated to iOS 18.2 beta and joined the Genmoji and image playground waitlist. I am wondering how much longer I have to wait till my request is approved.
Replies
1
Boosts
0
Views
773
Activity
Jan ’26
Siri 2.0 (suggests and future updates)
Hey dear developers! This post should be available for the future Siri updates and improvements but also for wishes in this forum so that everyone can share their opinion and idea please stay friendly. have fun! I had already thought about developing a demo app to demonstrate my idea for a better Siri. My change of many: Wish Update: Siri's language recognition capabilities have been significantly enhanced. Instead of manually setting the language, Siri can now automatically recognize the language you intend to use, making language switching much more efficient. Simply speak the language you want to communicate in, and Siri will automatically recognize it and respond accordingly. Whether you speak English, German, or Japanese, Siri will respond in the language you choose.
Replies
1
Boosts
1
Views
941
Activity
Oct ’25
Foundation Models reliable for medicine purposes?
How reliable is the Models, to use as a comparison, such as a cholesterol test, to inform, for example, whether it is worth it to go see a doctor? I would like to use Tool to attach the simple blood test data to the session and with this the Model can analyse and made a simple suggestion if is necessary to see a doctor etc.. ? ps.: Local model
Replies
1
Boosts
0
Views
215
Activity
Jun ’25
Unified Use Case Mail Categories & Spam
Hi Apple product owners. I am missing a unified concept which might be derived from the use cases for mail categories and mail spam for the app "Mail" on Mac. I need a recommendation on how to use categories in combination with the spam filter to get most out of it. So I was looking for the use cases for the 2 functionality areas in order to figure out how to organise my mails by using as much automation as possible before I start creating intelligent folders in addition. What can you recommend where I get this information from? I don't want to guess or read a lot of forum contributions which are based on guesses.
Replies
1
Boosts
0
Views
97
Activity
Apr ’25
What is the Foundation Models support for basic math?
I am experimenting with Foundation Models in my time tracking app to analyze users tracked events, but I am finding that the model struggles with even basic computation of time. Specifically converting from seconds to hours and minutes. To give just one example, when I prompt: "Convert 3672 seconds to hours, minutes, and seconds. Don't include the calculations in the resulting output" I get this: "3672 seconds is equal to 1 hour, 0 minutes, and 36 seconds". Which is clearly wrong - it should be 1 hour, 1 minute, and 12 seconds. Another issue that I saw a lot is that seconds were considered to be minutes, or that the hours were just completely off. What can I do to make the support for math better? Or is that just something that the model is not meant to be used for?
Replies
1
Boosts
0
Views
218
Activity
Jun ’25
ImagePlayground API not working on Xcode Simulator Devices
Hi! I'm trying to use the ImagePlayground API in SwiftUI with the .imagePlaygroundSheet modifier. However, when the sheet is shown (in the preview or in the simulator) it displays the following message: "Image Playground is not available. Image Playground is not available on this iPhone.". I'm using an iPhone 16 Pro with iOS 18.3.1 in the Xcode (16.2) Simulator. Anyone else having this problem? How can I fix it?
Replies
1
Boosts
0
Views
200
Activity
Apr ’25
Threading issues when using debugger
Hi, I am modifying the sample camera app that is here: https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview ... In the processPreviewImages, I am using the Vision APIs to generate a segmentation mask for a person/object, then compositing that person onto a different background (with some other filtering). The filtering and compositing is done via CoreImage. At the end, I convert the CIImage to a CGImage then to a SwiftUI Image. When I run it on my iPhone, it works fine, and has not crashed. When I run it on the iPhone with the debugger, it crashes within a few seconds with: EXC_BAD_ACCESS in libRPAC.dylib`std::__1::__hash_table<std::__1::__hash_value_type<long, qos_info_t>, std::__1::__unordered_map_hasher<long, std::__1::__hash_value_type<long, qos_info_t>, std::__1::hash, std::__1::equal_to, true>, std::__1::__unordered_map_equal<long, std::__1::__hash_value_type<long, qos_info_t>, std::__1::equal_to, std::__1::hash, true>, std::__1::allocator<std::__1::__hash_value_type<long, qos_info_t>>>::__emplace_unique_key_args<long, std::__1::piecewise_construct_t const&, std::__1::tuple<long const&>, std::__1::tuple<>>: It had previously been working fine with the debugger, so I'm not sure what has changed. Is there a difference in how the Vision APIs are executed if the debugger is attached vs. not?
Replies
1
Boosts
0
Views
402
Activity
Jan ’26
AXSpeech Crash
I have a very terrible crash problem in my App when I use AVSpeechSynthesizer and I can't repetition it.Here is my code, It's a singleton- (void)stopSpeech { if ([self.synthesizer isPaused]) { return; } if ([self.synthesizer isSpeaking]) { BOOL isSpeech = [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryImmediate]; if (!isSpeech) { [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryWord]; } } self.stopBlock ? self.stopBlock() : nil; } -(AVSpeechSynthesizer *)synthesizer { if (!_synthesizer) { _synthesizer = [[AVSpeechSynthesizer alloc] init]; _synthesizer.delegate = self; } return _synthesizer; }When the user leaves the page, I call the stopSpeech method。Then I got a lot of crash messagesHere is a crash log:# Crashlytics - plaintext stacktrace downloaded by liweican at Mon, 13 May 2019 03:03:24 GMT # URL: https://fabric.io/youdao-dict/ios/apps/com.youdao.udictionary/issues/5a904ed88cb3c2fa63ad7ed3?time=last-thirty-days/sessions/b1747d91bafc4680ab0ca8e3a702c52c_DNE_0_v2 # Organization: zzz # Platform: ios # Application: U-Dictionary # Version: 3.0.5.4 # Bundle Identifier: com.youdao.UDictionary # Issue ID: 5a904ed88cb3c2fa63ad7ed3 # Session ID: b1747d91bafc4680ab0ca8e3a702c52c_DNE_0_v2 # Date: 2019-05-13T02:27:00Z # OS Version: 12.2.0 (16E227) # Device: iPhone 8 Plus # RAM Free: 17% # Disk Free: 64.6% #19. Crashed: AXSpeech 0 libsystem_pthread.dylib 0x19c15e5b8 pthread_mutex_lock$VARIANT$armv81 + 102 1 CoreFoundation 0x19c4cf84c CFRunLoopSourceSignal + 68 2 Foundation 0x19cfc7280 performQueueDequeue + 464 3 Foundation 0x19cfc680c __NSThreadPerformPerform + 136 4 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 5 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88 6 CoreFoundation 0x19c4d1b74 __CFRunLoopDoSources0 + 256 7 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004 8 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 9 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300 10 libAXSpeechManager.dylib 0x1ac16c94c -[AXSpeechThread main] + 264 11 Foundation 0x19cfc66e4 __NSThread__start__ + 984 12 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 13 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 14 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 -- #0. com.apple.main-thread 0 libsystem_malloc.dylib 0x19c11ce24 small_free_list_remove_ptr_no_clear + 768 1 libsystem_malloc.dylib 0x19c11f094 small_malloc_from_free_list + 296 2 libsystem_malloc.dylib 0x19c11f094 small_malloc_from_free_list + 296 3 libsystem_malloc.dylib 0x19c11d63c small_malloc_should_clear + 224 4 libsystem_malloc.dylib 0x19c11adcc szone_malloc_should_clear + 132 5 libsystem_malloc.dylib 0x19c123c18 malloc_zone_malloc + 156 6 CoreFoundation 0x19c569ab4 __CFBasicHashRehash + 300 7 CoreFoundation 0x19c56b430 __CFBasicHashAddValue + 96 8 CoreFoundation 0x19c56ab9c CFBasicHashAddValue + 2160 9 CoreFoundation 0x19c49f3bc CFDictionaryAddValue + 260 10 CoreFoundation 0x19c572ee8 __54-[CFPrefsSource mergeIntoDictionary:sourceDictionary:]_block_invoke + 28 11 CoreFoundation 0x19c49f0b4 __CFDictionaryApplyFunction_block_invoke + 24 12 CoreFoundation 0x19c568b7c CFBasicHashApply + 116 13 CoreFoundation 0x19c49f090 CFDictionaryApplyFunction + 168 14 CoreFoundation 0x19c42f504 -[CFPrefsSource mergeIntoDictionary:sourceDictionary:] + 136 15 CoreFoundation 0x19c4bcd38 -[CFPrefsSearchListSource alreadylocked_getDictionary:] + 644 16 CoreFoundation 0x19c42e71c -[CFPrefsSearchListSource alreadylocked_copyValueForKey:] + 152 17 CoreFoundation 0x19c42e660 -[CFPrefsSource copyValueForKey:] + 60 18 CoreFoundation 0x19c579e88 __76-[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:]_block_invoke + 40 19 CoreFoundation 0x19c4bdff4 __108-[_CFXPreferences(SearchListAdditions) withSearchListForIdentifier:container:cloudConfigurationURL:perform:]_block_invoke + 272 20 CoreFoundation 0x19c4bda38 normalizeQuintuplet + 340 21 CoreFoundation 0x19c42c634 -[_CFXPreferences(SearchListAdditions) withSearchListForIdentifier:container:cloudConfigurationURL:perform:] + 108 22 CoreFoundation 0x19c42cec0 -[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:] + 148 23 CoreFoundation 0x19c57c2d0 _CFPreferencesCopyAppValueWithContainerAndConfiguration + 124 24 TextInput 0x1a450e550 -[TIPreferencesController valueForPreferenceKey:] + 460 25 UIKitCore 0x1c87c71f8 -[UIKeyboardPreferencesController handBias] + 36 26 UIKitCore 0x1c887275c -[UIKeyboardLayoutStar showKeyboardWithInputTraits:screenTraits:splitTraits:] + 320 27 UIKitCore 0x1c88f4240 -[UIKeyboardImpl finishLayoutChangeWithArguments:] + 492 28 UIKitCore 0x1c88f47c8 -[UIKeyboardImpl updateLayout] + 1208 29 UIKitCore 0x1c88eaad0 -[UIKeyboardImpl updateLayoutIfNecessary] + 448 30 UIKitCore 0x1c88eab9c -[UIKeyboardImpl setFrame:] + 140 31 UIKitCore 0x1c88d5d60 -[UIKeyboard activate] + 652 32 UIKitCore 0x1c894c90c -[UIKeyboardAutomatic activate] + 128 33 UIKitCore 0x1c88d5158 -[UIKeyboard setFrame:] + 296 34 UIKitCore 0x1c88d81b0 -[UIKeyboard _didChangeKeyplaneWithContext:] + 228 35 UIKitCore 0x1c88f4aa0 -[UIKeyboardImpl didMoveToSuperview] + 136 36 UIKitCore 0x1c8f2ad84 __45-[UIView(Hierarchy) _postMovedFromSuperview:]_block_invoke + 888 37 UIKitCore 0x1c8f2a970 -[UIView(Hierarchy) _postMovedFromSuperview:] + 760 38 UIKitCore 0x1c8f39ddc -[UIView(Internal) _addSubview:positioned:relativeTo:] + 1740 39 UIKitCore 0x1c88d5d84 -[UIKeyboard activate] + 688 40 UIKitCore 0x1c894c90c -[UIKeyboardAutomatic activate] + 128 41 UIKitCore 0x1c893b3a4 -[UIPeripheralHost(UIKitInternal) _reloadInputViewsForResponder:] + 1332 42 UIKitCore 0x1c8ae66d8 -[UIResponder(UIResponderInputViewAdditions) reloadInputViews] + 80 43 UIKitCore 0x1c8ae23bc -[UIResponder becomeFirstResponder] + 804 44 UIKitCore 0x1c8f2a560 -[UIView(Hierarchy) becomeFirstResponder] + 156 45 UIKitCore 0x1c8d93e84 -[UITextField becomeFirstResponder] + 244 46 UIKitCore 0x1c8d578dc -[UITextInteractionAssistant(UITextInteractionAssistant_Internal) setFirstResponderIfNecessary] + 192 47 UIKitCore 0x1c8d45d8c -[UITextSelectionInteraction oneFingerTap:] + 3136 48 UIKitCore 0x1c86e0bcc -[UIGestureRecognizerTarget _sendActionWithGestureRecognizer:] + 64 49 UIKitCore 0x1c86e8dd4 _UIGestureRecognizerSendTargetActions + 124 50 UIKitCore 0x1c86e6778 _UIGestureRecognizerSendActions + 316 51 UIKitCore 0x1c86e5ca4 -[UIGestureRecognizer _updateGestureWithEvent:buttonEvent:] + 760 52 UIKitCore 0x1c86d9d80 _UIGestureEnvironmentUpdate + 2180 53 UIKitCore 0x1c86d94b0 -[UIGestureEnvironment _deliverEvent:toGestureRecognizers:usingBlock:] + 384 54 UIKitCore 0x1c86d9290 -[UIGestureEnvironment _updateForEvent:window:] + 204 55 UIKitCore 0x1c8af14a8 -[UIWindow sendEvent:] + 3112 56 UIKitCore 0x1c8ad1534 -[UIApplication sendEvent:] + 340 57 UIKitCore 0x1c8b977c0 __dispatchPreprocessedEventFromEventQueue + 1768 58 UIKitCore 0x1c8b99eec __handleEventQueueInternal + 4828 59 UIKitCore 0x1c8b9311c __handleHIDEventFetcherDrain + 152 60 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 61 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88 62 CoreFoundation 0x19c4d1b24 __CFRunLoopDoSources0 + 176 63 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004 64 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 65 GraphicsServices 0x19e6cc79c GSEventRunModal + 104 66 UIKitCore 0x1c8ab7b68 UIApplicationMain + 212 67 UDictionary 0x10517e138 main (main.m:17) 68 libdyld.dylib 0x19bf928e0 start + 4 #1. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #2. com.apple.uikit.eventfetch-thread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300 6 Foundation 0x19ce99e5c -[NSRunLoop(NSRunLoop) runUntilDate:] + 96 7 UIKitCore 0x1c8b9d540 -[UIEventFetcher threadMain] + 136 8 Foundation 0x19cfc66e4 __NSThread__start__ + 984 9 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 10 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 11 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #3. JavaScriptCore bmalloc scavenger 0 libsystem_kernel.dylib 0x19c0ddee4 __psynch_cvwait + 8 1 libsystem_pthread.dylib 0x19c15d4a4 _pthread_cond_wait$VARIANT$armv81 + 628 2 libc++.1.dylib 0x19b6b5090 std::__1::condition_variable::wait(std::__1::unique_lock&lt;std::__1::mutex&gt;&amp;) + 24 3 JavaScriptCore 0x1a36a2238 void std::__1::condition_variable_any::wait&lt;std::__1::unique_lock&lt;bmalloc::Mutex&gt; &gt;(std::__1::unique_lock&lt;bmalloc::Mutex&gt;&amp;) + 108 4 JavaScriptCore 0x1a36a622c bmalloc::Scavenger::threadRunLoop() + 176 5 JavaScriptCore 0x1a36a59a4 bmalloc::Scavenger::Scavenger(std::__1::lock_guard&lt;bmalloc::Mutex&gt;&amp;) + 10 6 JavaScriptCore 0x1a36a73e4 std::__1::__thread_specific_ptr&lt;std::__1::__thread_struct&gt;::set_pointer(std::__1::__thread_struct*) + 38 7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #4. WebThread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 WebCore 0x1a5126480 RunWebThread(void*) + 600 6 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 7 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 8 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #5. com.twitter.crashlytics.ios.MachExceptionServer 0 UDictionary 0x1058a5564 CLSProcessRecordAllThreads (CLSProcess.c:376) 1 UDictionary 0x1058a594c CLSProcessRecordAllThreads (CLSProcess.c:407) 2 UDictionary 0x1058952dc CLSHandler (CLSHandler.m:26) 3 UDictionary 0x1058906cc CLSMachExceptionServer (CLSMachException.c:446) 4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #6. com.apple.NSURLConnectionLoader 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 CFNetwork 0x19cae574c -[__CoreSchedulingSetRunnable runForever] + 216 6 Foundation 0x19cfc66e4 __NSThread__start__ + 984 7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #7. AVAudioSession Notify Thread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 AVFAudio 0x1a238a378 GenericRunLoopThread::Entry(void*) + 156 6 AVFAudio 0x1a23b4c60 CAPThread::Entry(CAPThread*) + 88 7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #8. WebCore: LocalStorage 0 libsystem_kernel.dylib 0x19c0ddee4 __psynch_cvwait + 8 1 libsystem_pthread.dylib 0x19c15d4a4 _pthread_cond_wait$VARIANT$armv81 + 628 2 JavaScriptCore 0x1a3668ce4 ***::ThreadCondition::timedWait(***::Mutex&amp;, ***::WallTime) + 80 3 JavaScriptCore 0x1a364f96c ***::ParkingLot::parkConditionallyImpl(void const*, ***::ScopedLambda&lt;bool ()&gt; const&amp;, ***::ScopedLambda&lt;void ()&gt; const&amp;, ***::TimeWithDynamicClockType const&amp;) + 2004 4 WebKitLegacy 0x1a67b6ea8 bool ***::Condition::waitUntil&lt;***::Lock&gt;(***::Lock&amp;, ***::TimeWithDynamicClockType const&amp;) + 184 5 WebKitLegacy 0x1a67b9ba4 std::__1::unique_ptr&lt;***::Function&lt;void ()&gt;, std::__1::default_delete&lt;***::Function&lt;void ()&gt; &gt; &gt; ***::MessageQueue&lt;***::Function&lt;void ()&gt; &gt;::waitForMessageFilteredWithTimeout&lt;***::MessageQueue&lt;***::Function&lt;void ()&gt; &gt;::waitForMessage()::'lambda'(***::Function&lt;void ()&gt; const&amp;)&gt;(***::MessageQueueWaitResult&amp;, ***::MessageQueue&lt;***::Function&lt;void ()&gt; &gt;::waitForMessage()::'lambda'(***::Function&lt;void ()&gt; const&amp;)&amp;&amp;, ***::WallTime) + 156 6 WebKitLegacy 0x1a67b91c0 WebCore::StorageThread::threadEntryPoint() + 68 7 JavaScriptCore 0x1a3666f88 ***::Thread::entryPoint(***::Thread::NewThreadContext*) + 260 8 JavaScriptCore 0x1a3668494 ***::wtfThreadEntryPoint(void*) + 12 9 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 10 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 11 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #9. com.apple.CoreMotion.MotionThread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 CoreFoundation 0x19c4cd0b0 CFRunLoopRun + 80 6 CoreMotion 0x1a1df0240 (Missing) 7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #10. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #11. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c1611f8 _pthread_wqthread + 532 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #12. com.apple.CFStream.LegacyThread 0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8 1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72 2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236 3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360 4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 5 CoreFoundation 0x19c4e5094 _legacyStreamRunLoop_workThread + 260 6 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 7 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 8 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #13. Thread 0 libsystem_pthread.dylib 0x19c163cd0 start_wqthread + 190 #14. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #15. Thread 0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8 1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340 2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4 #16. Thread 0 libsystem_kernel.dylib 0x19c0d3148 semaphore_timedwait_trap + 8 1 libdispatch.dylib 0x19bf50a4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64 2 libdispatch.dylib 0x19bf513a8 _dispatch_semaphore_wait_slow + 72 3 libdispatch.dylib 0x19bf647c8 _dispatch_worker_thread + 344 4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #17. Thread 0 libsystem_kernel.dylib 0x19c0d3148 semaphore_timedwait_trap + 8 1 libdispatch.dylib 0x19bf50a4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64 2 libdispatch.dylib 0x19bf513a8 _dispatch_semaphore_wait_slow + 72 3 libdispatch.dylib 0x19bf647c8 _dispatch_worker_thread + 344 4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #18. Thread 0 libsystem_kernel.dylib 0x19c0d3148 semaphore_timedwait_trap + 8 1 libdispatch.dylib 0x19bf50a4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64 2 libdispatch.dylib 0x19bf513a8 _dispatch_semaphore_wait_slow + 72 3 libdispatch.dylib 0x19bf647c8 _dispatch_worker_thread + 344 4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #19. Crashed: AXSpeech 0 libsystem_pthread.dylib 0x19c15e5b8 pthread_mutex_lock$VARIANT$armv81 + 102 1 CoreFoundation 0x19c4cf84c CFRunLoopSourceSignal + 68 2 Foundation 0x19cfc7280 performQueueDequeue + 464 3 Foundation 0x19cfc680c __NSThreadPerformPerform + 136 4 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 5 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88 6 CoreFoundation 0x19c4d1b74 __CFRunLoopDoSources0 + 256 7 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004 8 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 9 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300 10 libAXSpeechManager.dylib 0x1ac16c94c -[AXSpeechThread main] + 264 11 Foundation 0x19cfc66e4 __NSThread__start__ + 984 12 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 13 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 14 libsystem_pthread.dylib 0x19c163cdc thread_start + 4 #20. AXSpeech 0 (Missing) 0x1071ba524 (Missing) 1 (Missing) 0x1071b3e7c (Missing) 2 (Missing) 0x10718fba4 (Missing) 3 (Missing) 0x107184bc8 (Missing) 4 libdyld.dylib 0x19bf95908 dlopen + 176 5 CoreFoundation 0x19c5483e8 _CFBundleDlfcnLoadBundle + 140 6 CoreFoundation 0x19c486918 _CFBundleLoadExecutableAndReturnError + 352 7 Foundation 0x19ced5734 -[NSBundle loadAndReturnError:] + 428 8 TextToSpeech 0x1abfff800 TTSSpeechUnitTestingMode + 1020 9 libdispatch.dylib 0x19bf817d4 _dispatch_client_callout + 16 10 libdispatch.dylib 0x19bf52040 _dispatch_once_callout + 28 11 TextToSpeech 0x1abfff478 TTSSpeechUnitTestingMode + 116 12 libobjc.A.dylib 0x19b7173cc CALLING_SOME_+initialize_METHOD + 24 13 libobjc.A.dylib 0x19b71cee0 initializeNonMetaClass + 296 14 libobjc.A.dylib 0x19b71e640 initializeAndMaybeRelock(objc_class*, objc_object*, mutex_tt&lt;false&gt;&amp;, bool) + 260 15 libobjc.A.dylib 0x19b7265a4 lookUpImpOrForward + 244 16 libobjc.A.dylib 0x19b733858 _objc_msgSend_uncached + 56 17 libAXSpeechManager.dylib 0x1ac167324 -[AXSpeechManager _initialize] + 68 18 Foundation 0x19cfc68d4 __NSThreadPerformPerform + 336 19 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 20 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88 21 CoreFoundation 0x19c4d1b74 __CFRunLoopDoSources0 + 256 22 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004 23 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436 24 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300 25 libAXSpeechManager.dylib 0x1ac16c94c -[AXSpeechThread main] + 264 26 Foundation 0x19cfc66e4 __NSThread__start__ + 984 27 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128 28 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44 29 libsystem_pthread.dylib 0x19c163cdc thread_start + 4I change my code like this, It still has the same problem- (void)stopSpeech { if (self.synthesizer != nil &amp;&amp; [self.synthesizer isPaused]) { return; } // if ([self.synthesizer isSpeaking]) { // BOOL isSpeech = [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryImmediate]; // if (!isSpeech) { // [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryWord]; // } // } if (self.synthesizer != nil) { [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryImmediate]; // if (!isSpeech) { // [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryWord]; // } self.stopBlock ? self.stopBlock() : nil; } }
Replies
1
Boosts
1
Views
2.5k
Activity
4d