Accelerometer sampling rate limits for third-party iOS apps

Hello everyone,

I am developing an iOS application that relies on accelerometer data for precise motion and reaction-time measurements.

Based on practical testing, it appears that third-party iOS applications receive accelerometer data at a maximum rate of approximately 100 Hz, regardless of hardware capabilities or requested update intervals.

I would like to ask for clarification on the following points:

Is there an officially supported way for third-party iOS apps to access accelerometer data at sampling rates higher than ~100 Hz?

If the hardware supports higher sampling rates, is this limitation intentionally enforced at the iOS level for third-party applications?

Are there any public APIs, entitlements, or documented approaches that allow access to higher-frequency sensor data, or is this restricted to system/internal components only?

Thank you in advance for any clarification.

Is there an officially supported way for third-party iOS apps to access accelerometer data at sampling rates higher than ~100 Hz?

So, my first question here is why you think you need a higher sampling rate? I ask because every developer who's asked this question has done so having gone through the same general development experience:

  1. They built an initial motion analysis engine which "sort of" worked, but had strange/inconsistent results and/or "glitches".

  2. They increased the update interval and the situation improved marginally, but not completely.

  3. They continued increasing the update interval and saw improvements without actually resolving the issue until they hit the limit.

  4. Based on that experience, they've decided that the problem is caused by update frequency and if they can JUST get data a little bit faster everything will work fine.

The problem here is that, in my experience, that entire analysis was built on the fundamentally false premise that the update frequency was actually the source of the problem.

In actuality, the real problem was actually caused by timing issues introduced by a combination of factors. More specifically:

  • Their analysis engine ignored the "timestamp" property of the data, treating all events as a "stream" of constant data delivered at a fixed interval.

  • The threading pattern of their app introduced unexpected delivery "gaps", making the timing between events inconsistent.

  • The details of their event delivery and/or data processing further distorted the data, making the existing timing gaps even worse.

Increasing the update frequency marginally "improves" results, but that's primarily because the increasing number of events tends to make smaller data gaps less "obvious". It will never really resolve the issue, regardless of how high you make the interval. Indeed, higher update intervals actually start to introduce new problems as they further complicate the threading issues.

Next, I want to return to what you said here:

...it appears that third-party iOS applications receive accelerometer data.

The word "receive" is actually a trap that many motion developers fall into. The implicit assumption here is that motion analysis is a “real-time" activity, so what matters is getting the data as quickly as you can. The problem is that this just isn't true. More specifically:

  • Unless you're doing motion controls, you're updating the screen a FAR less frequently than the update interval. For most apps, polling CMSensorRecorder 5-20/s looks exactly the same as “real-time" updates.

  • If you are doing motion controls, then you shouldn't be messing around with any of this and should just poll the motion properties, exactly as described here.

  • In many cases, the user isn't actually looking at their device or receiving any truly “real-time" feedback, so when the analysis actually happens doesn't REALLY matter at all.

In any case, there are basically two solutions to this:

(1) Use CMSensorRecorder (Easy Way):

CMSensorRecorder lets you sidestep this entire issue by providing you with an undistorted event list gathered at a precise interval (50Hz), allowing you to basically ignore all of the issues I outlined above.

(2) Fix the issues above (Hard Way):

More specifically, that means first reworking your motion analysis engine so that it properly accounts for event timing, including interpolating motion events to account for those gaps. You'll then need to tune your event delivery system to minimize the distortion it introduces.

Critically, if you decide to go this route, then you're almost certainly going to need to introduce some form of "batch processing" model instead of trying to process each event as it arrives. That's because:

  • If your motion analysis has any complexity, then doing that analysis on the receiving queue risks stalling data delivery.

  • Transferring every motion event off thread means generating 100/s, generating significant thread churn/noise, wasting energy, and creating more thread chaos you need to compensate for.

...and the only good way to address those issues is to have the receiving thread queue up data, then deliver that data in batches. In practice, that ends up working a lot like an implementation that simply polled CMSensorRecorder at X/second.

Is there an officially supported way for third-party iOS apps to access accelerometer data at sampling rates higher than ~100 Hz?

No.

If the hardware supports higher sampling rates, is this limitation intentionally enforced at the iOS level for third-party applications?

Yes, and not just for 3rd party apps. The issue here is that, on top of all the issues I outlined above, waking a thread 100x per sec is at the outer edge of what would generally be considered "reasonable" behavior. Notably, the kernel starts complaining at 150 thread wakes/second, and 100hz means you're already 2/3 of the way. Earlier I mentioned that you NEED to be batching events for processing, and that's because dispatching every motion event basically means 200 wakeups/second. An API that allowed sustained collection at higher rates would require an API structure that looked a lot like CMSensorRecorder, not the standard motion APIs.

Speaking of which... the reason CMSensorRecorder runs a 50/sec is that's the "standard" interval we collect data at, so it lets us consolidate "all" the data collection to a single store.

That leads to here:

Are there any public APIs, entitlements, or documented approaches that allow access to higher-frequency sensor data?

No, nothing like this exists.

or is this restricted to system/internal components only?

As I outlined above, higher frequency updates are rarely as beneficial as they might seem. It's a big system, and I can't claim to have looked in great detail, but much of our code actually collects a lower frequency (generally between 15-60 /sec), and I'm not aware of any case where we're collecting at higher than 100/sec.

__
Kevin Elliott
DTS Engineer, CoreOS/Hardware

Accelerometer sampling rate limits for third-party iOS apps
 
 
Q