Ever tried to finish an article mid-ride and felt your stomach revolt? That unpleasant disconnect — your eyes fixated on a still screen while your inner ear senses motion — is exactly what Google is trying to fix.

Google has been quietly working on a feature provisionally called Motion Cues (and possibly rebranded to Motion Assist) that aims to reduce motion sickness when you use your phone in a moving vehicle. The idea is simple and clever: add subtle, moving visual elements on the edges of the screen that track the vehicle’s motion, giving your brain a visual cue to match what your body feels.

How it works — the dots, the sensors, and the trick

Motion Cues uses data from your phone’s accelerometer and gyroscope to animate tiny dots (or other subtle indicators) along the display edges. When the car accelerates, brakes, or corners, those dots shift in real time. That visual motion reduces the sensory conflict that often causes nausea, making it easier to read or watch videos on a bumpy train or in the back seat of an Uber.

This isn’t entirely new territory: Apple introduced Vehicle Motion Cues in iOS 18, and independent Android apps such as KineStop have offered a similar fix for years. What Google appears to be building is a native, system-level implementation so users don’t need a third‑party app or special permissions to get relief.

The overlay problem — why it didn’t launch sooner

Engineers found Motion Cues code tucked away in Play Services and Android Canary builds, but the early implementation hit a practical barrier: Android prevents ordinary apps from drawing over sensitive system screens (lock screen, Quick Settings, notifications, Settings) for security reasons. The initial Motion Cues used the standard overlay API, so the dots would vanish whenever a system panel was visible — exactly the moments you might still need them.

Rather than ship a half-baked overlay, Google appears to be moving the rendering into SystemUI. The code references — MotionCuesService, IMotionCuesCallback, MotionCuesData and MotionCuesSettings — suggest Play Services would compute positions and styling while SystemUI draws the visuals on a privileged layer. That change sidesteps overlay limitations while keeping the display reliable across the UI.

To stop random apps from cluttering the view or spoofing system UI, the API includes a new permission (DRAWMOTIONCUES) that’s restricted to privileged, platform-signed apps. In other words: Google wants the dots to be available everywhere without opening the door to malicious overlays — a balance between usability and security. For readers interested in how seriously vendors treat system-level security, related fixes and protections have been central to recent platform patching efforts like those described in two zero-day fixes that forced Apple and Google to rush updates.

Where and when you might see it

Because the feature relies on a system-level API, it likely needs a full OS update. Reports point to Android 17 as the most probable landing spot, though a late Android 16 quarterly release could theoretically include it. Google may debut Motion Cues (or Motion Assist) on Pixel devices first, with OEMs like Samsung adopting it within their One UI 9 / Android 17 builds.

The company also appears cautious about rollout. Only system apps will be granted the new permission, and Google could pair Motion Cues with a broader Transiting mode that auto-adjusts settings when the phone detects you’re traveling.

Real-world questions: will it actually help everyone?

A few practical points matter:

  • Sensor accuracy: Older phones with noisy gyros or poor sampling may give less-effective cues. Expect initial limitations on legacy devices.
  • Customization: Leaks and teardowns hint Google might allow intensity, color, and shape adjustments, which would be useful — people’s sensitivity varies.
  • Privacy and power: Continuous sensor reading raises questions about battery drain and whether motion data is logged. Google will likely make this opt-in, mirroring recent privacy-first patterns.

If you want an immediate workaround, KineStop on the Play Store already provides the overlay-based approach (grant it display-over-apps permission and test if it helps on your device).

Competition and context

Apple’s Vehicle Motion Cues gave this concept mainstream attention in iOS 18, and third‑party apps have proven the idea can work. Native implementation on Android matters because it removes permission friction and makes the feature reliable across system screens. For commuters and families who rely on screen time while traveling, even a modest reduction in nausea could be transformative.

Google’s careful approach — moving rendering into SystemUI and gating access with a restricted permission — shows how accessibility features often intersect with platform security. That slow, cautious path explains the delay between discovery in Canary builds and a public release.

A future where you can finish a message on a twisting highway without queasiness is within reach. When Motion Cues arrives, it may not be flashy. It just might let you keep reading.

AndroidMotion CuesAccessibilityAndroid 17Motion Sickness