Fixing Simulation and VR Sickness in Learning | Magic EdTech

We are education technology experts.

Skip to main content
Blogs - Immersive Learning

Simulation Sickness, VR Sickness, and How to Fix Both

  • Published on: September 16, 2025
  • |
  • Updated on: September 16, 2025
  • |
  • Reading Time: 6 mins
  • |
  • Views
  • |
Authored By:

Tiger Louck

XR Developer

The first thing many users notice when putting on a VR headset for the first time is the unreal sensation of being in a new, virtual world.

The second thing they usually notice is that they’re getting dizzy.

Education teams love the wow factor of VR. They do not love session drop‑offs, early exits, and “I can’t continue” reports. This guide explains the two kinds of discomfort your learners may feel in stereo VR, why they happen, and what to build so your programs stay usable. The goal is simple: more learners completing more minutes, with fewer sick notes.

Motion and VR sickness is a constant consideration within the panoply of accessibility concerns in XR development, because it represents one of the great barriers to adoption of XR (and especially VR) technologies. This concern can torpedo virtually any design objective. It doesn’t matter if it’s a flight simulator or an art project, emesis pretty much always gets in the way of what you’re trying to do.

Before we get into how to mitigate motion sickness in VR, let’s get some terminology out of the way.

This blog post will tackle accessibility challenges relating to Stereo Rendered platforms, meaning virtual environments that display two views to the user, one for each eye. Within this context, two types of “sickness” can be observed, and we will be exploring their history, causes, and mitigation.

A person interacting with a VR headset with raised hands, depicting the importance of reducing VR motion sickness in immersive environments.

 

Part 1: Simulation Sickness

The first type of sickness users can experience in VR is “Simulation Sickness.” This is a type of motion sickness, and as a result, shares the same root cause.

Humans have two main methods of understanding their position and orientation in the world. Visual orientation uses reference points, such as a horizon, to produce the sensation of motion. Vestibular sensation does the same thing using the sensation of inertia in specific organs, primarily in the inner ear. These two systems normally complement each other, with different situations where they provide the best information.

Simulation sickness occurs when there is a consistent difference in the understanding of the world between these systems. Reading in the backseat of a car, for instance, tends to induce motion sickness because the visual systems register no motion (you’re looking down into the static interior of the car), but the vestibular system registers the acceleration of the car as it moves down the road.

In VR, the inverse happens: the user visually registers the sensation of motion while the vestibular system feels no such thing (this is called “vection” and is a common problem among pilots).

Mitigating motion sickness in VR applications involves reducing this sensory incongruity. The easiest way to do this is just…don’t move the user within the virtual world.

3 Tips to Reduce Motion Within Your VR World

1. Try the Tunnel Vision Effect

A lot of experiences require motion within the virtual space (it’s hard not to move when learning to drive a truck), so we need to get more clever. For example, you may have seen a VR experience where virtual motion is accompanied by a vignette or “tunnel vision” effect. This effect restricts the user’s field of view to stop them from registering the world of the simulation as a reference point for their sense of orientation.

Adding a Tunnel Vision effect can have mixed results. Users are still focused on what little of the world they can see while they navigate, and the disorientation from having to move around with blinders on can be just as counterproductive.

2. Adding a Static Reference Frame

Providing a static reference frame within the simulation could be a more effective method. I mentioned truck driving earlier, which is a great example. The user does not move relative to the cab of the vehicle, so their reference frame becomes the truck, instead of the outside world.

3. Try World Grabbing

My personal favorite way of solving this problem, at least within the context of individual locomotion, is world-grabbing. Rather than simply sliding around at the push of a joystick, the user moves by grabbing and pulling on the world with their controllers, either through designated handles and surfaces or by grabbing ahold of thin air (to see examples of this in action, check out Gorilla Tag and Hotdogs, Horseshoes, and Hand Grenades).

 

Part 2: Virtual Reality Sickness

Didn’t we just talk about Virtual Reality Sickness? Incredibly, no. Simulation sickness is a type of motion sickness, and virtual reality sickness is most closely described as a complex type of eye strain.

Let’s do a quick exercise: hold up a finger about six inches from your one eye, close the other eye, and focus on that finger. Note that the finger is in focus, but whatever is behind it is blurry. If you focus on what is behind it, the finger instead becomes blurry.

This exercise demonstrates that when a human looks at something, there are two types of focus going on:

  • Convergence, which is the pointing of two eyes at the same spot.
  • Accommodation, which is the lens inside your eye that changes shape to bring different depths into focus.

A quirk of stereo rendering is that they can only represent depth with convergence; from the perspective of an individual eye, the entire image is at the same depth. This is called vergence-accommodation conflict.

Sensitivity to this effect is less common but harder to mitigate than regular simulation sickness. There are a small number of users who are completely unable to do so much as put a headset on without near-instant headaches.

How to Mitigate Virtual Reality Sickness

This effect can only be mitigated by displays that are capable of representing depth. These include technologies like light field displays or the Meta Butterscotch Varifocal and Half-Dome prototypes, none of which are ready for mass production (and in the case of light field displays, only recently exited the realm of science fiction).

If you’re looking for the next big leap in XR tech, pay attention to the solutions that appear for this problem.

 

Questions to ask your VR vendor

1. Which locomotion options ship by default, and which are optional in settings?

2. What comfort settings are user‑adjustable in‑app?

3. What is your plan for users who cannot tolerate headsets on day one?

4. How do you collect and report comfort metrics in pilots?

5. What is the accuracy and noise of the position tracking for your hardware?

VR can help learners master hard tasks faster, but only if they can stay in the experience. Design to reduce visual–inner‑ear disagreement and to avoid eye‑strain triggers. Give users control over comfort settings and a seatbelt in the form of a stable frame, snap turns, and short first sessions. Offer a desktop path for those who need it. Pilot with diverse users and track comfort like a core outcome. When you do that, you turn VR from a one‑time demo into a repeatable learning tool that people finish.

Magic EdTech designs simulations that learners can finish. Our team builds comfort‑first VR with stable environments, snap turns, optional teleport or grab‑to‑move, clear UI, and desktop parity for sensitive users. If you are scoping a new module or seeing early exits, we can help.

Set up a quick review or ask for a short playable demo built to your use case.

 

Written By:

Tiger Louck

XR Developer

Tiger Louck is a future-forward XR engineer driving immersive innovation at the intersection of emerging tech and spatial experience. With deep expertise in Unity development, game design, and AR/VR systems, Tiger has architected and delivered next-gen simulations and learning environments across education, gaming, and industrial sectors. He brings precision to performance optimization, user interface design, and content engineering, translating complex spatial challenges into intuitive, high-impact XR solutions. From construction overlays to community-centered VR experiences, Tiger thrives in agile pipelines, building what’s next in real-time 3D interaction.

FAQs

Start with brief, seated sessions that use a stable frame (e.g., a cockpit or desk) and avoid forced camera motion. Offer snap‑turns, teleport, or grab‑to‑move instead of smooth locomotion, and include an optional vignette. Give a desktop/web alternative so that sensitive learners can complete the lesson without a headset.

Ship with snap‑turns, teleport, or world‑grabbing as the first‑run defaults; leave smooth, continuous movement as an opt‑in. Keep acceleration short and predictable, and anchor the user to a consistent reference (vehicle cab, room frame) whenever the scene moves. Let learners adjust comfort settings in‑app.

Track session length, early exits, pause frequency, and feature toggles (e.g., vignette on/off, turn mode). Pair telemetry with a quick post‑session comfort check to catch nausea or eyestrain signals. Review both streams to spot problem scenes and iterate before full rollout.

Keep the user fixed relative to a stable object (driver’s seat, control console) so the world moves around them, not them through the world. Minimize camera bob, avoid sudden rotations, and prefer short, purposeful moves over long smooth cruises. Provide clear focal targets to reduce disorientation.

Offer a parity path outside VR: desktop simulations, guided videos, or interactive 3D without a headset. Include subtitles, readable UI, and input options that don’t require fine motor precision. Make comfort controls easy to find so that learners can tailor the experience and finish the module.

Get In Touch

Reach out to our team with your question and our representatives will get back to you within 24 working hours.