User spotlight: Ride FX – Immersive VR

Learn how immersive studio Ride FX uses Unreal Engine and Anchorpoint to manage massive 8K VR projects and coordinate a global team of artists without struggling with version control.

Matthäus Niedoba
20 Mar 2026
Updated on
20 Mar 2026
4
min read
Content

Elia Vermander and Romain Ferchat from Ride FX break down how their team uses Unreal Engine to create large-scale immersive VR experiences and scenographic exhibitions.

Ride FX is a design agency and production studio that creates real-time, immersive cultural experiences for museums and exhibitions. In this interview, they share their process of scaling an international remote team, transitioning to Unreal Engine, and managing their massive 8K VR assets using Anchorpoint.

Hello Elia and Romain, can you tell us a bit about yourself? How did you start the company and what kind of products do you deal with?

Elia: My name is Elia Vermander. I'm the founder of Ride FX. We are a design agency and production studio rolled into one, specialized in creating immersive experiences. We do everything from building scenographic exhibitions, but always including digital aspects, audiovisual elements, and game elements.

Can you tell us about your projects in the AR/VR space?

Elia: We worked on our first big VR project, which is a recreation of the D-Day landings in Normandy, co-produced with a museum situated on Omaha Beach. Everything had to be rendered in 8K 360, making it a very ambitious project where we had to scale the studio from 6 to 25 artists. We also created an immersive room recreating ancient Pompeii, using massive projectors on monumental walls instead of VR headsets. Right now, our biggest development is the launch of our proprietary free-roaming, multiplayer VR platform called Myriad XR. It allows people to walk around environments like the ocean floor or ancient Rome.

The trailer of the D-Day immersive experience located directly at the historical Omaha Beach site. The project focused on delivering deep emotional impact without cheap effects, supported by a passionate team and a production partnership built on meaningful context.

Wearing a VR headset on the location where the historical event actually happened
What people saw in the experience

Which engine are you using for these experiences and why?

Elia: It's Unreal Engine all the way for us. We decided to ride the wave with Unreal Engine and focus on creating beautiful images directly inside it. Delivering very high-resolution renders for our museum clients would have cost a fortune otherwise if we used traditional offline rendering. With Unreal Engine, we have a "what you see is what you get" engine, and we are able to render everything on our RTX cards in the studio, which means no dependency on render farms.

How did Anchorpoint fit into that process?

Elia: For a long time, we worked from home and needed a version control system to share project files in a smoother and more efficient way. The pressure from the artists forced me to adopt Anchorpoint. We originally tried installing Perforce, but it was very complicated. Then someone recommended Anchorpoint, and 10 minutes later, it was up and running. I am not a technical person, but I was able to take care of it myself.

Recently, Romain Ferchat, our 3D Supervisor, analyzed our pipeline and we scaled up Anchorpoint so every artist has their own account. Now, we are working with Anchorpoint on 6 to 7 different projects, pulling and pushing assets at the asset level instead of using multiple platforms that caused chaos.

How do you manage rendering across a distributed global team?

Elia: We have an international team, which allows us to work around the clock. We have people in the Americas and in Asia who can launch renders and make our machines work continuously. For example, during the Pompeii project, a team member in India used Anchorpoint remotely to pull the project onto all 20 of our studio workstations. He was switching between machines to check renders, and Anchorpoint ensured that he always had the right project version for those renders at every moment.

Are you currently building custom pipeline tooling?

Romain: At the minute, all our resources are committed to the in-house tech we're working on to ensure we can iterate quickly on our VR projects. However, automating that pipeline is very high up on our list.

For example, we want a full Python script that takes mocap cleanup data from Maya, automatically names it, and places it in the proper folder. Anchorpoint would know where that folder is and would automatically send the animator a notification. Another goal is an automation tool where, if we have a new project, it automatically builds the entire pipeline repository based on the project type (like a locomotion-based VR project or a 360 movie), populating it with the exact tools and add-ons required.

What are the next big goals you are tackling right now?

Elia: The biggest development is the public launch of our free-roaming VR experiences using Myriad XR. We are launching a huge exhibition in Singapore on the topic of the Earth's oceans, featuring a free-roaming VR experience and an immersive 300-square-meter projection room. Following that, we are launching a unique D-Day free-roaming experience that will allow people to live through that day with full suspension of disbelief, utilizing multi-sensorial effects like wind and heat.

We help clients from cultural and historical institutions to adopt technology which would otherwise be completely out of reach for them, and with great success.

Read more about Ride FX and watch the trailer of D-Day Night.