Meta has revealed the primary technical particulars of its new Horizon Engine.
Introduced at Join 2025 two weeks in the past, Meta Horizon Engine is a brand new engine for Horizon Worlds and Quest’s new Immersive House, the place in each circumstances it replaces Unity.
The brand new Immersive House is offered as a part of Horizon OS v81, at the moment solely on the Public Take a look at Channel (PTC), whereas in Horizon Worlds it is accessible in worlds made utilizing the upcoming Horizon Studio, which proper now means only some first-party worlds from Meta.
At Join, Mark Zuckerberg mentioned that Meta has spent “the final couple of years” constructing the brand new engine “from scratch”, and claimed that it brings 4x sooner world loading and assist for 100+ customers in the identical occasion.
“This engine is totally optimized for bringing the metaverse to life. It’s a lot sooner efficiency and to load issues, a lot better graphics, a lot simpler to create with”, Zuckerberg mentioned. However there wasn’t a lot in the best way of particular technical particulars.
Now, Meta has revealed a weblog submit describing some particulars of Horizon Engine.
Horizon Engine “robotically scales from high-end cloud rendering to working on cell phones”, Meta says, and may assist “crowds of dwell avatars in a single shared area, expansive environments that may be streamed as sub-levels”, with “automated administration of object high quality by robotically generated LODs”.
Here is what Meta says in regards to the core techniques of Horizon Engine:
• Belongings: A sturdy, data-driven, and creator-controlled asset pipeline that helps fashionable native workflows with normal instruments and acquainted middleware like PopcornFX for results, FMOD for sound, Noesis for UI, and PhysX for physics.
• Audio: A spatialized audio system that stitches collectively expertise sound, immersive media codecs, and hybrid-mixed VoIP right into a single immersive expertise.
• Avatar: First-class integration of Meta Avatars, offering constant embodiment and interplay habits throughout the platform, and networked cross-instance crowd techniques.
• Networking: A safe and scalable actor-based community topology that enables for low latency player-predicted interactions with server validation and creator-defined networked elements.
• Rendering: A cellular and VR-first ahead renderer, with a bodily based mostly shading mannequin, built-in gentle baker, probe-defined lighting for dynamic objects, and a creator-defined materials framework by a strong, extensible, and stackable floor shader system.
• Useful resource Administration: A useful resource supervisor, streaming system, and multi-threaded activity scheduling framework to take care of the person expertise by balancing high quality and value throughout the variable efficiency envelope of Quest and different platforms.
• Scripting: An extensible Typescript authoring setting to unify logic and management circulation in worlds, with creator-defined elements and clear entity lifecycles.
• Simulation: An information oriented ECS-based simulation system able to effectively simulating hundreds of thousands of networked entities.
It is notable that Meta describes the engine as “working on cell phones”. Horizon Worlds on smartphones is at the moment cloud streamed (poorly, in my expertise), that means every cellular session has a non-trivial price to Meta, and requires a powerful and constant web connection from the person. If the corporate strikes to native rendering on telephones, it might end in a extra responsive and dependable expertise, bolstering its ambitions of competing with the likes of Fortnite and Roblox.

