-
Notifications
You must be signed in to change notification settings - Fork 17
Description
Here's a demo mixing some DOM content with WebGL content using Three.js.
What I'd like to do is the same thing, but in VR. However, this obviously won't work with WebVR because that is purely WebGL.
Unfortunately, the only way to achieve "mixed mode" with DOM and WebGL is probably using the ame technique as yours: split the page into two parts, and stick a phone in a headset, but it won't work with Oculus or similar, unless we make some serious hacks to it that probably won't ship into the app stores.
But, none-the-less, for those stick-in-headset people, it'd be nice to achieve this.
So maybe we can do exactly the same thing as in my "mixed mode" demo, and just render two of them, one for each eye.
But, things will become complicated. What I want to do, for example, is render an Ace Editor HTML-based code editor (or similar) for both eyes.
We'd have to consider:
- How do we focus the cursor on the editor?
- How do we modify both instances of the editors for each eye?
- etc
We can probably do it by using a MutationObserver and mapping all changes from one eye to the other eye, and only one eye has the real instance, while the other one is always just a copy of the real one. This will obviously get somewhat heavy reflecting DOM mutations from one eye to the other.
Any thoughts on how to achieve "VR mixed mode"?