[web-vr] browsing through a CAVE

Lorne Covington lists at noirflux.com
Thu Oct 29 23:58:23 UTC 2015


Very cool Andrew!

Using graphics packages (I use vvvv) doing a VR Wall (one sided CAVE) is 
actually very simple, and even three-sided will easily run on a modern 
graphics card.  I drive six full-HD screens on systems with two cards.  
(My first VR Wall attempt: https://vimeo.com/33088117)

Seems to me all that is needed for a VR wall in WebVR is simply the 
ability to specify the viewer position (from the tracking camera) in the 
absence of an HMD.  Orientation is not used.  Ryan, where is that VRPN 
code?  I could not find it by searching the MozVR repository.

Multiple walls would require multiple renderers each with their own 
orientation and shared camera position.  So you can approach that as you 
did Andrew with multiple browser instances (even on the same machine?) 
and explicitly syncing cameras, but how will you handle timing of events 
between the instances so they do not slip in time, such as moving 
objects with their own behaviors that cross screens? Seems like it will 
require a master/slave arrangement for global time.  How is this handled 
in Liquid Galaxy?  This is a question I'm very interested in even just 
in syncing time in multiple WebGL instances.

I can understand that handling multiple views/renderers in WebVR, while 
solving the sync problem, is probably not in the architecture and would 
be difficult to add.  Or would it?

Thanks folks - ever onward!

- Lorne


http://noirflux.com


On 10/21/2015 8:11 PM, Andrew Leahy wrote:
> Hi, I've been hacking on webvr in multi-screen multi-pc environments a 
> little bit...
>
> Here's the MozVR Sechelt demo running on a cluster with Rift "master" 
> using ChromeVR
>
> https://www.youtube.com/watch?v=d3CMK2Hynb0
>
> Here the front 3 screens are driven by a small cluster of HP 
> ChromeBoxes which just run Chrome. Using websockets to sync the 
> threeJS camera's to give the illusion of a single joined up world.
>
> We don't want people throwing up, so I'm not sharing the Rift wearers 
> camera POV (heading/pitch/roll) just their location. You can see 
> towards the end of that video as I sit down/move around the front 
> screens shift and track my location only in the virtual world.
>
> A wraparound CAVE/VR naturally immerses the viewers, so there's no 
> particular reason to have the world spin around unnecessarily.
>
> I'm involved in the Google Liquid Galaxy community. We're very 
> enthusiastic about the possibilities of connecting goggle-based VR 
> with large-screen immersive VR world.
>
> Cheers, Andrew | eResearch | Western Sydney University
>
>
> On 22 October 2015 at 07:13, Ryan Pavlik <ryan at sensics.com 
> <mailto:ryan at sensics.com>> wrote:
>
>     Sounds neat to me! :) (As someone who has done a lot of work in/with
>     CAVEs, but hasn't had access to one in a while...)
>
>     While there isn't active work on this at the moment that I know
>     of, OSVR
>     is designed to be able to support CAVEs as well as HMDs (and
>     head-tracked desktops, etc...) within its single unified computational
>     display model - CAVEs and the like are actually a good deal simpler to
>     properly render to than HMDs (primarily because of the optics in HMDs
>     getting between the viewer and the screen), so this would actually
>     make
>     a relatively low-barrier-of-entry community contribution if you have
>     access to one. (for starters you can probably assume just a single
>     render node for a reasonably "standard" industrial-grade system like
>     METaL http://www.vrac.iastate.edu/facilities/metal/ - running
>     anything,
>     much less WebVR, on the massive render cluster of the C6
>     http://www.vrac.iastate.edu/facilities/c6/ takes some effort, so it
>     makes sense to put it off as "future work")
>
>     The majority of CAVE tracking systems and input devices are already
>     supported through OSVR via VRPN compatibility. So, if you've got
>     (access
>     to) a CAVE, you can at least get tracking and input devices like
>     wands,
>     etc. into WebVR already with the patch Georgiy submitted to Mozilla.
>
>     Ryan
>
>     On 10/21/2015 2:35 PM, Stefan Mayrhofer wrote:
>     > On 2015-10-21 21:27, Stefan Mayrhofer wrote:
>     >> What about connecting a browser to a CAVE as an interface for VR?
>     > what's a CAVE?
>     > https://en.wikipedia.org/wiki/Cave_automatic_virtual_environment
>     > _______________________________________________
>     > web-vr-discuss mailing list
>     > web-vr-discuss at mozilla.org <mailto:web-vr-discuss at mozilla.org>
>     > https://mail.mozilla.org/listinfo/web-vr-discuss
>
>     _______________________________________________
>     web-vr-discuss mailing list
>     web-vr-discuss at mozilla.org <mailto:web-vr-discuss at mozilla.org>
>     https://mail.mozilla.org/listinfo/web-vr-discuss
>
>
>
>
> _______________________________________________
> web-vr-discuss mailing list
> web-vr-discuss at mozilla.org
> https://mail.mozilla.org/listinfo/web-vr-discuss

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.mozilla.org/pipermail/web-vr-discuss/attachments/20151029/92a48306/attachment.html>


More information about the web-vr-discuss mailing list