Skip to main content

VR @ OU - Workshop Outline

Oklahoma Virtual Academic Library

VR @ OU

What is the OVAL?

OVAL stands for Oklahoma Virtual Academic Laboratory. 
Here's a short demonstration of the system in action to get things started:

  • Users gains expertise with the OVAL quite quickly, turning it (in the long term) into a sort of 3D productivity tool
  • What you’ll see – whether you’re looking at the monitors in front of us or the screen at the front of the room – is a list of available 3D assets, which can be pulled in and manipulated in real time over a cloud-networked set of VR headsets. Anything done to this model, others (or co-researcher) will see, regardless of where their station is located. Basically, what you are seeing is real-time analysis of 3D assets (from a variety of fields) across a network of headsets.
  • What that translates to for the instructor, is 360 degrees of interactive content. That means an instructor can design their learning experience from the ground up to include those components they deem educationally relevant while removing distracting elements. (There’s no way to check your smartphone in the OVAL, for example).

What are the benefits of using the OVAL?

Hyper-malleability of content

  • Scale and perspective are instantaneously changeable in a way not allowed of a physical specimen.
  • New perspectives on “old” data--inside the protein molecule, instead of the flat textbook page, for instance.
  • Previously invisible, fragile or distant objects are accessible.

Embodied interactions

  • Technical training not required. Anyone can upload a model and walk in to use the system.
  • Builds on intuitive interaction types, like leaning in to look closer.

Cloud-based

  • Teach and learn from anywhere.
  • Upload models remotely (from dorm, office, etc.)

Literature Survey

  • "Visualizing Archeological Excavations based on Unity3D" 
  • Virtual Reality for Assembly Methods Prototying: a Review" 
  • "Perceptual and Interface Design for Virtual Reality"
  • Oculus Rift "Best Practices Guide"
  • "Virtual Fossils revolutionize the Study of Human Evolution"
  • "As if Being There: Mediated Reality for Crime Scene Investigation"
  • "Immersive and Collaborative Data Visualization Using Virtual Reality Platforms"

What is the technology involved?
This system comprises both a hardware and software element.

  • The railed chair, which provides range of motion and cable management, was custom-designed by the University of Oklahoma’s physics fabrication lab. Discussions surrounding design of the railed chair centered on usability feedback gathered during an earlier proof of concept phase, where users were inadvertently striking their monitor or desk or keyboard, limiting engagement with the virtual environment.  
  • Off-the-shelf PC peripherals were also pivotal to the OVAL’s hardware implementation. Those peripherals include a 3D Mouse (3dconnexion’s Space Navigator), which, unlike a traditional mouse, is able to control motion along a vertical, z-axis; the LeapMotion hand tracker for preserving fingers and forearms in virtual reality; and the Oculus Rift Development Kit 2, the head mounted display (e.g. VR goggles) central to the immersive virtual reality experience.  
  • These hardware components were combined on the software side of the OVAL implementation using the Unity3D gaming engine, which itself is available at no cost for amateur developers and for a relatively small licensing fee if distribution of software is planned. The executable application - presently in a .4 beta stage - combines the software “packages” that power the abovementioned off-the-shelf peripherals with a library of custom scripted interface elements to allow for shared manipulation and analysis of an arbitrary set of 3D assets across a network of head mounted displays.The software we are presently labeling OVAL v.5 (a functional beta build).

Who has used the system?

Last semester classes from multiple colleges tested the system as part of their curriculum including Architecture, Art and Art History, Chemistry, Journalism and English.  
These early adopters include:

  • The College of Architecture
    • Interior design, and the student/professional dilemma + spatial thinking
  • Chem/Bio-Chem navigating and viewing chemical structures in a 3D format
  • Art and Art History (sculpture) and the life of an artist in the 21st century
  • English, digital humanities and the valuable physicality of textual assets + preservation
  • Journalism – Storytelling in 360 degrees (NYT VR app, for example)
  • Library and Information Science - 3D repository development and web hosting.

For now, only two students can actually engage with the content, while others can watch – and trade-off – when they’ve accomplished their goal, but six more OVAL workstations stations will be in place by the Fall 2016 semester, at the Law School Library and the iHub, to support up to 8 simultaneous VR participants. Then larger research groups and small classes can inhabit a "virtual classroom".

Outputs?

  • Screen capture of a structural component, for D2L upload or publication & presentation, is accomplished by holding down both 3d mouse buttons for 5 seconds.
  • Video capture, initiated outside of OVAL by 3rd party software, allows for the reliving of fly-through sessions for downstream presentations. 
  • Anatomy - video capture of skeletal system fly-through for YouTube tutorial.
  • Biochemistry - video capture for use in grant-fund presentation.
  • Voice-chat
    • "Always on" voice-chat means that users can communicate with remote OVAL workstations (soon-to-be) deployed across the OU campus

Who can Use the System?

If yours is a field that has as its focus some feature of the physical universe, generally, we can deploy your content into the OVAL. Some fields, of course, benefit from face-to-face interaction/conversation with an instruction. I know – I’m a philosophy major, but if you have an artifact, or site, or structure, or some spatially extended object as an object of study then we can help you capture and interact with that object across a network of headsets anywhere in the world.
Anyone can walk in off the street and sit down and pull up their asset.

How to I find and upload 3D models for a personal fly-through?

Existing Content

  • Large institutions, like NASA and the Smithsonian are now building freely accessible online asset repositories
  • Sketchfab, a community-driven 3D model hosting service, is also a great place to find content for VR fly-through.
  • More coming online all the time.

Generating Content

  • Structure Scanner = ease-of-use but lower quality captures
  • Still camera + Autodesk Memento = high quality capture but sensitive to lighting and materials


Uploading the Model to OVAL via the Innovation @ the EDGE website.

What’s next?

The beauty of this system is that it can be scaled up relatively easily. It’s not too difficult to imagine a classroom-sized number of headsets, networked for an instructor-guided session. Indeed, we can eventually send these headsets to distance or non-traditional learners (for about the cost of an expensive textbook), thereby allowing them to join in OVAL sessions remotely.


Bottom line: Analysis and manipulation of any 3D asset regardless of physical location, background/discipline, or level of technical expertise.