Intel
Prototyped hardware and software for the Realsense depth camera product family with and shipped open source developer SDKs. The Perceptual Computing Lab projects ranged from VR, AR, BCI (Brain Computer Interfaces), haptics, computer vision, projection based experiences, transparent displays and holographic displays.
"Art comes into play as the expression way of how technology is used" Johan Jervoe, vice president of sales and marketing at Intel.
Highlighted Projects
Project
Realsense Depth Camera Prototyping
Deliverables
Hardware Prototypes Software Use Case Examples
Role
Creative Technologist
Collaborators
Camera Sensor Manufactures Electrical Engineering Mechanical Engineering Leadership Business Partnerships Computer Vision Researchers
Brief
Given the task to create a Depth Camera Product Line, we knew it would take a few years to ship so we partnered with Creative Labs to make the first couple generations while we worked on the first generation of Intel manufactured Realsense Depth Cameras based on unstructured IR light stereo vision. My goal for this product was to be more robust than the Microsoft Kinect by allowing the camera to work in bright outdoor settings and therefore would be tackling a different set of use cases and product fit. In addition to simply outputting depth we wanted the camera to be able to sense hand tracking from 1st and 3rd person views, be capable of full body tracking, facial tracking and later on have SLAM embedded into it for 6DOF tracking of the camera itself.
Early Realsense prototypes used (2) Playstation Eye cameras that were custom wired to be gen-locked and frame sync'd as well as have IR emitters. This provided a good platform to test a number of computer vision algorithms and custom lenses, spacing and IR light wavelengths relative to the sensor. Later I worked directly with camera sensor manufacturers to spec a sensor pair and chipset that would provide all the feature we wanted in an IR stereo camera with the resolution and framerates that were needed.
The Realsense Cameras were embedded in drones, cars, AR headsets, VR headsets, tablets, computers, phones, installation art and used by many tech enthusiasts working in the field of Computer Vision.
Project
6DOF VR with Hand Tracking
Deliverables
Hardware Prototypes Software Prototype Experiences (Unity, C++)
Role
Creative Technologist
Collaborators
Computer Vision Researchers
Brief
When the Oculus Rift Dev Kit was launched on Kickstarter, I ordered it immediately very quickly realized a few features that would greatly improve the overall experience. First I added 6DOF tracking using the Optitrack Motion Capture System so you could walk through a VR space 1:1 with the real world, this gave the user the ability to walk through virtual spaces instead of viewing a rotation only based VR system tethered to a PC. Next was to try to have a better input system than using a keyboard and mouse, so I added Razor's Hydra Controllers which added (2) 6DOF motion controllers with reliable button input. The third step was to add hand tracking interactions that allowed you to manipulate 3d content in VR with your hands as you can today with the Quest 2 and last was to add depth for scene understanding.
Shortly before leaving to work on Hololens at Microsoft I passed the project onto a friend/colleague at the Perceptual Computing lab which evolved it into Project Alloy, a fully self contained VR headset with mixed reality view, hand tracking and depth camera. Unfortunately Intel cancelled the project even though it was far ahead of what Oculus had at the time and it would take Oculus multiple years before they released the Quest 2 with hand tracking.