Posts | Comments

Planet Arduino

Archive for the ‘Virtual Reality’ Category

In virtual reality, anything is possible, yet being able to accurately model things from the real-world in a digital space remains a huge challenge due to the lack of weight/feedback that would otherwise be present in physical objects. Inspired by working with digital cameras and the inherit imperfection they bring to their videos, Bas van Seeters has developed a rig that translates the feeling of a camera into VR with only a few components.

The project began as a salvaged Panasonic MS70 VHS camcorder thanks to its spacious interior and easily adjustable wiring. An Arduino UNO Rev3 was then connected to the camera’s start/stop recording button as well as an indicator light and a potentiometer for changing the in-game focus. The UNO is responsible for reading the inputs and writing the data to USB serial so that a Unity plugin can apply the correct effects. Van Seeters even included a two-position switch for selecting between wide and telescopic fields of view.

With the Arduino now sending data, the last step involved creating a virtual camcorder object in Unity and making it follow the movement of a controller in 3D space, thus allowing the player to track things in-game and capture videos. More details on the project can be found in van Seeters’ write-up here and in the video below!

The post Getting more realistic camera movements in VR with an Arduino appeared first on Arduino Blog.

When it comes to virtual reality, the visual technology is quite good and can convince our brains that we’re inside a digital 3D environment. Audio is also decent, thanks to established interaural 3D audio techniques. But current consumer VR setups fail to adequately incorporate our other senses and that failure breaks the immersive illusion. To bring tongues into the mix, researchers from the University of Calgary and University of Maine have developed TactTongue which provides on-demand electrotactile stimulation.

TactTongue creates both physical, tactile sensations and also some measure of taste. Both are the result of electrical pulses through electrodes on a mouth piece worn over the tongue. That matrix of electrodes covers much of the tongue’s surface to add a spatial layer to the equation. The user can, for example, feel a pulse moving from the tip of their tongue to the back-right portion of their tongue. TactTongue can also generate a feeling of taste (salt in particular), thanks to the physiological mechanism by which our tongues perceive saltiness through ion transfer.

It is possible to direct electrical pulses to specific parts of the tongue with the TactTongue hardware, implemented as a shield for Arduino UNO Rev3 boards. That passes current down a flexible ribbon cable to electrode pads embedded in the cable.

The team behind TactTongue developed toolkits for prototyping “haptic experiences” with this technology. Those control the electrodes active at any given time, but also the pulse intensity and waveform patterns. It is possible to tie the haptic experiences to specific software events, both in VR games and software, to offer feedback or direction to users.

The post Bringing tongue stimulation into the VR experience with TactTongue appeared first on Arduino Blog.

[resize output image]

The entire purpose of virtual reality (VR) is to improve immersion beyond what can be achieved with a regular TV or monitor. So it can be frustrating to VR users when the darkness in their peripheral vision reminds them that they are, in fact, wearing a headset. To improve the situation with Valve Index VR headsets without adding too much cost, Staton developed peripheral ambient lightning called VR Ambilight.

VR Ambilight works in the same way as Ambilight ambient backlighting for TVs and monitors. The system looks at the colors of the pixels around the entire outer border of the screen, then sets the colors of LEDs to match. That creates a gentle transition from the screen to the surrounding wall. When applied to VR, it extends the screen content into the user’s periphery. Because the user can’t see anything in their periphery in detail, the colored light is enough to maintain the illusion and eliminate breaks in immersion.

The only hardware components necessary for this system were an Arduino Nano board and two short strips of WS2812B individually addressable RGB LEDs. The LEDs mount inside of the Valve Index VR headset, with a thin sheet of translucent white plastic acting as a diffuser. Prismatik software works with Steam to detect the pixel colors along the screen edges, then uses a simple script to pass that along to the Arduino. The Valve Index has a handy USB port built-in, which helps to keep the wiring nice and tidy.

The post Add peripheral lighting to improve VR immersion appeared first on Arduino Blog.

We’re currently seeing something of a technological blitzkrieg as corporations and engineers attempt to solve the problem of tactility in virtual reality (VR). Modern VR headsets provide quite realistic visual and auditory immersion, but that immersion falls apart when users find themselves unable to physically interact with virtual objects. Developed by a team of National Chengchi University researchers, ELAXO is an Arduino-controlled exoskeleton glove that enables complex force feedback for VR applications.

ELAXO looks unwieldy — it is like an exoskeleton glove made up of 3D-printed struts and joints. In the demonstrated setup, ELAXO mounts to the user’s wrist and has force feedback structures attached to their thumb and first two fingers. Each finger receives four servo motors, four small DC motors, and one larger DC motor. Those motors attach to joints to create on-demand physical resistance to movement.

For two fingers and a thumb, ELAXO requires a total of 12 servos, 12 small DC motors, and three large DC motors. Each finger also needs an infrared (IR) sensor, for a total of three. In addition, the large DC motors contain encoders that use two wires each. Controlling those takes a lot of I/O pins, which is why the ELAXO team chose an Arduino Mega board for their prototype. It controls the motors through eight dual TB6612FNG drivers. 

The Arduino powers the motors according to what happens in the VR world. For example, if a user tries to touch a stationary object, the motors on that finger might get full power to keep the joints from bending and to provide a feeling of solid resistance. Other actions, like rotating a knob, result in less resistance. By gaining granular control over the resistance of each joint, ELAXO can produce convincing force feedback. 

The post This strange exoskeleton glove enables VR force feedback appeared first on Arduino Blog.

Virtual reality technology has come a long way in the last decade, but there are still some major things that could be done to make it even more interactive and immersive. For one, what if VR users could actually feel the ground they walk on rather than simply see it? A team of students from the Industrial Design program at KAIST in South Korea and the computer science program at the University of Chicago set out to tackle this by creating a fairly large 1.8 by 0.6 meter platform that can accurately create the feeling of varied terrain. Called the Elevate, this device uses a total of 1,200 pins that can individually raise and lower with 15mm of resolution. 

Each pin is comprised of a block of wood that protrudes from the platform, a comb-shaped section that is used to move the pin, and a locking bar to prevent unintended movement. At the core of the device is the shape generator, and its job is to individually actuate each pin. This is accomplished by moving row-by-row across the 60 rows to push or pull all of the pins within it via a timing belt and DC motor. There are 10 actuator modules in total that each contain an Arduino Nano, a regulator, two geared DC motors, a hall effect sensor, and a pair of magnets. The locking mechanism is controlled with an Arduino Uno and two servo drivers, and horizontal movements are done with an Uno as well and two microstepper controllers.

The resulting terrain is quite spectacular, as this much granularity means really fine details can be replicated. When paired with the VR game, participants who were testing the device consistently rated their experience on the Elevate to be far better than simply playing in VR. 

To learn more about this project, check out the video below and the team’s paper here.

The post Elevate is a walkable pin array floor that generates shape-changing terrain for VR appeared first on Arduino Blog.

When it comes to virtual reality, the challenge isn’t displaying convincing visuals — VR headset manufacturers have already figured that out. The real challenge is how to tickle our other senses, like smell, taste, and especially touch. To give people the ability to feel the fur of animals in VR, engineers have built this strange haptic device, called HairTouch, equipped with adjustable hair.

HairTouch, which is controlled by an Arduino Mega board, serves a very specific purpose: to let VR users feel hair or fibers of varying lengths. That is an absurdly narrow goal and this device definitely won’t ever make it to market, but that doesn’t make it any less interesting. If you’ve ever wanted to feel the difference between a virtual tabby cat and a virtual Maine Coon, this is the haptic feedback gadget that you’ve been looking for.

Using a series of servo motors, HairTouch adjusts the bristles of a brush. It can control how far those bristles protrude, which is also related to their rigidity. It also adjusts the angle of the bristles, so the user can differentiate the feel of a Pomeranian from a Collie. Those adjustments correspond to the VR object that the user is currently touching. The engineers designed HairTouch to attach to VR controllers, so, at least theoretically, it can work with existing VR systems.

The post This haptic device lets you feel the hair of virtual reality animals appeared first on Arduino Blog.

A few decades ago you might have been satisfied with a crude wireframe flight simulator or driving a race car with the WASD keys. Today, gamers expect more realism, and [600,000 milliliters] is no different. At first, he upgraded his race car driving chair and put on VR goggles. But watching the world whiz by in VR is you can’t feel the wind on your face. Armed with a 3D printer, some software, and some repurposed PC fans, he can now feel the real wind in virtual reality. You can see the build in the video, below.

The electronics are relatively straightforward and there is already software available. The key, though, is the giant 3D printed ducts that direct the airflow. These are big prints, so probably not for some printers, but printers are getting bigger every day. The fan parts are from Thingiverse, but the enclosures are custom and you can download them from the blog post.

Turns out the cheap motor controllers sometimes are DOA, so it apparently took a few iterations to find one that would drive both motors to 100%. An Arduino works as a bridge between the PC and the fans. How does it work? Looks windy to us.

The next project is haptics and we’ve seen that before. If you want to build a cockpit on the cheap, check out the junkyard.

When using a virtual reality (VR) system, you may need to flip a switch, touch a button, etc., which can be represented by a carefully coordinated series of pixels in front of your eyes. As a physical alternative — or augmentation — researchers at the National Chiao Tung University in Hsinchu, Taiwan have developed a system of interchangeable physical control panels, called FaceWidgets, that reside on the backside of head-mounted unit itself.

When a wearer places their palm near their face (and headset), this is sensed and an on-screen canvas appears depending on the application. They can then manipulate these widgets both physically and in the virtual world to control the experience. 

Physical interactions are detected with the help of an Arduino Mega and the facial control pad even extends and retracts for optimal usage via a motor shield and stepper motors.

We present FaceWidgets, a device integrated with the backside of a head-mounted display (HMD) that enables tangible interactions using physical controls. To allow for near range-to-eye interactions, our first study suggested displaying the virtual widgets at 20 cm from the eye positions, which is 9 cm from the HMD backside. We propose two novel interactions, widget canvas and palm-facing gesture, that can help users avoid double vision and allow them to access the interface as needed. Our second study showed that displaying a hand reference improved performance of face widgets interactions. We developed two applications of FaceWidgets, a fixed-layout 360 video player and a contextual input for smart home control. Finally, we compared four hand visualizations against the two applications in an exploratory study. Participants considered the transparent hand as the most suitable and responded positively to our system.

One of the more interesting ideas being experimented with in VR is 1:1 mapping of virtual and real-world objects, so that virtual representations can have physically interaction in a normal way. Tinker Pilot is a VR spaceship simulator project by [LLUÍS and JAVI] that takes this idea and runs with it, aiming for the ability to map a cockpit’s joysticks, switches, and other hardware to real-world representations. What does that mean? It means a virtual cockpit with flight sticks, levers, and switches that have working physical versions that actually exist exactly where they appear to be.

A few things about the project design caught our eye. One is the serial communications protocol intended to interface easily with microcontrollers, allowing for feedback between the program and any custom peripherals. (By the way, this is the same approach Kerbal Space Program took with KSPSerialIO, which enables custom mission control hardware at whatever level of complexity a user may wish to implement.)

The possibilities are demonstrated starting around 1:09 in the teaser trailer (embedded below) in which a custom controller is drawn up in CAD, then 3D-printed and attached to an Arduino, and finally the 3D model is imported into the cockpit as a 1:1 representation of the actual working unit, with visual positional feedback.

Unlike this chair experiment we saw which attached a Vive Tracker to a chair, there is no indication of needing positional trackers on individual controls in Tinker Pilot. In a cockpit layout, controls can be reasonably expected to remain in fixed positions relative to the cockpit, meaning that they can be set up as 1:1 representations of a physical layout and otherwise left alone. The kind of experimentation that is available today even to individual developers or small teams is remarkable, and it’s fascinating to see the ideas being given some experimentation.

As seen here, “Standard controllers for virtual reality (VR) lack sophisticated means to convey realistic, kinesthetic impression on size, resistance or inertia.” To overcome these limitations, André Zenner and Antonio Krüger at the German Research Center for Artificial Intelligence (DFKI) have come up with Drag:on—a haptic feedback device that changes air resistance and weight distribution using a commercially-available hand fan.

Drag:on uses a pair of MG996R servos to actuate the fan, shifting its weight and air resistance as needed to simulate a virtual environment. The assembly is attached to an HTC Vive tracker, and an Arduino Nano provides control and computer interface via a USB serial link.

Drag:on leverages the airflow occurring at the controller during interaction. By dynamically adjusting its surface area, the controller changes the drag and rotational inertia felt by the user. In a user study, we found that Drag:on can provide distinguishable levels of haptic feedback. Our prototype increases the haptic realism in VR compared to standard controllers and when rotated or swung improves the perception of virtual resistance. By this, Drag:on provides haptic feedback suitable for rendering different virtual mechanical resistances, virtual gas streams, and virtual objects differing in scale, material and fill state


More details on the project can be found in the researchers’ paper here.



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook