Posts | Comments

Planet Arduino

Archive for the ‘vr’ Category

In virtual reality, anything is possible, yet being able to accurately model things from the real-world in a digital space remains a huge challenge due to the lack of weight/feedback that would otherwise be present in physical objects. Inspired by working with digital cameras and the inherit imperfection they bring to their videos, Bas van Seeters has developed a rig that translates the feeling of a camera into VR with only a few components.

The project began as a salvaged Panasonic MS70 VHS camcorder thanks to its spacious interior and easily adjustable wiring. An Arduino UNO Rev3 was then connected to the camera’s start/stop recording button as well as an indicator light and a potentiometer for changing the in-game focus. The UNO is responsible for reading the inputs and writing the data to USB serial so that a Unity plugin can apply the correct effects. Van Seeters even included a two-position switch for selecting between wide and telescopic fields of view.

With the Arduino now sending data, the last step involved creating a virtual camcorder object in Unity and making it follow the movement of a controller in 3D space, thus allowing the player to track things in-game and capture videos. More details on the project can be found in van Seeters’ write-up here and in the video below!

The post Getting more realistic camera movements in VR with an Arduino appeared first on Arduino Blog.

When it comes to virtual reality, the visual technology is quite good and can convince our brains that we’re inside a digital 3D environment. Audio is also decent, thanks to established interaural 3D audio techniques. But current consumer VR setups fail to adequately incorporate our other senses and that failure breaks the immersive illusion. To bring tongues into the mix, researchers from the University of Calgary and University of Maine have developed TactTongue which provides on-demand electrotactile stimulation.

TactTongue creates both physical, tactile sensations and also some measure of taste. Both are the result of electrical pulses through electrodes on a mouth piece worn over the tongue. That matrix of electrodes covers much of the tongue’s surface to add a spatial layer to the equation. The user can, for example, feel a pulse moving from the tip of their tongue to the back-right portion of their tongue. TactTongue can also generate a feeling of taste (salt in particular), thanks to the physiological mechanism by which our tongues perceive saltiness through ion transfer.

It is possible to direct electrical pulses to specific parts of the tongue with the TactTongue hardware, implemented as a shield for Arduino UNO Rev3 boards. That passes current down a flexible ribbon cable to electrode pads embedded in the cable.

The team behind TactTongue developed toolkits for prototyping “haptic experiences” with this technology. Those control the electrodes active at any given time, but also the pulse intensity and waveform patterns. It is possible to tie the haptic experiences to specific software events, both in VR games and software, to offer feedback or direction to users.

The post Bringing tongue stimulation into the VR experience with TactTongue appeared first on Arduino Blog.

Virtual reality (VR) technology has improved dramatically in recent years and there are now a number of VR headsets on the market that provide high-quality visual immersion. But VR systems still struggle to stimulate our other senses. When you can’t feel the virtual objects that you can see, the immersion falls apart. That’s why an international team of researchers has developed GuideBand, which is an arm-mounted contraption that physically guides players within VR.

This device looks a bit like an external fixation apparatus for securing broken bones. It straps onto the user’s arm and has three motors controlled by an Arduino Mega via TB6612FNG motor drivers. The first motor moves the device’s gantry radially around the user’s arm. The second motor adjusts the angle of attack, offset perpendicularly from the forearm. The third motor acts as a winch and pulls a cable attached to a strap on the user’s arm.

The unique layout of GuideBand lets it impart the feeling of pulling onto the user’s forearm, like a parent tugging their child through a grocery store. That guidance could correspond directly to action in the virtual world, such as an NPC (Non-Player Character) pulling the player out of the way of danger. Or it can provide more subtle direction, like a game tutorial demonstrating how the player should move to interact with a virtual object.

As with many other VR haptic feedback systems, GuideBand is highly experimental and we don’t expect to see it on the market anytime soon. But it is still an interesting solution to a specific problem with virtual reality.

You can read the team’s published paper here.

The post This arm-mounted contraption provides guidance in VR appeared first on Arduino Blog.

If you want a virtual reality headset for your computer, but don’t want to dig deep into your pockets, this project by “jamesvdberg” (AKA Killer Robotics) presents a low-cost alternative. 

Although it won’t pack the capabilities of an Oculus or HTC Vive, jamesvdberg’s VR rig can be replicated for just $80 using a Google cardboard-compatible shell, along with a 5” Raspberry Pi 800×480 LCD screen and an Arduino Micro for control.

The DIY device tracks head movements using an MPU6050 IMU, sending data to a PC system as a mouse input via the Micro. Game visuals are fed back to the screen over HDMI, split into discreet images for each eye, creating a side-by-side 3D effect. 

Those interested in building their own version can find the tutorial here.  

If you’ve ever used a VR system and thought what was really missing is the feeling of being hit in the face, then a team researchers at the National Taiwan University may hold just the solution. 

ElastImpact takes the form of a head-mounted display with two impact drivers situated roughly parallel to one’s eyes for normal — straight-on — impacts, and another that rotates about the front of your face for side blows.

Each impact driver first stretches an elastic band using a gearmotor, then releases it with a micro servo when an impact is required. The system is controlled by an Arduino Mega, along with a pair of TB6612FNG motor drivers. 

Impact is a common effect in both daily life and virtual reality (VR) experiences, e.g., being punched, hit or bumped. Impact force is instantly produced, which is distinct from other force feedback, e.g., push and pull. We propose ElastImpact to provide 2.5D instant impact on a head-mounted display (HMD) for realistic and versatile VR experiences. ElastImpact consists of three impact devices, also called impactors. Each impactor blocks an elastic band with a mechanical brake using a servo motor and extending it using a DC motor to store the impact power. When releasing the brake, it provides impact instantly. Two impactors are affixed on both sides of the head and connected with the HMD to provide the normal direction impact toward the face (i.e., 0.5D in z-axis). The other impactor is connected with a proxy collider in a barrel in front of the HMD and rotated by a DC motor in the tangential plane of the face to provide 2D impact (i.e., xy-plane). By performing a just-noticeable difference (JND) study, we realize users’ impact force perception distinguishability on the heads in the normal direction and tangential plane, separately. Based on the results, we combine normal and tangential impact as 2.5D impact, and performed a VR experience study to verify that the proposed 2.5D impact significantly enhances realism.

VR environments are meant to be immersive, but if you’ve ever thought what was missing is being actually pummeled by robotic fists, then James Bruton’s newest project could be just the thing. 

Bruton recently teamed up with students from Portsmouth University to build a robot that works in the real world, and coordinates its movements with a virtual setting displayed on the human’s headset.

The robot itself is controlled by an Arduino Mega, and features a differential (tank) drive with encoders for feedback. Shoulders can tilt from left to right, and the actual punching motion is handled by pneumatic actuators built from modified bicycle pumps. Robo-fists are covered by boxing gloves to keep humans relatively safe, and flesh-based competitors are given a small shield and sword-bat with which to fight back!

I worked on this project with final year degree students in Computer Games Technology at Portsmouth University CCI faculty. The robot hardware is controlled over a serial interface, the team built an VR game which controls the robot, so when you get hit in VR you get hit in real life! The robot is tracked back into VR with Vive trackers so it stays in sync.

Haptic feedback is something commonly used with handheld controllers and the like. However, in a virtual reality environment, it could also be used with the other interface surface attached to your body: the VR headset itself.

That’s the idea behind FacePush, which employs an Arduino Uno-powered pulley system to place tension on the straps of an HTC Vive headset. A corresponding pushing force is felt by the wearer through the headset in response to this action, creating yet another way to help immerse users in a virtual world. 

Applications tried so far include a boxing game, dive simulator, and 360-degree guidance You can check it out in a short demo below, and read more about it in the full research paper here.

While touchscreens are nice, wouldn’t it be even better if you could simply wave your hand to your computer to get it to do what you want? That’s the idea behind this Iron Man-inspired gesture control device by B. Aswinth Raj.

The DIY system uses an Arduino Nano mounted to a disposable glove, along with hall effect sensors, a magnet attached to the thumb, and a Bluetooth module. This smart glove uses the finger-mounted sensors as left and right mouse buttons, and has a blue circle in the middle of the palm that the computer can track via a webcam and a Processing sketch to generate a cursor position.

You can see it demonstrated in the video below, drawing a stick man literally by hand, and also controlling an LED on the Nano. Check out this write-up for code and more info on the build!

[Florian] has been putting a lot of work into VR controllers that can be used without interfering with a regular mouse + keyboard combination, and his most recent work has opened the door to successfully emulating a Vive VR controller in Steam VR. He uses Arduino-based custom hardware on the hand, a Leap Motion controller, and fuses the data in software.

We’ve seen [Florian]’s work before in successfully combining a Leap Motion with additional hardware sensors. The idea is to compensate for the fact that the Leap Motion sensor is not very good at detecting some types of movement, such as tilting a fist towards or away from yourself — a movement similar to aiming a gun up or down. At the same time, an important goal is for any added hardware to leave fingers and hands free.

emulation-demo-optimized[Florian]’s DIY VR hand controls emulate the HTC Vive controllers in Valve’s Steam VR Tracking with a software chain that works with his custom hardware. His DIY controller doesn’t need to be actively held because by design it grips the hand, leaving fingers free to do other tasks like typing or gesturing.

Last time we saw [Florian]’s work, development was still heavy and there wasn’t any source code shared, but there’s now a git repository for the project with everything you’d need to join the fun. He adds that “I see a lot of people with Wii nunchucks looking to do this. With a few edits to my FreePIE script, they should be easily be able to enable whatever buttons/orientation data they want.”

We have DIY hardware emulating Vive controllers in software, and we’ve seen interfacing to the Vive’s Lighthouse hardware with DIY electronics. There’s a lot of hacking around going on in this area, and it’s exciting to see what comes next.


Filed under: Arduino Hacks, Virtual Reality

[Florian] has been putting a lot of work into VR controllers that can be used without interfering with a regular mouse + keyboard combination, and his most recent work has opened the door to successfully emulating a Vive VR controller in Steam VR. He uses Arduino-based custom hardware on the hand, a Leap Motion controller, and fuses the data in software.

We’ve seen [Florian]’s work before in successfully combining a Leap Motion with additional hardware sensors. The idea is to compensate for the fact that the Leap Motion sensor is not very good at detecting some types of movement, such as tilting a fist towards or away from yourself — a movement similar to aiming a gun up or down. At the same time, an important goal is for any added hardware to leave fingers and hands free.

emulation-demo-optimized[Florian]’s DIY VR hand controls emulate the HTC Vive controllers in Valve’s Steam VR Tracking with a software chain that works with his custom hardware. His DIY controller doesn’t need to be actively held because by design it grips the hand, leaving fingers free to do other tasks like typing or gesturing.

Last time we saw [Florian]’s work, development was still heavy and there wasn’t any source code shared, but there’s now a git repository for the project with everything you’d need to join the fun. He adds that “I see a lot of people with Wii nunchucks looking to do this. With a few edits to my FreePIE script, they should be easily be able to enable whatever buttons/orientation data they want.”

We have DIY hardware emulating Vive controllers in software, and we’ve seen interfacing to the Vive’s Lighthouse hardware with DIY electronics. There’s a lot of hacking around going on in this area, and it’s exciting to see what comes next.


Filed under: Arduino Hacks, Virtual Reality


  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook