Posts | Comments

Planet Arduino

Archive for the ‘Virtual Reality’ Category

VR environments are meant to be immersive, but if you’ve ever thought what was missing is being actually pummeled by robotic fists, then James Bruton’s newest project could be just the thing. 

Bruton recently teamed up with students from Portsmouth University to build a robot that works in the real world, and coordinates its movements with a virtual setting displayed on the human’s headset.

The robot itself is controlled by an Arduino Mega, and features a differential (tank) drive with encoders for feedback. Shoulders can tilt from left to right, and the actual punching motion is handled by pneumatic actuators built from modified bicycle pumps. Robo-fists are covered by boxing gloves to keep humans relatively safe, and flesh-based competitors are given a small shield and sword-bat with which to fight back!

I worked on this project with final year degree students in Computer Games Technology at Portsmouth University CCI faculty. The robot hardware is controlled over a serial interface, the team built an VR game which controls the robot, so when you get hit in VR you get hit in real life! The robot is tracked back into VR with Vive trackers so it stays in sync.

While you’ve been hearing about virtual reality for the last 20 years or so, today the hardware required to build such a rig is finally to the point where it’s within the reach of consumers. As seen here, Relativty is a SteamVR-compatible headset that can be made for around $100.

Relativty uses a 3D-printed frame to house its 2560 x 1440 LCD screen, along with a pair of 80mm Fresnel lenses to properly focus the image. Control is accomplished via an Arduino Due and an MPU-6050 accelerometer, which feeds head-tracking info to an external gaming system. 

At this point, the device is clean though fairly basic, and will hopefully be the start of a truly excellent open source project as features are added.

Haptic feedback is something commonly used with handheld controllers and the like. However, in a virtual reality environment, it could also be used with the other interface surface attached to your body: the VR headset itself.

That’s the idea behind FacePush, which employs an Arduino Uno-powered pulley system to place tension on the straps of an HTC Vive headset. A corresponding pushing force is felt by the wearer through the headset in response to this action, creating yet another way to help immerse users in a virtual world. 

Applications tried so far include a boxing game, dive simulator, and 360-degree guidance You can check it out in a short demo below, and read more about it in the full research paper here.

Like a lot of 16-year-olds, [Maxime Coutée] wanted an Oculus Rift. Unlike a lot of 16-year-olds, [Maxime] and friends [Gabriel] and [Jonas] built one themselves for about a hundred bucks and posted it on GitHub. We’ll admit that at 16 we weren’t throwing around words like quaternions and antiderivatives, so we were duly impressed.

Before you assume this is just a box to put a phone in like a Google Cardboard, take a look at the bill of materials: an Arduino Due, a 2K LCD screen, a Fresnel lens, and an accelerometer/gyro. The team notes that the screen is what will push the price unpredictably, but they got by for about a hundred euro. At the current exchange rate, if you add up all the parts, they went a little over $100, but they were still under $150 assuming you have a 3D printer to print the mechanical parts.

The system uses two custom libraries that you could use even if you wanted a slightly different project. FastVR creates 3D virtual reality using Unity and WRMHL allows Unity to communicate with an Arduino. Both of these are on the team’s GitHub page, as well.

There was one other member of the team, their math teacher [Jerome Dieudonne] who they call [Sensei]. According to [Jerome] he is “… the theoretician of the team. I teach them math and I help them solving algorithm issues.” He must be very proud and we always applaud when someone takes the time to share what they know with students.

We don’t know what’s next for this group, but we will be keeping an eye on them to see what’s next. Maybe they’ll work on smell-o-vision.

As reported by the Creative Applications Network, “Tangibles Worlds explores the effects of tactile experience as a catalyst for full immersion in VR.”

The project by Stella Speziali takes the form of three separate boxes, along with an Oculus Rift headset. When a hand is placed in one of these boxes, the user is virtually transported to another dimension of sight and sound, controlled by IR distance sensors, flex sensors, capacitive wire, and several other devices interfaced with an Arduino Mega.

Each box contains an IR distance sensor, which detects when a hand is inserted and display the virtual world attributed to the box. This new virtual world surrounds the user. A sensor is placed on each wall within the boxes, this sensor recognizes the hand and activates an animation inside the virtual world. I tried to map the sensors in the virtual universe so that a little clue is given to the user and will lead him to trigger the animations.

The idea behind this installation is to go beyond “traditional” VR controllers for entirely new level of interaction. The video seen here gives an excellent preview of the strangeness of this type of interface, though using it with a headset and sensors would likely be an altogether different experience!

It wasn’t too long ago that one could conjecture that most hackers are not avid video game players. We spend most of our free time taking things apart, tinkering with microcontrollers and reading the latest [Jenny List] article on Hackaday.com. When we do think of video games, our neurons generally fire in the direction of emulating a console on a single board computer, such as a Raspberry Pi or a Beaglebone. Or even emulating the actual console processor on an FPGA. Rarely do we venture off into 3D programs meant to make modern video games. If we can’t export an .STL with it, we’re not interested. It’s just not our bag.

Oculus Rift changed this. The VR headset was originally invented for 3D video games, but quickly became a darling to hackers the world over. Virtual Reality technology is far bigger than just video games, and brings opportunity to many fields such as real estate, construction, product visualization, education, social interaction… the list goes on and on.

The Oculus team got together with the folks over at Unity in the early days to make it easy for video game makers to make content for the Rift. Unity is a game engine designed with a shallow learning curve and is available for free for non-commercial use. The Oculus Rift can be integrated into a Unity environment with the check of a setting and importing a small package, available on the Oculus site. This makes it easy for anyone interested in VR technology to get a Rift and start pumping out content.

Hackers have taken things a step further and have written scripts that allow Unity to communicate with an Arduino. VR is fun. But VR plus physical reality is just down right exciting! In this article, we’re going to walk you through setting up your Oculus Rift and Unity game engine to communicate with the outside world via an Arduino.

Off the Shelf Options

If you head over to the Unity Asset page and run a search for Arduino, you get a few options. Sadly, searches for Raspberry Pi do not yield any fruit. There are a few generic serial communication options such as Simple Serial and SD Serial, but these options are not free and do not, at face value, appear to be well supported. Unidino looks promising, but it’s thirty bucks and there’s not much activity on the forum. The obvious choice to play around with on a rainy day is ARDunity. There’s a free version that still has plenty of capability to experiment with, and it’s well supported and documented.  It’s written in more of a WYSIWYG style that can be off-putting to coders, but it will have to suffice until someone bangs out more advanced version.

Getting Started

We’re assuming that you already have Arduino and Oculus setup on your PC. If you don’t have an Oculus Rift, we recommend to go to the Oculus site and install the software anyway. It will allow you to test the Unity/Arduino communication through the Oculus run-time even if you don’t have the hardware. This way when you do get a headset, you won’t have to do anything. Just execute the .exe and you’ll see your work in VR. Note that they used to prevent the software from installing on computers that did not meet the minimum requirements. Oculus has since toned down this nuisance, and now allows the software to be installed on most computers, including laptops!

Setting up Unity is straight forward – simply go to the site and download the installer. The latest version at the time of this article is 5.6.1. You’ll want to grab the personal version – it’s about five Gigabytes, so give it some time to download. Once installed, head over to the Oculus site and grab the Unity tools import package.

Open Unity and open a new project. Then head back to the Unity Asset store and install the ARDunity Basic Import package.  This will open Unity, and you should see a tab called Asset Store. Click on the tab and then import the package. Then go to Edit–>Project Settings–>Player and set the API compatibility level to .NET 2.0. This will clear the error you see in the bottom of the screen. Then under Other Settings, be sure to select the Virtual Reality Supported option.

Now import the Oculus package by going to Asset–>Import Package–>Custom Package and point towards the Oculus package you downloaded earlier. Restart the Unity program if needed.

Putting It All Together

At this point, everything should be set to get Unity talking to your Arduino through the Oculus Rift. Put an LED on D2 and note your comm port.  From the Project tab (lower left),  expand the folders ARDunity–>Examples–>LED–>Digital. Double click the last example – ReactingTrigger(DigitalLED).unity.

That will load the example. You should now see two cubes in the view window, and a new folder called ARDunity under the hierarchy tab on the top left. Click on ARDunity and you’ll see the inspector tab (far right) change to include a button that says “Export Sketch”. Hit the button and it will export the Arduino sketch to a place of

View of scene after opening LED example

your choosing. The .ccp and .h files will be generated automatically, along with some other dependencies. I shouldn’t have to tell you to have a look around, but don’t forget to compile and upload it after you’ve had your fill.

Just below the Export Sketch button will be the Comm Serial (Script) window. Search for and select the comm port for you Arduino. After you find your comm port, hit the little play button on top of the main scene window. Once you do this, you’ll see a “Connect” button appear where the Export Sketch was. Go ahead and connect. Now click on the Scene tab above the main viewing window. This will allow you to manipulate the cubes. Go ahead and grab the white cube and run it into the green one. If you did everything right, you’ll see your LED light when the two cubes collide.

The gears in your noggin should be turning right about now… if you can toggle IO from within a game engine; there is some seriously cool stuff you can do! But we’ve only scratched the surface. Let’s get this working in Virtual Reality!

Enter the Rift

In order to enter the virtual world, you need a couple of things. One is a character controller and the other is something to walk on. Hit the play button again if you haven’t already. This will disconnect everything and allow you to edit the world. Go to Game Object–>3D Object–>Plane and adjust the plane so the two cubes are hovering above it. Then un-collapse the ARDuino and expand the OVR folder to OVR–>Prefabs. Select the OVRPlayerController.prefab and drag it into the main hierarchy in the upper left. You’ll see the player controller appear in the main scene window. Drag it away from the cubes, and then under the inspector tab on the right hand side, de-select the Use Profile Data option in the OVR Player Controller (Script) section.

Now, select ARDuino under the hierarchy tab, press play and then connect. If you have an Oculus Rift connected, you should be able to put it on and run into the green cube and light the LED on the Arduino. If you don’t have one, just use the arrow keys to do the same.

Conclusion

So where can you go with this? The biggest thing that jumps out to us is haptic feedback. Imagine instead of a cube, you have a wall or table or something of that nature. Have the Arduino trigger some type of feedback when you touch or bump into the object.

Dig into the code and let us know of any bugs or improvements. Be sure to follow some other examples as well, and check out the video below for a demonstration. If you’ve done anything cool with Unity, show us in the comments.

 


Filed under: Arduino Hacks, Featured, Interest, Virtual Reality

Whether you choose to control this vehicle with your mind or a joystick, the camera mounted on it will give you a new view of the world.

Maker “Imetomi” was inspired to create a tracked robot after he was able to salvage a camera off of a cheap drone. This became the basis of his FPV setup, which he fitted onto a little tracked vehicle. Although this would have been enough for most people, in addition to building a joystick-based controller, he also made it work with a brainwave headset.

Imetomi now has something that he can drive around virtually, spying on passersby, as long as it stays within the VR transmitter’s 50-meter range. Be sure to check out the video below, where the small bot shows of its impressive all-terrain capabilities, and read his Instructables write-up here.

 

[Florian] has been putting a lot of work into VR controllers that can be used without interfering with a regular mouse + keyboard combination, and his most recent work has opened the door to successfully emulating a Vive VR controller in Steam VR. He uses Arduino-based custom hardware on the hand, a Leap Motion controller, and fuses the data in software.

We’ve seen [Florian]’s work before in successfully combining a Leap Motion with additional hardware sensors. The idea is to compensate for the fact that the Leap Motion sensor is not very good at detecting some types of movement, such as tilting a fist towards or away from yourself — a movement similar to aiming a gun up or down. At the same time, an important goal is for any added hardware to leave fingers and hands free.

emulation-demo-optimized[Florian]’s DIY VR hand controls emulate the HTC Vive controllers in Valve’s Steam VR Tracking with a software chain that works with his custom hardware. His DIY controller doesn’t need to be actively held because by design it grips the hand, leaving fingers free to do other tasks like typing or gesturing.

Last time we saw [Florian]’s work, development was still heavy and there wasn’t any source code shared, but there’s now a git repository for the project with everything you’d need to join the fun. He adds that “I see a lot of people with Wii nunchucks looking to do this. With a few edits to my FreePIE script, they should be easily be able to enable whatever buttons/orientation data they want.”

We have DIY hardware emulating Vive controllers in software, and we’ve seen interfacing to the Vive’s Lighthouse hardware with DIY electronics. There’s a lot of hacking around going on in this area, and it’s exciting to see what comes next.


Filed under: Arduino Hacks, Virtual Reality

[Florian] has been putting a lot of work into VR controllers that can be used without interfering with a regular mouse + keyboard combination, and his most recent work has opened the door to successfully emulating a Vive VR controller in Steam VR. He uses Arduino-based custom hardware on the hand, a Leap Motion controller, and fuses the data in software.

We’ve seen [Florian]’s work before in successfully combining a Leap Motion with additional hardware sensors. The idea is to compensate for the fact that the Leap Motion sensor is not very good at detecting some types of movement, such as tilting a fist towards or away from yourself — a movement similar to aiming a gun up or down. At the same time, an important goal is for any added hardware to leave fingers and hands free.

emulation-demo-optimized[Florian]’s DIY VR hand controls emulate the HTC Vive controllers in Valve’s Steam VR Tracking with a software chain that works with his custom hardware. His DIY controller doesn’t need to be actively held because by design it grips the hand, leaving fingers free to do other tasks like typing or gesturing.

Last time we saw [Florian]’s work, development was still heavy and there wasn’t any source code shared, but there’s now a git repository for the project with everything you’d need to join the fun. He adds that “I see a lot of people with Wii nunchucks looking to do this. With a few edits to my FreePIE script, they should be easily be able to enable whatever buttons/orientation data they want.”

We have DIY hardware emulating Vive controllers in software, and we’ve seen interfacing to the Vive’s Lighthouse hardware with DIY electronics. There’s a lot of hacking around going on in this area, and it’s exciting to see what comes next.


Filed under: Arduino Hacks, Virtual Reality

https://blog.arduino.cc/wp-content/uploads/2016/10/Robot.png

Light graffiti, virtual reality, and motion control combine to make one amazing robotic sidekick.

Film making has advanced in a staggering way over the last 50 or even 10 years, but what if you were to augment your filming rig with a VR headset and add a camera that automatically tracks where a VR controller is? Jaymis Loveday did just this, and in the video below, you’ll see a very interesting result–he’s painting with light in thin air.

If you’ve ever tried light graffiti, you’re familiar with what he’s doing. The problem is figuring out what and where exactly you’ve painted, and his system seems to solve this.

Later in the video, he interacts with a virtual world while the real world is still in the shot, for a kind of mixed reality filming experience. The possibilities for this kind of interface are staggering, so hopefully we’ll see even more strange art in the future!

… it’s a project I’ve had in my mind since I first used the Vive over a year ago, at PAX Prime in Seattle. As soon as I waved the controller in front of my face and noticed the tracking speed and accuracy I started mentally designing camera tracking systems. I wanted a VR system in my life because I love games, but I needed a Vive for filmmaking science experiments.

You can see more about this rig on his website here or on the project’s Reddit post. You can also check out Loveday’s previous video tracking only version here.



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook