Posts | Comments

Planet Arduino

Archive for the ‘augmented reality’ Category

Researchers across several universities have developed a controller that provides tangible interaction for 3D augmented reality data spaces.

The device is comprised of three orthogonal arms, embodying X, Y, and Z axes which extend from a central point. These form an interactive space for 3D objects, with linear potentiometers and a rotary button on each axis as a user interface.

At the heart of it all is an Arduino Mega, which takes in data from the sliders to section a model. This enables users to peer inside of a representation with an AR headset, “slicing off” anything that gets in the way by defining a maximum and minimum view plane. The sliders are each motorized to allow them to move together and to provide force feedback.

Possible applications include medical imaging and CAD modeling, among many others. More details on the Embodied Axes project can be found in the researchers’ paper here.

Want to see something super cool? Go grab your copy of Make: Vol. 68 and download the Digi-Key AR Guide to Boards app, then put them together to watch real magic happen. 

Read more on MAKE

The post Make’s Guide to Boards Has a Hidden Secret! appeared first on Make: DIY Projects and Ideas for Makers.

As robotics advance, the future could certainly involve humans and automated elements working together as a team. The question then becomes, how do you design such an interaction? A team of researchers from Purdue University attempt to provide a solution with their GhostAR system.

The setup records human movements for playback later in augmented reality, while a robotic partner is programmed to work around this “ghost.” This enables a user to then plan out how to collaborate with the robot and work out kinks before actually performing a task.

GhostAR’s hardware includes an Oculus Rift headset and IR LED tracking, along with actual robots used in development. Simulation hardware consists of a six-axis Tinkerkit Braccio robot, as well as an Arduino-controlled omni-wheel base that can mount either a robot an arm or a camera as needed.

More information on the project can be found in the team’s research paper here.

We present GhostAR, a time-space editor for authoring and acting Human-Robot-Collaborative (HRC) tasks in-situ. Our system adopts an embodied authoring approach in Augmented Reality (AR), for spatially editing the actions and programming the robots through demonstrative role-playing. We propose a novel HRC workflow that externalizes user’s authoring as demonstrative and editable AR ghost, allowing for spatially situated visual referencing, realistic animated simulation, and collaborative action guidance. We develop a dynamic time warping (DTW) based collaboration model which takes the real-time captured motion as inputs, maps it to the previously authored human actions, and outputs the corresponding robot actions to achieve adaptive collaboration. We emphasize an in-situ authoring and rapid iterations of joint plans without an offline training process. Further, we demonstrate and evaluate the effectiveness of our workflow through HRC use cases and a three-session user study.

Those familiar with the Dragon Ball Z franchise will recognize the head-mounted Scouter computer often seen adorning character faces. As part of his Goku costume, Marcin Poblocki made an impressive replica of this device, featuring a see-through lens that shows the “strength” of the person he’s looking at, based on a distance measurement taken using a VL53L0X sensor. 

An Arduino Nano provides processing power for the headset, and light from a small OLED display is reflected on the lens for AR-style viewing.

It’s not exactly perfect copy but it’s actually working device. Inspired by Google virtual glasses I made virtual distance sensor.

I used Arduino Nano, OLED screen and laser distance sensor. Laser sensor takes readings (not calibrated yet) and displays number on OLED screen. Perspex mirror reflects the image (45 degrees) to the the lens (used from cheap Google Cardboard virtual glasses) and then it’s projected on clear Perspex screen.

So you will still see everything but in the clear Perspex you will also see distance to the object you looking at. On OLED screen I typed ‘Power’ instead distance because that’s what this device suppose to measure in DBZ. 😀

Print files as well as code and the circuit diagram needed to hook this head-mounted device up  are available on Thingiverse. For those that don’t have a DBZ costume in their immediate future, the concept could be expanded to a wide variety of other sci-fi and real world applications.

Go-Pokemon-Go-Game-Wallpaper-2016-Desktop-WallpaperDo you want to be the very best? Do you want to become a Pokemon Go master? Then here are 5 projects to help you level up and catch 'em all.

Read more on MAKE

The post 5 Projects Fit for a Pokemon Go Master appeared first on Make: DIY Projects and Ideas for Makers.

As part of a recent Microsoft HoloLens hackathon in San Francisco, Maker Ian Sterling developed a new app that interacts with you smart home via augmented reality. The proof of concept, dubbed “IoTxMR,” allows a user to simply glance at a gadget and control it through gestures.

As you can see in the video below, IoTxMR enables Sterling to connect various Android and Arduino-based devices with the HoloLens to create a customized interdependent network. It also features a mixed reality experience called “virtual zen mode,” complete with calming sounds and light orbs in his surrounding environment.

During a recent interview with Digital Trends, Sterling revealed:

The primary goal of the app is to provide a 3D spatial UI for cross-platform devices — Android Music Player app and Arduino-controlled fan and light — and to interact with them using gaze and gesture control.

The connectivity between Arduino and a mixed reality device is something which holds a huge amount of creative opportunity for developers to create some very exciting applications — be it [Internet of Things], robotics, or other sensor data visualization. Besides this, our app features some fun ways to connect devices. Our demo featured a connection between a music player and a light in order to set a certain mood in your home.

Although just a demo, IoTxMR does highlight the endless possibilities that AR platforms like HoloLens offer in the not-too-distant future.

The Reality Editor (Credit: Fluid Interface Group/MIT)Augmented reality has yet to find a foothold in widespread applications, but MIT has just released an AR app that allows you to control IoT devices.

Read more on MAKE

The post MIT’s Reality Editor Controls IoT Devices via Augmented Reality appeared first on Make: DIY Projects, How-Tos, Electronics, Crafts and Ideas for Makers.

The Reality Editor (Credit: Fluid Interface Group/MIT)Augmented reality has yet to find a foothold in widespread applications, but MIT has just released an AR app that allows you to control IoT devices.

Read more on MAKE

The post MIT’s Reality Editor Controls IoT Devices via Augmented Reality appeared first on Make: DIY Projects, How-Tos, Electronics, Crafts and Ideas for Makers.

Lug
31

Open Hybrid Gives you the Knobs and Buttons to your Digital Kingdom

arduino hacks, Arduino Yún, augmented reality, internet hacks, mit media lab, open hybrid Commenti disabilitati su Open Hybrid Gives you the Knobs and Buttons to your Digital Kingdom 

With a sweeping wave of complexity that comes with using your new appliance tech, it’s easy to start grumbling over having to pull your phone out every time you want to turn the kitchen lights on. [Valentin] realized that our new interfaces aren’t making our lives much simpler, and both he and the folks at MIT Media Labs have developed a solution.

open-hybrid-light-color-pickerOpen Hybrid takes the interface out of the phone app and superimposes it directly onto the items we want to operate in real life. The Open Hybrid Interface is viewed through the lense of a tablet or smart mobile device. With a real time video stream, an interactive set of knobs and buttons superimpose themselves on the objects they control. In one example, holding a tablet up to a light brings up a color palette for color control. In another, sliders superimposed on a Mindstorms tank-drive toy become the control panel for driving the vehicle around the floor. Object behaviors can even be tied together so that applying an action to one object, such as turning off one light, will apply to other objects, in this case, putting all other lights out.

Beneath the surface, Open Hybrid is developed on OpenFrameworks with a hardware interface handled by the Arduino Yún running custom firmware. Creating a new application, though, has been simplified to be achievable with web-friendly languages (HTML, Javascript, and CSS). The net result is that their toolchain cuts out a heavy need for extensive graphics knowledge to develop a new control panel.

If you can spare a few minutes, check out [Valentin’s] SolidCon talk on the drive to design new digital interfaces that echo those we’ve already been using for hundreds of years.

Last but not least, Open Hybrid may have been born in the Labs, but its evolution is up to the community as the entire project is both platform independent and open source.

Sure, it’s not mustaches, but it’s definitely more user-friendly.


Filed under: Arduino Hacks, internet hacks
Set
17

Gravity Touch bluetooth Glove powered by Arduino Micro

arduino, Arduino micro, augmented reality, Featured, glove, micro, Virtual Reality Commenti disabilitati su Gravity Touch bluetooth Glove powered by Arduino Micro 

ARglove

Arduino user Jubeso submitted to our blog an instructable explaining the 10 steps to build an input device for gaming.

The  Gravity Touch bluetooth glove  is specifically designed to interact with augmented reality glasses like the Google Glass, Meta, Moverio BT or with the VR headsets like Oculus Rift, Samsung Gear VR, vrAse, Durovis Dive:

Those new products are amazing and they need new types of input devices. This instructable will describe how to build your own “Gravity Touch bluetooth glove” and I will also give you some tips to build your own Durovis Dive VR headset so that you will be able to enjoy full mobile VR. Because this glove will be of most use for VR game, I have created a Unity3D plugin for Android that handle the communication between your app and the glove. It means that you will be able to use your Gravity Touch glove to interact with your Unity3D VR game.

The Arduino code and the Java class I wrote to handle the communication between the glove and the Android device will also be available so that you will be able to adapt them for your need.

 

The bill of materials, among other things, contains an Arduino Micro , FreeIMU – an Open Hardware Framework for Orientation and Motion Sensing and 3m of flexible soft electric wire.

F2IPGRHHWCJK2XB.MEDIUM



  • Newsletter

    Sign up for the PlanetArduino Newsletter, which delivers the most popular articles via e-mail to your inbox every week. Just fill in the information below and submit.

  • Like Us on Facebook