Couldn’t fit a real robot, so we made a virtual one instead.

Final augmented reality scene on iPad Pro 12.9"

Final augmented reality scene on iPad Pro 12.9"

tl;dr We made a 1:1, mobile-friendly, augmented-reality robotic fabrication scene. This is an article about why and how we did it. If your reading this on mobile, click here to open it up. If your reading this on a desktop, scan below.

As part of an upcoming in-office exhibition showcasing BVN’s latest robotics research, there was an aspiration to incorporate a live demonstrator robotic exhibit. How better to de-mystify the often misunderstood accessibility of robotic fabrication than the real deal right?

There was an issue however; these machines (in particular the Kuka KR-120 we’ve been using) are big, expensive, power hungry, and dangerous. So to install one in a live, open-plan office environment, on a commercial floor-plate? Somewhat challenging…

So we developed a quick 1:1 augmented-reality scene of our ideal setup instead. A cheap, shareable, reasonably compelling and 100% WHS compatible alternative to the real deal.

If your reading this on mobile, click here to open it upIf your reading this on a desktop, scan the QR code to the right, or play with the non-animated scene directly below*.

1_jy4ktqqu_7PkWW08yympng.png
 

How does it work?

Instead of developing a dedicated exhibition mobile app that people would need to download (read: never download), we chose to work with an increasingly popular three-dimensional file type know as a .USDZ. This file type allows the compression of geometry, materials, animation and audio into a single <15mb file, and is considered native to our target platform(Apple/iOS)**.

As far as Apple is concerned, .USDZ’s should be considered the three-dimensional sibling of the JPEG. As such, Apple has made it seriously easy to preview and share these files on all things Apple. Whether you want to send your file by message, attach to email, host on your website, or Airdrop to a friend, the .USDZ will preview in augmented-reality or an on-screen viewer as simply as previewing an image file.

USDZ preview interface on iPhone

USDZ preview interface on iPhone

 

How to make the file?

The key ingredients for a good-looking .USDZ file are high-fidelity geometry, considered materials, coordinated animation and, if you fancy, audio. Luckily for us, as our robotic fabrication processes were already simulated in the three-dimensional modelling environment Rhino (via Grasshopper), so we already had ideal base geometry. Further, by the very nature of simulation analysis, we had (in effect) the coordinated animation component too. They both just needed a little TLC.

The three-dimensional scene components within Rhino.

The three-dimensional scene components within Rhino.

Geometry:

The geometry of the robotic fabrication scene had to be cleaned, neatly ordered and grouped. As you can see via the coloured scene above and below, the robot has been broken down into its individual arm components. The size of the scene corresponds to the 5m x 5m floor space allocated in the office exhibition plan.

This scene was exported to .OBJ and imported into our animation software Blender.

Visualising each arm component of the robot and its corresponding axis

Visualising each arm component of the robot and its corresponding axis

Animation:

In order to animate the virtual robot geometry, we needed to translate the instructions that our real world robot reads to perform its sequence of movement, into something an animation software like Blender can understand.

If you can think of a robotic arm as a six-jointed human arm, then you might be able to imagine that each arm component only has to consider its rotation relative to the joint it’s connected too. Really, it’s only in the combination of many simple rotations that any movement appears complex (consider the mechanics of a freestyle swimming stroke for example.)

As such, you can think of a robot’s position as essentially just six rotation values (one for each arm component), and it’s movement as a long list of rotations played out over some period of time. This is the kind of data Blender can read very well.

#Three robots position in space as represented by rotation value. 26.410, -60.693, 120.787, 168.191, 60.724, -234.340 24.25, -59.351, 120.533, 165.239, 62.071, -233.116 23.538,-58.903,120.449,164.255,62.520,-232.708 ...

Having translated our robots real-world print file into a sequence of simple rotation values (around 100,000 individual positions), it’s time to coordinate them to each arm component of the robot in Blender.

Scene Geometry with Materials Applied

Scene Geometry with Materials Applied

Blender & Python:

With our Rhino scene now sitting pretty within Blender, we need to do a little additional house cleaning. In short, we need to group, order and name the arm components hierarchically (so when everything starts animating, the tail doesn’t wag the dog, or the hand doesn’t shake the arm so to speak).

Once this is done, the robot is finally read to be animated. I wrote a sketchy Python script for Blender that reads those rotation values we grabbed earlier from Rhino and applies them to their correspondingly named arm component. Wooden spoon award for quality pythonic syntax I know...

A01.rotation_euler.z = —math.radians(float(A01Rot[n])) A02.rotation_euler.x = —math.radians(float(A02Rot[n])) A03.rotation_euler.x = —math.radians(float(A03Rot[n])) A04.rotation_euler.y = —math.radians(float(A04Rot[n])) A05.rotation_euler.x = —math.radians(float(A05Rot[n])) A06.rotation_euler.y = —math.radians(float(A06Rot[n]))

Exporting & Conversion

With our scene geometry, materiality and coordinated animation complete, it’s time to finally embed all that work into a .USDZ file.

This is a two-step process for iOS devices (and only a one-step process for Android):

  1. Export from Blender to .GLB or .GLTF (that’s it for Android!)

  2. Convert the .GLB file into .USDZ via Apple’s Reality Converter app.

  3. Open on your Apple device! (that link one more time…)

Frustratingly, you will need a MacOS device (Macbook, Mac Mini etc.) for this Step 2.

Frustratingly, you will need a MacOS device (Macbook, Mac Mini etc.) for Step 2.

 

Where to next?

Retail and e-commerce are already implementing this form of app-less augmented reality experience across their websites as a means to allow customers to spatially and aesthetically evaluate products in their own personal space before purchasing online (e.g. IKEA Place). Improved customer engagement, extended product browsing session times, and significantly reduced product return rates have been big wins for companies investing in this space.

For architecture and design, I think the opportunity is clear;

We model and materialise three-dimensional objects everyday, mostly for review by clients who are not necessarily spatially-minded. Shouldn’t we be leveraging AR to speak in a more accessible and compelling spatial language?

The cool thing for us is that it’s a low-investment hypothetical for an architecture studio to test. If we were to remove the more exotic use-case of an animated a robot arm, the workflow of creating accessible AR objects for design teams and clients is really quite simple and quick. The geometry and materials already exists in our concept sketch models, documentation models, and visualisation models… we really only need to translate them.

The steps for this workflow then simply become;

  1. Model/assemble your geometry

  2. Fine tune your geometry and materials within Blender.

  3. Export to .GLB / .GLTF

  4. Convert to .USDZ

  5. Victory!

iPad.gif

*Unfortunately, animations embedding within .GLB files (which Android devices rely on) are inconsistent on playback within Google’s Scene Viewer platform. Apologies if this is the case for your particular device. A problem easily fixed, but not one we have focused upon.

**iOS devices 13 and above.