⟵ Back to writing index

Reality Mix-up: Science Applications

This talk was originally given at Melbourne Augmented Reality Meetup, hosted by CSIRO Data61. I wrote up the transcript while my mind was fresh.

If you haven't been to Melbourne Museum lately, check out the CSIRAC, Australia's first computer in 1949. It was a significant investment in service over 15 years, used punch cards, and it could compute 1000 times the number of equations a single person could do in one day.

punch cards

Punch cards were data, but easily replaceable: someone could (with patience) spot a mistake, replace a card, and feed it back into a machine. Once the data was verified and correct, it could be replicated and posted to other people, and they could run similar calculations.

python REPL

Computer terminals was the next leap. You could type on a physical keyboard, and the computer would give you a response. This kind of instantaneous interactivity changed the way scientists work. What wasn't possible before was suddenly something that could be done in a much shorter period of time.

caves of qud

Computer scientists and programmers weren't the only people who embraced computers. Storytellers, filmmakers, and musicians alike went to learn programming, or collaborated with programmers. They created interactive experiences through digital media, and broke new grounds.


Today we live in a world where photorealistic experiences are coming at interactive rates. Computing power is continuing to rise, more affordable, and more efficient. It's not just automation and interactions, but also digitisation of processes and workflows.

Today we live in a world where motion capture technology from 20 years ago, which cost $100,000+, can be bought for less than $2,000 at home. Authoring and content creation tools like Unity are maturing, and there are many efforts in open source and commercial solutions building on each other.


Improvisation became innovation. ReMoTe is such a synthesis where the remote worker's location is streamed back to an instructor, who can project stereoscopic overlays, like their hands through Kinect, back to the worker on site. It was cheaper to collaborate remotely than flying someone over. It was like a Skype call, but in 3D, and in front of you.


Zebedee is a hand-held LIDAR scanner (you'd still lug a trolley behind you) that can digitise an entire location by simply walking around it. The laser spinner continues to record the point cloud, and the spring head gives it plenty of resolution and reach.


I want to explore what mixed reality can bring to the table for scientists and researchers. The work above was done by Eleanor McMurtry, one of my students last summer. Given the data, we can quickly tell where is the front door, the halls, and leave each other a note. 3D data is much more intuitive than floor plans, and don't rely on existing knowledge of the place.

insect scan

My colleagues Matt Adcock and Stuart Anderson set this insect photo station up at Data61. They mounted it on a spinning disc, and automated the whole picturing process. The result was 100+ photos for a single insect used to reconstruct it in 3D, preserving the fragile sample for future use.

The result was that many insects can now be viewed up close, in the angle natural to the interested scientist. They no longer have to be careful about using the sample. Better yet---the process of measuring the length of the limbs and sizes of the hulks could also be automated, saving even more time down the track.


CluckAR, developed by Choice Australia, is a consumer-facing mobile application. It works for all Australian egg providers found at supermarkets, and it will show you how free range the chickens really are.


We are still exploring what mixed reality can bring for research, and we've learned a few lessons along the way. I wanted to share with you some of the lessons we learned, and tell you a bit more about where things are headed, especially for mixed reality on the web.

Mixed reality is a visualisation platform. We can bring existing workflows from the entertainment industry to data science and visual analytics. Real-time exploration of historic and new context is here, and the tools are becoming more accessible for modern developers and non-developers alike. For scientific visualisation, the added bonus is all the existing tools we can bring in to our research.

Mixed reality encourages physical exploration. There is something curious about being able to get up and close to your dataset, and look at it in ways you are naturally inclined to look. It is sometimes difficult to get the angle you want with a mouse and keyboard combo, and nothing is more natural than your own body movement and your own eyes.

Mixed reality will be delivered across the web. Just as movies and music already moved to streaming platforms, digital content will also benefit from moving to streaming content. For one, you no longer have to wait for a massive download to get things going. Your browser will download the things you can see first, and fill in objects as they become available.

after the flood demo

There are much more to see where technology is headed. It is not likely that in 5 years we will all have contact lenses that deliver mixed reality magic. But the devices are maturing, tools are shared and built upon, and web services are increasingly offered for others to create on. Right now, web technologies like After The Flood are good enough to deliver immersive experiences.

We have the people just as good, if not better than average, regardless of the obstacles we face in government, internet speeds, and hardware. No matter where you live, as long as you have access to tools to develop on mixed reality technology, you have the opportunity to shape the industry.

Device manufacturers are still pushing hard, at least in my experience, for early adopting developers. Mixed reality will not replace our existing reality, but it will serve as an enhancement for our lives, like Google Translate, with real-time text replacements. There's no app yet that we cannot live without, but many are quite handy in the right context.

I hope I have given you some glimpses of what we have worked on, where mixed reality is going in terms of existing technologies, and what may be coming in the future.

Latest projects