Magic Leap Play, LEAPcon friends, and the future of XR as Language & Healthcare


Magic Leap.

The augmented reality glasses that were talking like they had it, the real deal, the holy smokes, the bonafide extended enhanced amazing powerful joyous reality altering machinery of the finest computation on your face!

Did they deliver with Magic Leap One?

You can count, right?

We’re on version one of version one version of new reality. So yes indeed does it have bugs and foibles. The FOV is just big enough to start getting a taste for proper synthesis of graphics and view. Yet it’s still a floating rectangle of perception. A big rectangle, yet edged nevertheless.

For the voices getting upset that this tech is not good enough, I’ll just say that it’s good enough for everyone using it. Magic Leap One is an overall rad device with lots of computing power, solid inputs and some excellent touches that make it a really wonderful development platform. So good you could even live with it! Magic Leap’s whole team did a stellar job here, thank you!

We’re undergoing a big shift in a tech capability space that was just 7 years ago a gimmicky and searching approach to make goofy experiences of changing the world! Not the whole world of course, just the world of the computer sensor. We had lots of dreams and visions, but mostly showed each other with powerpoint and hopeful hand gestures. Times were dark yet hopeful.

First time using Ml1 at Steve Lukas’ house

First time using Ml1 at Steve Lukas’ house

Fast forward to now. Magic Leap is a Cambrian Explosion moment for the new ways of computing, now called XR. Magic Leap and Hololens are enough to catalyze the new ways of computing among us who research, create and build. And there are other models from ODG and the tech modules are shuffling around searching for form as new conglomerate companies.

It’s been funny watching the term “XR” come into use as it seemed to emerge spontaneously to signify a merging, or crossing, of all concepts related to computer mediation as applied to human perception.

The Magic Leap One is a very slick device and also a very alpha device. The displays have limited field of view, there are some fun differences between the color balance of each eye’s display and even between their focal planes. It washes out totally outside.

None of that changes the fact: Magic Leap One WORKS! It works well. Well enough to where we are seeing great things fast from an extremely excited community.

I have been dreaming of the launch of a generally useful, robust look-through spatial computer headset. I did not and still do not care what ‘brand’ it is, as long as the dang thing lets me do my visions and stories!

So when Magic Leap was announced I started befriending Twitter people who had an ML1 or were getting one. Anyone in the XR space doing fun and meaningful things. And it’s been so fun watching the people, techniques and demos emerge out of the woodwork around the next phase of getting XR from gimmick to holy shit.

Don a VR helmet and meet John Hanacek on registration day @ LEAPCon #LEAPCon #MagicLeap

At LEAPcon 1 I met amazing people. The Magic Leap talks were inspiring, then AT&T got up on stage and I laughed loud enough that I bet you heard me if you were in that tiny presentation room. If you’ve read my blog at all, or know me, you know that I can often see the very real dark side of any given wonderful thing. Convenience often has a dark side. So I’m watching this space and tech closely, yet also am finding reasons for optimism. Specifically I’m meeting reasons for optimism.

The people give me the hope, especially my newest tribe #LeapNation which emerged on Twitter right after I met Steve Lukas of AcrossXR now Magic Leap who started the group chat. The “Magic Leap” has been very literal for me, suddenly collaborating and discussing with over 50 different talented developers pushing the boundaries and laughing the whole way.

Was great to see a real-life Lumin Runtime app from Sven Mesecke and his team at Rebel Camp in Oregon @dachsghund

Basically every app you can get from other developers on Magic Leap currently is a fully ‘immersive’ experience that does not utilize the UI/UX framework and OS layer that Magic Leap has with Lumin Runtime. Rebel Camp has a real app in that OS environment and it was great to see, even as a proof of concept. This is big and I’m very glad that Rebel Camp is getting the expertise to make Lumin OS apps because these are amazing test-beds to collaborate with Magic Leap on adding intelligence into the operating system while developers get that ‘for free’. If Android can evolve into a XR operating system, I’m all for helping. Hats off to Rebel Camp for diving into the Lumin rabbit hole and emerging with a rabbit.

Sven told me that had proceeded me a little bit with him having found it in the forums as I asked the same questions he assured me weren’t crazy: how the hell do I get models with textures into ML1 reliably?!

Developing for Helio browser is very fun, and I’m excited to see what official WebXR support looks like as it comes to the headset in 2019.

I have a special interest in the medical and therapeutic applications of XR, and it was my great pleasure to talk with two surgeons, Rafael Grossman and Alex Johnson at John’s Hopkins.

I sketched out with them a prototype idea that will keep brewing as many forms of knowing around energies, health and ‘trauma’ continue intersecting: contact-less surgery using vibrations and intention tracking channeled through the hands.

They are largely not satisfied with the current surgical robots, and so a vision of direct hands-on manipulation of tissues with a fidelity down to microns without the need for held tools excited them to say the least!


Brian (@ImmersionXR) from Immersion Neurotechnologies showed me his demo after us having talked over Twitter for weeks before. His work to get EEG into Magic Leap is amazing and path-leading.

At the Medical talks at LEAPcon I was very impressed by the quality of research and solutions people have been developing.

After the talk I learned from a Leaper that the Magic Leap One is getting heart-beat data and just using it for noise cancelling. She seemed coy as the Cheshire cat as to whether we’d get access to that data any time soon, so make sure to add your voice to the “we need heart beat please!” chorus haha

My interest in medical is currently channeling through as seeking to unite a knowledge and toolbase, with a special interest in human bioenergy visualization and work.

Important note about the reflectivity of Magic Leap One for first responders & therapists:

During training as a California State Beach Lifeguard we were told specifically not to wear reflective sunglasses on the job. If the victim sees their wound in the reflection of your lenses they can go into shock, which just made your job as first responder almost impossibly difficult. Most of lifeguarding first aid is to keep people out of shock. In my years as a guard I experienced the immense health benefit of ‘tactical lying’ to victims about the state of their condition. If someone can’t move their arms and legs, they don’t need you to freak out and tell them that. If someone has a massive gash on their head and they ask “how bad is it?” you can tell them it’s fine and they'll be fine. But that won’t work if they see their blood in your reflective lenses! So for all emerging medical prototypes with ML1 just remember that you have reflective lenses on, so be careful how you insert the ML1 into healthcare situations!

I’m finding humor, curiosity and heart are the keys to making this new medium emerge as something useful. Don’t take it too seriously, yet take it seriously enough to get passionate.

First time I used the Magic Leap was with Steve Lukas (@slukas) showing me his freshly received headset then I received mine on 9/9/18. Every day with it has been a horizon-widening experience. The form factor of ‘holographic headset’ is just singing with my imagination, and then comes all the learning of how to translate ideas to code!

This is a hugely exciting time to get in and shape the medium of ARVRXR whatever, you’re shaping R reality so stay open and find where everything intersects. My research goal is to use Magic Leap and other similar headsets to bring together many different knowledge threads in new ways of visualizing. We are to have a medium for thinking the unthinkable, as Bret Victor ( says.

We seem further along in some areas than I thought we would be and on pace for others.

Overall the thing that draws me in most about Magic Leap and this look through style of Mixed Reality is that I enjoy high quality physical environments and things. I want to expand and merge new aspects into a consensus reality more so than exist in many walled off realities. The best part about reality prime that we share by default is that no one owns the light that bounces off a painting or the sound that leaves my vocal chords as we speak and breathe.

The XR future that I care about is characterized by balance and by a revival of sacred aspects, such as plants being amazing beyond our tech, and real paintings made of atoms being wicked cool. It’s so fun to develop for a new holographic medium while sitting in my partner Kyra V Brandt’s oil painting artwork. A feeling of elevating and grounding of the computer medium in being something humane, and fundamentally artistic over specific. Emergent over prescribed. Gestural and free is how we can be with computers helping our minds and hearts to sing our most authentic forms and dance our most authentic songs.

Easter egg:
Check out Universe Creator by Les Bird if you have a Magic Leap One and stay tuned for more from this community!

Look for the red beanie easter egg in the videos below ;) #LeapNation


Video semantic meaning mapping XR magic leap hololens
Now we are entering a time of holographic spatial computation. A magical computer is still computational, a computational magic is still magical. Through our intentions channeled via tools we find ourselves building the world anew each day. Choose wisely and play deeply. Optimism and pessimism are each self fulfilling prophesies.

The Deep Potentials of XR

Everything that follows here are resources for understanding the magnitude of the shift that the human experience can undergo through our passionate utilization of technology for ‘story being’.

I’ve gotten to meet Ken Perlin, who has been an inspiration to my ambition for what the ‘computer’ as a medium really ought to behave like.

Ken Perlin (Academy Award Winner & Computer Science Professor, New York University (NYU)) discusses his research on the future language of AR. Perlin demonstrates his research by drawing figures (through HTML5 and JavaScript), and explaining how these drawings are used as a language.
Talk by Helen Papagiannis, Augmented Reality ArtistSpeaker at NEXT Berlin 2012
People are alive -- they behave and respond. Creations within the computer can also live, behave, and respond... if they are allowed to. The message of this talk is that computer-based art tools should embrace both forms of life -- artists behaving through real-time performance, and art behaving through real-time simulation.



The excellent Ramez Naam rushes ahead to a world where XR is in our very minds to explore the remapping of this conservative old world onto radically new possibilities in Water. This is a wake-up call to all of us working on this that behavior and choices are more fundamental then technology; we are responsible for a ‘good’ world.


The merger of VR and AR into a functional whole seems more likely to occur than when I wrote these stories in 2015 of a world where VR as used for school and AR happened in the background. It was easier for me to ground VR in stories back then. Now I feel I can ground XR and even make demos. I am working on new visions where the world itself is the school. You are encouraged to view the world as school regardless of the technology, we just seek to augment the learning potential in any given moment with best alignment.


Art by Kyra V Brandt Long before the Wayfair app came out, I decided to see if the ML1 gallery app could handle a little room planning. It was tough to get the pictures placed exactly where I wanted them. Some intelligence with the mesh would go a long way here as I kept losing the picture behind the wall occlusion!

From Magic Leap developer samples, built in Unity. Really fun use of the controller as a remote controller for little 'vehicles' this is a surprisingly addictive little experience that you get to open up in unity or unreal and learn how it is put together!


Went to DEFCON 26 in Las Vegas for just a few days and it was really fun! The badges are an amazing avenue I want to explore with more interactive versions - the battery tech needs improvement!

A sandstorm rolled in one night.

Really enjoyed seeing this robot at DEFCON 26 - the demo where someone put a plant on top of one of these and it could move into the light and dance when it needed watering of the plant is one of my favorite things I've ever seen!

Eye-opening talk from DEFCON 26, what hidden doors lurk inside of 'secure' systems??

Met some amazing people and am super stoked to go to DEFCON 27 !

SIGGRAPH 2017 & The Birthing Of Interactive External Imagination


I attended SIGGRAPH 2017 in LA and was struck by how mature the technologies are. It felt like a capstone moment, when so much of what I had dreamt of over the recent years of my life started to look tantalizingly possible. This post is a collection links from myself and others, please click around and explore the future As We May Think - There has long existed a dream of what an augmenting medium could be about, and a profound new chapter feels close at hand if we choose to build it...


At SIGGRAPH I was delighted to see Ken Perlin in the interesting demo #MeetMike. I recorded his talk on my phone, so it's not the best quality, but ironically the main thing I want to share is not the visuals, although they are incredibly impressive, but rather Perlin's message: we need to work with humanity as we build social computer experiences. Perlin would rather see a cartoon with great expressivity than a photo-realistic avatar with rigid muscles. Perlin is a pioneering researcher and thinker currently working on HoloJam and Chalktalk.

That said, the #MeetMike demo was extremely impressive in showing a glimmer of what a photo-realistic avatar could be like. The possibility has been lurking, but Mike Seymour and his team brought it to life and gave it an application, and they did an amazing job!

In my touring around the booths I encountered Imverse in the back of the room with a cool first take on making the creation of Mixed Reality really easy. Talking with Javier the CEO was really fun, his tool already has some of the elements of Perlin's HoloJam, but could use a lot more interactivity in creating real-time advanced animations, as Javier agreed and is working on. Unlike HoloJom though, Imverse is a commercially-ready application that works on more standard VR gear, without the need of a fully tracked room, making it ripe for experimentation by us mere mortals without lab-space!

Another highlight: I was really blown away to wear ODG's Mixed Reality glasses because they were so light! They run Android and will cost a bit over $1,000. Their form factor made me think, "yup, this one will actually get used in the field like it should." They're a self-contained computer, which limits graphics but mobility is my favorite element of what makes Mixed and Augmented Reality so special - computing out in the real world, with our hands free! 

With Apple's AR kit and small glasses like this we are getting ready for Mixed (or Augmented) reality sooner than I thought! 

But before any of this we also need to think hard about computation and what we're trying to do - what is the Center Of "Why"?

And I'm intent to explore where we've already been to make sure we're always breaking new ground and/or refining the garden we're already in. A recent ACM panel gives a great crash history through the present with the ethics.

Most of the improvements we need are in conceptual understanding of computation - SIGGRAPH showed me that the raw technology is ready for anything! So it's up to us to imagine a more robust version of just what it means to "use a computer". We need smarter interfaces and a more expressive form of programming, and we can begin experimenting today even before all the gadgets are ready.

If I talked with you at SIGGRAPH I probably told you I'm tired of clicking little buttons and using inert spaces which I must instruct EVERY element of the action I wish to perform. I mean it. I want to paint with math and math with paint. I believe that computing is a shared substrate for all of human endeavor, and I know it needs to let us be fully human within it. 

In graduate school I started exploring the research that is currently present for making a much more expressive interface for computing based on the the mighty pen, hypercharged by the affordances of a digital substrate: As We May Sketch

I am fascinated by the dream that we might eventually Converse with Computers in new forms of conversation, even if it is just to make graphics in a more user-friendly way.

There are already methods for approaching "artificial" intelligence in computers that will let us switch from thinking of machine learning as a data problem to having machine learning become an expressive experience allowing us to bridge rigid logics with blurry imaginations.

I am struck to my core that computing can and ought to move beyond us as explicit instructors and toward us as cooperative participants with an increasingly dynamic and intelligent substrate ready to hold our creations and let our minds soar. We in our physical bodies and full humanity are at the center of the vision of the future.

Still beyond our wonderful maturing pen-based tablets with 2D motion screens, we are entering into a time when graphics can move outside of rectangles - a time to explore more Humane Representations of Thought. We do need to keep humanity in mind, both our soaring potentials and our consistent ambitions and nagging flaws, books like Rainbow's End help situate the affordances against stories.

Back in 2015 I wrote a collection of three scenes from a School Astride the Metaverse, an attempt to envision a school that bridges material and virtual reality into a kind of whole:

This vision is not complete, it is playground to think about possible styles of school in the future. The first two are grounded, and the last one is more fanciful. There's an entire arena to go into with mixed reality and even more simulation space games - I'm working on it, but for now want to re-surface these since they are starting to look almost conservative in light of this years' tech I witnessed!

This is not meant as a description of the future, but rather a constellation of ideas about how the future of education might look in a world where technology keeps getting better and cheaper, yet overall amounts of money spent on education remain relatively constant. This is not a utopia, this is meant to explore the feeling of what certain compromises and perspectives might manifest as. You certainly do not have to agree with or love or even like this vision, but I do hope that you want to discuss it and the ideas within since this is a world not too far away. Mere decades really for some parts of it; other parts are less clear...

It's important to think and converse about holistic visions for the future of education, not just individual technology artifacts in the classroom and moderate systems deployments. This is my attempt to help add a little flavor to the conversation.

Let's make a future where learning is an adventure and allow new generations to soar past our wildest dreams.