Unleashing the Magic: The Future of AR with Smart Clothing

Discover how gloves, contact lenses, and fiber batteries can revolutionize the way we interact with digital environments.

Unleashing the Magic: The Future of AR with Smart Clothing
Data Analyst looking at graphs and charts with AR - Powered by MidJourney AI

The other day I got to play with the Apple Vision Pro. I have to say that the experience was amazing and like nothing I have ever seen before. It was everything I expected after having watched the reviews online.

But I’m not going to rave about the Vision Pro today. If you read my other post on AR & VR, you should already have a pretty good idea what I imagine it will allow us to do. Also, there’s enough people out there praising the Vision Pro already. Instead, I’ll tell propose some ways in which we can do better. And most of the technology to do it, we either already have or will have in the coming years.

When the XBOX Kinect came out, I bought it because I wanted the Minority Report experience. Waving my hands around and I get to feel like a wizard. At times this worked well, and other times not so much. Kinect Sports and Star Wars Kinect had great integration, but exploring the world from your living room felt limited and immersion ruining. The Kinect still feels like magic to this day.

When I tried the Vision Pro, it felt like it was missing some of the integration that Kinect had. The pinching motion with my hand wasn’t natural to me. It did minimize the motion I had to do, and it was a very comfortable experience. It also made me feel disconnected from the way I interact with objects in the real world, where I reach for things and grab them.

Minority Report had a solution to this already back in 2002. Gloves. And the good news is that both Meta and Apple already hold patents for this. This means we can expect these haptic feedback gloves to hit the markets at some point in the near future.

I would take this a few steps further though. Instead of forcing all the tech into heavy glasses or heavy gloves, we could spread it out using fiber batteries in smart clothing. And keep in mind that flexible batteries have been called out as one of the top 10 emerging technologies by the World Economic Forum this year.

In the future, fiber batteries may be charged through your body-heat, sunlight and kinetic energy. You can't beat this level of efficiency when talking about sustainable tech.

Now, with having turned our t-shirts into fiber batteries, we can now space out computing components rather than having to stack them on top of eachother for the sake of compactness. This in turn should make cooling easier, which in turn would allow for more advanced computing.

Now that we have solved the computing problem, we can just stream information from the clothing to other devices. Introducing smart-contact lenses. Expect these to hit the market hard in the coming years. These can create that augmented reality experience that we see in video-games and movies. Things appear before your eyes out of thin air. The best part is, that we can use body-heat to power them, which makes them far more sustainable!

One of the biggest problems with these contact lenses is figuring out what you're looking at. But with haptic gloves they don't need to. Your clothes would receive a signal from your hands and a signal from your eyes and then orchestrate everything together. Thus the lenses and gloves would only need to to handle sensor information, while the data-processing would take place in the shirt on your back.

The question then becomes, how do we know how objects should feel, and perhaps how easy or difficult it should be to lift virtual objects. Lifting a heavy sword in a videogame should feel heavy, but not so heavy that you can't play the game anymore, unless you're a powerlifting world champion.

But digital physics have been solved for most video-game engines already. At worst, it's a matter of calibration or configuration in the settings menu.

The greater idea is that you want to be able to touch and feel like you're touching a physical object in the virtual space, so you can better imagine the challenges you would face when it's translated into a real-world object. Which would further improve time-to-market and allow engineers to be more creative in their work, while trusting AI to tackle the nitty, gritty details.

And if you are in a smart-environment, you can now operate everything with your voice, eyes and hands.

The world we live in, is looking more and more like magic. And while we understand how this technology works. Go ahead and imagine telling your grandmother that you can sit on the couch and just wave your hand… and some robot now dusts your home and picks up those socks from your son’s bedroom floor.

I don’t know about you, but never would I ever have thought it possible to be living in a world as advanced as we about to live in 5-10 years from now. Even the thought of explaining this to my younger self seems mind-blowing.

What do you think? What does the future hold for us? How can we further improve our digital user experience?

At Aeon Cortex we explore the past and present, and consider how it will impact the future - for better or worse. Subscribe and join the conversation and help us shape a world worth living in!