50. Jussi mäkinen: Mixed Reality Surfaces transforming Our Perception
Available now
-
Available now -
The Future is Transparent: How Mixed Reality Surfaces Will Transform Our Perception of Reality by 2035
What if every transparent surface you encounter—your car windshield, the windows of skyscrapers, even the glass walls of your home—could become a portal into an augmented world? What if technology could expand human perception beyond our biological limits, revealing layers of reality we've never been able to see?
Welcome to the future that Jussi Mäkinen and his team at Distance Technologies are building. It's a future where computational optics enhances our vision in ways that seemed impossible just a decade ago. But this transformation also invites us to ask deeper questions: what does it mean to see, to know, and to trust our own perception when technology becomes part of that process?
The Revolution Hiding in Plain Sight
Picture this: It's 2035. You're driving through Helsinki on a foggy winter morning. Your windshield isn't just glass anymore—it's become an intelligent interface between you and the world. Through Distance Technologies' breakthrough in computational optics, the windshield projects navigation arrows that appear to hover directly on the road itself, highlights pedestrians emerging from the mist with gentle outlines, and warns you of black ice forming exactly where your tires will touch the pavement.
But here's what makes this different from every science fiction movie you've seen: you're not wearing anything on your head. No bulky VR headset. No awkward AR glasses. The mixed reality surfaces are simply... there. Seamlessly integrated into the transparent materials that already surround you.
"There will be a future for AR glasses in some specific context," Jussi explains during our conversation. "But if you want to really kind of like accelerate the future, you need to remove these barriers. Like take these barriers away one by one so that you won't need to put a headset on your head or put glasses on your head."
The Magic Behind the Glass
So how does Distance Technologies achieve what seems like technological wizardry? The answer lies in something Jussi calls "infinite pixel depth"—a term that sounds abstract until you understand its profound implications.
Traditional heads-up displays project flat images onto your windshield, like stickers floating in space. Your eyes constantly refocus between the windshield surface and the road beyond, creating fatigue and distraction. But Distance Technologies has solved this through computational optics that sends different images to your left and right eyes.
"We send two different images to your left and right eye," Jussi explains. "So your left and right eye see a different image. They don't see a similar image because we track the user's eyes and the head. And then we have something called computational optics where we can guide the light from a single source display."
The result? Digital information that appears to exist in the physical world itself, at the exact depth where your eyes naturally focus. A warning about a pedestrian appears exactly where the pedestrian stands, not on your windshield. Navigation arrows paint themselves onto the road surface ahead. The boundary between digital and physical dissolves.
The Promise of Expanded Perception
This technology opens remarkable possibilities. Imagine being able to see in conditions where you're currently blind—driving safely through fog, darkness, or blizzards. Picture having thermal vision to detect people or animals before they enter your path. Think about visualizing invisible data flows, electromagnetic fields, or environmental conditions that normally escape human perception.
But as with any transformative technology, expanded perception also raises important questions about design, access, and intentionality.
Who Decides What Enhancement Means?
When mixed reality surfaces become ubiquitous, important questions arise about design philosophy and user agency. If your car windshield can display information, what should it show? Safety warnings, certainly. Navigation assistance, probably. But what about advertisements? Social media notifications? Real-time commentary on the people you pass?
Jussi's approach at Distance Technologies is to provide the canvas, not paint the picture. "We at Distance, we are not building that experience, the OEMs, the car brands will build it," he explains. "We just give them a whole new canvas."
This creates both opportunities and responsibilities. On one hand, it enables innovation—different companies can experiment with different approaches, and users can choose experiences that align with their values. On the other hand, it means the quality of augmented perception will vary dramatically based on who designs it and with what intentions.
The key, perhaps, is maintaining user control. Jussi emphasizes that the technology should be "shy"—present when needed, invisible when not.
The Automotive Gateway
Distance Technologies is starting with automotive applications, and the reasoning is strategic. Cars already have computing power, power supplies, and a contained environment. Plus, the use case is immediately compelling: safer driving through enhanced perception.
"The self-driving experience is actually just as good as how you can see what the car sees in front of you," Jussi observes. "So if you go into self-driving mode now, for example, with my car, or it's not self-driving, it's more like automatic cruise control. So that you can see in the display, like, what are the cars around this car? So you see what the car sees."
Currently, that information lives on a separate dashboard screen—disconnected from your natural field of view. But imagine if the car's sensor data—cameras, LIDAR, thermal imaging—could be visualized directly on the windshield, overlaid precisely onto the real-world objects they're detecting.
"The more trust you have in your self-driving car, I think it is correlated to how you trust the car and how you can see what the car sees," Jussi explains. "So if we visualize what the car sees in front of you, in front, in, in, on top of the reality, you actually start to trust the car much more."
This isn't just about better interfaces. It's about merging human perception with machine perception. And that opens fascinating possibilities: as we learn to see through the car's sensors—to perceive thermal signatures, to process LIDAR depth maps, to visualize AI predictions—we might be expanding human consciousness in ways previously only imaginable. We're augmenting our biological senses with technological ones, creating a hybrid form of awareness that could help us navigate increasingly complex environments more safely and effectively.
The Defense Connection: Seeing Through Darkness
The implications become even more stark in defense applications. Distance Technologies has showcased technology that allows military vehicle operators to drive at night as if it were daylight—by merging infrared camera data with LIDAR depth sensing and projecting it onto the windshield through computational optics.
"We had an infrared camera, a low light camera basically, and we capture that environment in stereo mode," Jussi describes. "So we have two infrared cameras and they create this kind of infrared map of the image in front of you."
But here's what's crucial: the system combines depth data with thermal imaging to create a three-dimensional representation of the world at night. Each pixel isn't just colored—it's positioned at its actual distance from the viewer. The result is XR night driving, where darkness becomes transparent.
For soldiers, this is a life-saving capability. But extrapolate this technology into civilian life by 2035 or 2045. What happens when seeing through darkness, seeing through walls, seeing heat signatures—capabilities once limited to military specialists—become available through the mixed reality surfaces that surround us?
Are we prepared for a world where vision itself becomes a technology that can be upgraded, enhanced, and democratized? Where a person with computational optics could see endangered species in a forest at night to protect them, detect structural weaknesses in buildings to prevent disasters, or visualize air quality to make healthier decisions? The same technology that could be misused also holds tremendous potential for environmental protection, public safety, and accessibility for people with visual impairments.
The Philosophy of Manufactured Perception
Here's where we need to confront something profound: we already live in a reality where we perceive only a minuscule fraction of what's happening around us.
Human vision captures only a narrow slice of the electromagnetic spectrum. We can't see radio waves, infrared, ultraviolet, or X-rays—yet these phenomena surround us constantly. We can't perceive the quantum fluctuations underlying material reality, the complex data flows connecting our digital devices, or the vast majority of sensory information that other species detect routinely.
In a very real sense, we're already living in a filtered, limited version of reality. Our biological sensors constrain what we can know.
Mixed reality surfaces and computational optics promise to expand those constraints. Thermal vision. Enhanced depth perception. Real-time data visualization. Information overlays that bring invisible patterns into view. These technologies don't just augment reality—they expand the bandwidth of human consciousness.
But—and this is critical—someone programs what information becomes visible. Someone decides which data streams get filtered and displayed. Someone owns the algorithms that translate sensor inputs into human-perceivable overlays.
This is where Distance Technologies' philosophy becomes important. Jussi emphasizes that they're not building the experience themselves—they're providing a platform for others to build upon.
"We at Distance, we are not building that experience, the OEMs, the car brands will build it," he explains. "We just give them a whole new canvas."
In other words, Distance Technologies creates the technological foundation—the ability to turn transparent surfaces into mixed reality interfaces—but the car manufacturers, building owners, and other stakeholders decide what appears on those surfaces.
This is simultaneously liberating and concerning. Liberating because it enables diverse applications and innovation. Concerning because it fragments control over perception into countless corporate and governmental hands.
Brand-Differentiated Realities
One of the most fascinating implications Jussi mentions almost casually: each car brand might create its own visual experience.
"I think the future of driving experience is not anymore like how you steer the wheel or push the gas pedal, but what kind of things and how do you see the world through the car."
Think about that. Not just "driving a Volvo" versus "driving a BMW," but experiencing the world through Volvo's perceptual framework versus BMW's. Each manufacturer curating a different overlay of information, emphasizing different aspects of the environment, interpreting sensor data through different algorithmic lenses.
This brand-differentiated reality could be exciting—imagine choosing a car partly based on how it helps you see and understand the world. One brand might excel at environmental awareness, another at social connectivity, another at pure safety. It's personalization at the perceptual level.
The key question becomes: how do we preserve interoperability and shared understanding while allowing for diverse design approaches?
Beyond Automobiles: Every Surface Becomes a Portal
Jussi's vision extends far beyond windshields. He sees potential in every transparent surface we encounter.
"If I look around me in this office that I am, I see a lot of transparent surfaces behind me, glasses looking at the other world," he reflects. "I see an opportunity when I look at transparent surface and I see that the future of mixed reality shouldn't be bound to a form factor like glasses that are on your head. They could be anywhere around."
Imagine walking through a city in 2045 where every building facade can become an interface. Emergency evacuation routes projecting from walls during a crisis. Historical reconstructions overlaying ancient architecture onto modern streets. Real-time environmental data visualizing air quality as you walk. Social information bubbles identifying friends in crowds.
The mixed reality surfaces aren't just in vehicles—they're woven into the infrastructure itself.
During our conversation, Jussi mentions an intriguing possibility: sensors embedded in city infrastructure communicating with vehicles to enable richer augmented experiences. Street markers that cars can read to enter autonomous mode. Building facades that project information visible only from certain angles or to certain vehicles.
We're talking about a hybrid physical-digital environment where the city itself becomes an active participant in shaping perception.
The Career Opportunities of Tomorrow
This technological shift will create entirely new professions and career paths. Here are some opportunities emerging as mixed reality surfaces become ubiquitous:
• Augmented Reality Experience Designers for Automotive Brands Creative agencies that design how each car brand "sees" the world through its windshield. They make sure a Volvo feels like a Volvo whether you're driving in Stockholm or Tokyo—same values, different cities. Example: A Volvo windshield in Stockholm highlights Nordic design and sustainable routes, while the same model in Tokyo emphasizes precision and urban efficiency—different contexts, same Scandinavian soul.
• Spatial Filmmakers & "Surface Cinema" Directors Content creators who make films and experiences designed specifically for transparent surfaces—not traditional screens. They understand how to tell stories that blend digital imagery with the physical world you're looking at. Example: Museums with glass walls that show historical reconstructions—walk past a window and watch ancient Rome materialize over the modern plaza outside, or see Shakespeare's Globe Theatre appear where you're standing.
• Smart Surface Material Engineers Companies making the next generation of glass and transparent materials specifically designed to work with these new display technologies. These aren't just regular windows—they're "smart surfaces" built from the ground up for mixed reality. Example: Glass that's embedded with special layers to make projections sharper and clearer, or windows that can automatically adjust how much light they let through based on what's being displayed.
• Perceptual Safety Consultants Experts who test how much information people can handle on augmented surfaces without getting overwhelmed or distracted. They figure out the sweet spot between helpful and too much. Example: Testing how many navigation arrows can appear on a windshield before drivers start making mistakes, or designing systems that make sure emergency warnings always catch your attention no matter what else is displayed.
• Urban Mixed Reality Architects City planners who design buildings and public spaces where the glass surfaces tell stories, provide information, or help during emergencies. They turn entire cities into interactive environments. Example: During a fire, every glass surface in a building automatically shows personalized escape routes to each person, or museum districts where windows display what the neighborhood looked like 200 years ago.
• Vehicle Social Network Architects Developers creating the "social media of autonomous vehicles"—platforms where cars interact with each other, check into locations via sensors, charge at "feeding stations," work autonomously for their owners, and build reputations that affect their market value. Example: Your autonomous Tesla works as a rideshare while you sleep, earns five-star ratings from passengers, "levels up" its insurance tier, and increases its resale value—while sharing road hazard data with other vehicles in real-time through a decentralized network.
These aren't science fiction careers—they're logical extensions of what happens when computational optics becomes infrastructure. Entrepreneurs, executives, and professionals paying attention now will be positioned to lead these emerging fields.
The Question of Agency and Design
This brings us to important considerations about how these technologies develop: as mixed reality surfaces proliferate, how do we ensure they serve human flourishing?
The challenge isn't just technical—it's about values embedded in design. When computational optics can reveal or conceal information, emphasize certain data while de-emphasizing others, the design choices become profoundly important.
But there's also tremendous potential for empowerment. Imagine local communities deploying mixed reality surfaces to visualize environmental data that corporations might prefer to hide. Citizen groups using transparent interfaces to project information challenging official narratives. Artists transforming public spaces through layered visual experiences.
The technology itself is neutral. The question is: will it democratize perception or concentrate control over it?
Jussi seems aware of these tensions. When discussing the learning curve for older drivers adapting to augmented windshields, he emphasizes that the technology should be "shy"—present when needed, invisible when not.
"Our technology or the visuals is not supposed to be immediately like, you know, it's full of graphics," he explains. "On the contrary, opposite of that one, it needs to be shy. It needs to be so shy that, you know, when it's needed, it's there. But when you need it, then you can kind of like control it."
The principle here is important: augmentation serving human intention rather than overwhelming it. Information available on demand rather than forcibly imposed. The user maintains agency over their perceptual experience.
This design philosophy—shy technology, user control, augmentation that respects human perception rather than dominating it—points toward a more hopeful path forward.
What We Already Don't See
Perhaps the most mind-expanding insight from thinking about mixed reality surfaces is how they highlight what we already don't perceive.
Right now, in the space around you, countless phenomena are occurring that your senses simply cannot detect. Radio communications. Infrared heat signatures. Microscopic biological activity. Quantum superpositions. Electromagnetic fields. Chemical gradients in the air. Ultrasonic sounds. The data packets flowing through nearby devices.
You exist in a vastly richer reality than the thin slice your biology lets you perceive. And you've adapted to this limitation so thoroughly that you rarely even think about it.
Computational optics and mixed reality surfaces offer the possibility of expanding that perceptual bandwidth—of revealing layers of reality that have always existed but remained invisible. In that sense, this technology doesn't create artificial realities so much as unveil hidden ones.
The infrared night vision that Distance Technologies demonstrated for military vehicles? It's not inventing heat signatures. It's simply making visible the thermal radiation that was always there, that nocturnal animals perceive routinely, but that human eyes cannot detect.
This reframing is crucial. These technologies don't necessarily distance us from reality—they might actually bring us closer to it, by compensating for the limitations of human sensory biology.
The Path Forward: Barriers and Possibilities
Distance Technologies faces significant challenges in bringing this vision to life. Manufacturing at scale. Integration with existing automotive systems. Regulatory approval for augmented driving. Competition from established players and tech giants.
But Jussi is optimistic, drawing from his decade of experience at Varjo building VR headsets with human-eye resolution.
"The good thing is that we have 10 years of experience optimizing latencies for mixed reality headsets and so on," he notes. "So we know a thing or two about how to manage that kind of latency."
The company is focusing initially on defense and automotive because those industries have clear use cases, available computing infrastructure, and willingness to invest in cutting-edge technology. But the long-term vision is broader: ubiquitous mixed reality surfaces that transform how we interact with built environments.
"I do believe in a future where the self-driving experience is actually just as good as how you can see what the car sees in front of you," Jussi reflects. And then, expanding further: "The next world might be that, you're, you know, in a five decade or so you're strolling around the city and transparent surfaces like around me are getting augmented without you needing to actually put anything on your head."
Consciousness, Perception, and Existence
This brings us full circle to the deepest questions. If reality is, as James Glattfelder suggested in our previous conversation, fundamentally constructed from information, then computational optics represents an evolution in how that information reaches us.
Of course, information has always been filtered, curated, and shaped—by media, by education, by cultural narratives, by our own cognitive biases. What's different now is that computational optics makes this process more immediate and seamless. The mediation happens not on a separate screen you consciously choose to look at, but overlaid directly onto your primary field of view. The line between "looking at information" and "looking at the world" blurs.
Are mixed reality surfaces tools for enhancing perception—or infrastructure for constructing new forms of reality?
The answer might be: both. And that's what makes this moment in history so crucial.
We're developing technologies that can expand human awareness beyond biological limits, revealing dimensions of reality previously hidden. But we're also creating systems that could be used to filter, manipulate, and control what populations perceive as true.
The same infrastructure that enables us to see through darkness could blind us to inconvenient truths. The same computational optics that enhance our ability to navigate complex environments could simplify them in ways that serve particular interests.
Understanding these technologies—how they work, who controls them, what they reveal and what they obscure—becomes a form of epistemic self-defense. A way of maintaining agency in a world where perception itself becomes programmable.
The Choice Ahead
By 2035, if Distance Technologies and similar companies succeed, we'll inhabit a world where transparent surfaces routinely augment our perception. Car windshields will overlay navigation and safety information. Building windows will display environmental data and interactive interfaces. Perhaps even glasses in our homes will transform into portals for remote communication or entertainment.
The question isn't whether this future arrives—the technology is already being developed. The question is: how do we shape it?
Can we build systems that prioritize transparency about what information is being displayed? That give users genuine agency—the ability to filter, customize, or disable augmentation? That ensure diverse perspectives and competing approaches rather than monopolistic control?
The exciting part is that we're still early enough in this technological evolution to influence its direction. Jussi's vision—removing barriers between human vision and computer vision, making augmentation seamless and ubiquitous—opens tremendous possibilities. It could help us see environmental damage in real-time, navigate cities more safely, understand complex systems more intuitively, and expand human consciousness in ways we're only beginning to imagine.
The challenge is ensuring these capabilities develop in ways that empower people rather than just serving narrow interests.
Opening Our Minds to New Ways of Seeing
Perhaps the most valuable lesson from contemplating mixed reality surfaces and computational optics isn't about the technology at all. It's about recognizing that our everyday perception is already limited, filtered, and constructed.
We perceive a narrow slice of electromagnetic radiation as "light." We interpret chemical molecules as "smells" and "tastes." We translate pressure waves into "sound." Our brains constantly edit our sensory experience, filling in blind spots, filtering out inconsistencies, constructing a stable model of reality from fragmentary inputs.
In a profound sense, we already live in augmented reality. The augmentation just happens biologically rather than technologically.
Technologies like those Distance Technologies is developing make this reality explicit. They externalize the process of selective perception, turning it from unconscious biology into visible infrastructure. And in making it visible, they give us the opportunity to question, understand, and potentially influence how reality is constructed.
Maybe that's the real revolution. Not the technology itself, but the consciousness it provokes about the nature of perception, reality, and truth.
As we move toward 2035 and beyond—toward Jussi's vision of cities where every transparent surface can become an interface, where digital and physical seamlessly merge, where human and machine perception blend—we're not just adopting new tools. We're confronting fundamental questions about what it means to see, to know, to trust our senses, and to share a common reality with others.
The mixed reality surfaces being developed today will shape the perceptual landscape of tomorrow. Understanding them, questioning them, and participating in decisions about their deployment isn't just about technology adoption. It's about the future of human consciousness itself.
____
This article is based on the Mizter Rad Show episode #50 featuring Jussi Mäkinen from Distance Technologies and was polished by AI.
Listen to the full conversation with Jussi Mäkinen on the Mizter Rad Show, where we explore the future of perception, the technology of computational optics, and what it means to see the world through augmented surfaces.
Stay curious, question everything, and maybe, just maybe, start paying attention to the transparent surfaces around you—because soon, they'll be paying attention back (if they have not started to listen already).
Mizter Rad