Home Technology Lenovo and How ‘Star Trek: The Next Generation Got the Holodeck Wrong

Lenovo and How ‘Star Trek: The Next Generation Got the Holodeck Wrong

44
0
SHARE

LikeLike many of you, I was fascinated by the concept of holodecks at the time “Star Trek: The Next Generation” (TNG) was released. This was the standard for the next metaverse or 3D web-based implementation for many of us.

Holodeck was a photorealistic virtual world based on the idea of hard light. This can render objects from the light (a phenomenon) to entertain the crew. A spaceship that spent long periods away from the port of its origin would require some leisure. Submarines from the Soviet Union had enormous pools for the same reason.

Although not new in concept, creating a digital reality wasn’t fully realized beyond games predominantly used to entertain. Although there were simulations designed for more practical reasons, such as military training, that date back decades ago, only a tiny fraction of people had ever encountered these. This and the likely significant cost limitations prevented the show from utilizing the technology to where it could be. The apparent error is evident as we explore ways to recreate holodeck-like experiences.

Let’s examine how TNG got Holodeck technology in the wrong direction or, at the very least, didn’t use it in the way that would occur in the real world. We’ll conclude with my top product of this week, a smartphone and smartwatch from Gabb Wireless that will keep your children, and perhaps older adults, much safer.

Simulation-to-Interface Optimization

The issue with TNG’s holodeck technology wasn’t apparent to me when I was watching the show at first or later. It was evident when listening to the keynotes of Lenovo’s Virtual Tech World event last week.

Lenovo could have the most current tools to investigate commercial interfaces and metaverse-like structures. The company showcased strong relationships with the core technology providers to aid the company in areas where mixed reality is utilized, for instance, in holodeck-like video conferencing based on VR. Contrary to Meta’s initial prototypes, these offerings appear to have legs.

Lenovo’s tools come with various conference/huddle room solutions that combine improved avatars, such as one in which they scan participants’ live using a 3D scanner, creating an immersive experience that is more like a holodeck than those made by Facebook using cartoon-like characters.

This is a little like an online medical practitioner from “Star Trek: Voyager,” wearing a unique badge is not just a way to leave any place with emitters in holo form, but leave Voyager.

In several episodes from the two series, there were instances of being able to recreate an interface for the control and bridge on the various ships and trick the viewers into believing they were not in the holodeck.

What’s the problem?

So, if you can create anything using light, even people, Why wouldn’t you want fixed interfaces on your vessel, or why would you limit yourself to a crew of living?

How the Metaverse Could Change Human-to-Machine Interfaces

We’ve frequently discussed how the massive AI revolution will remove the need to master technology-based tools. Similar to how we’ve observed using AI-based writers and artists, users need to be able to describe what they’d like to achieve to receive the desired result. If they’d like to write a piece on a particular subject, they write a summary of the task, and AI produces the paper. They can also describe the things they’d like to see conveyed in a visual format, and, in turn, AI creates it.

Fast through hundreds of years to the era of “Star Trek” stories.

Does this not mean that human-machine interfaces throughout the Enterprise will be based on complex light and dynamically adapt to meet the user’s and particular situation’s needs? And possibly redundant because the AI already did most of the work that the crew is doing on its own?

Physical Drones vs. Hard Light Human Digital Twins

“Star Trek: Discovery” recently featured drones as a tool. It did include an android known as Data; however, what is the point of having substantial staffing levels of an alien spaceship if you could make digital humans that can’t be distinguished from humans?

Furthermore, suppose it is possible to make complex objects in virtual reality. Why shouldn’t you be able to have control interfaces that respond to the environment instead of being fixed? Furthermore, suppose you can put the crew anywhere and in every position. Why put them in a precarious position within the ship’s skin on the upper deck instead of in a centrally armored place inside the vessel?

I’m making this point because often when using the advent of new technology, we initially imitate how we used to conduct things. In time we move away from these obsolete models and eventually build on the most recent technology. When we get into the metaverse, we discuss the concept of digital twins. However, what happens if you only require the twin but don’t require the physical device?

For example, if you’re working from The metaverse, interaction may be anything you think it to be, as long as your body could be sufficiently helpful. There is no need to design a cubicle, office, or eC. They could all be rendered using metaverse technologies. Will bring what you’ve created in the metaverse to objective reality.

Imagine you’re writing a report or attending a conference researching or even designing new products. In this scenario, metaverse technology can offer you options you don’t find in real life and a more accessible interface to other technologies, such as 3D printers that create what you imagine and make them into reality.

In your daily life, you may reside in a tiny home. However, in the metaverse, you live in a massive digital mansion that needs less attention than a real one. You could complete a remodel by describing the things you would like to change There is no need for contractors, no costs, and no long-term problems due to the outcome.

LEAVE A REPLY

Please enter your comment!
Please enter your name here