Telepresence

From Star Trek : Freedom's Wiki
Jump to: navigation, search
An example of a telepresence helmet, suitable for a human-like creature.

Telepresence is defined as a biological-computer-machine condition in which a user receives sufficient information about a remote, real-world site through a machine so that the user feels physically present at the remote, real-world site. With telepresence, a user’s physical presence (body) exists at one location (the home site) while his ability to act and interact with a remote, real-world environment occurs at one or more locations (the remote site or sites). Telepresence is achieved by using advanced sensors, communications, remote action, and feedback stimuli to allow personnel to project their presence through remote sites, giving a user the ability to see, hear, touch, taste, and smell those sites. Users with telepresence capabilities will be able to act and receive stimuli just as if they were present at the remote, real-world location.

The Joz use telepresence to direct robotic platforms to conduct high-risk battlefield recovery and maintenance. Telepresence is also used to operate materials-handling equipment and conduct port discharge and depot repair operations. In those cases, operators have greater control over processes without incurring additional safety or manpower burdens within the battlespace.

A telepresence system is required to accomplish telepresence. This system is composed of three essential subsystems (and their related technologies): the home site, the communications link, and the remote site. Telepresence technologies associated with these three subsystems are similar to technologies found in virtual reality except that, in telepresence, a user must have feedback stimuli from, and the ability to exercise control over, the remote site. The ultimate goal of a telepresence system is to produce a transparent link from the user to machine that passes information so naturally between user and environment that the user achieves a complete sense of immersion in the remote-site environment.

To accomplish this, sensory impressions obtained from the remote site and delivered to the home site must engage the senses fully with sufficient breadth, volume, and quality. As an example, consider that you are at a home site viewing a video of a remote-site environment on a theater-sized screen. The video provides only a minor feature of the much larger real-world environment present at the remote site. If a majority of your total vision is subjected to the video image on a large, curved screen and the video image depicts natural human motions, then the visual element becomes perceptually real world. Engaging the other senses similarly and in a synchronous manner will enable you to progress toward the feeling of being fully immersed in the remote, real-world environment. The ideal situation occurs when high-quality, high-resolution, and consistent information is presented to all of your senses.

Telepresence requires a complete biological-computer-machine interface that incorporates audio, visual, haptic (touch), olfactic (smell), and gustatic (taste) technologies with home site elements perceptually identical to remote, real-world elements.

To date, within the Federation the senses of sight and hearing have been the focus of intense research and have formed the core of virtual reality systems because these senses are the most important senses by which we receive information related to our surroundings. The contributions of smell, touch, and taste are not as great, and current technologies for reproducing smell and taste are difficult to implement. One might argue that, compared with the other senses, taste (and perhaps smell) play marginal roles in creating a full immersion experience. Nevertheless, technology progression eventually will enable all human senses to be engaged in the telepresence experience.

The Joz, in contrast have studied the complete immersion process, by which they completely immerse the body in a semiconductive bioreactive gel, which stimulates the nerve endings when the gel interacts with microburst electrical current.In addition, specialized implants transmit some of these signals (in a specific wavelength) into auditory, olfactic or gustatic stimuli as needed.

Aside from the biological-computer-machine sensory interface, communication between the home and remote sites must be “real time” so the user feels that he is indeed in the remote, real-world environment. Home or remote site latency detracts from the realism that telepresence systems seek to achieve. While any communications link may be used by a telepresence system, a faster than light communications method is usually employed. The specific type of link depends upon distance, bandwidth requirements, latency tolerance, availability of services between sites, and so on. The Joz use fractal communications to achieve high fidelity immersion in thier military applications.

Sensory elements received from the remote site are obtained by effectors. An example of an effector is a complex robotic system. Robots controlled by users from remote locations will carry out operations required in the field. Robots must be able to perform myriad tasks with the versatility of their users and, in some cases, with strength exceeding them.

Joz robots have the ability to see and hearand suprisingly do not lack extensive haptic, olfactic, and gustatic sensory elements. They are primarily purpose-built for specific tasks. These advanced robots can perform multiple tasks and have the ability to adapt.

Advanced quantum computation make telepresence a reality. With the ability to preform billions of calculations in milliseconds, the quantum computers used by the Joz make it possible to control complex maneuvers through telepresence..

In the Joz model Full-immersion telepresence is realized with completely noninvasive sensory stimulation directly to the brain. Sensory element information sent from a user to a robot is accomplished using noninvasive brain-computer interfaces. In effect the users have the ability to control robots just by thinking. Finally, configurable, remotely assembling components (robot swarms) have the ability to adapt to remote, real-world environments, thereby enhancing telepresence capabilities and applications.