5 ways to interact with computers in the future

Since the birth of personal computers, the mouse has become a way of interaction between humans and computers. Computers are becoming more and more abundant and mobile, but there is really no change in the way of interaction.

Recently, revolutionary products and inventions have brought a whole new human-computer interaction experience. Under the spur of maximum output, the mouse and keyboard will one day be abandoned by us.

The following embedded technologies may change the way we interact.

Multi touch

The mouse and the notebook’s handwriting board can achieve the purpose of double-clicking the icon and dragging the window by clicking, while the multi-touch can only use simple

Finger movements realize complex commands. The well-known iPhone screen uses this technology: use two fingers to zoom in or out of a picture, or use a sliding gesture to scroll the page.

A series of Apple products ipone, itouch, macbook and the latest product ipad all use this technology extensively, and other manufacturers have begun to follow in Apple’s footsteps. Even Apple’s new “Magic Mouse” is a touch device with gesture recognition. The changes brought by multi-touch to ordinary computers have improved the efficiency of command input: human five fingers replace the only mouse pointer on the screen. When people do multi-touch, the screen is actually locked. R. Clayton Miller invented an interface called 10/GUI to solve the problem of multi-touch. People free their hands on a large mobile device and operate the screen with ten fingers.

On the computer screen, there are ten visible dots representing the ten fingers of the user. When people squeeze or move their fingers, they can open programs and scroll pages.

Gesture sensing

Both the mouse wheel and ipone can perform motion sensing, but only gesture sensing allows motion to be performed in three-dimensional space.

In recent years, Nintendo’s Wii game console has introduced gesture sensing to the masses. Recently, a large number of manufacturers have also introduced many gesture sensing products for gamers.

Bloom Co., Ltd. in Los Angeles is a company that targets desktop users. They have developed a product called “g-speak”. After the user puts on a special kind of glove, he stands in front of the wall-mounted screen and makes a shooting gesture like a policeman, so that images and data can be moved from one screen to another (similar technology is also used in Spear It appeared in Berg’s 2002 film “Minority Report”).

Christian Rishel, CEO of Bloom, believes that this interface can free people from large amounts of data. “If you are submerged in a sea of ​​data, then you have to find the right data at the right time.”

Rishel said that early adopters of this expensive technology include the military and oil companies, but he believes that all computers in five to ten years will use some of the results of this technology.

Beyond the two-dimensional limitations of human-computer interaction, Rishel believes that this technology will make human-computer interaction more efficient and rewarding. “We will fish out the data and paint them on the wall” Rishel said humorously.

Speech Recognition

What if we talk directly to the computer? The concept of speech recognition has been proposed for decades, and a series of software products have also been developed. Most software acts as a transcription machine, and the input speed is often only one-third of the speed of people’s usual speech. According to Naunce, an employee of a Massachusetts company that developed the “Dragon Natural Voice”, their product allows people with physical disabilities to operate computers that cannot use a traditional keyboard or mouse. Peter Mahoney, the company’s vice president and head of the giant dragon project, said: “We already have a core group of customers…. They use our software 100% of the time when they use their computers.”

Peter Mahoney gave us some examples to introduce how “Dragon” recognizes voice signals and executes them. For example, when using Microsoft Word, just say “underline when speaking” and the software can underline the text. Users can interact with the software as long as they pronounce (“new line”) and menu commands (“restore changes”). Just say “Go online” to open the browser, and voice recognition also allows users to select hyperlinks to read. Other applications such as e-mail can also be completed with simple voice commands.

“The voice interface is very flexible, and its application prospects are limitless,” Mahoney said in a telephone interview. “It has much more potential than physical equipment.”

Visual tracking

Since we see what we are going to click, why not just use our eyes to finish it?

Visual tracking technology relies on high-definition cameras and invisible infrared light sources to achieve the purpose of detecting the direction of the human eye. This technique has proved very useful in scientific research and advertising research. But at the same time, this technology is used almost most of the time for the disabled, and the cost is very high. GUIDe (Visual Enhanced User Interface Design) is a research project dedicated to popularizing visual tracking technology. The “Eyepoint” software developed in it allows users to put their hands on a board and focus their eyes on a point on the screen. , The area around this point will be enlarged, and then squeeze the board by hand to let the program run.

After experimenting with the “Eyepoint” software, the experimenters thought that “visualization will make human-computer interaction faster and easier…because they are already looking at the target.” Manu Kumer was the leader of this project at Stanford University a few years ago people.

Kumar said that Eyepoint can also use less wrist pressure than a traditional mouse, but the error rate of “watch and click” is slightly higher than “position and click”. “I firmly believe that this technology will develop to replace the mouse.” But so far, the high cost is still the biggest obstacle to the promotion of visual tracking.

Brain-controlled computer

Just think about it, and the computer can do it for you. This kind of ultimate human-computer integration is faster than you expect, but there are still some potential obstacles to overcome to enter the ordinary consumer market.

Brain Controlled Computer Interaction (BCI) directly reflects the pulses of nerve cells on electronic screens or machine equipment. Like speech recognition, BCI has helped many people with physical inconveniences, such as stroke patients and amyotrophic lateral sclerosis patients. In the past ten years, BCI has helped many patients who cannot move their bodies use computers.

There is a problem that has long plagued the development of commercial BCI for normal people. It is necessary to implant electrodes to provide clear nerve signals to the brain, which may cause infection, body rejection and scar formation. However, other non-implantable scanning techniques such as EEG can be achieved by wearing a cap with electrodes on the cerebral cortex. These techniques have developed rapidly recently.

At the CeBIT exhibition held in Germany earlier this month, Guger Technology showcased their new device “Intendix”, which they called “the world’s first BCI sounder”. There was a picture on the screen containing numbers and letters. The virtual keyboard, if you want to light up a character, then Intendix will detect your brain activity and select the character. The company claims that Intendix can facilitate communication between injured and sick people, and that it only takes a few minutes to use Intendix’s function for blood suction, and that Intendix can recognize 5 to 10 characters per minute. Of course, this is too slow for healthy people, and the price of this equipment is as high as $12,000.

Related research is “neural repair”, which can connect to the human brain and work through brain waves, which may lead to feasible desktop applications.

Regardless of the future of human-computer interaction, it seems that everyone will continue to use the mouse for a long time as in the past.

Leave a Reply