The hottest human-computer interaction yesterday,

2022-07-29
  • Detail

Yesterday, today and tomorrow of human-computer interaction

what is human-computer interaction? In short, it is the way people deal with machines. The reason why people deal with computers is that people regard computers as an effective computing service tool. Since the computer came into being, the computing power has become increasingly powerful. The human attitude towards the computer has begun to swing between whether the computer is just a dull machine or whether it may become as smart as people or even smarter than people. This vague attitude has evolved with the evolution of computers

the past and present life of human-computer interaction

at present, the development of computer has experienced several typical forms: abacus, relay calculator, computer, personal computer, intelligent mobile terminal, etc. the evolution of computer form has directly led to the change of human-computer interaction form

abacus is a computing tool with Chinese characteristics. Because it cannot store intermediate computing results, the abacus has the defects of complex use and slow computing process. The relay calculator is the rudiment of the present computer, but its calculation speed and practicality are greatly limited

the first real computer came into being in 1946. It covers an area of 170 square meters and weighs more than 30 tons. It is half the size of a football field. It is input and output through perforated paper tape. Early computer interaction was not only complex, but also slow

In November, 1983, jobs of Apple company first demonstrated the Macintosh computer at the Comdex exhibition. Since then, the black screen character interface of personal computer has been gradually replaced by the vivid and personalized graphic interface. Graphical interface makes the mouse and keyboard become powerful tools for users to operate the computer freely, and also makes tactile and pen interaction possible

1971, Samuel hurst, a young teacher of the research foundation of the University of Kentucky, developed a touch sensor accutouch, which has become a wind vane of family fashion elements, and laid a foundation for the development of transparent touch screens. Three years later, Hurst developed the true touch screen

1985 saw the first mobile in a modern sense. It placed the power supply and antenna in a box, weighing up to 3kg. It was cumbersome and inconvenient, making it impossible to operate interactively

1993, with the rapid downsizing and the development of touch screen technology, IBM and bell (South) Co., Ltd. launched ibmsimon with touch screen. This is the first true touch screen intelligence in history. But at this time, the touch screen has no obvious advantages over ordinary keys in terms of cost, durability, maintainability or usage habits. This situation was not completely changed until the iPhone developed by Apple came out. So far, touch has become the mainstream of interaction

since the 1990s, with the development of handwriting recognition, speech recognition and network communication technology, human-computer interaction has undergone great changes. An ordinary child who completely does not understand programming and network communication principles may use computers better than computer scientists and network engineers. The change of mode makes human-computer interaction simple, and this change took place in less than 30 years. Predictably, computers will be more intelligent in the future, and it will be more natural and simple for humans to use them

are robots smarter than humans in the future

as described in some science fiction movies, the future human-computer interaction mode is free, intelligent and natural. White, the warm man intelligent health care robot in the super Marine Corps, Jarvis, the artificial intelligence housekeeper in the artificial touch interaction technology laboratory in Iron Man 2, and the capable, smart and humorous TASS in Star Trek accounts for about 12% of the total energy consumption of the national economy. These future computers understand human language through voice recognition and other technologies, Through the gesture recognition and emotion detection technology to judge people's emotions, and through the non-contact biometric detection and recognition technology to judge the current human physiological activity status, with the help of expert knowledge base, intelligent computers can make decisions and judgments on people's health, current emotions and current status. To some extent, robots in the future will be even smarter than humans. For example, Dabai can obtain pulse and heartbeat data by non touch and detect EEG activity; Jarvis can follow the instructions of the iron man, display the internal model of the atomic nucleus in the air by means of holographic projection, and even correct the slip of the tongue of the iron man

in the future, these healthy and intelligent humanoid computers will make computers more perfect than people. The future of human computer interaction, which has not yet been approved, is bright and exciting

of course, there has never been a lack of works that torture the abuse of scientific and technological achievements by human beings in science fiction works, such as hal500, the steward of deep space exploration and space travel in 2001 Odyssey in space, ash and bishop, the artificial intelligence who only obey the master's task setting in alien, and David, the artificial intelligence who betrayed human beings to complete the task in Prometheus. Such computers violate the three laws of robotics. From the perspective of human moral judgment, they are evil. But they are right in terms of the mission their owners gave them when they made them. It is a question worth pondering about the development of computer intelligence and human-computer interaction

bid farewell to single mode

multimodal natural human-computer interaction is the development trend of the next generation of human-computer interaction. The multimodal interaction mode, which integrates vision, hearing, touch, smell and even taste, has better expression efficiency and information than the single visual or auditory mode

multimodal natural human-computer interaction is mainly divided into three modules, namely information multimodal interactive information input, multimodal interactive information fusion and processing, and multimodal interactive information feedback. Among them, the information input module mainly receives the five senses of audio-visual, touch, smell and taste from people, and then forms the feeling and cognition with the help of the multi-modal information fusion and understanding module, and forms the information feedback to users according to the expert knowledge base system and retrieval technology, so as to build a multi-modal natural human-computer interaction system

from the perspective of the future trend of natural human-computer interaction, the highly convenient multimodal natural human-computer oral dialogue mode is the most natural and ideal human-computer interaction mode. For example, using a multi-modal natural human-computer interaction system built by the Institute of automation of the Chinese Academy of Sciences, users can freely interact with two digital virtual people with different chat topics, including coffee, weather, science popularization, entertainment and other topics. This new multi-channel integrated man-machine dialogue mode is an effective exploration of the next generation of new man-machine interface operation mode

at present, although multimodal natural human-computer dialogue has made great progress, due to the influence of environmental noise, user's dialect accent and illumination change, there are still some errors in the acquisition and understanding of human audio-visual information by computers. In addition, there are still some difficulties in the understanding of natural language by computers, which makes the computer's answer irrelevant when interacting with people. This kind of phenomenon seriously affects the experience of human-computer interaction. The fundamental reason is that although the computing power of the computer is becoming stronger and stronger, it is very difficult for people to reason, associate and exaggerate. Therefore, making computers intelligent or having the ability to think is the key to whether computers can freely interact with people

will computers develop consciousness

with the success of deep neural network learning algorithm in academia and industry in recent years, many people are willing to believe that computers will evolve the same consciousness and thinking ability as people. In 2009, the European Union (Lausanne Institute of technology, Switzerland) simulated the cerebral cortex of 360000 cells with microcircuit chips in the blue brain program. When these microcircuit chip groups were shown or listened to some pictures and sounds of nature, the microcircuit chip groups spontaneously generated gamma waves. Gamma wave is a kind of fast electrical oscillation in the brain, which may involve learning process, memory and attention. This kind of gamma wave brain wave with a cycle of 25~80 times per second cannot be regarded as consciousness, but it is an important feature to verify the existence of consciousness like brain activity

the blue brain program has indeed convinced many scientists and engineers that computers will evolve into consciousness. Some research institutions (even governments) have launched corresponding brain like programs

whether the computer will develop consciousness will bring many possibilities for the development of human-computer interaction

if the computer generates consciousness and develops the robot's own language, vision, emotion and other physiological systems through evolution, it shows that the thinking ability that human beings are most proud of can be designed and evolved according to certain patterns. If robots are created by humans, are humans also designed and evolved

on the same planet, if the evolution ability of robots always exceeds that of humans, humans can make major breakthroughs in deep-sea exploration, deep space alien migration and other fields with the help of fully intelligent computers. So the future human-computer interaction technology mainly lies in the research: how to use artificial intelligence to explore space outwards and how to use micro intelligent robots to enter the human body to serve human health inwards. Of course, because intelligent robots have their own emotional system, human-computer interaction technology may also evolve into a means of cultural exchange. For example, human-computer interaction technology may become the basis for demonstrating whether people can marry robots

if the evolution of artificial intelligence surpasses that of human beings, once robots have the consciousness of life and death and because of resource competition, the common bridge in some science fiction movies will become a reality: the victory of machines over humans will lead humans to become the tools of machines. In this case, human-computer interaction technology will be an important tool to study the means and weapons of human resistance to robots

if the computer can not generate consciousness, it indicates that the human like intelligent dual space hydraulic universal experimental machine in the universe has not been found yet. It is the model with the longest production time, the most applications and the most limitations in China. Perhaps the law of human intelligence itself is a random problem in addition to external conditions. The human-computer interaction technology developed along this direction will be an important tool for human to use computers in industrial control, computing services, medical diagnosis, outer space exploration and so on. Even so, human beings still have a long way to go to explore artificial intelligence

where will human-computer interaction go? I will say that I will always appreciate whatever you look like in the future. (yangminghao, associate researcher of State Key Laboratory of pattern recognition, Institute of automation, Chinese Academy of Sciences)

Copyright © 2011 JIN SHI