【正文】
that adaptive systems respond to variations in operator workload (Hancock amp。 Lowenthal, 1985。 Riley, 1985。 Parasuraman, 1996). An example of such an approach is that by Pope, Bogart, and Bartolome (1996) and Prinzel, Freeman, Scerbo, Mikulka, and Pope (2020) who used a closedloop system to dynamically regulate the level of engagement that the subject had with a tracking task. The system indexes engagement on the basis of EEG brainwave patterns. Human Performance Modeling. Another approach would be to model the performance of the operator. The approach would allow the system to develop a number of standards for operator performance that are derived from models of the operator. An 中英文資料 example is Card, Moran, and Newell (1987) discussion of a model human processor. They discussed aspects of the human processor that could be used to model various levels of human performance. Another example is Geddes (1985) and his colleagues (Rouse, Geddes, amp。 both are unwanted consequences of today39。s ability to prehend all the implications for crew performance. It is unrealistic to call for a halt to cockpit automation until the manifestations are pletely understood. We do, however, call for those designing, analyzing, and installing automatic systems in the cockpit to do so carefully。 Woods, 1994), transparency。 1991). ? Data overload points to the increase in information in modern automated contexts (Billings, 1997). These characteristics of automation have relevance for defining the scope of human factors issues likely to plague adaptive automation design if significant attention is not directed toward ensuring humancentered design. The human factors research munity has noted that these characteristics can lead to human factors issues of allocation of function (., when and how should functions be allocated adaptively)。 manual skill decay and the “outoftheloop” performance problem。 the design should allow for the coordination between machine agents and human practitioners. However, many researchers have noted that automated systems tend to fail as team players (Billings, 1991。 Sarter amp。 Woods, 1996). The reason is what Woods (1996) calls “apparent simplicity, real plexity.” Apparent Simplicity, Real Complexity. Woods (1996) stated that conventional wisdom about automation makes technology change seem simple. Automation can be seen as simply changing the human agent for a machine agent. Automation further provides for more options and methods, frees up operator time to do other things, provides new puter graphics and interfaces, and reduces human error. However, the reality is that technology change has often resulted in the design of automated systems that are strong, silent, clumsy, and difficult to direct. Woods (1996) stated that these types of systems are “are not team players. The literature has described these as: ? Strong automation refers to automation that act autonomously and possess authority. A number of researchers (Billings, 1997。 Wiener,1989) have noted that increased autonomy and authority creates new monitoring and coordination demands for operators of a system (Woods, 1996。 Sarter amp。 Scerbo, 1994。 Schreckenghost, 中英文資料 1992。 and issues related to the design of automation. This last issue points to the significant concern in the human factors munity of how to design adaptive automation so that it reflects what has been called “teamcentered”。 how adaptive automation will affect mental models, situation models, and representational models。 that is, the flexibility of the system to respond to novel events. ? Clumsiness was coined by Wiener (1989) to refer to automation that reduced workload demands when the demands are already low (., transit flight phase), but increases them when attention and resources are needed elsewhere (., descent phase of flight). An 中英文資料 example is when the copilot needs to reprogram the FMS, to change the plane39。 to avail themselves of present and future guidelines。 1997) examination of the teleology for technology. He suggests that automation shall continue to impact our lives requiring humans to coevolve with the technology。 Gluckman, 1994). Although this method of adaptive automation may be the most accessible at the current state of technology, Bahri et al. (1992) stated that such monitoring systems lack sophistication and are not well integrated and coupled to monitor operator workload or performance (Scerbo, 1996). An example of a mission analysis approach to adaptive automation is Barnes and Grossman (1985) who developed a system that uses critical events to allocate among automation modes. In this system, the detection of critical events, such as emergency situations or high workload periods, invoked automation. Adaptive Automation Human Factors Issues A number of issues, however, have been raised by the use of adaptive automation, and many of these issues are the same as those raised almost 20 years ago by Curry and Wiener (1980). Therefore, these issues are applicable not only to advanced automation concepts, such as adaptive automation, but to traditional forms of automation already in place in plex systems (., airplanes, trains, process control). Although certainly one can make the case that adaptive automation is dressed up automation and therefore has many of the same problems, it is also important to note that the trend towards such forms of automation does have unique issues that acpany it. As Billings amp。 Chignel, 1987). Some criteria for performance would be specified in the system parameters, and the degree to which the operator deviates from the criteria (., errors), the system would invoke levels of adaptive automation. For example, Kaber, Prinzel, Clammann, amp。 Kramer, 1994。 1988。s List) in which an attempt is made to deter