References
- R. G. Hanumansetty, Model based approach for context aware and adaptive user interface generation, Ph.D. Dissertation, Computer Science, Virginia Tech, Virginia, 2004.
- M. A. Goodrich and A. C. Schultz, Human-robot interaction: A survey, Found. Trends(R) Human-Computer Interact. 1 (2007), no. 3, 203-275.
- B. Larochelle et al., Establishing human situation awareness using a multi-modal operator control unit in an urban search and rescue human-robot team, Proc. IEEE Int. Work. Robot Hum. Interact. Commun., Atlanta, GA, USA, July 31-Aug. 3, 2011, pp. 229-234.
- J. M. Riley et al., Situation awareness in human-robot interaction: Challenges and user interface requirements, Human-Robot Interact. Futur. Mil. Oper. CRC Press, Burlington, VT, USA, 2010, pp. 171-192.
- M. Hou et al., Advances and challenges in intelligent adaptive interface design, Human-Machine Syst. Des., Wiley, Hoboken, NJ, USA, 2015, pp. 369-424.
- M. Hou et al., Optimizing operator-agent interaction in intelligent adaptive interface design: A conceptual framework, IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 41 (2011), no. 2, 161-178.
- F. Fortmann and A. Ludtke, An intelligent SA-adaptive interface to aid supervisory control of a UAV swarm, IEEE Int. Conf. Ind. Informat., Bochum, Germany, July 29-31, 2013, pp. 768-773.
- D. Donath, A. Rauschert, and A. Schulte, Cognitive assistant system concept for multi-UAV guidance using human operator behaviour models, Humous'10, Toulouse, France, Apr. 26-27, 2010.
- G. R. Arrabito et al., Human factors issues for controlling uninhabited aerial vehicles: Preliminary findings in support of the Canadian forces joint unmanned aerial vehicle surveillance target acquisition system project, Technical Report 2009-043, Defence R&D Canada, Toronto, CA, 2010, available at http://pubs.drdc.gc.ca.
- P. A. Akiki, A. K. Bandara, and Y. Yu, Adaptive model-driven user interface development systems, ACM Comput. Surv. 47 (2014), no. 1, 9:1-9:33.
- S. Rowe and C. R. Wagner, An introduction to the joint architecture for unmanned systems (JAUS), Ann Arbor 1001 (2008), 48108.
- M. Ilbeygi and H. Shah-Hosseini, A novel fuzzy facial expression recognition system based on facial feature extraction from color face images, Eng. Appl. Artif. Intell. 25 (2012), no. 1, 130-146. https://doi.org/10.1016/j.engappai.2011.07.004
- J. L. Franke et al., Holistic contingency management for autonomous unmanned systems, Proc. AUVSI's Unmanned Syst. North Am., 2006.
- Q. Limbourg, USIXML: A user interface description language supporting multiple levels of independence, Eng.Adv.Web Applicat.: Proc. Workshops connection Int. Conf.Web Eng. (ICWE 2004), Munich, Germany, July 28-30, 2004, pp. 325-338.
- J. Guerrero-Garcia et al., A theoretical survey of user interface description languages: Preliminary results, Latin American Web Cong.- Joint LA-WEB/CLIHC Conf., Merida, Mexico, Nov. 9-11, 2009, pp. 36-43.
- R. P. Guidorizzi, SpringerSecurity: Active authentication, Soc. Robot., pp. 452-459.
- M. Abramson, Cognitive fingerprints, AAAI Spring Symp. Series, Palo Alto, CA, USA, Mar. 23-25, 2015.
- K. Jensen and G. Rozenberg, High-level Petri nets: theory and application, Springer Science & Business Media, Berlin Heidelberg, 2012.
- K. Jensen, L. M. Kristensen, and L. Wells, Coloured Petri nets and CPN tools for modelling and validation of concurrent systems, Int. J. Softw. Tools Technol. Transf. 9 (2007), no. 3-4, 213-254. https://doi.org/10.1007/s10009-007-0038-x
- K. Jensen and L. M. Kristensen, Colored Petri nets: A graphical language for formal modeling and validation of concurrent systems, Commun. ACM 58 (2015), no. 6, 61-70. https://doi.org/10.1145/2663340
- G. D. A. Brown, I. Neath, and N. Chater, A temporal ratio model of memory, Psychol. Rev. 114 (2007), no. 3, 539-576. https://doi.org/10.1037/0033-295X.114.3.539
- M. R. Endsley, Measurement of situation awareness in dynamic systems, Hum. Factors 37 (1995), no. 1, 65-84. https://doi.org/10.1518/001872095779049499
- S. G. Hart, NASA-task load index (NASA-TLX); 20 years later, Hum. Factors Ergon. Soc. Annu. Meting, 50 (2006), no. 904-908.
- M. Hou and R. D. Kobierski, Intelligent adaptive interfaces: summary report on design, development, and evaluation of intelligent adaptive interfaces for the control of multiple UAVs from an airborne platform, DRDC-TORONTO-TR-2006-292-Technical Report, DRDC, Tornto, Canada (2006). 2006.
- T. Chen, Management of multiple heterogeneous unmanned aerial vehicles through transparency capability multiple heterogeneous unmanned aerial vehicles through transparency capability, Ph.D. Dissertation, Queensland University of Technology, Australia, 2016.
- J. J. Roldan et al., Multi-robot interfaces and operator situational awareness: Study of the impact of immersion and prediction, Sensors 17 (2017), no. 8, 1720:1-1720:25.
- C. Fuchs et al., Adaptive consoles for supervisory control of multiple unmanned aerial vehicles, Int. Conf. Human-Computer Interact., vol. 8007, Springer, Berlin, Heildelberg, 2013, pp. 678-687.
- A. Rauschert and A. Schulte, Cognitive and cooperative assistant system for aerial manned-unmanned teaming missions, NATO Res. Technol. Agency, Hum. Factors Med. Panel, Task Gr. HFM-170 Superv. Control Mult. Uninhabited Syst. Methodol. Enabling Oper. Interface Technol. RTO-TR-HFM 170 (2012), 1-16.
Cited by
- Sensor Data Acquisition and Multimodal Sensor Fusion for Human Activity Recognition Using Deep Learning vol.19, pp.7, 2018, https://doi.org/10.3390/s19071716
- Zero-Shot Human Activity Recognition Using Non-Visual Sensors vol.20, pp.3, 2018, https://doi.org/10.3390/s20030825