Search Sites

Transactions of the Institute of Systems, Control and Information Engineers Vol. 14 (2001), No. 9

ISIJ International
belloff
ONLINE ISSN: 2185-811X
PRINT ISSN: 1342-5668
Publisher: THE INSTITUTE OF SYSTEMS, CONTROL AND INFORMATION ENGINEERS (ISCIE)

Backnumber

  1. Vol. 37 (2024)

  2. Vol. 36 (2023)

  3. Vol. 35 (2022)

  4. Vol. 34 (2021)

  5. Vol. 33 (2020)

  6. Vol. 32 (2019)

  7. Vol. 31 (2018)

  8. Vol. 30 (2017)

  9. Vol. 29 (2016)

  10. Vol. 28 (2015)

  11. Vol. 27 (2014)

  12. Vol. 26 (2013)

  13. Vol. 25 (2012)

  14. Vol. 24 (2011)

  15. Vol. 23 (2010)

  16. Vol. 22 (2009)

  17. Vol. 21 (2008)

  18. Vol. 20 (2007)

  19. Vol. 19 (2006)

  20. Vol. 18 (2005)

  21. Vol. 17 (2004)

  22. Vol. 16 (2003)

  23. Vol. 15 (2002)

  24. Vol. 14 (2001)

  25. Vol. 13 (2000)

  26. Vol. 12 (1999)

  27. Vol. 11 (1998)

  28. Vol. 10 (1997)

  29. Vol. 9 (1996)

  30. Vol. 8 (1995)

  31. Vol. 7 (1994)

  32. Vol. 6 (1993)

  33. Vol. 5 (1992)

  34. Vol. 4 (1991)

  35. Vol. 3 (1990)

  36. Vol. 2 (1989)

  37. Vol. 1 (1988)

Transactions of the Institute of Systems, Control and Information Engineers Vol. 14 (2001), No. 9

Integrated Person Identification and Emotion Recognition from Facial Images

Dadet PRAMADIHANTO, Yoshio IWAI, Masahiko YACHIDA

pp. 421-429

Abstract

Deformation of face caused by its expression change varies among individuals. In this paper we propose an integration of face identification and facial expression recognition to overcome difference of expression change among individuals. A face is modeled as a graph where the nodes represent facial feature points. This model is used for automatic face and facial feature points detection, and facial feature points tracking by applying a flexible features matching. Face identification is performed by comparing the graphs representing the input image and individual face models. Facial expression is modeled by finding the relationship between the motion of facial feature points and expression change using B spline curves. Individual and average expression models are generated and then used to identify facial expressions and the degree of expression changes. The expression model used for facial expression recognition is chosen by the results of face identification.

Bookmark

Share it with SNS

Article Title

Integrated Person Identification and Emotion Recognition from Facial Images

Effect of Input Device on Search Time of Tree Structured Menu

Asako KIMURA, Shusaku KURODA, Hirokazu KATO, Seiji INOKUCHI

pp. 430-438

Abstract

To design user interface of menu selection system based on human kansei and intention, both of two design elements, (1) menu tree structure and (2) device, should be selected properly to show and select given items. This paper investigates and analyzes the effect of both (1) and (2) on user's search processes in a hierarchical menu selection system. The results of the experiments, in which the input user interface and hierarchical menu structure are varied along with models of user behavior show that both of (1) and (2) affect search time and usability.

Bookmark

Share it with SNS

Article Title

Effect of Input Device on Search Time of Tree Structured Menu

A State Graph Model of Negotiation Considering Subjective Factors

Masahide YUASA, Yoshiaki YASUMURA, Katsumi NITTA

pp. 439-446

Abstract

In this paper, for decision support in negotiation through a computer network, we propose a state graph model considering subjective factors. This model shows that participants evaluate proposals by considering not only the utility value but also the opponent's attitude. To verify this model, we examine negotiation in which participants exchange proposals with facial expression. The experimental results show that facial expressions affect opponent's behaviors. In addition, we measured participants' perspiration in the negotiation to identify participant's mental state. The results indicate that the state graph model in negotiation is valid.

Bookmark

Share it with SNS

Article Title

A State Graph Model of Negotiation Considering Subjective Factors

Automatic Motion Generation Corresponding to Music Based on 3D Spring Model

Qi WANG, Naoki SAIWAKI, Shogo NISHIDA

pp. 447-457

Abstract

This paper proposes an approach to automatic motion generation of 3D spring model with music. Music factors are extracted from MIDI (Music Instrument Digital Interface) file. Instead of designing and calculating animation of an object directly, a sequence of motion-controllers, which can finish a set of motions affected by music factors, are automatically generated through a synthesis algorithm. A prototype system is developed to evaluate the motion generator and it is confirmed that the users are satisfied with the generated motion based on music.

Bookmark

Share it with SNS

Article Title

Automatic Motion Generation Corresponding to Music Based on 3D Spring Model

Development of a Physiological Measuring System for the Evaluation of Drowsiness during Driving Vehicle

Yasuyuki NISHIO, Koichi KOJIMA

pp. 458-465

Abstract

Early detection of drowsiness at the wheel is important to prevent traffic accidents. This study is performed by analyzing physiological responses of drivers. EEG, electrodermal activity and blinking are measured in the experiments using driving simulators and those obtained by the driving test on a circuit. An analysis of the pattern changes in skin impedance and blinking is found to be useful for the evaluation of the human arousal level. A system for real-time evaluation based on skin impedance changes is also developed. The system consists of a skin impedance monitor and a personal computer that can be installed in a car. It displays the observed arousal level every 15 seconds, and is a useful tool to evaluate the present arousal level of drivers.

Bookmark

Share it with SNS

Article Title

Development of a Physiological Measuring System for the Evaluation of Drowsiness during Driving Vehicle

You can use this feature after you logged into the site.
Please click the button below.

Advanced Search

Article Title

Author

Abstract

Journal Title

Year

Please enter the publication date
with Christian era
(4 digits).

Please enter your search criteria.