亚洲免费av电影一区二区三区,日韩爱爱视频,51精品视频一区二区三区,91视频爱爱,日韩欧美在线播放视频,中文字幕少妇AV,亚洲电影中文字幕,久久久久亚洲av成人网址,久久综合视频网站,国产在线不卡免费播放

        ?

        Empathizing with Emotional Robot Based on Cognition Reappraisal

        2017-04-09 05:52:51XinLiuLunXieZhiliangWang
        China Communications 2017年9期

        Xin Liu, Lun Xie,*, Zhiliang Wang

        1 School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing 100083, China

        * The corresponding author, email: xielun@ustb.edu.cn

        I. INTRODUCTION

        Nowadays, some intelligent robots have been designed for natural human robot communication and used in early education, autism’s emotion therapy, mental health service and so on. Thus, robot’s intelligence (such as cognition, emotion, personality and so on) not only needs to meet the interacting person’s behavior requirements but also wants to satisfy their psychological needs, and the development trend of emotional intelligent robot is irreversible[1]. Robot will evolve into intelligent agents with anthropomorphic and diversified emotions, and even will begin to communicate with empathy[2].

        Several valued and far-reaching approaches about emotional model have been proposed in the robot’s emotional intelligence research field. Based on the facial expressions, Ekman proposed six prototypical emotions organized in discrete model[3] , and this analogous approach was followed by several authors[4-8].Velásquez also put forward an emotion-based control for autonomous robot. In his research,six prototypical emotions (anger, fear, sorrow,happiness, disgust, and surprise) were imple-mented with innate personality and acquired learning[9]. Unlike discrete models, emotional space models treat the operation range of emotions as a continuous multidimensional space where each point represents an emotion and each dimension represents a fundamental property common to all emotions. In order to show the emotional properties, the classical 3D emotional space, pleasant/unpleasant,excitement/depression, tension/relaxation was used by Wundt and a large number of emotional dimensional theories have been proposed over the years[10-14]. One of the most accepted theories is Pleasure-Arousal-Dominance (PAD) space[15]. Hollinger,Becker-Asano et al. also used and developed PAD space to determine artificial emotions for social robot[16,17]. Miwa constructed the 3D psychological vector space, Arousal-Pleasant-Certain, with machine learning, dynamic regulation and personality[18]. In addition,Breazeal’s Arousal-Valence-Stance (AVS)space with the social robot Kismet was a noteworthy emotional space model[19].

        Based on the robot mechanical platform with 13 degrees of freedom, this paper adopted a multijoint combined drive to enhance motion joints’ collaborative capability, and the posture of arbitrary direction/precision was achieved by PWM two-way control.

        According to Hoffman, empathy is “an affective response more appropriate to another’s situation than to one’s own[20].” Empathy depends on many factors, such as social context,culture and behavior characteristic (e.g. facial expression, gesture and voice pitch etc.) in the interaction[21]. So Engen believed that “empathy is an important contributor to natural social interaction, allowing us to predict and understand others’ behavior and react accordingly[22].” D’Amvrosio pointed out that we can feel empathy from two parts: cognitive empathy and affective empathy [23]. Adam Smith wrote in the Theory of Moral Sentiments that empathy is composed of the understanding of others and the corresponding emotional responses[24]. Thus, emotional communication can make social robots more attractive and make human feels empathy from their imagination. Nowadays,an increasing number of scholars deemed that empathy is a complex fusion formed by cognition and emotion, and it is needed to understand and share emotions with each other[25]. Andreea Niculescu et al. created two robot characters Olivia and Cynthia to evaluate the importance of empathy from humor expression in human robot interaction[26]. Vanessa Evers et al. discussed the effects of an environmental monitoring robot’s empathy based on touch behaviors[27].Angelica Lim et al. presented a model which suggests that empathy is an emergent behavior and implemented this model in a multimodal emotional intelligence robot[28]. Luisa Damiano et al. dealt with contemporary emotional and empathic robot (e.g. cognitive robot, affective developmental robot, epigenetic robot,assistive robot, social robot and rehabilitation robot, etc.) for supporting the development of human-robot ecologies[29]. These related works suggested that with empathy can help robot recognize interacting person’s emotions and response appropriately. The capacity of personalized cognitive analysis and emotional regulation is an essential part in human-robot interaction.

        This paper proposes a cognitive emotional regulation method in the 3D active field state space for solving the problem of cognition deficits and emotion deficiency in the human-robot emotional interaction. First, Gross cognitive strategy[30] is introduced to the domain of affective computing, yielding efficient computational models of guiding cognitive reappraisal. Second, the paper show the key methods. A dynamic settings considering the robot’s cognition, current emotion and external stimulus with the aim of describing the transition probability among the emotions are constructed on HMM for implementing a personalized emotional regulation in emotional space, and then an observational emotional behavior sequence for robot expression is yielded by HMM. Moreover, the robot with 13 degrees of freedom produces these emotional behaviors (facial expressions, upper limbs and chassis) with the response suppression strategy in Gross emotional regulation. Finally, the emotion regulation model will be assessed in a human-robot experiment.

        The rest of this paper is organized as follows: Section 2 discusses the mechani-cal structure of emotional robot. Section 3 presents a guiding cognitive reappraisal and then defines the emotional space and related state transition process namely the cognitive emotional regulation model. Section 4 shows experimental design, results and discussions.The conclusion and direction for future work is offered in Section 5.

        II. EMOTIONAL ROBOT’S MECHANICAL DESIGN

        The emotional robot mechanical structure matches the proportion of adults’ torso and limb[31]. As shown in figure 1, there are 13 degrees of freedom and 100 kinds of facial expressions. Therefore, this robot could use a large number of behaviors and facial emotional expressions to communicate with the interacting person in real time.

        Robot’s face is a 7 inch Liquid Crystal Display (LCD) for expressions’ display. Robot’s upper body is composed of 10 motors and many connecting pieces. Robot adopts multijoint combined drive to enhance the collaborative capability among motion joints, and the posture of arbitrary direction/precision is completed by PWM two-way control technique under the ModBus communication protocol.Emotional robot’s chassis is designed with three omni-wheels used completely symmetry design. So the friction of the chassis’s geometric center is nearly equal, and robot could walk in a straight line at arbitrary direction for simplifying turning path and improving its flexibility and controllability[32]. The robot’s design supports obstacle avoidance and path planning.

        III. COGNITIVE EMOTIONAL REGULATION

        3.1 Spatial Description of Emotion

        Unlike traditional finite states, the emotional regulation process is continuum for making robot vitality and humanization. Robot is free to change its emotion in its emotion space S.An emotionis considered as a spatial location at time t in the active field state space.The active field state space is a continuous emotional space where each emotion is considered as an energy source and produces a field. In particular, the robot’s emotion and the detected interacting person’s emotions can be represented in this field space and drive the emotion shift in the robot. Psychodynamics proposed by Freud[33] considers that emotion is driven by internal and external forces and each state has the corresponding energy. The energy is determined by particle’s potential energy in the active field state space. In our human-robot interaction, interacting person’s expression is acquired by camera on the robot, and we call it input expression. Based on Ekman’s emotion theory, the input expression is mapped into emotional space by facial expression recognition and corresponds to one of seven emotional categories (anger, disgust,fear, happiness, sadness, surprise, calm). This emotion is called stimulus emotion.

        3.2 Guiding Cognitive Reappraisal

        Fig. 1 Emotional robot’s mechanical design

        Gross proposed five emotional regulation strategies— situation selection, situation modification, attention deployment, cognitive reappraisal and response suppression[30]. Cognitive reappraisal, the more antecedent-focused strategy in the early emotional regulation stage, is composed of guiding and spontaneous evaluation. The guiding cognitive reappraisal always bears on the intensity of guiding emotion and the current emotional stance. The implementation method of energy transfer is work done by force, so the emotional regulation is a process of work done by force. Here,the guiding emotion, a positive force, is from intervener’s guidance to robot. The intervener is the third party, and she/he will give an encouragement and comfort to robot. When the individual is in trouble or psychological expectation is different from reality, negative emotions, such as sadness, anxiety, anger,pain, etc, are produced. In general, guiding cognitive reappraisal revises stimulus emotion state and improves the negative emotional experience. It will help robot keep a positive attitude in the HRI.

        To implement guiding cognitive reappraisal, interacting person always gives robot encouragement and comfort via language,behavior, and expression. In this paper, the encouragement is from interacting person’s emotion expression. When a guiding cognitive reappraisal occurs, it will change the position of stimulus emotion in the active field. As shown in figure 2, we suppose that the stimulus emotion after cognitive reappraisal will appear in the straight line joining the stimulus emotion and the guiding emotion in the AVS space, and the probability of a certain position obeying Gauss distribution. In the cognition process,the intensity of guiding emotion α affects the scattering of distribution, and the stance of robot current emotion s(stance) decides the core center of distribution. So mathematical expectationand varianceis:

        Fig. 2 Emotional space in the active field

        Here, R is the distance between stimulus emotion and guiding emotion, e is a constant.

        3.3 Emotional intensity attenuation

        In AVS emotional space, the axis Valence is applied to express the emotional property which is all about the attenuation rule of emotional intensity. In other words, the emotional intensity attenuation is influenced by the coordinate value v on Valence. So emotional attenuation coefficientis:

        Moreover, emotional intensity gradually weakens with time. Emotional Intensity’s Third Law derived from Weber-Fechner’s Law[34] by Qiu Dehui[35] describes the relationship of internal feel-intensity and external stimulus-intensity. It shows that the emotional intensity has a negative exponential function relationship with duration, so the exponential law of emotional intensity is:

        Here, I is the emotional intensity over time,is the initial emotion intensity, T is the duration.

        According to formula (3) (4), emotional intensity attenuation follows the rule:

        3.4 Emotional interaction modeling based on HMM

        3.4.1 HMM in Emotional Interaction

        Emotion, a psychological experience, is acted out by behaviors. So emotional interaction can be divided into two steps, the first step is personalized emotional regulation based on cognitive reappraisal and the second is emotional expression. This paper regards emotional interaction process as a double stochastic process imitated by a Hidden Markov Model(HMM). In the emotional regulation robot’s next emotion is only related to the current emotion. That regulation corresponds to the Markov process in HMM. In the Markov process the states are robot’s emotions, and in the second stochastic process the states are robot’s expressional behavior. The interaction between robot’s current emotion and external stimulus emotion forms the first Markov process, and then yields robot’s next observational emotional behavior without response suppression.The second stochastic process generates robot’s expressional behaviors during the emotional regulation.

        ● Initialization

        The state probability at time t isand the initial state distribution isMeanwhile the initial state distributionis subject to uniform distribution

        ● Induction

        The observable behaviors at time t isand the forward probability at time t + 1 is:

        ● Termination

        The probability of robot’s behavior sequence in HMMis

        Here, the size of solution problem is the order of emotional matrix N, the time complexity of recursive algorithm is

        3.4.2 Transition probability for emotional regulation

        Dynamic psychology shows that as other physical dynamical systems emotional drive also requires energy. Field theory proposed by Kurt Lewin explains the relationship between psychological state and behavior,namely,Here, B is behavior, P is person, E is environment, fis a field function[36]. Based on Kismet’s emotional space,the concept of the field is introduced into the emotional space for describing emotional spa-tiotemporal property and measuring energy change among emotions. This method explains the emotional occurrence and regulation process with cognition. In this emotional model,the interaction between stimulus emotion and robot current emotion in the active field forms an emotional space as shown in figure 2. Here,the size of field source is determined by the activated intensity of emotion, and the position of field source, emotional category[36].

        In the above emotional space, the emotional potential ε describes the field from energy perspective, and its value is closely related to the current and stimulus emotion. The computing method about emotional potential at pointin the straight line between current emotion at pointand cognitive stimulus emotion at pointis:

        Stimulus may lead to the changing of individual emotion. According to Fechner’s Law,the feel emotion intensity changes in logarithm relation with stimulus emotion intensity, so the intensity of robot’s current emotion changes logarithmically with the stimulus intensity which triggers it, namely,

        Here,the intensity of stimulus emotionbased on the range of facial expression from external stimulus.

        Because individual emotion is driven and produced by energy, the more energetic the higher activated degree is and vice versa. In the active field state space, the next emotion is chosen by the potential generated by the cognitive stimulus and own emotions. The greater emotional potential the point possesses, the more probability of this point the next emotion has. Emotional activation threshold effectively solves the problem of emotional overly sensitivity and overflow. When the emotional potential is in a certain interval, this emotion might be activated. And in other cases, emotions do not have activated probability.

        The two-point form of straight-line equation between the current and cognitive stimulus emotion is:

        So the parameter equation is:

        The transition probability from the current emotional itothe next j is:

        In AVS emotional space, a higher value ofindicates there may be more varied emotions, whereas the stimulus may trigger lesser emotion. Moreover, emotional similarity is proportional to the distance in emotional space, so a smaller value of L in the interval is more likely to cause a gently emotion change and a higher value of H in the interval is more likely to cause a jump.

        3.4.3 Robot emotion expression

        Emotion is constantly aroused and experienced by individual and numerous emotion experts maintain that behavior expression is the core component about emotional response,so behavior expression is used to express current robot emotion in our HRI research. In Gross emotional regulation, the last strategy response suppression is a response-focused strategy in the later emotional regulation stage and it could reduce the subjective negative emotional behavior via self-control[30].That is, response suppression is correlated with negative emotional controllability, and it focuses on the improvement of negative emotional behavior and enhances the positive emotional experience. Psychological research shows that the higher emotional activity the lower controllability (means the worse the response suppression effect). Because, in AVS emotional space, the axis of Arousal is mainly for expressing activation degree of emotion,the arousal valuea has an effect on robot response suppression. Pre-process the arousala,and obtain the suppression factor

        Standard action range is robot’s behavior without the response suppression, and it is the intended behavior of robot. Let’s suppose that standard action range of robot expression isThe actual action range after response suppression is:

        The action range is robot’s the moving scope of expression and behavior. We define the action range without response suppression is 1, the moving scope can float ± 100% with response suppression, so the action range is from 0 to 2.

        IV. EXPERIMENT AND IMPLEMENTATION IN HRI

        4.1 Experiment and simulation

        The cognitive emotional model mentioned in this paper was applied to a real HRI scenario for verifying the actual interaction effect.Emotional robot adopted facial expression recognition method based on Gabor wavelet and Nearest Neighbor Analysis to get user’s emotions in HRI and the guiding emotion from the intervener in cognitive reappraisal[32][37].The result of emotion recognition is shown in figure 3(a).

        Fig. 3 (a) Real-time expression recognition results; (b) Human-robot emotion interaction with behaviors

        Fig. 4 Probability distribution of cognitive stimulus emotion

        According to section 3, the distribution of emotional transition probability is calculated by robot’s cognitive stimulus emotion and own current emotion. At the moment, the coordinate values of guiding emotion wasand original external stimulus derived from an expression was sadness whose coordinate value wasThus cognitive stimulus emotion followed the Gauss distribution of mathematical expectationand standard deviationshown in figure 4.From figure 4 we can figure out that the probability at pointis the maximum value. Here, the mentioned probability is the transition probability from objective stimulus emotion to robot’s subjective emotion, when the guiding emotion is different from the stimulus emotion. If the cognitive stimulus emotion is different from robot’s own current emotion, the robot’s emotion will change. Figure 5 shows some emotional potential curves with different current emotions.

        Robot acts its emotion in behaviors including facial expression, the movement and gesture of upper limbs and chassis[32]. Table 1 shows a robot emotion regulation process with sadness stimulus and happiness guiding cognition as an instance. In the beginning,robot was in a calm state. When the sadness stimulus occurs, the emotion was changed to be heart-breaking and sobbed loudly. However, the actual emotion state was really not this intense because of the positive guiding and the response suppression. The robot emotion was gradually calmed for a while on the basis of emotional intensity attenuation rule mentioned in section 3.3.

        4.2 Implementation in HRI

        The HRI experiment setting is shown in figure 6. It includes human-robot communication and interactive evaluation. In human-robot communication: (1) Interacting person’s facial expression is acquired by camera in the robot’s chest, and we take emotion extracted from the facial expression as the stimulus emotion. (2)The stimulus emotion translates into cognitive stimulus emotion with the guiding cognitive reappraisal mentioned in 3.2. (3) Robot’s nextemotion state is produced by current emotion and cognitive stimulus emotion in the emotional interactive model mentioned in 3.4. (4)Robot’s emotion is expressed in the behaviors like figure 3(b) to implement the HRI. Interacting person will evaluate the interactive impact after the HRI through questionnaire system like the bottom left of figure 6. The evaluations include the acceptability, accuracy,richness, fluency, interestingness, friendliness and exaggeration of robot’s behavior. Each aspect is divided into 5 degree from 1 to 5, and interacting person gives each aspect a mark based on their satisfaction. 1 represents very dissatisfied and 5 represents very satisfied.

        Table I Robot emotion regulation process with sadness stimulus and happiness guiding cognition

        There were five groups in the interactive impact evaluation for the experiment in our research. The recruited interacting people are teenagers and youth including 10 middle school students (13-18 years old, 5 males and 5 females), 10 undergraduates (19-22 years old, 5 males and 5 females) and 10 postgraduates (23-28 years old, 5 males and 5 females).The interacting person accompanied by the robot could freely move around the HRI experimental laboratory with 25 square meters for emotional interaction. On each group,interacting person made 20 facial expressions in 10 minutes according to their mood, and robot gave corresponding emotion expressions. Thirty interacting people were involved in each interaction process (each interacting person included in 5 different groups) to realize the difference among the groups and to evaluate the interactive satisfaction about each emotional stimulus.

        Fig. 5 Transition probability curves with different current emotions

        Fig. 6 HRI experiment setting

        Table II Interacting people’s evaluation results

        Fig. 7 The media value of each evaluation

        Experimental setting of the robot for 5 groups is shown in table 2, and together with evaluation average results from interacting people. We can see that the satisfaction of Group 1 without any emotional model is the lowest and Group 5 is closest to users’ need.Besides, cognitive reappraisal has the strongest effect on improving the users’ experience,response suppression comes second and intensity attenuation third. The media value of each evaluation is shown in figure 7. As can be seen from figure 7, comparing to the interaction without cognitive reappraisal, response suppression and intensity attenuation, the cognitive reappraisal improves the acceptability,accuracy, richness, interestingness, friendliness and exaggeration of robot’s behavior; the response suppression is effective in the accuracy, richness and exaggeration; the intensity attenuation is beneficial to the acceptability,accuracy, richness, fluency and exaggeration.Moreover, the combination of cognitive reappraisal, response suppression and intensity attenuation makes robot’s emotional behaviors have a comprehensive promotion in HRI.Figure 8 is statistical results of the evaluation from interacting people in human-robot emotion interaction with behaviors. In general, the robot with more emotional capabilities becomes widely accepted in HRI.

        V. DISCUSSION

        Fig. 8 Statistical results of the evaluation from interacting people in human-robot emotion interaction with behaviors

        Even though the results obtained show the cognitive emotional model based on HMM has a preferable applicability and flexibility for emotional interactive modeling, there are some research directions that should be considered in the future research.

        In fact, implementing empathy in human-robot interaction is confined to social context, culture and behavior characteristic.Currently, deep learning that is widely used to large-scale feature extraction from big data could provide diverse datasets as key components of emotion input (e. g. situation, personal background, habit, physiology etc.), but they are rarely combined with emotional space[38][39]. This paper only pays attention to behaviors, however, we map it into active field space and elaborate the interaction of emotion states from the dynamics law. We expect that the application of deep learning would promote a cross-modal feature fusion and parameters setting in the follow-up research.

        The proposed cognitive emotional model is based on HMM. One main highlight is that we give a heuristic algorithm to determine the transition probability and focus on the observed sequence probability at AVS space.There have been a number of studies on the emotional inference and detection model for HRI [29][40], but none of them unify the emotional prediction, transfer and expression together. Although the method cannot describe the real mechanism of emotional production and expression comprehensively (e.g. weep for joy), we obtain an effective model to present the emotional regulation processing and appropriate behaviors under the external stimulus.

        Because this model is only involved in emotional intensity attenuation, the continuous prediction of spontaneous affect still need to be improved in the future [41]. Though we only perform some small sample experiments with 5 control group to evaluate the empathy,when cognitive reappraisal, response suppression, intensity attenuation are added to the emotional interaction, and the statistical results show a certain positive improvement,we certainly would consider to expand the experiment sample size and seek more effective evaluation approaches for affective computing.

        VI. CONCLUSION

        Emotion is an inner feeling, not directly observable. Robot’s physical structure, as the basis of outside manifestation, plays a significant role in the emotional interaction. Based on the robot mechanical platform with 13 degrees of freedom, this paper adopted a multi-joint combined drive to enhance motion joints’ collaborative capability, and the posture of arbitrary direction/precision was achieved by PWM two-way control. Facts prove that the control error of each joint’s posture is less than 0.5°.Moreover, the chassis with 3 omni-wheels and a fibre optic gyroscope help robot implement more intelligent interactive functions such as path planning and autonomous obstacle avoidance. On that basis, Empathizing, the main distinguishing feature of our works, was realized by the emotional regulation which was operated in a continuous 3D emotional space enabling a wide range of intermediate emotions to be obtained. In the emotional space,first, the emotional distribution after guiding cognitive reappraisal could be obtained by the intensity of guiding emotion and the stance of robot current emotion; second, the robot emotional intensity weakened with time according to emotional valence; third, robot’s actual action range was influenced by the response suppression that was related to arousal. From this the emotional regulation process driven by energy could be analyzed quantitatively by arousal, valence and stance. In general, the use of HMM emotional regulation model based on cognitive reappraisal in active field allows robot to imitate the human emotional regulation,and the experiment results provide evidence with questionnaire that the robot with cognition and emotional control ability could serve more interacting people’s emotional needs in HRI.

        ACKNOWLEDGEMENTS

        This work has been supported by Beijing Natural Science Foundation (No. 4164091),China Postdoctoral Science Foundation (No.2015M580048), Fundamental Research Funds for the Central Universities (No. FRF-TP-15-034A1), National Natural Science Foundation of China (No. 61672093, 61432004), National Key Research and Development Plan(2016YFB1001404).

        [1] L. F. Rodriguez, F.Ramos, “Computational models of emotions for autonomous agents: major challenges”, Artificial Intelligence Review, vol.43,no.3, pp 437-465, 2015.

        [2] D. Luisa, D. Paul, H. Lehmann, “Towards Human-Robot Affective Co-evolution Overcoming Oppositions in Constructing Emotions and Empathy”, International Journal of Social Robotics,vol. 7, no.1, pp 7-18, 2015.

        [3] P. Ekman, “Lie catching and microexpressions”,Oxford Universit Press, pp 118-133, 2009.

        [4] W.Wang, Z. L.Wang, X. J.Gu, S. Y.Zheng, “Research on the computational model of emotional decision-making”, International Journal of Kensei Information, vol.2, no.3, pp 167-172,2011.

        [5] C. D. Liu, Y. N. Chung, P. C. Chung, “An interaction-embedded HMM framework for human behavior understanding: with nursing environments as examples”, IEEE Transactions on Information Technology in Biomedicine, vol.5, no.14,pp 1236-1241, 2010.

        [6] S. Boucenna, S. Anzalone, E. Tilmont et al,“Learning of social signatures through inmitation game between a robot and a human parter”, IEEE Transactions on Autonomous Mental Development, vol.6, no.3, pp 213-225, 2014.

        [7] W. Joshua, B. Robins, F. Amirabdollahian, et al, “Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism”,IEEE Transactions on Autonomous Mental Development, vol.6, no.3, pp 183-198, 2014.

        [8] U. Tariq, K. H. Lin, Z. Li, et al, “Recognizing emotions from an ensemble of reatures”, IEEE Transactions on Systems, Man, and Cybernetics —Part B: Cybernetics, vol.42, no.4, pp 1017-1026,2012.

        [9] J. Velásquez, “An emotion-based approach to robotics”, IEEE/RSJ International Conference on Intelligent Robots and Systems,1999.

        [10] W.Wundt, “Principles of physiological psychology”, New York: Macmillan Press.

        [11] H. Schlosberg, “Three dimensions of emotion”,Psychological Review, vol.61, no.2, pp 81-88,1954.

        [12] J. Panskepp, “Affective neuroscience: The foundations of human and animal emotions”, Oxford University Press, 2004.

        [13] G. Valenza, A. Lanata, E. P. Scilingo, “The role of nonlinear dynamics in affective valence and arousal recognition”, IEEE Transactions on Affective Computing, vol.3, no.2, pp 237-248, 2012.

        [14] I. Hupont, S. Baldassarri, E. Cerezo, “Facial emotional classification: from a discrete perspective to a continuous emotional space”, Pattern Analysis and Applications, vol.16, no.1, pp 41-45,2013.

        [15] J. Russell, A. Mehrabian, “Evidence for a three-factor theory of emotions”, Journal of Research and Personality, vol.11, no.3, pp 273-294,1977.

        [16] G. A. Hollinger, Y. Georgiev, A. Manfredi, B. A.Maxwell, Z. A. Pezzementi, B. Mitchell. “Design of a social mobile robot using emoiton-based decision mechanisms”, Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, October, pp 3093-3098, 2006.

        [17] B. A. Christian, W. Ipke, “Affective computing with primary and scondary emotions in a virtual human”, Autonomous Agents and Multi-Agent Systems, vol.20, no.1, pp 32-49, 2010.

        [18] M. Zecca, S. Roccella, H. Miwa, “On the development of the emotion expression humanoid robot WE-4RII with RCH-1”, Proceedings of the 4th IEEE/RAS International Conference on Humanoid Robots, Tokyo, Japan, November, pp 235-252, 2005.

        [19] C. Breazeal, “Function meets style: insights from emotion theory applied to HRI”, IEEE Transactions on Systems, Man, and Cybernetics-Part C:Applications and Reviews, vol.34, no.2, pp 187-194, 2004.

        [20] M. L. Hoffman, “Empathy and Moral Development: Implications for Caring and Justice”,Cambridge University Press, 2001.

        [21] M. S. Gou, V. Vouloutsi, K. Grechuta, S. Lallée, P.F. M. J. Verschure, “Empathy in Humanoid Robots”, The 3th International Conference, Living Machines, Milan, Italy, July, pp 423-426, 2014.

        [22] H. G. Engen, T. Singer, “Empathy circuits”, Current Opinion in Neurobiology, vol.23, no.2, pp 275–282, 2013.

        [23] F. D. Ambrosio, M. Olivier, D. Didon, C. Besche,“The basic empathy scale: a French validation of a measure of empathy in youth”, Personality and Individual Differences, vol.46, no.2, pp 160–165, 2009.

        [24] A. Smith, “The theory of moral sSentiments(sixth edition)”, MεταLibri, 1790.

        [25] N. Eisenberg, J. Strayer, “Critical issues in the study of empathy”, New York: Cambridge University, 1987, pp 3-13.

        [26] A. Niculescu, B. V. Dijk, A. Nijholt, H. Li, S. Lan See. “Making Social Robots More Attractive:The Effects of Voice Pitch, Humor and Empathy”, International Journal of Social Robotics,vol.5, no.2, pp 171-191, 2013.

        [27] V. Evers, W. Andi, P. Gregor, F. Groen, “The Evaluation of Empathy, Autonomy and Touch to Inform the Design of an Environmental Monitoring Robot”, The 2th International Conference on Social Robotics, Singapore, November, pp 285-294, 2010.

        [28] A. Lim, H. G. Okuno, “A Recipe for Empathy:Integrating the Mirror System, Insula, Somatosensory Cortex and Motherese”, International Journal of Social Robotics, vol.7 , no.1, pp 35-49,2015.

        [29] L. Damiano, P. Dumouchel, H. Lehmann, “ Towards Human-Robot Affective Co-evolution Overcoming Oppositions in Constructing Emotions and Empathy”, International Journal of Social Robotics, vol.7, no.1, pp 7-18, 2015.

        [30] J. J. Gross, “Emotion regulation: affective, cognitive and social consequences”, Psychophysiology, vol.39, pp 281-291, 2002.

        [31] Z. G. Yu, Q. Huang, G. Ma, etc. “Design and development of the humanoid robot BHR-5”,Advances in Mechanical Engineering, 2014.

        [32] L. Xie, F. Gong, Z. L.Wang, “Three wheels omni-directional mobile control device and autism children monitoring system”, 201110066274.7,China, 2012.

        [33] F. Sigmund, “New introductory lectures on psychoanalysis”, W W Norton & Company, 1990.

        [34] G. T. Fechner, “Elemente der psychophysik, Part 1” , Kessinger Legacy Reprints, 1860.

        [35] D. H. Qiu, “Mathematical emotion”, Hunan People’s Publishing House, 2001.

        [36] L. Kurt, “Principles of topological psychology”,Munshi Press, 2008.

        [37] X. Liu, L. Xie, Z. L.Wang, D. M. Fu, “Micro-expression capture and recognition based on 3D-gradient projection descriptor”, Journal of Huazhong University of Science and Technology(Natural Science Edition), vol.42, no.12, pp 122-127, 2014.

        [38] A. Vinciarelli, G. Mohammadi, “A survey of personality computing”, IEEE Transactions on Affective Computing, vol.5, no.3, pp 273-291, 2014.

        [39] H. P. Martinez, Y. Bengio, G. N. Yannakakis,“Learning deep physiological models of affect”,IEEE Computational Intelligence Magazine, vol.8,no.2, pp 20-33, 2013.

        [40] S. Kumano, K. Otsuka, D. Mikami, M. Matsuda,J. Yamato, “Analyzing interpersonal empathy via collective impressions”, IEEE Transactions on Affective Computing, vol.6, no.4, pp 324-335,2015.

        [41] M. A. Nicolaou, H. Gunes, M. Pantic, “Continuous predicion of spontaneous affect from multiple cues and modalities in valence-arousal space”,IEEE Transactions on Affective Computing, vol.2,no.2, pp 92-105, 2011.

        久久丁香花综合狼人| 四虎影视永久在线观看| 亚洲av片不卡无码久久| 久久99精品中文字幕在| 青青草成人原视频在线播放视频| 国产精品久久免费中文字幕| 国产精品乱码一区二区三区| 国产jizzjizz视频免费看| 国产91精品丝袜美腿在线| 在线播放国产自拍av| 精品国产av色一区二区深夜久久| 特级毛片a级毛片在线播放www| 国产精品白浆免费观看| 久久精品国产亚洲av影院毛片| 欧美成人aaa片一区国产精品| 久久99精品久久久久久野外| 国产不卡在线免费视频| 东北熟妇露脸25分钟| 真人做爰片免费观看播放| 日本亚洲欧美在线观看| 激情视频在线播放一区二区三区 | 高清毛茸茸的中国少妇| 久久久精品人妻一区二区三区四| 91美女片黄在线观看| 日本av一级视频在线观看| 久久精品国产清自在天天线| 亚洲av成人精品日韩一区| 亚洲伊人免费综合网站| 国产精品专区第一页天堂2019| 国产成人亚洲精品| 亚洲第一无码精品久久| 人妻少妇中文字幕专区| 欧洲多毛裸体xxxxx| 最近高清中文在线字幕观看| 丝袜美腿一区二区在线观看| av一区二区三区在线| 欧美日韩中文国产一区发布| 99久久亚洲国产高清观看| 国产精品成人自拍在线观看| 亚洲欧美国产国产综合一区| 午夜a福利|