Tuesday 27 November 2007

Recent Activity

Due to several pieces of coursework due in at the moment, activity on the Virtual Character has been put on hold.

My intention is to try and get the emotion engine complete or close over Christmas, as well as building on the VR project (mentioned below) to start the framework for the visual interface.

Quick update: Despite Prologs flexibility, it will be easier to program the emotion framework in C++, given that I can load emotion states from external files, and also change emotional reaction to stimuli. This shows evolution of emotions and learning etc.

Thursday 8 November 2007

Emotion Design 2

This is the continuance of my Emotion Engine design, what I have achieved today. Not an awful lot, but I think I might actually be ready to start structuring the Prolog!

Further Emotion Engine Design

Now comes the time of actually attempting to put the design into Prolog. It's been a while since I last used prolog, so this process is something of trial and error. I'm going attempt to keep a record of this process, though I'm likely to forget to record something.

In addition, following a recent lecture, it seemed that a further emotion was missing from the emotion engine - Desire. Desire drives us in many aspects of our lives, particularly in the early part of our development.

In fact, it seems to me that disgust is the opposite of desire, so the relate that way. I just need to make sure that when a negative disgust is activated (i.e. desire) this should have an effect on the emotional state.

An additional aspect that came to my mind following the above thought was "learning". In order to be an "intelligent" character, does my character need to include learning?

Emotions stand as this, given as Positive/Negative: [Happiness/Sadness], [Peace/Anger], [Love/Hate], [Friendliness/Hostility], [Desire/Disgust].

All emotions will be 0-100 scale, whereby 0 is the negative, 49-50 is neutral and 100 is positive

Right, new rules, including desire:

1. increased anger increases sadness (or decreases happiness)
2. increased sadness doesn't affect anger
3. increased happiness decreases anger
4. the character can be calm (i.e. not angry) without being happy
5. anger should wear off
6. consistently happy character should lead to increased love
7. consistently sad/angry character should lead to hate (anger should affect more than sadness)
8. certain words should evoke love/hate response
9. strong love/hate emotions should lead to increased emotional response elsewhere
* NB. Strong love increases happiness, reduces anger, reduces hostility, reduces disgust
Strong hate decreases happiness and increases hostility
10. love above a certain level should activate flirtatiousness (?)
11. hate above a certain level should activate hostility
12. disgust should reduce all other emotions, so that it takes priority

New rules:
13. love increases desire
14. hate reduces desire
15. happiness increases desire
16. sadness reduces desire (minimally, not enough to increase disgust as such)

Interesting thought:
Emotions usually have two dimensions, their degree (i.e. what emotion and how strong) and their focus. For instance, I love this, I hate that. So far this idea of focus isn't employed. This could probably be employed in the overlying C "brain" program - focus could be set by the input, the emotional response calculated (the highest emotional level wins).

Emotion Design 1

This is a record I've had on my computer whilst thinking through the problem of designing the Emotion Engine. As yet I still haven't actually designed it. Ooops. Getting there though. Just trying to remember Prolog and work out how to structure the predicates. Anyway, the below text is from the 25th October:

Emotion Engine Design

Emotion needs to take a current state and a set of actions (may only be one action) and process the actions to give a new emotional state

Emotion is given as a function with several elements. These elements could come from my own approximation given on my blog: happiness, anger, love and hostility
Or alternatively use an adaptation of Ekman's model [1] as noted in AlphaWolves etc: fear, anger, sadness, happiness, disgust and surprise. Ekman's model seems to assume that you feel one emotion at a time.

My model lacks something of Ekmans, however it is simpler in one sense. A new model, improving on my own in light of Ekmans might be this: Happiness, Anger, Love, Hostility, Disgust. It seems unlikely that anything will be able to "surprise" the character, so including this emotional aspect only complicates the processing; also, I have chosen not to give my character fear (both in the interests of simplification and kindness).

The rules that I created for my emotional rules are below. These will need to be manipulated or added to in light of the additional aspect of Disgust.

1. increased anger increases sadness (or decreases happiness)
2. increased sadness doesn't affect anger
3. increased happiness decreases anger
4. the character can be calm (i.e. not angry) without being happy
5. anger should wear off
6. consistently happy character should lead to increased love
7. consistently sad/angry character should lead to hate (anger should affect more than sadness)
8. certain words should evoke love/hate response
9. strong love/hate emotions should lead to increased emotional response elsewhere
* NB. Strong love increases happiness, reduces anger, reduces hostility, reduces disgust
Strong hate decreases happiness and increases hostility
10. love above a certain level should activate flirtatiousness (?)
11. hate above a certain level should activate hostility

New rules:
12. disgust should reduce all other emotions, so that it takes priority

The Prolog action/emotion processor should be created using these rules.

Look ahead: Next step would be the Next Action function, taking received action and new emotional state to decide on a response. Responses would be limited, so it would need to choose most appropriate.

References:
[1] EKMAN, P. 1992. An Argument for Basic Emotions.
Basic Emotions. N. Stein, and Oatley, K.
Hove, UK, Lawrence Erlbaum: 169-200.