Jim is the current incarnation of the emotion-based character. He doesn't have a pretty 3D interface, but he does use emotions and also demonstrates learning, at least to some degree.
If you want to give him a try, you can download a windows version or a pre-compiled linux version.
I'm keeping the code under wraps, for the time being. At least until I've finished most of the demonstrations etc. related to the project.
Version 0.1 (the version released so far) comes with a relatively untrained character, so you will need to train it yourself. Brief instructions are included in the downloads.
Friday, 7 March 2008
Conference Paper
Today my conference paper has been submitted. This will be published, and also I'll be presenting it later in May!
For your reading pleasure, you may view the PDF version of my conference paper here.
For your reading pleasure, you may view the PDF version of my conference paper here.
Tuesday, 4 March 2008
Character Brain Working
Well the brains of the character seems to be working OK, I'm currently going through a process of trying to train it what actions are appropriate responses for the different user actions. It's getting there.
At present only a Linux version exists, and the last time I tried it wouldn't compile on Windows (something to do with the enormous arrays used to hold the "appropriateness" data...)
Something I have also thought of doing is introducing long-term memory. Using an external file the emotional state could be saved at the end of a session and loaded at the beginning of the next one... this could be an interesting addition.
At the moment the character doesn't really display any goals, but I'm not sure how to factor this into the current algorithms. It may be impossible at this stage.
I'm still upset that the visual aspects have had to be ignored. Unless a miracle happens, it seems unlikely that this will be able to be integrated before the end of the project. Still, the investigation into use of emotion as a decision making tool has been quite interesting!!
At present only a Linux version exists, and the last time I tried it wouldn't compile on Windows (something to do with the enormous arrays used to hold the "appropriateness" data...)
Something I have also thought of doing is introducing long-term memory. Using an external file the emotional state could be saved at the end of a session and loaded at the beginning of the next one... this could be an interesting addition.
At the moment the character doesn't really display any goals, but I'm not sure how to factor this into the current algorithms. It may be impossible at this stage.
I'm still upset that the visual aspects have had to be ignored. Unless a miracle happens, it seems unlikely that this will be able to be integrated before the end of the project. Still, the investigation into use of emotion as a decision making tool has been quite interesting!!
Labels:
Character,
Emotion,
Milestone,
Programming
Thursday, 28 February 2008
Character Actions
Progress has been slow of late with OpenSG, to the point where I'm going to need to contact my project supervisor.
However, in order to make the most of the situation, I have been spending time developing the remaining framework of the character - taking an input, making an emotional decision and then giving an output. I will at some point publish the algorithms used in these processes, but in the meantime I wanted to publish the actual list of inputs and outputs.
Natural Language Processing simply wasn't an option with everything else I have to do, so it's more a matter of having a defined list of inputs / outputs.
There's going to be some learning involved, so that the character learns the most appropriate output, rather than it being hardcoded in. This should make it a little more interesting, and a little less predictable!
Inputs:
1. Say "Hello!"
2. Say "I love you!"
3. Say "Where's the toilet?"
4. Say "You smell!"
5. Say "I'm sad."
6. Say "You're sad!"
7. Say "AI sucks"
8. Say "You're quite clever, aren't you?"
9. Say "What do you know about AI?"
10. Say "Goodbye" (procedure for exiting?)
11. Do: Kiss
12. Do: Hug
13. Do: Smile at
14. Do: Stroke
15. Do: Rub belly
16: Do: Poke tongue out playfully
17. Do: Hold hand
18. Do: Give present
19. Do: Dance with
20. Do: Tickle
21. Do: Kick
22. Do: Poke
23. Do: Laugh at
24. Do: Poke tongue out at, in rude way
25. Do: Smack
26. Do: Push
27. Do: Blow a raspberry
28. Do: Trip up
29. Do: Scream loudly
30. Do: Shake fist at
Outputs:
1. Say "The toilet is not here."
2. Say "Artificial Intelligence? That's far too big a subject for me to know! Look on WikiPedia!"
3. Say "Hello!"
4. Say "Thank you!"
5. Wave
6. Say "It's a beautiful day!"
7. Smile
8. Laugh
9. Say "I just want to die!"
10. Cry
11. Sulk
12. Dance
13. Sit
14. Gaze into space
15. Say "That won't help!"
16. Say "I don't care about AI!"
17. Stamp feet
18. Kiss
19. Say "I love you!"
20. Blush
21. Scowl at
22. Punch
23. Say "I hate you!"
24. Say "Would you like to be my friend?"
25. Cuddle
26. Tickle
27. Push you away
28. Say "Get away from me, you minger!"
29. Poke you
30. Drool
31. Say "Hmmmmm... me want!"
32. Lick lips
33. Say "Ugh!"
34. Turn up nose
35. Shake head
However, in order to make the most of the situation, I have been spending time developing the remaining framework of the character - taking an input, making an emotional decision and then giving an output. I will at some point publish the algorithms used in these processes, but in the meantime I wanted to publish the actual list of inputs and outputs.
Natural Language Processing simply wasn't an option with everything else I have to do, so it's more a matter of having a defined list of inputs / outputs.
There's going to be some learning involved, so that the character learns the most appropriate output, rather than it being hardcoded in. This should make it a little more interesting, and a little less predictable!
Inputs:
1. Say "Hello!"
2. Say "I love you!"
3. Say "Where's the toilet?"
4. Say "You smell!"
5. Say "I'm sad."
6. Say "You're sad!"
7. Say "AI sucks"
8. Say "You're quite clever, aren't you?"
9. Say "What do you know about AI?"
10. Say "Goodbye" (procedure for exiting?)
11. Do: Kiss
12. Do: Hug
13. Do: Smile at
14. Do: Stroke
15. Do: Rub belly
16: Do: Poke tongue out playfully
17. Do: Hold hand
18. Do: Give present
19. Do: Dance with
20. Do: Tickle
21. Do: Kick
22. Do: Poke
23. Do: Laugh at
24. Do: Poke tongue out at, in rude way
25. Do: Smack
26. Do: Push
27. Do: Blow a raspberry
28. Do: Trip up
29. Do: Scream loudly
30. Do: Shake fist at
Outputs:
1. Say "The toilet is not here."
2. Say "Artificial Intelligence? That's far too big a subject for me to know! Look on WikiPedia!"
3. Say "Hello!"
4. Say "Thank you!"
5. Wave
6. Say "It's a beautiful day!"
7. Smile
8. Laugh
9. Say "I just want to die!"
10. Cry
11. Sulk
12. Dance
13. Sit
14. Gaze into space
15. Say "That won't help!"
16. Say "I don't care about AI!"
17. Stamp feet
18. Kiss
19. Say "I love you!"
20. Blush
21. Scowl at
22. Punch
23. Say "I hate you!"
24. Say "Would you like to be my friend?"
25. Cuddle
26. Tickle
27. Push you away
28. Say "Get away from me, you minger!"
29. Poke you
30. Drool
31. Say "Hmmmmm... me want!"
32. Lick lips
33. Say "Ugh!"
34. Turn up nose
35. Shake head
Thursday, 14 February 2008
Character Animation in OpenSG
Today I've been investigating programming in OpenSG, by playing around with a really simple program that was used for my Virtual Reality project.
I also did a search around for a character to use - now I'm using Blender, Generi (which is a Maya character rig) is no longer suitable. After a little search around I came across a little chap called Ludwig, created by Jason Pierce.
I've plugged it into OpenSG and all seemed reasonably OK, but after playing around a bit and doing some research it looks like exporting from Blender into OpenSG exports a fixed model. Oh dear. So if I want to do any animation (which clearly I do) I'm going to have to program the individual parts as individual nodes, and then get them to move around that way.
Tomorrows job is to investigate creating a simple face using individual nodes.
I also did a search around for a character to use - now I'm using Blender, Generi (which is a Maya character rig) is no longer suitable. After a little search around I came across a little chap called Ludwig, created by Jason Pierce.
I've plugged it into OpenSG and all seemed reasonably OK, but after playing around a bit and doing some research it looks like exporting from Blender into OpenSG exports a fixed model. Oh dear. So if I want to do any animation (which clearly I do) I'm going to have to program the individual parts as individual nodes, and then get them to move around that way.
Tomorrows job is to investigate creating a simple face using individual nodes.
Thursday, 17 January 2008
Emotion Engine Complete
The emotion engine has gone through some changes, both to make it more efficient and also to make it slightly less predictable.
The emotion engine now includes a randomisation effect to increase or decrease all of the emotions whenever the system is running without input. The idea is, if nothing is happening to the character the emotions should continue to fluctuate and change. If no one talks to you, you still manage to get happier or sadder, don't you?
If you are particularly mean to the test character (called "Larry") he gets extremely sad and struggles to get out of this rut. It seems the emotion engine I've designed may suffer from depression... However, this is probably quite acceptable - if someone kept repeatedly kicking me, I would get quite upset and it would take a lot to make me happier.
Test out "Larry" here. Please comment on this blog if there are any problems!
A quick note for future versions:
It may be interesting for some actions to have longer lasting effects. Such that if something really good/bad happens, it continues to affect the emotions after the initial action. This would be something to work into the overlying program, rather than the emotion engine itself.
The emotion engine now includes a randomisation effect to increase or decrease all of the emotions whenever the system is running without input. The idea is, if nothing is happening to the character the emotions should continue to fluctuate and change. If no one talks to you, you still manage to get happier or sadder, don't you?
If you are particularly mean to the test character (called "Larry") he gets extremely sad and struggles to get out of this rut. It seems the emotion engine I've designed may suffer from depression... However, this is probably quite acceptable - if someone kept repeatedly kicking me, I would get quite upset and it would take a lot to make me happier.
Test out "Larry" here. Please comment on this blog if there are any problems!
A quick note for future versions:
It may be interesting for some actions to have longer lasting effects. Such that if something really good/bad happens, it continues to affect the emotions after the initial action. This would be something to work into the overlying program, rather than the emotion engine itself.
Monday, 7 January 2008
Emotion Engine Functional!
The emotion engine is partially functional - the emotion object code has been tested in a very small (and extremely basic) program and the functions to run the emotions all seem to work.
However, during testing it seems that if you are particularly horrible to Larry (the test emotional character) he can get very hateful very quickly and not come out of it particularly fast. Therefore a new set of rules will be added which state:
However, during testing it seems that if you are particularly horrible to Larry (the test emotional character) he can get very hateful very quickly and not come out of it particularly fast. Therefore a new set of rules will be added which state:
This marks an important milestone. Now that the emotions work, the character can use this engine to begine to appear intelligent!17 - hate wears off naturally (only a little bit)
18 - love wears off (again, limited as above)
Subscribe to:
Posts (Atom)