Multitouch gestures and personalized feelings

NAO learned two important things today:

1) Detection of multitouch gestures on its head buttons: one, two ore all three buttons pressed at the same time.

2) Learning and recognizing different faces and store personal feelings towards these persons. This will help NAO to react differently while interacting with different persons.

Team 3 will start combining all previously programmed parts to create a more complex project for the presentation.

Adjusting NAOs personality

NAOs personal parameters can be modified using the touch sensors on his head. Pressing the buttons will increase or decrease different parameters and will have an effect on NAOs eye color. A ready-to-use block for Chorographers library is currently developed to ease programming and make it even more flexible.

The next steps concerning his personality will be implementing a lock to assure that personal parameters can not be changed by two actions running simultaneously and reacting to multitouch gestures on its head.


Further information on NAOs personality

Todays goal was to create and implement a simple personality in NAO Coreographe. The following parameters will be used for human-like behaviours such as certain quotes, movements and face expressions (using the eye leds):

1) Fatigue
general parameter
ranging between tired and active

2) Mood
general parameter
ranging between bad and good

3) Feelings
personalized parameter
ranging between hate and love

The team implemented two different ways of passing these parameters from the initialization block to a block performing some kind of action

1) Wired
The properties are combined to an array and sent from an block outlet to the next block inlet. Doing so works fine, but will cause problems with the readability of the graphical code and will probably not work with interaction between different action blocks running simultaneously.
Project file: 2012-02-01_Personality_v2.crg

2) Wireless
Blocks are autonomous code elements and very limited when it comes to using common properties. Using the ALMemory let us store the properties in NAOs global memory and access it from any other block. This makes coding more flexible.
Project file: 2012-02-01_PassingParameters4.crg

Tomorrows workshop will be dedicated to these two tasks:
1) Using input sensors to adjust the properties and therefore change the behavior of the robot.
2) Implementing the robots memory using face detection/recognition and interact differently using previously stored feelings towards this person.

First step to implement the NAO’s personality (1/02/2012)

Today we focused on the management of various sensors of the robot.

By controlling all the sensors of the robot, we can easily create a link with the outside world.

In parallel, Mac works on the implementation of a personality of the NAO robot.

With the combination of our work, we can quickly create a personality for NAO.

We have focused our efforts on:

- The initialization of the robot (language, volume, etc.)
- The various sensors of the robot (on the head, hands, etc.)
- The walk of the robot with obstacle detection
- The recording of a customize movement
- Catching an object (ball, goblet, etc.)

Here is a screenshot of the programming environment of the robot.

and finally here is a picture of the robot to catch an object.

Work is progressing well.


Project: NAO
Topic: Interactive humanoid
– Artist: BARRET Pascale
– Coach: GROBET Damien
– Expert: ALMEIDA Filipe
– FURLAN Jérémy
– PINKERS Christophe
– SOUSA Duarte
– autonomous behavior
– overcame several obstacles and objects
– interact with people
– create a personality and psychological profile
– tourist guide
– act as presentator
– interface with other technologies (kinect, arduino) and objects (ball, etc.)
– object recognition
– use all build in sensors and actuators
– communication with the pc (i.e. to change background images)
– dialog and smalltalk
– sending information, data or pictures via email
– describe the personality
– implement the personality
– create animations
– ways to express feelings (speech, sound, movement, led colors, etc.)
– explore the build in blocks
– plan additional behaviors
– create the environment (table, objects, screen, etc.)
– blog (
– preparing the presentation
– 2/3 slides for débriefing

Bonjour tout le monde !

Bienvenue dans LARAS – LAboratory for Research in the field of Arts and Sciences. Ceci est votre premier article. Modifiez-le ou effacez-le, puis lancez-vous !