Developmental Robotics; Creativity and Simulation

From IPRE Wiki
Revision as of 21:27, 10 June 2009 by Ashgavs (Talk | contribs) (June 1st - June 5th)

Jump to: navigation, search

June 1st - June 5th

Monday, June 1st

Key points: Creativity, Unsupervised vs. Supervised neural networks, Developmental Robotics, Microsoft Surface

I started my research experience here at BMC this summer late, however, I came in with a slew of ideas. Which I discussed with Doug. We discussed my thoughts on creativity and developmental robots; whether or not genuine creativity is a possible phenomenon in robots (which I now presume it is), how this can be measured, and what kind of environment best fosters it. We also spoke in general about Developmental Robotics, and removing the anthropomorphic bias inherent in other forms of robotics research. I agreed with Doug, removing this bias, and letting the robot generate its own self-motivations, is key in discovering whether creativity is possible for robots. From this point we began discussing unsupervised vs. supervised neural networks with the rest of the research group. Meena is our expert in this area.

I spent the rest of the day learning about the Microsoft Surface for which we will do some programming this summer.

Tuesday and Wednesday, June 2nd and 3rd

Key points: Webots and Humanoid Robot Simulation

For the next two days Doug introduced me to the Webots robot simulator, this is the simulator we will be using for our Humanoid robot, the Robonova-1. Webots is very cool, and, when not programming, very simple to use. It allows the user to create worlds, robots, and code to control those robots. As well as physics to control the world the robot lives in. I learned that the main goal of this project would be to create an abstract library, using the Python programming language, for programming humanoid robots. This project would work in conjunction with the [Pyro Project].

Thursday and Friday, June 4th and 5th

Key Points: Learning Webots Worlds and Simple Robot Models

I spent all of Thursday and Friday learning about how to create worlds in Webots and the creation of very basic robot models. I primarily studied the lighting of worlds, creations of objects and the creation of boundaries around objects (to keep my robot from walking through walls), materials of objects, textures, and such. I eventually created my own, simple world, that consists of a fairly simple maze. The end point of the maze is a blue box, when the blue box is spotted, my robot will then stop moving around.

Maze.jpg

I also worked on a differential wheels model of a robot (a robot that is a body and has two wheels on either side). I also gave this little robot two eyes, which each contain two IR sensors, and a little hat containing a camera. I don't have a name for her yet, suggestions would be appreciated.

Differentialwheels.jpg

June 8th - June 13th

Monday and Tuesday, June 8th and 9th

Key Points: Code for Webots and Reading on Creativity and Developmental Robotics

Over the weekend I finished all the reading on Doug's Developmental Robotics research and his PhD thesis. His thesis was awesome, he did research on whether robots can learn to make analogies, which they did. Its a bigger version of what I might want to achieve this summer. I also spoke to a Psychology professor here and a Philosophy professor here to guide me in my research. After speaking with them and Doug, I realized that the creativity coming from my robot would be most similar to creativity from babies. My robot will only have so many primitive actions, just as a baby has very few abilities at birth (sucking, movement of libs etc.). The creativity exhibited by my robot will be based on these primitives. For example, rather than my robot doing art, something creative might be the way my robot takes a movement it already knows, and combines it with another movement to create one newer, more complex movement (it would be even better if it were to serve some purpose). Another example might be if the robot uses one of its primitive movements to manipulate its environment in order to serve some purpose or in an attempt to relieve its own boredom, such as a baby bouncing to music or pushing a button to reveal some cool consequence (like hearing a noise or seeing a picture). My mother pointed out to me that she found it most interesting when babies realize that they are in control of their own bodies. She particularly likes it when babies stare at their hands, flipping them back and forth, realizing that they have control over it. If a robot could make a discovery about its own body, and use this discovery to manipulate its environment or self, would be very interesting and very creative.

I spent my Monday and Tuesday working on Webots Controller code. The basic code for the robot to move around the maze was originally written in C, I translated this code into python (there are VERY few controllers written in python publicly available, so making the switch was difficult as the library was totally new to me) The code is quite simple. I mostly struggled with the translation in terms of object oriented programming (C is not, Python is). Because of this, I had trouble determining what belonged in what class and whether I needed to create new classes or if the classes already existed in the libraries imported etc.

C Code

 /*
 * File:         mybot_simple.c
 * Date:         August 8th, 2006
 * Description:  A really simple controller which moves the MyBot robot
 *               and avoids the walls
 * Author:       Simon Blanchoud
 * Modifications:
 *
 * Copyright (c) 2008 Cyberbotics - www.cyberbotics.com
 */

#include <webots/robot.h>
#include <webots/differential_wheels.h>
#include <webots/distance_sensor.h>

#define SPEED 60
#define TIME_STEP 64

int main()
{
  wb_robot_init(); /* necessary to initialize webots stuff */
  
  /* Get and enable the distance sensors. */
  WbDeviceTag ir0 = wb_robot_get_device("ir0");
  WbDeviceTag ir1 = wb_robot_get_device("ir1");
  wb_distance_sensor_enable(ir0, TIME_STEP);
  wb_distance_sensor_enable(ir1, TIME_STEP);
  
  while(wb_robot_step(TIME_STEP)!=-1) {
    
    /* Get distance sensor values */
    double ir0_value = wb_distance_sensor_get_value(ir0);
    double ir1_value = wb_distance_sensor_get_value(ir1);

    /* Compute the motor speeds */
    double left_speed, right_speed;
    if (ir1_value > 500) {

        /*
         * If both distance sensors are detecting something, this means that
         * we are facing a wall. In this case we need to move backwards.
         */
        if (ir0_value > 500) {
            left_speed = -SPEED;
            right_speed = -SPEED / 2;
        } else {

            /*
             * We turn proportionnaly to the sensors value because the
             * closer we are from the wall, the more we need to turn.
             */
            left_speed = -ir1_value / 10;
            right_speed = (ir0_value / 10) + 5;
        }
    } else if (ir0_value > 500) {
        left_speed = (ir1_value / 10) + 5;
        right_speed = -ir0_value / 10;
    } else {

        /*
         * If nothing has been detected we can move forward at maximal speed.
         */
        left_speed = SPEED;
        right_speed = SPEED;
    }

    /* Set the motor speeds. */
    wb_differential_wheels_set_speed(left_speed, right_speed);
  }
  
  return 0;
}

My Python Code

# File:        Mybot_Simple_Python.py 
# Date:        6/8/2009 
# Description:  Python Testing Program  
# Author:       AshGavs

from controller import *  
timestep = 64
speed = 60  
class Mybot_Simple_Python (DifferentialWheels):
  
  def run(self):
    distanceSensor0 = self.getDistanceSensor('ir0') #Get the IR sensors and enable them
    distanceSensor1 = self.getDistanceSensor('ir1')
    distanceSensor0.enable(timestep)
    distanceSensor1.enable(timestep)

    while (self.step(timestep)!= -1): # Start the timer for the world
      ir0_value = distanceSensor0.getValue()
      ir1_value = distanceSensor1.getValue()
      
      if(ir1_value > 500):
        if(ir0_value > 500): #both IR sensors sense a wall
          left_speed = speed*-1
          right_speed = (speed*-1)/2
        else:
          left_speed = (ir1_value*-1)/10 # Just IR1 senses a wall, move one wheel
          right_speed = (ir0_value/10) + 5
        
      elif(ir0_value > 500): # Just IR0 senses a wall, move other wheel
        left_speed = (ir1_value/10)+5
        right_speed = (ir0_value*-1)/10
      else: # no wall, move foward
        left_speed = speed
        right_speed = speed
        
      self.setSpeed(left_speed,right_speed) # set the speed of the wheels
      
controller = Mybot_Simple_Python()
controller.run()


Wednesday, June 10th

Key Points: Testing Python on a Humanoid Robot

I have decided to start work on using python on a humanoid robot. Though there is no model of the RN-1 (the model we will end up using) for Webots, I started doing work on the Kondo KHR-2HV, a similar bot. This controller was written in C++, which is object oriented, however, the code is very complicated and will be difficult to translate. I am currently just reading and attempting to understand the code, the model, the physics of the world, and determining whether it would be more worth it to build my own world and model from scratch (which we might need to do eventually), or to experiment with this robot.

Humanoid.jpg