Humanoid Robotics 2

From IPRE Wiki
Jump to: navigation, search

6/8/2010 *Our Super Robot*

Today is the first day of my internship. Melanie and I worked on the robot. We mainly checked the possible range of motion of different motors and found out the appropriate range which makes the robot more human-like.

Ying01.jpg


There are 20 motors and one CM2+ in the robot. And here are the data we have in today’s work:

motor19: 0-standing on toes 1023-bending up towards itself, heel

motor20: 512-standing straight

motor18: 0-knee back 300-standing (robot possible range:0-600/700)

motor16: 0-up front 1024-back 512-standing

motor 17: 600-left 512-standing

motor3: no limit

motor2: need to be changed because the motion range cannot be made by human

motor1: 500-up

motor4: 0-front 1024-back

motor5: 0-up(should be changed, cannot go down)

motor6: no limit

motor15: 512-forward 400-turn left; 800-turn right; 512-standing

motor9: 512-forward 600-turn right 200-turn left;

motor8: 512-forward 0-turnright motion range 200-800

6/9/2010 *Connecting the Lower Half*

This morning, I mainly tried to used the daisy chains to connect the lower half of the robot. I met great challenges when trying to plug some of the daisy chains. I think if I were the one who design the motors, I would have added more jacks on a single motor, which would give more flexibility to the connection of the daisy chains.

It is not easy to connect the motors in a good and organized way. Since the daisy chains have length limits, they have limited choice to connect to. At the same time, we have to pay attention not to hamper the motion.


In the afternoon, we tried to used the USB2Dynamixel to connect the PC and the servos. We downloaded the USB2Dynamixel Manager 1.03 and we successfully controlled the servos through the device. However, when we tried to talk to the servos directly through Python, we failed. This is the major probelm we are going to solve tomorrow.


6/10/2010 *Working On the Insides*

This morning, we had a conference about the electronic core of the robot. We met some challenges in the size of batteries and PC.

The task for this morning is to make the robot stand and see how much weight the robot can afford.

The standing position of the robot is not necessarily straight up. It may also have good weight distribution.


My task in the afternoon is to read the back ground information about Tekkotsu. I learnt about how to run behavior and to make state machine from the material.

6/11/2010 *Starting the Simulation*

Today I mainly worked on starting Mirage in Linux with Professor Blank. At the same time, I took the measurement of the robot based on the variable requirement of Mirage. Here are the results:

Ying02.jpg


I made this diagram after I watched the Tekkotsu tutorials in YouTube:(http://www.youtube.com/watch?v=rA9tm0gTln8), and here are some notes about it:

1.begin by defining z axis: z axis points along axis of rotation or axis of translation for a prismatic joint

x axis for base frame is a free choice. constrained for subsequent joints

y axis is now constrained to complete right-handed coordinate frame

2.add another joint z axis is defined by the axis of actuation D-H parameters are derived from common normal between consecutive z axes new x axis is colinear with common normal with origin at intersection with new z notice the origin is not at the center of the acuator. the origin may be in "open space"

four parameters specify the joint-to-joint transformation:d, θ,r, α.

d:d is the depth along the previous joint's axis θ:is the angle about the previous z to align its x with the new origin r:is the distance along the rotated x axis

 alternatively,radius of rotation about previous z

α:rotates about the new x axis to put z in its desired orientation

special case:parallel z axes thus a is a free parameter choose any convenient d

______________________________________________

The tutorial summerized above is indeed very confusing. I watched the video over 5 times and took notes but still not able to make sure all the measurement is correct. I am going to discuss this with Professor Blank next Monday!


6/14/2010 *Simulating the Robot in Mirage*

Simulation01.jpg
Simulation02.jpg

Today, I worked on the simulation of our robot.There are two kind of joints in the DHWizard, one is the revolute joint, and the other is the prismatic joint. Since all of servos in the lower part of the robot are revolute joints, so that I reconstructed the hip and the legs in Mirage. Look at the picture on the left. Because there are only two joints to select from and the shape of them are a little far from what we used for our robot, there are inevitablely some gaps between the joints. It seems like the joints are in the air.

After constructing the robot in the simulator, I tried to explore the xml file of the simulation I created to see whether I could change the color or even material of the robot.And the answer is YES! I can change the color of the robot to Pink(some girls may like), Green, and Blue or even some random colors defined by the six digits code.

6/15/2010 *Correcting the Wrong Joints and Move On to the Upper Half of the Robot*

Yesterday afternoon, I found a problem in my simulation. Whenever I reopened the file, the servos just clustered together and the legs need to be streched out. Later I found that the problem is resulted from the incorrect orientation of the servos as well as the wrong parameters of distance. I corrected them in the morning and tried to work on the upper half of the robot.

At noon, the little bluepig needs a short nap, so it crashed. Professor Blank and I restarted it many times but it still didn't work. Fortunately, this problem was solved in the afternoon!

Because the the nap of the bluepig, I started to work on Chiara. I typed into command to make the LED on and off and tested the hardware. The camera of the robot somehow doesn't work. And we have to take the red box apart to check later.The picture here shows the whole joint simulation of our robot.





6/16/2010 *Putting a Mesh on the Robot*

GoogleSU.jpg

This morning, I recieved an email from Professor Blank. He told me that it is said Google SketchUp and xml coverter together can help to make a mesh for our robot's joints so that we can make a simulation look exactly like our robot. So, I began by downloading Google SketchUp and learning how to use it. This software is suprisingly easy to use. (Making 3D model is no longer a dream!!!!!!)I can give the third dimension of a certain pattern simply by dragging it up or down. The tutorial videos are very helpfu and I strongly recommand those people who want to creat a 3D wonderland and live inside begin with Google SketchUp!

Here is the link to the tutorials:[1]

In order to make it export an xml document, I have to add a plugin to the software. However, because there are no GUI interface for setting the destination path, I have to edit the the .rb file. This plugin also requires you to have a xmlconverter and direct the pugin to it.

In the afternoon, I tried test the experted .mesh.xml file on the bluepig computer but failed. Somehow the computer worked really slow when running the mesh program. And I later found that I need one more convertor to change .mesh.xml to .mesh, and that is my task for tomorrow.


6/17/2010 & 6/18/2010 *Make A Success!*

Servomodel.jpg

With the help from the Tekkotsu mailing list, I successfully find out how to change the mesh in Mirage. Here are the steps: 1.We can use Google SketchUp to draw a mesh. Then, we have to download a Ogre plugin for exporting .mesh.xml and .material file. 2.We select the object that should be export and click "export selected: 3.Then, we use the OgreXMLConverter (which is included in Mirage) to convert the .mesh.xml file to .mesh file.(Please check whether there are errors in the log) 4.Remember to copy the .material file to the media folder, which enable the mesh to actually show up 5.Copy the .mesh file into the media folder or any folder you created. 6.After using the default joints to create a structure, you can change the name of the mesh, for example, from Chiara/Pan to Chiara/New or NewFolder/NewMesh. 7.If the mesh does not show up, it may probablely because of the following reasons: ●1)it has trouble importing the .material file that we created with the .mesh file.(You may also copy the text in .material file into the existing material file(I did both when I was exploring. So I don't really know whichi one actually works. ●2)it has trouble with the scale. The mesh you created with Google ScketchUp might be too small to see. When you check out whether the mesh exist, please change the parameters of X Y Z scale(making them 100, for example). You may find them just too small to be seen.(Thank you for the help from the Tekkotsu group. I figure this out with their help!)

That's my experience in making mesh. When you are making meshes, you should also mind the orientation. Please check out what is the orientation of the default mesh so that when you replace them with the new meshes, the model will not look weird.

I am currently working on the servo models. I will upload it here when I am done.

(One day later)

Okay, I am done. Look at this:


To download the model, please click here: Media:servo.skp

6/21/2010 *Some Problems Appeared Right Before Doug is Leaving!*

Today, when I converted the .mesh.xml file into .mesh and made it appear in Mirage, some of the faces were missing. I tried to reverse them but still didn't work(now I think maybe I grouped them or have set them into components, so that when I changed some of them, the others were changed too. Finally, I gave up. I tried to use the model downloaded from the 3D Warehouse instead. At first, some of the faces of the model were missing too, but they could be easily adjusted by reversing. File:Robotmodel.jpg Okay, maybe I should develop the meshes based on this model first and then change it later after I figure out what actually happened when I made the mesh.

One another problem I met is that the axi of the coordinate couldn't be acurrately aligned with the rotation axis somehow. I may have to figure out how to do that tomorrow.

6/22/2010 *Dividing the Whole into Pieces*

This morning, I recieved some email telling me that there are Robotis servo models available online. So, I downloaded them and tried to further process them in order to make them like our robots. Because the the robot is currently in the shop, I had to build up the meshes based on the robot dancing video and couldn't actually examine its detials.

6/23/2010 *Dividing the Whole into Pieces*

Robotsim.jpg

I made the basic structure with self-designed meshes today. From the orientation of the same servo in different position, we can learn that the right and left arms and legs can not be using the same meshes(that was my original expectation). Thus, I have to make two meshes of the same design but opposite orientation. One another problem I met is how to make the metal part. the metal should be moved with the joints. However, since every single piece of metal is connected with two servos, which means if one servo moves, the metal connected servo will move together, but in the simulation, it seems the metal will only moved with the servo in the same mesh. I will figure how to do this tomorrow.






6/25/2010 *Dressing up the Robot*

Simpic.jpg
Simpic2.jpg

Today, the meshes of the robot were mostly done. I only left out one mesh unfinished, wichi is actually, a piece for connection. I found it really difficult to determine the orientation of the meshes, especially when you want to be lazy, using one mesh for both right and left part of the body. It can easily run wrong. One important experience I got from doing the meshes this time is that some of the mesh can have 0 in all dimensions, so that your meshes won't stick out when you don't want them to in some motions. This simulation is okay for testing motions; however, if I have time, I would like to further improve the details.

Next week, I am planning to see whether I can make my robot walk. If I cannot, I may make a maze in the virtual world and go into the details of the robot.

6/28/2010 *Start using Tekkotsu*

Today I started reading the Tekkotsu material in http://www.tekkotsu.org/porting.html#configuration. Because I was advised to make a info.h file based on an existing one, I decided to first thoroughly study the original Chiara file. I downloaded the KHR2 from here:http://bugs.tekkotsu.org/show_bug.cgi?id=285 and try to read it. I found that this robot is kinda well constructed. It may work if I just change some variables. After one morning, I finished changing the info.h file accoridng to the actual configuratioin of our robot. But I don't really know how to link it with the file I created in DHWizard. I may work on it tomorrow.

On the other hand, I tried hard to get rid of the "left-behind robot shape" in mirage but somehow did not get any good result. I found that the left-behind sometimes disappear after I open the file for a long time. Does that mean the image is too large to be process in this computer? However, other robot simulations work well ..


6/29/2010 *Simulating the Maze*

Mazeforrobot.jpg

Though I changed some code in the info.h file, I still could not figure out where the test-out software is. So, I decided to wait till Professor Blank comes. Of course, I did not stop. I start to make the environment-the maze in which the real robot will be tested. I tried to search on the internent how to create an environment and load it to mirage but didn't get any satisfactory result. I don't know what can be specificly done about this, but I think of one way that can integrate this environment with the robot: 1. I can export the whole environment as a mesh with Google Sketchup 2. then, I can put the mesh as a joint into mirage, and set it in respect to the baseframe 3. set it still and unmovable relative to the baseframe.

7/1/2010 *Why couldn't I do what we did before?*

Today, Doug finally came back. We tried to launch Mirage together with Tekkotsu, but we somehow filed. Each time, after launching mirage, it will crash after we run Tekkotsu. In order to faciliate the trouble shooting process, here I will give out the path of the files related to our simulation: 1)the .kin file of our robot simulation: /usr/local/Tekkotsu/tools/dhwizard/code/ file name: robotwithhands_explore 2)the .mesh file of our robot simulation: /usr/local/Tekkotsu/tools/mirage/media/newrobt all the .mesh files are inside 3)the .material file /usr/local/Tekkotsu/tools/mirage/media/

7/7/2010 *A Better Simulation*

Gort.jpg

Today, I worked till late at night to redo the simulation for the Atlanta conference. I used a different and more clever way to make it: first construct the whoe robot in Sketchup and then export mesh by mesn, servo by servo. This can help to save the time in adjusting the orientation and relative distance. Here is the simulation in sketchup. I haven't made it into mirage yet. I also sent an email asking for help from the Tekkotsu development team about how to make the robot work.






7/8/2010 *Twins in Different World*

Gortvsgort.jpg

Today I further perfect the simulation. I wanted to load in to mirage but fail to make the converter work.








8/11/2010 *Summary of This Summer:A Detailed Method to Create A Simulation with Tekkotsu Platform*

How to Make A Robot Simulation in the Tekkotsu platform?

■Software you may need:

1.Google Sketchup from: http://sketchup.google.com/

2.Ogre Plugin for Google Sketchup from: http://www.ogre3d.org/forums/viewtopic.php?t=20481

The following can be downloaded from: http://www.tekkotsu.org/downloads.html

3.Mirage (in the typical Tekkotsu package)

4.DHWizard(in the typical Tekkotsu package)

5.Tekkotsu package

Of course, a lot of people use Blender, which is a powerful software to make 3D works.


■If you are going to add a robot whose design is really close to ours, I would suggest you can develop your biped robot based on our files, which is much easier than based on Chiara or ERS series(which is what we did). You can download these file, edit them and then replace them with the old ones(or place them in appropriate directories)

Here is a list of files that you need to edit( if your robot is a humanoid robot, you can download the files offered here and it can save you much work.)

Note: Original location means either you can find the file here or you should put the file you make in that directory.

1.Media:RobotInfo.h (you have to edit based on your robot)

orginal location: Tekkotsu/Shared

2.Media:RobotInfo.cc (you have to edit based on your robot)

orginal location: Tekkotsu/Shared

3.Media:GORT.kin (you can name your robot anything other than GORT, this is what you create in DHWizard)

original location: project/ms/config (you should put it here if you have created your own)

4.Media:hal-GORT.plist (this is what you get after compiling, you cannot directly edit this file)

original location: project/defaults(you should put it here if you have created your own)

5.Media:Environment.conf (remember to edit this one before you want to compile your code)

original location: Tekkotsu/project/Environment.conf


■Step by step instruction:

1. First, you have to measure the dimension of your robot accurately and identify which parts move together and how they are connected.

2. Then, you build the robot in Sketchup. When I first did this, I built the robot parts in Sketchup instead of the whole robot, so I didn’t have trouble with the coordinate system. But at the second time when I built the robot and export the meshes, the default rotating axis is the axis for the whole environment, which means all the parts rotate around the same axis. Temporarily I don’t know how to solve this problem. But I think we can try changing the coordinate of the environment or changing the relative position of the robot in order to make it rotate around the expected axis. When you export the meshes, make sure that they are selected, and use Tools/Export to Ogre to export. If you don’t have this option, you may fail to install the plugin. Please install the plugin again following the instruction in the download package. The files you export from Sketchup through Ogre plugin are .mesh.xml files.

3. To convert the .mesh.xml files into .mesh files, you need to use the OgreXMLConverter in Tekkotsu package. Just type something like this in your terminal: OgreXMLConverter test.mesh.xml (“test.mesh.xml” is your file name) Put these .mesh files(after convertion) Tekkotsu / tools / mirage / media / yourrobotname

4. Run Mirage in Tekkotsu/tools/mirage (location may vary in different version of Tekkotsu)

5. Run DHWizard in Tekkotsu/tools/bin/DHWizard (location may vary in different version of Tekkotsu) .Create your own .kin file with the default joints. You can create a skeleton for the robot with the default joint mesh, and then change the meshes path to those in “yourrobotname”, for example, I have a mesh named “head.mesh”, I will change the path to “yourrobotname/head”. Pay attention to the scale. If you change the path of the mesh but the mesh doesn’t show up, please try to make the scale to be 1000 in X,Y,X. If you are still not able to see it, please check the error message.You need to add one 0 scale joint for left leg and right leg if you are doing humanoid robots. These two are the reference frames for both legs.( There is a bug in DHWizard that if you open the file from the open option in DHWizard, two simulations will actually show up and you can only control one. In order to get rid of the extra robot, you can open the file from the terminal at the same time when you run DHWizard.

6.One important thing is that you should set your body as the baseframe, not matter it is a spider or a humanoid robot. Every extension of the new robot should branch out from the baseframe. The names of the joints should be highly organized, because they are the key for the Tekkotsu platform to recognize the joints correctly.

7. After these step above, we have to go on to edit the .xml files. We can make a copy of a Chiarainfo.h file(or just download ours from the link above) and then rename it to be YourrobotnameInfo.h. You should open the file and then edit the code based on your robot, such as the number of arms and lengs and sensros. Remember the names under this few lines should match the name of servos in your .kin file. And the order matters. The range of the servos should also match those in .kin file.

8.Put YourrobotnameInfo.h into the Tekkotsu/Shared directory

9.Edit the Tekkotsu/Shared/RobotInfo.h file to add the entry:

   #elif TGT_Yourrobotname
   #       include "Shared/YourrobotnameInfo.h"
   namespace RobotInfo { using namespace YourrobotnameInfo; }

underneath the first entry that has an "if" (since this is an "elif")

Edit the Tekkotsu/Shared/RobotInfo.cc file to add the entry:

   #include " YourrobotnameInfo.h"
   namespace YourrobotnameInfo {
       const char* const TargetName="Yourrobotname";
       const YourrobotnameCapabilities capabilities;
   }

among the other entries

10.Put Yourrobotname.kin into the project/ms/config directory

11.Edit Environment.conf to set the TEKKOTSU_TARGET_MODEL to TGT_Yourrobotname

12.Make the project by typing "make"( when we compiled the project, we had some issue with the files under Tekkotsu/Behaviors/Demos, so we just move the Demos folder directly out of Tekkotsu.

13.the hal-yourrobotname.plist file should be made now.

14.Put hal-yourrobotname.plist into project/defaults

15.Then, we run the project:

cd PROJECT

./tekkotsu-Yourrobotname -c mirage.plist

16. if you want to use controllerGUI to control it, you can then run it by typing:

cd Tekkotsu/tools/mon

./ControllerGUI localhost

(Thank you very much for Jason Tennyson and Ethan Tira-Thompson's help in this project.)