Difference between revisions of "Humanoid Robotics 2"

From IPRE Wiki
Jump to: navigation, search
(6/28/2010 *Start using Tekkotsu*)
Line 230: Line 230:
  
 
On the other hand, I tried hard to get rid of the "left-behind robot shape" in mirage but somehow did not get any good result. I found that the left-behind sometimes disappear after I open the file for a long time. Does that mean the image is too large to be process in this computer? However, other robot simulations work well ..
 
On the other hand, I tried hard to get rid of the "left-behind robot shape" in mirage but somehow did not get any good result. I found that the left-behind sometimes disappear after I open the file for a long time. Does that mean the image is too large to be process in this computer? However, other robot simulations work well ..
 +
 +
 +
== 6/29/2010 *Simulating the Maze* ==
 +
[[Image:maze.jpg|thumb|400px|left]]
 +
 +
Though I changed some code in the info.h file,  I still could not figure out where the test-out software is. So, I decided to wait till Professor Blank comes. Of course, I did not stop. I start to make the environment-the maze in which the real robot will be tested. I tried to search on the internent how to create an environment and load it to mirage but didn't get any satisfactory result.
 +
I don't know what can be specificly done about this, but I think of one way that can integrate this environment with the robot:
 +
1. I can export the whole environment as a mesh with Google Sketchup
 +
2. then, I can put the mesh as a joint into mirage, and set it in respect to the baseframe
 +
3. set it still and unmovable relative to the baseframe.

Revision as of 19:28, 29 June 2010

6/8/2010 *Our Super Robot*

Today is the first day of my internship. Melanie and I worked on the robot. We mainly checked the possible range of motion of different motors and found out the appropriate range which makes the robot more human-like.

Ying01.jpg


There are 20 motors and one CM2+ in the robot. And here are the data we have in today’s work:

motor19: 0-standing on toes 1023-bending up towards itself, heel

motor20: 512-standing straight

motor18: 0-knee back 300-standing (robot possible range:0-600/700)

motor16: 0-up front 1024-back 512-standing

motor 17: 600-left 512-standing

motor3: no limit

motor2: need to be changed because the motion range cannot be made by human

motor1: 500-up

motor4: 0-front 1024-back

motor5: 0-up(should be changed, cannot go down)

motor6: no limit

motor15: 512-forward 400-turn left; 800-turn right; 512-standing

motor9: 512-forward 600-turn right 200-turn left;

motor8: 512-forward 0-turnright motion range 200-800

6/9/2010 *Connecting the Lower Half*

This morning, I mainly tried to used the daisy chains to connect the lower half of the robot. I met great challenges when trying to plug some of the daisy chains. I think if I were the one who design the motors, I would have added more jacks on a single motor, which would give more flexibility to the connection of the daisy chains.

It is not easy to connect the motors in a good and organized way. Since the daisy chains have length limits, they have limited choice to connect to. At the same time, we have to pay attention not to hamper the motion.


In the afternoon, we tried to used the USB2Dynamixel to connect the PC and the servos. We downloaded the USB2Dynamixel Manager 1.03 and we successfully controlled the servos through the device. However, when we tried to talk to the servos directly through Python, we failed. This is the major probelm we are going to solve tomorrow.


6/10/2010 *Working On the Insides*

This morning, we had a conference about the electronic core of the robot. We met some challenges in the size of batteries and PC.

The task for this morning is to make the robot stand and see how much weight the robot can afford.

The standing position of the robot is not necessarily straight up. It may also have good weight distribution.


My task in the afternoon is to read the back ground information about Tekkotsu. I learnt about how to run behavior and to make state machine from the material.

6/11/2010 *Starting the Simulation*

Today I mainly worked on starting Mirage in Linux with Professor Blank. At the same time, I took the measurement of the robot based on the variable requirement of Mirage. Here are the results:

Ying02.jpg


I made this diagram after I watched the Tekkotsu tutorials in YouTube:(http://www.youtube.com/watch?v=rA9tm0gTln8), and here are some notes about it:

1.begin by defining z axis: z axis points along axis of rotation or axis of translation for a prismatic joint

x axis for base frame is a free choice. constrained for subsequent joints

y axis is now constrained to complete right-handed coordinate frame

2.add another joint z axis is defined by the axis of actuation D-H parameters are derived from common normal between consecutive z axes new x axis is colinear with common normal with origin at intersection with new z notice the origin is not at the center of the acuator. the origin may be in "open space"

four parameters specify the joint-to-joint transformation:d, θ,r, α.

d:d is the depth along the previous joint's axis θ:is the angle about the previous z to align its x with the new origin r:is the distance along the rotated x axis

 alternatively,radius of rotation about previous z

α:rotates about the new x axis to put z in its desired orientation

special case:parallel z axes thus a is a free parameter choose any convenient d

______________________________________________

The tutorial summerized above is indeed very confusing. I watched the video over 5 times and took notes but still not able to make sure all the measurement is correct. I am going to discuss this with Professor Blank next Monday!


6/14/2010 *Simulating the Robot in Mirage*

Simulation01.jpg
Simulation02.jpg

Today, I worked on the simulation of our robot.There are two kind of joints in the DHWizard, one is the revolute joint, and the other is the prismatic joint. Since all of servos in the lower part of the robot are revolute joints, so that I reconstructed the hip and the legs in Mirage. Look at the picture on the left. Because there are only two joints to select from and the shape of them are a little far from what we used for our robot, there are inevitablely some gaps between the joints. It seems like the joints are in the air.

After constructing the robot in the simulator, I tried to explore the xml file of the simulation I created to see whether I could change the color or even material of the robot.And the answer is YES! I can change the color of the robot to Pink(some girls may like), Green, and Blue or even some random colors defined by the six digits code.

6/15/2010 *Correcting the Wrong Joints and Move On to the Upper Half of the Robot*

Yesterday afternoon, I found a problem in my simulation. Whenever I reopened the file, the servos just clustered together and the legs need to be streched out. Later I found that the problem is resulted from the incorrect orientation of the servos as well as the wrong parameters of distance. I corrected them in the morning and tried to work on the upper half of the robot.

At noon, the little bluepig needs a short nap, so it crashed. Professor Blank and I restarted it many times but it still didn't work. Fortunately, this problem was solved in the afternoon!

Because the the nap of the bluepig, I started to work on Chiara. I typed into command to make the LED on and off and tested the hardware. The camera of the robot somehow doesn't work. And we have to take the red box apart to check later.The picture here shows the whole joint simulation of our robot.





6/16/2010 *Putting a Mesh on the Robot*

GoogleSU.jpg

This morning, I recieved an email from Professor Blank. He told me that it is said Google SketchUp and xml coverter together can help to make a mesh for our robot's joints so that we can make a simulation look exactly like our robot. So, I began by downloading Google SketchUp and learning how to use it. This software is suprisingly easy to use. (Making 3D model is no longer a dream!!!!!!)I can give the third dimension of a certain pattern simply by dragging it up or down. The tutorial videos are very helpfu and I strongly recommand those people who want to creat a 3D wonderland and live inside begin with Google SketchUp!

Here is the link to the tutorials:[1]

In order to make it export an xml document, I have to add a plugin to the software. However, because there are no GUI interface for setting the destination path, I have to edit the the .rb file. This plugin also requires you to have a xmlconverter and direct the pugin to it.

In the afternoon, I tried test the experted .mesh.xml file on the bluepig computer but failed. Somehow the computer worked really slow when running the mesh program. And I later found that I need one more convertor to change .mesh.xml to .mesh, and that is my task for tomorrow.


6/17/2010 & 6/18/2010 *Make A Success!*

Servomodel.jpg

With the help from the Tekkotsu mailing list, I successfully find out how to change the mesh in Mirage. Here are the steps: 1.We can use Google SketchUp to draw a mesh. Then, we have to download a Ogre plugin for exporting .mesh.xml and .material file. 2.We select the object that should be export and click "export selected: 3.Then, we use the OgreXMLConverter (which is included in Mirage) to convert the .mesh.xml file to .mesh file.(Please check whether there are errors in the log) 4.Remember to copy the .material file to the media folder, which enable the mesh to actually show up 5.Copy the .mesh file into the media folder or any folder you created. 6.After using the default joints to create a structure, you can change the name of the mesh, for example, from Chiara/Pan to Chiara/New or NewFolder/NewMesh. 7.If the mesh does not show up, it may probablely because of the following reasons: ●1)it has trouble importing the .material file that we created with the .mesh file.(You may also copy the text in .material file into the existing material file(I did both when I was exploring. So I don't really know whichi one actually works. ●2)it has trouble with the scale. The mesh you created with Google ScketchUp might be too small to see. When you check out whether the mesh exist, please change the parameters of X Y Z scale(making them 100, for example). You may find them just too small to be seen.(Thank you for the help from the Tekkotsu group. I figure this out with their help!)

That's my experience in making mesh. When you are making meshes, you should also mind the orientation. Please check out what is the orientation of the default mesh so that when you replace them with the new meshes, the model will not look weird.

I am currently working on the servo models. I will upload it here when I am done.

(One day later)

Okay, I am done. Look at this:


To download the model, please click here: Media:servo.skp

6/21/2010 *Some Problems Appeared Right Before Doug is Leaving!*

Today, when I converted the .mesh.xml file into .mesh and made it appear in Mirage, some of the faces were missing. I tried to reverse them but still didn't work(now I think maybe I grouped them or have set them into components, so that when I changed some of them, the others were changed too. Finally, I gave up. I tried to use the model downloaded from the 3D Warehouse instead. At first, some of the faces of the model were missing too, but they could be easily adjusted by reversing. File:Robotmodel.jpg Okay, maybe I should develop the meshes based on this model first and then change it later after I figure out what actually happened when I made the mesh.

One another problem I met is that the axi of the coordinate couldn't be acurrately aligned with the rotation axis somehow. I may have to figure out how to do that tomorrow.

6/22/2010 *Dividing the Whole into Pieces*

This morning, I recieved some email telling me that there are Robotis servo models available online. So, I downloaded them and tried to further process them in order to make them like our robots. Because the the robot is currently in the shop, I had to build up the meshes based on the robot dancing video and couldn't actually examine its detials.

6/23/2010 *Dividing the Whole into Pieces*

Robotsim.jpg

I made the basic structure with self-designed meshes today. From the orientation of the same servo in different position, we can learn that the right and left arms and legs can not be using the same meshes(that was my original expectation). Thus, I have to make two meshes of the same design but opposite orientation. One another problem I met is how to make the metal part. the metal should be moved with the joints. However, since every single piece of metal is connected with two servos, which means if one servo moves, the metal connected servo will move together, but in the simulation, it seems the metal will only moved with the servo in the same mesh. I will figure how to do this tomorrow.






6/25/2010 *Dressing up the Robot*

Simpic.jpg
Simpic2.jpg

Today, the meshes of the robot were mostly done. I only left out one mesh unfinished, wichi is actually, a piece for connection. I found it really difficult to determine the orientation of the meshes, especially when you want to be lazy, using one mesh for both right and left part of the body. It can easily run wrong. One important experience I got from doing the meshes this time is that some of the mesh can have 0 in all dimensions, so that your meshes won't stick out when you don't want them to in some motions. This simulation is okay for testing motions; however, if I have time, I would like to further improve the details.

Next week, I am planning to see whether I can make my robot walk. If I cannot, I may make a maze in the virtual world and go into the details of the robot.

6/28/2010 *Start using Tekkotsu*

Today I started reading the Tekkotsu material in http://www.tekkotsu.org/porting.html#configuration. Because I was advised to make a info.h file based on an existing one, I decided to first thoroughly study the original Chiara file. I downloaded the KHR2 from here:http://bugs.tekkotsu.org/show_bug.cgi?id=285 and try to read it. I found that this robot is kinda well constructed. It may work if I just change some variables. After one morning, I finished changing the info.h file accoridng to the actual configuratioin of our robot. But I don't really know how to link it with the file I created in DHWizard. I may work on it tomorrow.

On the other hand, I tried hard to get rid of the "left-behind robot shape" in mirage but somehow did not get any good result. I found that the left-behind sometimes disappear after I open the file for a long time. Does that mean the image is too large to be process in this computer? However, other robot simulations work well ..


6/29/2010 *Simulating the Maze*

Maze.jpg

Though I changed some code in the info.h file, I still could not figure out where the test-out software is. So, I decided to wait till Professor Blank comes. Of course, I did not stop. I start to make the environment-the maze in which the real robot will be tested. I tried to search on the internent how to create an environment and load it to mirage but didn't get any satisfactory result. I don't know what can be specificly done about this, but I think of one way that can integrate this environment with the robot: 1. I can export the whole environment as a mesh with Google Sketchup 2. then, I can put the mesh as a joint into mirage, and set it in respect to the baseframe 3. set it still and unmovable relative to the baseframe.