Exploring Search and Rescue with TevBot

From IPRE Wiki
Jump to: navigation, search

TevBot: The Amazing Hexapod

TevBot.JPG

File:Tbpix1.png

Senior Thesis Proposal

Exploring Search and Rescue with an Autonomous Hexapod Robot

Teyvonia Thomas

Thesis Advisors: Doug Blank and David Nice




Search and rescue refers to the process of searching for and assisting people or property in danger. Search and rescue operations are carried out in a variety of settings and circumstances including: mountainous regions, lakes, deserts and forests, after structural collapses and during wartime. These operations can be extremely dangerous for rescue workers and as such more and more mobile robots are being used to explore and map these locations. The deployment of robots for search and rescue operations has many advantages. One significant advantage is that miniature mobile robots can enter confined or toxic areas that aren’t easily accessible to humans and search dogs.

The goal of this thesis is to use an autonomous hexapod robot (TevBot) to navigate through flat and rugged terrain searching for simulated people while mapping the terrain and relaying victim locations, hazardous regions and passable openings to a central location which can then be used to plan a rescue mission.

There are several subtasks to be solved in order to execute such a complex operation. One central aspect of this search is navigation. Navigation involves the ability to move efficiently from a start location to a goal location and determine whether or not the target is reached. To navigate successfully, TevBot must be able to determine its position relative to surrounding objects in order to interact with them appropriately. The robot must also be able to determine where its body parts are in relation to each other and in relation to the objects it handles. To do this requires a variety of sensors. Infrared and sonar sensors will be used to measure distance and luminosity as well as to avoid collisions, obstacles and falling off cliffs. Body marks and an onboard camera will also be used to determine the position and orientation of the robot’s body parts with respect to its surroundings. The robot is also equipped with a microphone which will be useful for detecting sounds.

A variety of search algorithms such as genetic algorithm, hill climbing, random and simulated annealing will also be explored in order to determine which is most efficient. Object recognition is an important part of detecting victims. A few neural networks will be explored to do this. Different algorithms will also be investigated for mapping the environment. The result will be the creation of a human-robot interface with an intuitive, user-friendly graphical user interface that can be operated by persons who possess a range of computer skills. The graphical user interface will used for controlling and monitoring TevBot, getting useful information such as images, mappings of the environment, sensor readings, the battery state of the robot, surrounding obstacles and the victims’ location.



TevBot's Mechanical Components

TevBot’s chassis (body), legs, fluke mount and gripper were constructed using strong, light-weight aluminum. The parts were all designed in AutoCAD and manufactured using a TRAK milling machine. TevBot weighs 2.9 kg and is 14 cm tall when standing. All six legs are identical. The 2 front legs and the 2 rear legs have a wider range of motion than the middle legs but each leg has at least 45 degrees of motion about its joint motor without interfering with adjacent legs.

Mechanical Components.jpg

Electronic Components

Electronic Components.jpg

The Sensor Module houses an IrDA receiver which can be used to communicate with other robots via infrared. It has three proximity sensors (left, right and center) that can measure distance and luminosity, a microphone to detect sounds and a piezo-electric speaker that can be used to create music and beeps.

The Fluke is a board that has bluetooth capabilities, a camera, IR sensors, LEDs and a microprocessor. I connected a TTL to RS232 level shifter between the CM-5 Zig-100 port and the fluke's port and was able to send messages to the CM-5. With the fluke I can control TevBot wirelessly.

The Microcontroller (CM-5 unit) uses a serial connection to communicate with each of the motors as well as the sensor module. Each of these servos has a pre-programmed ID (which can be changed) that allows for easy identification of and communication with the individual servos.



Week 01: Jan 25 - 31

Accomplishments:

  • Submitted Proposal
  • First Senior Conference Meeting

Feedback:

  • Narrow focus of research


What's Next?

  • Explore SLAM
  • Explore Object Recognition

Week 02: Feb 01 - 07

Challenge One: Recognizing Objects

Main Objective: Program TevBot to recognize an orange body

To Do:

  • Add the following Behaviors to TevBot: STEP FORWARD, STEP BACKWARD, SPIN RIGHT, SPIN LEFT, MOVE RIGHT, MOVE LEFT, STOP
    • <Video of TevBot carrying out a sequence of these steps goes here>
  • Explore SLAM - "Simultaneous localization and mapping is a technique used by robots and autonomous vehicles to build up a map within an unknown environment while at the same time keeping track of their current position." - Wikipedia

Accomplishments:

  • Added more movements
  • Readings on object recognition
  • Explored different options for a GUI

Week 03: Feb 08 - 14

Accomplishments:

  • Programed TevBot to take pictures while walking and spinning
  • Made videos of TevBot's movements so far
    • <This week's videos go here>

Problems Encountered:

  • Hardware Malfunction: Loose wires, Bluetooth issues


What's Next?

  • Improve Hardware:
    • Check if CM-5 batteries are dead. If they are, reorder new ones
    • Get plugs for batteries on Fluke\
    • Build base (rubber feet) for TevBot
    • Replace gripper with sensor module. This will make it easier for TevBot to detect objects in front of her as well as to avoid falling of tables

Other Goals:

  • Program TevBot to recognize big orange face (stool top) near it (on the table)
  • Program the robot to recognize/find a big orange halloween basket on the floor
    • <Video of TevBot searching for the basket goes here>

Anticipated Problems:

      • Avoiding obstacles: location of sensor module, more sensors needed - e.g. back of robot, possibly bump sensors on legs
      • Lighting will affect vision processing
      • Power - movement restricted without a battery pack

Readings So Far

Mobile Robot for Search and Rescue

Mapping and Exploration for Search and Rescue with Humans and Mobile Robots

An Autonomous Team of Search and Rescue Robots

Human-Robot Teaming for Search and Rescue

SLAM for Dummies: A Tutorial Approach to Simultaneous Localization and Mapping


Week 04: February 15 - 20

Accomplishments:

TevBot Resting


"TevBot Standing"


Challenges:

  • Battery Issues
  • Gaits/Dragging motion

What's Next:

  • More Gait Analysis and refinement
  • Carry out simple object recognition experiments
  • Work on GUI
  • Explore Mapping and Search Algorithms

Week 05: February 20 - 27

Accomplishments:

  • Gait Analysis:
    • Readings:
      • Metachronal Wave Gait Generation for Hexapod Robots
        • The paper discusses the development of Cyclic Genetic Algorithms to produce a metachronal wave gait for autonomous legged robots
        • The metachronal wave is a very common walking pattern used by terrestrial arthropods such as insects and spiders
        • In this gait, the animal begins by lifting its hind legs first followed by the legs in front (only when the back legs are on the ground). This ensures that there is sufficient support for the animal's weight as well as stability while the animal is moving forward
        • CGA - "an evolutionary learning algorithm specialized to solve tasks that require continual cycles of sequential instructions"
        • Steps:
          1. Take capability measures of the robot: range and rate of motion - store in data structure - represents range of possible gaits
          2. Develop appropriate CGA and use it to train the robot model
          3. Test the gaits produced on the actual robot
        • Gaits tested on semi-autonomous hexapod robot, SevoBot and was successful
      • Walking Genetic Algorithms for Gait Synthesis in a Hexapod Robot
        • Explores the usage of a complex motor pattern generator (neural network) to develop gaits for a six-legged robot
      • Walking Hexapod Robot in Disaster Recovery: Developing Algorithm for Terrain Negotiation and Navigation
        • Explores Tripod and Wave Gait
        • Use of Bump Sensors for Obstacle and Collision Avoidance

  • Met with Physics adviser, David Nice:
    • Discussed Gait Analysis and Goals/Challenges of Research
    • Focusing on a specific problem (such as gait development and optimization) vs. developing fundamental components of search and rescue - basic walk, mapping, GUI - optimize later

  • Carried out Simple Object Recognition Experiments
    • Experimental 1: Recognize an Orange Stool
    • Experimental Setup:
      • Stool to right of TevBot - on table
TevbotExperiment1.jpg


      • Steps:
        1. Take Picture and analyze it: Determine if there are a lot (more than 20 % of pixels in frame) of orange pixels in the pic
        2. If yes, then announce that you've found the stool and state the position of the stool in relation to robot's position
        3. If no stool found, then keep searching
      • Searching involves rotating camera/fluke about mount 150 degrees, if stool is not found during these rotations, then the robot spins right (~90 degrees) and repeats until stool is found

  • GUI Development
    • Looked at PyQT, PyGTK
  • Thesis Planning and Management
    • Gantt Chart

My Gantt Chart

Gantt Chart Templates


  • Thesis Writing
    • LaTex Tutorial

Challenges:

  • Bluetooth disconnections/errors - while taking pictures
  • Gait Development

What's Next:

  • Writing Thesis: Background/Motivation, Introduction, Analysis of the Problem
  • Technical Documentation
  • More Object Recognition Experiments
  • Gait Development
  • GUI Development

Week 06: February 28 - March 6

Goals:

  • Improve Gait:
    • Add Rubber Feet to TevBot
    • Reprogram Walking Motions
  • Start Writing Thesis: Background/Motivation, Introduction, Analysis of the Problem
  • Program CM-5 in C
  • Carry out Search/Object Recognition Experiment in Lab (In simple constructed Environment)
  • Carry out Obstacle Detection Experiments
  • Start Mapping Algorithm

Accomplishments:

  • Explored recent developments in programming the CM-5 in C:

C Library for CM-5

Another link to C Library for CM-5 Development

Example.c file can be found here

Another link to Example.c file here

Describes how to build a small C program to run on the microprocessor Atmega128

Challenges:

  • Unable to compile C files


What's Next:

  • Try to fix compilation errors
  • Explore other ways to program CM-5

Week 07: March 7 - March 15 - SPRING BREAK

Accomplishments:

Week 08: March 15 - March 20

Goals:

  • Implement SLAM
  • Submit first third of thesis

Implementing SLAM:
Notes from: SLAM 4 DUMMIES

  • Hardware:
    • a mobile robot
    • a range measurement device
  • The Extended Kalman Filter is a fundamental part of the SLAM process - used to estimate robot's position based on features/landmarks in the robot's environment
  • The SLAM Process:
    • Get data about the robot's environment using sensors (such as sonar/laser)
      • Considerations for range measurement device:
        • angle/width of measurements
        • threshold
    • Get Odometry Data
    • Get sensor data and odometry data 'at the same time'
    • Landmarks - should decide what features of the room will be used for landmarks:
      • Unique features in the environment
      • must be easily distinguishable from other landmarks and other areas of the surrounding
      • must be re-observable
      • must be stationary and plentiful
    • Landmark Extraction:
    • Data Association:

Accomplishments:


Tips on writing thesis A Hybrid Planning Approach for Robots in Search and Rescue

Using Blob detection and An Ising-Based Neural Network for Victim Detection:

  • Blob Detection
  • Ising-Based Neural Network
    • A Neural Network - any network designed to function in a similar way to natural neural structures such as a human brain.
    • Concerned with the following questions:
      • How can a neural network function as a memory?
      • How can we use this network to carry out object recognition?
      • Two Main Components:
        1. Spins – connected such that the orientation of any given spin Si is influenced by the direction of other spins due to the interaction energy J*Si*Sj.
        2. The Energy Function:
          • E = - J ∑<ij> Si Sj
          • J - exchange constant and sum over all pairs of neighbor spins <ij>)
          • Jij > 0 the interaction is called ferromagnetic: spins aligned
          • Jij < 0 the interaction is called antiferromagnetic: antiparallel
          • Jij = 0 the spins are noninteracting


  • Relationship between Ising Model and Brain
  • Brain consists of a large number of basic units too called neurons
  • Each neuron connected to many other neurons
  • State of neuron can be described by its firing rate - (rate at which it emits pulses), R = f ( ∑ Vi ), Vi = input signal from dendrite i.
  • It is therefore possible to model a neuron as a simple Ising spin with two possible spins: Up or down. Assumption: neuron has two possible states firing or not firing
  • Therefore: E = - ∑<ij> Ji,j si sj
    • Ji,j - strength of synaptic connections - influence of neuron i on firing rate of j, Sum over all pairs of spins i and j in network
    • Result: Network prefers states that minimize energy -> parallel alignment
  • How can a Lattice of Spins Function as a Memory?

Bioloid Control Programmer main file

Main Program on CM-5: