First Day of Safety, Security and Rescue Robots 2010 (SSRR-2010)

Currently I’m participating at the workshop of Safety, Security and Rescue Robots 2010 in Bremen.

The first day is now gone and a lot of interesting talks have been given:

Tetsuya Kinugasa has shown a Flexible Displacement Sensor in his talk of “Measurement of Flexed Posture for Mono-tread Mobile Track Using New Flexible Displacement Sensor“. His group develops and uses this sensor to control the posture of a robot which is a combination of snake, worm and tank.

Jimmy Tran presented his works on “Canine Assisted Robot Deployment for Urban Search and Rescue“. The basic idea is as simple as brilliant, use a equipped dog to find victims and to inform operators about him. So, dogs are well used in rescue and they have a high mobility. They can easily overcome huge rubles and are able to carry video cameras or rescue material. So, his approach is to use the dogs to deploy a small robot next to a victim, which would allow to investigate medical status of the person. The idea is hilarious.

Development of leg-track hybrid locomotion to traverse loose slopes and irregular terrain” is so far the most interesting technical approach of this workshop. It shows a way how a tracked like vehicle can be combined with a semi-Walker.

Donny Kurnia Sutantyo  presented his work on “Multi-Robot Searching Algorithm Using Levy Flight and Artificial Potential Field“, while Julian de Hoog showed a solution for team exploration in “Dynamic Team Hierarchies in Communication-Limited Multi-Robot Exploration”.

The invited speaker Bernardo Wagner showed the outcomes of his department. The Leibniz University of Hannover has worked intensively in the field of “Perception and Navigation with 3D Laser Range Data in Challenging Environments“.

Potential Field based Approach for Coordinate Exploration with a Multi-Robot Team” is topic of Alessandro Renzaglia.

Bin Li showed another nice approach of a shape shifting robot. His robot is able to shape shift it self by rearranging its three motion segments. “Cooperative Reconfiguration between Two Specific Configurations for A Shape-shifting Robot

Jorge Bruno Silva presented a approach of trajectory planing while respecting time constrains in “Generating Trajectories With Temporal Constraints for an Autonomous Robot
Noritaka Sato closed the day by presenting novel a HMI approach for teleoperation. Instead of showing only the direct camera image his group uses temporal shifted images to generate an artificial bird eye view, like it is given in computer car games. “Teleoperation System Using Past Image Records Considering Moving Objects

I am looking forward to listen to the next talks.

Share on Facebook

Interesting designs for Rescue Robots – Part 2

Professor Dr. Satoshi Tadokoro from the Tohoku University  presents his ASC. ASC is an search camera for usage in emergency situations and stands for Active Scope Camera. In basic it is a flexible endoscope which is able to move by it self. With the help of vibrating inclined cilia this endoscope can like a caterpillar crawl into smallest voids (>30 mm). Its maximum speed is 47 mm/s and the operating range is 8 m. This allows rescue workers to search in rubbles for victims or checking the structure of it.

The following video shows Professor Dr. Satoshi Tadokoro at the Tokyo International Fire and Safety Exhibition 2008 presenting the ASC.

During the Collapse of the Historical Archive of the City of Cologne (March 2009),  Professor Dr. Satoshi Tadokoro, Professor Dr. Robin R. Murphy (Texas A&M University), Clint Arnett (Project Coordinator for Urban Search and Rescue in TEEX), members of the Fraunhofer Institute for Intelligent Analysis and Information Systems (IAIS) were trying to support the local fire department. Therefore I was able to test the ASC which was in use during this disaster.

The ASC performs extremely well. It can crawl in a reasonable speed into the rubble and is (after a little training) easy to use. But the biggest problem is the user interface. The ASC camera system does not compensated tilting or turning if the “robot” does flip/turn over, which happens quite often. Hence, it is hard for the Operator to keep track of the orientation. In addition the opening angle of the camera is extreme small, which does even more handicap the situational awareness.

Share on Facebook

Open source is FAIR – IAIS released the “Fraunhofer Autonomous Intelligent Robotics Devices Library” as open source

Developing and programming robotic systems can sometimes be an unsatisfying task. This feeling is mostly not related to problems that occur during “high level” problem solving. It is mostly appearing if you try to get the system it self up and running. So tools and solutions are needed to help us to overcome these initialization barriers.

The Fraunhofer Institute for Intelligent Analysis and Information Systems or for short Fraunhofer IAIS, does now offer a special computer library that can support the developer to get a width field of sensors and actors up and running. In addition it includes a various number of algorithmic for every day robotic problems like Simultaneous Localization And Mapping (SLAM) or image processing. The so called “Fraunhofer Autonomous Intelligent Robotics Devices Library” or for short FAIR library, is a C/C++ development library which is actively used in the VolksBot® projects and is released as open source project under the GNU-license Creative Commons.

FAIRlib is soon available at the sourceforge project “OpenVolksBot“.

Updated: The initail version is now available (see also here) and is published under the CC-by-sa-nc.

Share on Facebook

Dennis W. Hong presents RoMeLa

RoMeLa, the Robotics and Mechanisms Laboratory at Virginia Tech is currently working on “Robot Evolution Through Intelligent Design”. This means they are taking evolutionary inspired designs and try to adopted them to robotic purpose. Dennis W. Hong, PhD and his students have been creating a lot of really interesting new robots, for example three legged robots, snake like robots or humanoids (e.g. DARwIn). The talk from the TEDxNASA conference, Mr. Hong offers a short overview of their research.

By the way, if you’re wondering about that motto and how “evolution” can meet “intelligent design” here comes the answer. Hong tells us:

“Though it has both evolution and intelligent design in the sentence, it has nothing to do with either – “we” push the boundaries and come up with the next generation robotics (robot evolution) through us doing rigorous research and designing them intelligently (intelligent design). I think it is a clever tag line for our lab.”

Share on Facebook

Johnny Chunge Lee´s HMI Projects

Prof. Johnny Chunge Lee is a researcher and currently working for “Microsoft – Applied Sciences” in Redmond. He has a Ph.D. in Human-Computer Interaction gained at the Carnegie Mellon University on his thesis “Projector-Based Location Discovery and Tracking” [website].

On his website he has some great projects related to HMI that could also be helpful in robotics.

So for example his experiments on the usage of the WII controlers,

or his projects on Projector Calibration and RFID usage.

Share on Facebook