Patent abstracts

Industrial Robot

ISSN: 0143-991x

Article publication date: 21 June 2011

628

Citation

(2011), "Patent abstracts", Industrial Robot, Vol. 38 No. 4. https://doi.org/10.1108/ir.2011.04938daa.013

Publisher

:

Emerald Group Publishing Limited

Copyright © 2011, Emerald Group Publishing Limited


Patent abstracts

Article Type: Web sites, patent abstracts and book review From: Industrial Robot: An International Journal, Volume 38, Issue 4

Title: Robot with vision-based 3D shape recognition

Applicant(s): Honda Research Institute Europe GmbH (DE)

Publication number: US2010286827 (A1)

Publication date: 11 November 2010

Abstract

The invention relates to a method for processing video signals from a video sensor, in order to extract 3D shape information about objects represented in the video signals, the method comprising the following steps: providing a memory in which objects are stored in a 3D shape space, the shape space being an abstract feature space encoding the objects’ 3D shape properties, and mapping a 2D video signal representation of an object in the shape space, the coordinates of the object in the shape space indicating the object’s 3D shape.

Title: Target tracking method for omnidirectional vision

Applicant(s): Tianjin University of Technology

Publication number: CN101860729 (A)

Publication date: 13 October 2010

Abstract

The invention discloses a target tracking method for omni-directional vision navigation of a robot. The method comprises the following steps of: acquiring omnidirectional images by using a fisheye lens; initializing the tracking for the image acquired by the fisheye lens and constructing characteristic parameters of an image target to be tracked during the tracking initialization with HSV color characteristic space; and seeking the position of the target to be tracked and setting a corresponding mark in the distorted image acquired by the fisheye lens by using a column diagram for the combined characteristic of the target color to be tracked and the probability distribution characteristic of the column diagram. By using the method, the separability of the target to be tracked and the background can be enhanced; even if the image has obvious visual distortion, the target can be tracked; and the target-tracking system has real-time property, robustness and accuracy. Particularly, under the conditions of varying illumination and complex background, the method can effectively solve the problem of tracking recovery reoccurred after the target is completely shielded or gets out of the vision and solve the problem of target tracking under the condition that the movement speed of the target changes violently in the vision.

Title: Ground obstacle detection method based on binocular stereo vision of robot

Applicant(s): Beijing Institute of Technology

Publication number: CN101852609 (A)

Publication date: 6 October 2010

Abstract

The invention discloses a ground obstacle detection method based on the binocular stereo vision of a robot, which belongs to the technical field of intelligent robots, and comprises the following steps that: according to the binocular baseline length and the focus, the ground parallax values of all rows in an image are analyzed through the geometrical configuration of the known image; based on the ground parallax values, a back projection model calculates the 3D coordinates of a scene point corresponding to a pixel so as to initially judge whether the pixel belongs to an obstacle or a ground point; the obstacle and the ground point are, respectively, endowed with different colors; the results are post-processed and the false obstacle is removed; and 3D errors are removed and a grid map is established. The method is applicable to various complicated indoor environments, can precisely identify various obstacles, has very high real-time performance, and provides very good preparation conditions for the robot to avoid obstacles.

Title: Robot airship control system for overhead line inspection and control method thereof

Applicant(s): University of Zhejiang±

Publication number: CN101807080 (A)

Publication date: 18 August 2010

Abstract

The invention discloses a robot airship control system for overhead line inspection and a control method thereof. The control system comprises a on-board system and a ground-based system, wherein the on-board system has a structure that: a master controller DSP is connected with various sensors, a wireless communication module, a motor drive circuit, an image processor DSP and the like through analog-to-digital conversion ports, serial ports and other ports, and the image processor DSP is connected with an infrared CCD through an HPI port; the ground-based system has a structure that: an interface singlechip is connected with a ground PC, wireless communication equipment and a manual remote controller through serial ports, I/O and other ports; the on-board system and the ground-based system are communicated through an on-board wireless communication module and a ground-based wireless communication module; and a wireless camera transmits image information to the ground-based system through wireless video signals, and displays the image information on an image monitor. The robot airship control system for the overhead line inspection has three patterns, namely an infrared vision navigation pattern, a GPS navigation pattern and a manual remote control pattern, and has the advantages of high inspection efficiency, high cruisingability and high safety.

Title: Laser weeding robot

Applicant(s): University of Jiangsu (CN) ±

Publication number: CN101589705

Publication date: 2 December 2009

Abstract

The invention discloses a laser weeding robot and relates to the field of agricultural robots. The laser weeding robot consists of an autonomous mobile vehicle, a transverse motion device, a laser weeding device and a control system. A primary vision system detects forward information so as to guide the autonomous mobile vehicle to move forward along the row direction of crops and a secondary vision system identifies and positions grass at the same time; the control system controls the forward motion of the autonomous mobile vehicle along the row direction of the crops according to the position information of grass; and at the same time, the transverse motion device controls a focus lens to move transversely in a direction perpendicular to the row direction of the crops to focus a laser beam on the grass accurately to cut or burn the grass by the heat effect generated by the laser on the grass. The laser weeding robot is accurate and quick in action and suitable for weeding between rows and around crop seedlings, avoids plowed earth covering the seedlings, greatly reduces the rate of damage to the crop seedlings, and is low in energy consumption and high in universality.

Title: System for performing non-contact type human-machine interaction by vision

Applicant(s): University of Science and Technology Beijing (CN) ±

Publication number: CN101441513 (A)

Publication date: 27 May 2009

Abstract

The invention relates to a system for non-contact man-machine interaction by vision, and belongs to the field of man-machine interaction. Non-contact interaction between human and a robot is achieved through a mode of information circulation on a vision channel. The system comprises a machine vision unit for acquiring a face image of a user under specific illumination conditions; an information processing analytic unit which is used for processing the face image, and calculating the vision direction of the user, analyzing the information containing intention extracted in an eye movement state of the user and identifying interactive information fed back by an intention information decision robot; and an interactive information displaying unit which is used for providing feedback information of the robot by a mode of generating visual stimulus for the user and used as a most direct platform for interaction of the robot and the user. The system makes full use of the vision channel to acquire and express nature, directness and convenience of information, has more convenient, quicker and easier operations for the man-machine interaction process, and has lower requirement on bandwidth of an information channel.

Title: Robotic tire spraying system

Inventor(s): Hendricks SR TODD E (USA) ±

Publication number: US2009061099 (A1)

Publication date: 5 March 2009

Abstract

A robotic spray system is provided for accurately spraying mold release onto any size or shaped green tire. The system analyzes individual green tires using an integrated vision system. The system controls the robotic spray position, the fan, fluid, atomizing air, and tire rotation speed for optimal spray coverage on both the inside and outside of green tires. The system includes a conveyor, an overhead mounted camera located over an infeed station, and a second camera located perpendicular to the green tire’s tread and several feet away from the center of the tire. Pictures of the green tire in the station are used to estimate the center and radius of the tire and locate the angle of the bar code with respect to the center of the tire. Reference points are provided from the camera images and robot positions are calculated to control the spraying.

Title: Terminal operating system of robot and minimally invasive surgery robot with vision navigation

Applicant(s): University of Beihang (CN) ±

Publication number: CN101411630 (A)

Publication date: 22 April 2009

Abstract

The invention discloses a terminal operating system for a robot and a vision navigation minimally invasive surgical robot. The terminal operating system for the robot comprises a substrate, wherein a locking device is arranged on the substrate and comprises a main locking frame and a side locking frame; and surgical puncture needles are arranged on the main locking frame and the side locking frame. The terminal operating system for the robot also comprises a calibration template, wherein black and white square texture is arranged on the calibration template and can be identified by a visual system of the robot; the calibration template is parallel to the upper surface of the substrate; and the calibration template and the substrate can be relatively rotated. The visual system of the vision navigation minimally invasive surgical robot identifies the position information of the calibration template, and the robot judges the position information of the surgical puncture needles according to the information. The terminal operating system for the robot has a simple structure, low cost and strong adaptability and universality, can be used for controlling and guiding scalpels of neurosurgery, osteological surgery, orthopedics and abdomen and chest surgery, and not only can be used for calibration of space positions but also can be used for position trailing of a terminal manipulator of the robot.

Related articles