Patent abstracts

Industrial Robot

ISSN: 0143-991x

Article publication date: 8 March 2010

57

Citation

(2010), "Patent abstracts", Industrial Robot, Vol. 37 No. 2. https://doi.org/10.1108/ir.2010.04937bad.004

Publisher

:

Emerald Group Publishing Limited

Copyright © 2010, Emerald Group Publishing Limited


Patent abstracts

Article Type: Patent abstracts From: Industrial Robot: An International Journal, Volume 37, Issue 2

Title: Nerve artificial limb hand driven and controlled by brain-computer interface and control method thereofApplicant: Suzhou Institute of Xi’an Jiaotong (CN)Patent number: CN101455596Publication date: June 17, 2009

Abstract: The present invention discloses a prosthetic limb hand which is driven and controlled by brain-computer interface and a control method thereof. The prosthetic limb hand comprises the following components: a brain-electrical signal collecting device which is installed in the brain for collecting brain-electrical signal; a signal processing device which executes amplifying and filtering processing to the brain-electrical signal collected by the brain-electrical signal collecting device; a signal extracting recognizing transmitting device which executes characteristic extraction and action mode recognition to the brain-electrical signal obtained by the signal processing device and transmits the information to a prosthetic limb hand driving device; and the prosthetic limb hand driving device which drives the prosthetic limb hand to finish the corresponding action and feed back the action information when the information transmitted by signal extracting recognizing transmitting device is received. The prosthetic limb hand device according to the invention drives the prosthetic limb hand to finish the related action through brain-electric control, successfully prevent the problems of easy muscle fatigue, inferior repeatability, etc. caused by myoelectric control, can execute precise control to the prosthetic limb hand and realizes the hand function more ideally. The invention is suitable for driving control of prosthetic limb hand used by person with disabled upper arm.

Title: Intelligent wheelchair control system based on brain-machine interface and brain-electrical signal processing method thereofApplicant: University of Tianjin (CN)Patent number: CN101301244Publication date: November 12, 2008

Abstract: The invention relates to an intelligent wheel chair control system based on a brain-machine interface and a method for processing the EEG signal of the same. The system comprises a preposed amplified signal preprocessing circuit which collects the EEG signal of a testee, a signal collection card A/D converter, a signal processing device, a signal collection card D/A converter, an interface circuit, an end-around lamp control panel and a wheel chair which are connected in turn; and the end-around lamp control panel is also connected with the testee so as to receive the feedback signal sent out by the testee. The method comprises the following steps that: the differential input of an EEG signal is completed by a signal collection card; the EEG signal is filtered; the RMS smoothing algorithm of the filtered EEG signal is carried out; the filtered EEG signal is divided into two signals with one signal entering into a main control channel after 400 to 500 ms averaging and the other signal entering into an auxiliary control channel through 50 ms averaging method; and the signals are output when the signals are alpha wave signals. The intelligent wheel chair control system has simple and convenient operation, and can help handicapped people, aged people and severe paralysis patients still having thinking in the brain to move freely just through placing two electroencephalograph detection electrodes at the two lead parts of the pillow part and the ear parts of a testee, thereby improving the quality of life of the people.

Title: Real-time simulation system for under-driven double-feet walking robotApplicant: University of Jilin (CN)Patent number: CN101493855 (A)Publication date: July 29, 2009

Abstract: The invention relates to the field of simulation technology and provides and realizes a real-time simulation system of an under-actuated biped-walking robot. The real-time simulation system consists of a plurality of basic platforms and mainly comprises a real-time control platform, a joint simulation platform and a mechanical-electrical integration system simulation platform. The real-time simulation system is characterized in that the real-time control platform is directly connected with a solid robot drive system to control the movement of the robot in real time, the real-time control platform is connected with the joint simulation platform and is capable of carrying out data transmission, the joint simulation platform is connected with the mechanical-electrical integration system simulation platform to complete real-time joint simulation analysis, and each platform consists of different functional modules which are connected by data link. Real-time control algorithm design, virtual prototype design of an under-actuated walking robot, controller design, real-time joint simulation and experimental result analysis can be performed based on the real-time simulation system, and the real-time simulation system completes various operation tasks by the man-machine interface.

Title: System for performing non-contact type human-machine interaction by visionApplicant: University Beijing Science and Tech. (CN)Patent number: CN101441513 (A)Publication date: May 27, 2009

Abstract: The invention relates to a system for non-contact man-machine interaction by vision, and belongs to the field of man-machine interaction Non-contact interaction between human and a robot is achieved through a mode of information circulation on a vision channel. The system comprises a machine vision unit for acquiring a face image of a user under specific illumination conditions; an information processing analytic unit which is used for processing the face image, and calculating the vision direction of the user, analyzing the information containing intention extracted in an eye movement state of the user and identifying interactive information fed back by an intention information decision robot; and an interactive information displaying unit which is used for providing feedback information of the robot by a mode of generating visual stimulus for the user and used as a most direct platform for interaction of the robot and the user. The system makes full use of the vision channel to acquire and express nature, directness and convenience of information, has more convenient, quicker and easier operations for the man-machine interaction process, and has lower requirement on bandwidth of an information channel.

Title: Transfer of knowledge from a human skilled worker to an expert machine – the learning processApplicant: Tairob Ltd (IL)Patent number: US2009132088 (A1)Publication date: May 21, 2009

Abstract: A learning environment and method which is a first milestone to an expert machine that implements the master-slave robotic concept. The present invention is of a learning environment and method for teaching the master expert machine by a skilled worker that transfers his professional knowledge to the master expert machine in the form of elementary motions and subdivided tasks. The present invention further provides a stand alone learning environment, where a human wearing one or two innovative gloves equipped with 3D feeling sensors transfers a task performing knowledge to a robot in a different learning process than the master-slave learning concept. The 3D force/torque, displacement, velocity/acceleration and joint forces are recorded during the knowledge transfer in the learning environment by a computerized processing unit that prepares the acquired data for mathematical transformations for transmitting commands to the motors of a robot. The objective of the new robotic learning method is a learning process that will pave the way to a robot with a “human-like” tactile sensitivity, to be applied to material handling, or man/machine interaction.

Title: Intelligent independent robot core controllerApplicant: University of Beijing (CN)Patent number: CN101138843 (A)Publication date: March 12, 2008

Abstract: The present invention relates to the core controller of an intelligent autonomous robot, which includes a central control module, a sensor module, a CAN bus transmission module and a movement control module. The central control module, which comprises a core controller connected with a storage device and an interface connected with an external side, is provided with a trimmed embedded RT-Linux operating system, a plurality of hardware driving programs and an application program which facilitates a secondary development. The sensor module comprising a pressure sensor and an electronic compass is connected with the central control module by the interface. The CAN bus transmission module is connected with the central control module. The movement control module which is connected with the CAN bus transmission module exchanges information with the core controller through the CAN bus. The present invention well controls the autonomous movement, environment sensing and the man-machine interaction and provides software and hardware supports for the realization of correct autonomous behaviors of the intelligent robot.

Title: Adaptive patient training routine for biological interface systemInventor(s): Flaherty J.C. (USA); Serruya Mijail D. (USA); Morris Daniel S. (USA); Caplan Abraham H. (USA); Saleh Maryam (USA); and Donoghue John P. (USA)Patent number: US2007032738 (A1)Publication date: February 8, 2007

Abstract: Various embodiments of a biological interface system and related methods are disclosed. The system may comprise a sensor comprising a plurality of electrodes for detecting multicellular signals emanating from one or more living cells of a patient and a processing unit configured to receive the multicellular signals from the sensor and process the multicellular signals to produce a processed signal. The processing unit may be configured to transmit the processed signal to a controlled device that is configured to receive the processed signal. The system is configured to perform an integrated patient training routine to generate one or more system configuration parameters that are used by the processing unit to produce the processed signal.

Title: Biological interface system with neural signal classification systems and methodsApplicant: Cyberkinetics Neurotechnology (USA); Sebald Daniel J. (USA); Branner Almut (USA); Korver Kirk F. (USA); Pungor Andras (USA); and Flaherty J. Christopher (USA)Patent number: WO2007058950 (A2)Publication date: May 24, 2007

Abstract: Systems and methods for neural signal classification for the processing of multicellular signals of a patient are disclosed. The system includes a preprocessing device operatively coupled to an input channel and configured to receive multicellular signals collected from a sensor, at least a portion of the sensor configured to be disposed within the brain of a patient. The preprocessing device is also configured to filter the multicellular signals to extract a neural signal portion of the multicellular signals, the neural signal portion including a neural spike portion and a local field potential portion. The system also includes a neural spike processing device operatively coupled to the preprocessing device. The system is configured to project information associated with a neural spike onto a feature space, the feature space indicative of a correlation of the neural spike with a benchmark signal. The system is also configured to adaptively determining a spike sorting statistical model for the feature space samples. The system is further configured to identify one or more types of voluntary stimuli based on analysis of the feature space, wherein the projected information is grouped in clusters, each cluster defining a particular type of voluntary stimuli.

Related articles