In this paper, we discuss the problem of using visual and other sensors in the manipulation of a part by a robotic manipulator in a manufacturing workcell. Our emphasis is on the part localization problem involved. We introduce a new sensor-fusion approach which fuses sensory information from different sensors at various spatial and temporal scales. Relative spatial information obtained from processing of visual information is mapped to absolute taskspace of the robot through fusing of information from an encoder. Data obtained this way can be superimposed upon data obtained from displacement based vision algorithms at coarser time scales to improve overall reliability. Tracking plans reflecting sensor fusion are proposed. The localization ...
In this paper, we propose a new serf-calibrated manipulation scheme for a robot on a mobile platform...
This paper describes a real-time hierarchical system that combines (fuses) data from vision and touc...
This paper shows how multi sensor fusion with position, force and vision sensors can help to improve...
In this paper, we consider the problem of real-time planning and control of a robot manipulator in a...
This paper discusses the subject of detecting the position and orientation of a moving part, which i...
Industrial robots are traditionally programmed using only the internal joint position sensors, in a ...
<p>Current manufacturing practices require complete physical separation between people and active in...
Abstract — In this paper, we present a new method for sensor fusion in robot assembly. In our approa...
It is standard practice in advanced telerobotic systems to use a detailed model of the operating env...
Industrial robots are fast and accurate when working with known objects at precise locations in well...
This work presents a method of information fusion involving data captured by both a standard CCD cam...
The effort for reduced cycle times in manufacturing has supported the development of remote welding ...
This work presents a method of information fusion involving data captured by both a standard charge-...
This paper is devoted to the control problem of a robot manipulator for a class of constrained motio...
In contrast to stationary systems, mobile robots have an arbitrarily expandable workspace. As a resu...
In this paper, we propose a new serf-calibrated manipulation scheme for a robot on a mobile platform...
This paper describes a real-time hierarchical system that combines (fuses) data from vision and touc...
This paper shows how multi sensor fusion with position, force and vision sensors can help to improve...
In this paper, we consider the problem of real-time planning and control of a robot manipulator in a...
This paper discusses the subject of detecting the position and orientation of a moving part, which i...
Industrial robots are traditionally programmed using only the internal joint position sensors, in a ...
<p>Current manufacturing practices require complete physical separation between people and active in...
Abstract — In this paper, we present a new method for sensor fusion in robot assembly. In our approa...
It is standard practice in advanced telerobotic systems to use a detailed model of the operating env...
Industrial robots are fast and accurate when working with known objects at precise locations in well...
This work presents a method of information fusion involving data captured by both a standard CCD cam...
The effort for reduced cycle times in manufacturing has supported the development of remote welding ...
This work presents a method of information fusion involving data captured by both a standard charge-...
This paper is devoted to the control problem of a robot manipulator for a class of constrained motio...
In contrast to stationary systems, mobile robots have an arbitrarily expandable workspace. As a resu...
In this paper, we propose a new serf-calibrated manipulation scheme for a robot on a mobile platform...
This paper describes a real-time hierarchical system that combines (fuses) data from vision and touc...
This paper shows how multi sensor fusion with position, force and vision sensors can help to improve...