Refine
Year of publication
Document Type
- Part of a Book (184) (remove)
Language
- English (184) (remove)
Keywords
- Papierkunst (4)
- Wind Tunnel (3)
- Aktionskunst (2)
- Central receiver power plant (2)
- Central receiver system (2)
- Concentrated solar collector (2)
- Concentrated systems (2)
- Engineering optimization (2)
- Flight Test (2)
- Gas turbine (2)
- Pitching Moment (2)
- Seismic design (2)
- Solar concentration (2)
- Wave Drag (2)
- robotic process automation (2)
- 20 fossil-fueled power plants (1)
- 3D printing (1)
- Acceptance tests (1)
- Additive manufacturing (1)
- Advanced driver assistance systems (ADAS/AD) (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (40)
- Fachbereich Elektrotechnik und Informationstechnik (35)
- IfB - Institut für Bioengineering (34)
- Fachbereich Luft- und Raumfahrttechnik (26)
- Fachbereich Energietechnik (24)
- Fachbereich Chemie und Biotechnologie (18)
- Fachbereich Wirtschaftswissenschaften (14)
- Solar-Institut Jülich (10)
- Fachbereich Maschinenbau und Mechatronik (8)
- INB - Institut für Nano- und Biotechnologien (8)
- Fachbereich Gestaltung (6)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (5)
- Fachbereich Architektur (4)
- Fachbereich Bauingenieurwesen (4)
- IaAM - Institut für angewandte Automation und Mechatronik (1)
- Kommission für Forschung und Entwicklung (1)
- ZHQ - Bereich Hochschuldidaktik und Evaluation (1)
RGB-D sensors such as the Microsoft Kinect or the Asus Xtion are inexpensive 3D sensors. A depth image is computed by calculating the distortion of a known infrared light (IR) pattern which is projected into the scene. While these sensors are great devices they have some limitations. The distance they can measure is limited and they suffer from reflection problems on transparent, shiny, or very matte and absorbing objects. If more than one RGB-D camera is used the IR patterns interfere with each other. This results in a massive loss of depth information. In this paper, we present a simple and powerful method to overcome these problems. We propose a stereo RGB-D camera system which uses the pros of RGB-D cameras and combine them with the pros of stereo camera systems. The idea is to utilize the IR images of each two sensors as a stereo pair to generate a depth map. The IR patterns emitted by IR projectors are exploited here to enhance the dense stereo matching even if the observed objects or surfaces are texture-less or transparent. The resulting disparity map is then fused with the depth map offered by the RGB-D sensor to fill the regions and the holes that appear because of interference, or due to transparent or reflective objects. Our results show that the density of depth information is increased especially for transparent, shiny or matte objects.