SlideShare ist ein Scribd-Unternehmen logo
1 von 30
Introduction to RoboticsFeatures and Uncertainty October 4, 2010 Nikolaus Correll
Review: Vision Convolution-based filters Thresholds Goal: condensing information Exercise: 2. 1. 3. 4. 5. 6. 7. 8. a: (q1 ~ q2) b: (q3 ~ q4) c:  (q1>q3) a&b&c -> Heart
Midterm October 11 Today: Quick review of culearn exercises
Week 2: Reading What is a possible reason for the fact that nature did not evolve wheels except for a few animals that using rolling as means of locomotion? Because rotational actuators are not part of nature's repertoire. Because wheeled locomotion is not efficient on soft and/or uneven ground. Not true, there are various examples for wheel-based locomotion in nature. What is the difference between static and dynamic stability? Dynamic stability is when a robot does not fall over even when moving. Static stability considers "snapshots" of robot poses, whereas dynamic stability addresses sequences of statically stable poses. Dynamic stability requires motion for the system to be stable, static stability does not. What is the prime purpose of a suspension system in a mobile robot? To prevent damage of equipment on the robot To guarantee that the robot base is always parallel to the ground To assure that all wheels have maximum ground contact
Week 3: Reading How do you calculate the forward kinematics of a wheeled robot?  I calculate the contribution of each wheel to the degrees of freedom of the robot in robot coordinates, then add up them up, and finally transform them into world coordinates. The world coordinates can be expressed in robot coordinates using a simple rotation matrix. I calculate the 1st and 2nd moments of the rotational center of the robot and transform those using a 3x3 rotation matrix in world coordinates. What is key when calculating wheel kinematic constraints?   The angle of the wheel plane needs to be fixed.  Rolling and sliding constraints should not be zero for the robot to move. T he rolling speed must add up to the robot motion along the direction of the wheel. Which one of the following configurations has steerability degree 1 and mobility degree 2?  A robot that can translate along two axes and rotate its main body with a single steering wheel. A robot that can steer one wheel which leads to change translation along one axis and rotation of its main body. A robot with two steering wheels that can independently drive the robot AND let it rotate on the spot.   What is a good recipe to drive a differential wheel robot to a desired position?  Calculate the robot speed as a function of the robot’s wheel-speed (Forward kinematics). Use this information to predict how to change the wheel-speed in order to drive the error (expressed in Polar-coordinates) using the controller from Section 3.6.2.4.  Use the control law from Section 3.6.2.4 to calculate the desired robot speed in polar coordinates. Now transform the polar coordinates into robot coordinates (Inverse kinematics) and from there in world coordinates (Forward kinematic model). Calculate first the relation between forward and angular speed at the robot’s center and its wheel-speed (Forward kinematic model). Determine how to set the wheel-speed in order to achieve a desired robot speed (Inverse kinematics). Calculate the error in polar coordinates, and use the control law from Section 3.6.2.4 to calculate the desired robot speed.
Week 4: Reading Your robot is facing a wall with its distance sensor, and even though the robot is not moving its readings appear to be random. This is most likely a problem with the sensor’s Resolution Dynamic Range Bandwidth Precision Your robot is equipped with an infra-red distance sensor that delivers very accurate readings that reflect even very small changes in distance. Unfortunately, the sensors do not work well when sunlight is penetrating the lab windows. This is a problem of Sensitivity of the sensor Cross-Sensitivity of the sensor Accuracy of the sensor  Why do you require four satellites to establish your position with GPS? There are four unknowns: x, y, z and orientation There are four unknowns: x,y, z and clockskew There are only three unknowns, a compass is required for orientation How does a laser range finder work? A laser beam changes its amplitude at high speed. The Doppler effect leads to a phase-shift of the amplitude-modulated laser signal. This phase-shift can be measured and is proportional to the distance. The amplitude of the laser beam changes with a specific frequency whose wavelength is larger than the maximum range of the laser. Upon reflection the phase of this beam is shifted. This phase-shift can be measured and is proportional to the distance. A laser beam with wavelength of 824 nm is reflected from a surface and its reflection is recorded on a linear camera, which is used to measure the time between the emission of the ray and its arrival.
Week 5: Reading What makes color-based object recognition using image sensors difficult? Colors  are expressed in terms of their red, green and blue components. The associated gains change drastically as a function of the lighting conditions, and make even red and green objects ambiguous to distinguish. The way the sensor sees the image is different from that of the human eye and therefore requires careful calibration. Colors are easy to distinguish and this is therefore one of the easiest problems in vision.  What is not a valuable cue to detect depth from a single monocular image? Blurring Known size of an object Disparity All of the vision-based range extraction mechanisms suffer from the following problem Depth is difficult to estimate for objects that are far away Changes in lighting conditions change the way color is perceived Only stereo-vision range estimates fail in the far field Range estimates based on stereo-vision can be improved by increasing the baseline between the cameras. What are the trade-offs? The sensor requires considerably more space and range to objects that are close cannot be estimated as one of the cameras might not see it anymore. The sensor requires considerably more space and is more difficult to calibrate. The sensor just requires more space.
Uncertainty All sensors are noisy Today How to model uncertainty? How does uncertainty propagate from sensors to features? Example: line detection
The Gaussian/Normal Distribution Defined by Mean Variance Good approximation for some sensors
Current Stats: Week 1-4 vs. Spring 2010 Bi-modal distribution: Undergraduates/Graduates Different performance thresholds for U-Grads / Grads Spring 2010, 2 different distributions N=27 Max=48 #  students Points #  students Overall score (%)
Week 6: reading Why is a Gaussian distribution the model of choice for representing uncertainty in robotic sensing? Sensor readings are subject to uncertainty and this uncertainty behaves like a Gaussian distribution. The true distribution of noise on most sensors is unknown, but the mathematical properties of the Gaussian model make it the model of choice being applicable to most sensors. Because the likelihood is very high that all the sensor readings are within 3 standard deviations. What is the reasoning behind the derivation of Equation 4.73 and 4.74 (least-squares optimization)? The derivative of a function is zero at its extreme values (maximum or minimum), and thus finding the value where the derivative of the least-square error is zero, minimizes it. The value for which the least-square error is minimal, is the best fit for the line. Finding the angle of the line that best matches the set of point requires a double integration (double sum). Finding the best fitting line is a complex numerical optimization problem for which no analytical solution can be found. In order to detect an edge in the image You have to find areas in the image where the frequency, i.e. the change between neighboring pixels is high You have to find areas in the image that are brighter than others  You have to find areas in the image that are darker than others How can you calculate the variance for the detection of a feature that relies on multiple sensors? The variance for feature detection corresponds to that of the sensor with the highest variance. This is represented by the Jacobian that encodes the dependencies between all sensor’s error models. The variance for feature detection is the product of the variances of all sensors involved in its detection. This is represented by the Jacobian that encodes the dependencies between all sensor’s error models. The variance for feature detection is a weighted sum of the individual variance of each sensor weighted by the dependency of the sensors of each other.
Example Feature: Detecting Lines Camera Laserscan N.B. Every single point is subject to uncertainty!
Line Fitting What is the uncertainty associated to each line feature?
Example: Line Fitting Given: Desired: r, a
Solution (Line fitting) Additional trick: weight each measurement by the variance expected at this distance.
Whiteboard
Error propagation What is the variance of a and r? Error propagation law Y are the output variables, X input Cx,y are matrices of covariances F is a Jacobian matrix
Example: Line fitting 17 measurements f -> (ri,qi) -> (r,a) We need We know
Summary Every sensor has noise and makes reasoning uncertain Sensor measurements can be combined into features The uncertainty of these features can be calculated using the error propagation law Knowing how uncertainty behaves helps you decide
Line segmentation: Split-and-Merge
Other features: Segments ,[object Object]
Here: WatershedGary Bradski (c) 2008 21 21
Watershed algorithm http://cmm.ensmp.fr/~beucher/wtshed.html Demo OpenCVpyramid_segmentation
Alternative line features: Hough Transform Demo: OpenCVhoughlines
Hough Transform Source: K. Grauma / D. Scaramuzza
Project Assignments 16 Undergraduate Students 10 Graduate students 5 groups -> 5+5+5+5+6 ~2 graduate students + ~3 undergrads Goal: implement a controller forRatslife Grad students: have to submit controller Undergraduates have to present (final/design reviews)
Exercise: Introduction to Ratslife
Exercise 2: Locomotion If you were to write a controller, what do you think would be your best bet to generate the joint values for ji for joint i at time t? Hint: look at how the dog in ghostdog.wbt is controlled. ji(t+1)=a j1(t)+b ji(t+1)= a sin (t-b)+c ji(t+1)=a(t-b)^2+c Can you implement a new gait in ghostdog.wbt that lets the robot trot? What do you need to do except adding a case TROT to the finite state machine in ghostdog.c? Try this out before answering the question! Calculate the servo speed so that both front and hind pairs are out of phase, but one front leg is in phase with one hind leg. Calculate the servo speed so that the phase between front and hind legs is always shifted by 90 degrees. Calculate the servo speed so that all legs are phase shifted by 45 degrees. Which of the motions in Figure 2.1 is only dynamically stable? From 1->2 From 3->4 From 5->8 None What is a straightforward way to presumably double the speed of the forward motion? To test this, edit the file with a text-editor. If you don’t get the desired speed-up, why is this? The inertia and limited motor speed and torque hinder the robot from executing the motions double as fast. The motors are simply not fast enough. Just changing the timing of the gait does not affect its actual execution.
Exercise 3: Control What happens when you increase KR and lower KA in the controller? The robot will drive curves with larger radius. The robot will drive curves with smaller radius. The robot will just drive straight, the values need to be exactly as they are. How does your controller deal with the obstacles Collision avoidance subsumes navigation. Obstacles are avoided and navigation is resumed as soon as the obstacles are cleared. The robot plans around the obstacles. The robot gets stuck in the obstacles. Build an obstacle with a U-shape by shifting the obstacles in the arena (press the shift key and move them with the mouse) and let the robot run into this. What happens? The robot follows the inner perimeter of the U-obstacle to get out of it and eventually reaches the goal.  The robot goes back and forth into the U-obstacle. Some kind of planning would be needed. The shape of the obstacle does not matter.
Exercise 4: Control How can you tell that the robot is lying on its front or its back? I need to integrate the direction of acceleration in order to determine the direction the robot has fallen. I need to identify the direction of the acceleration exerted by the gravity of earth. I use the accelerometer to detect a fall and then use the camera to detect whether the robot is facing down or not. Can you use the Nao’s accelerometer for integrating the robot’s position? If you are not sure, try it! Sure, the acceleration allows me to calculate the position and small errors around the mean will cancel each other out.  It is not possible to calculate position simply from acceleration. Problems are the gravity of earth and the fact that even small errors have fatal impact after the required double integration. What is the problem with the resulting map? The laser scanner’s readings accuracy is pretty bad, leading to a rather noisy map. Accumulating odometry errors renders the resulting map useless very fast. The environment is hard to map. What problem does this robot create? It continues to collide with the mapping robot. Dynamic obstacles make their way into the map. The other robot moves to fast to be mapped accurately.

Weitere ähnliche Inhalte

Was ist angesagt?

From Virtual to Real World: Applying Animation to Design the Activity Recogni...
From Virtual to Real World: Applying Animation to Design the Activity Recogni...From Virtual to Real World: Applying Animation to Design the Activity Recogni...
From Virtual to Real World: Applying Animation to Design the Activity Recogni...sugiuralab
 
A visual explanation of the Michelson Interferometer
A visual explanation of the Michelson InterferometerA visual explanation of the Michelson Interferometer
A visual explanation of the Michelson Interferometertarotart
 
Use of internal sensors of tablets and smarphones in physics #scichallenge2017
Use of internal sensors of tablets and smarphones in physics   #scichallenge2017Use of internal sensors of tablets and smarphones in physics   #scichallenge2017
Use of internal sensors of tablets and smarphones in physics #scichallenge2017Matouš Pikous
 
Muhammad rizwan aqeel rlp.ppt
Muhammad rizwan aqeel rlp.pptMuhammad rizwan aqeel rlp.ppt
Muhammad rizwan aqeel rlp.pptM Rizwan Aqeel
 
Lecture 10 focusing and colimating
Lecture 10 focusing and colimatingLecture 10 focusing and colimating
Lecture 10 focusing and colimatingIwan Cony S
 

Was ist angesagt? (15)

Auto-collimator
Auto-collimatorAuto-collimator
Auto-collimator
 
From Virtual to Real World: Applying Animation to Design the Activity Recogni...
From Virtual to Real World: Applying Animation to Design the Activity Recogni...From Virtual to Real World: Applying Animation to Design the Activity Recogni...
From Virtual to Real World: Applying Animation to Design the Activity Recogni...
 
Ftir seminar
Ftir seminarFtir seminar
Ftir seminar
 
L01117074
L01117074L01117074
L01117074
 
A visual explanation of the Michelson Interferometer
A visual explanation of the Michelson InterferometerA visual explanation of the Michelson Interferometer
A visual explanation of the Michelson Interferometer
 
Computer Graphics display technologies(Computer graphics tutorials)
Computer Graphics display technologies(Computer graphics tutorials)Computer Graphics display technologies(Computer graphics tutorials)
Computer Graphics display technologies(Computer graphics tutorials)
 
FTIR
FTIRFTIR
FTIR
 
Use of internal sensors of tablets and smarphones in physics #scichallenge2017
Use of internal sensors of tablets and smarphones in physics   #scichallenge2017Use of internal sensors of tablets and smarphones in physics   #scichallenge2017
Use of internal sensors of tablets and smarphones in physics #scichallenge2017
 
Ftir
FtirFtir
Ftir
 
Ppt
PptPpt
Ppt
 
Muhammad rizwan aqeel rlp.ppt
Muhammad rizwan aqeel rlp.pptMuhammad rizwan aqeel rlp.ppt
Muhammad rizwan aqeel rlp.ppt
 
Lecture 10 focusing and colimating
Lecture 10 focusing and colimatingLecture 10 focusing and colimating
Lecture 10 focusing and colimating
 
De24686692
De24686692De24686692
De24686692
 
Chapter two 1
Chapter two 1Chapter two 1
Chapter two 1
 
Chapter two 1
Chapter two 1Chapter two 1
Chapter two 1
 

Andere mochten auch

Three-dimensional construction with mobile robots and modular blocks
 Three-dimensional construction with mobile robots and modular blocks Three-dimensional construction with mobile robots and modular blocks
Three-dimensional construction with mobile robots and modular blocksUniversity of Colorado at Boulder
 
Robotics by rk mittal
Robotics by rk mittalRobotics by rk mittal
Robotics by rk mittalZAKI ANWER
 
Industrial robotics
Industrial roboticsIndustrial robotics
Industrial roboticsjjenishmech
 

Andere mochten auch (19)

Lecture 05
Lecture 05Lecture 05
Lecture 05
 
Lecture 10: Summary
Lecture 10: SummaryLecture 10: Summary
Lecture 10: Summary
 
Lecture 04
Lecture 04Lecture 04
Lecture 04
 
Lecture 02: Locomotion
Lecture 02: LocomotionLecture 02: Locomotion
Lecture 02: Locomotion
 
Showcase
ShowcaseShowcase
Showcase
 
Three-dimensional construction with mobile robots and modular blocks
 Three-dimensional construction with mobile robots and modular blocks Three-dimensional construction with mobile robots and modular blocks
Three-dimensional construction with mobile robots and modular blocks
 
Lecture 09: SLAM
Lecture 09: SLAMLecture 09: SLAM
Lecture 09: SLAM
 
Lecture 08: Localization and Mapping II
Lecture 08: Localization and Mapping IILecture 08: Localization and Mapping II
Lecture 08: Localization and Mapping II
 
Lecture 03 - Kinematics and Control
Lecture 03 - Kinematics and ControlLecture 03 - Kinematics and Control
Lecture 03 - Kinematics and Control
 
Lecture 07: Localization and Mapping I
Lecture 07: Localization and Mapping ILecture 07: Localization and Mapping I
Lecture 07: Localization and Mapping I
 
Robot Manipulation Basics
Robot Manipulation BasicsRobot Manipulation Basics
Robot Manipulation Basics
 
Robot Configuration - 1
Robot Configuration - 1Robot Configuration - 1
Robot Configuration - 1
 
Robot Configuration - 2
Robot Configuration - 2Robot Configuration - 2
Robot Configuration - 2
 
Robotics by rk mittal
Robotics by rk mittalRobotics by rk mittal
Robotics by rk mittal
 
Industrial robotics
Industrial roboticsIndustrial robotics
Industrial robotics
 
Computer Science Engineering
Computer Science EngineeringComputer Science Engineering
Computer Science Engineering
 
Lecture 01: Introduction
Lecture 01: IntroductionLecture 01: Introduction
Lecture 01: Introduction
 
Industrial robotics
Industrial roboticsIndustrial robotics
Industrial robotics
 
Mechanical Engineering
Mechanical EngineeringMechanical Engineering
Mechanical Engineering
 

Ähnlich wie Lecture 06: Features and Uncertainty

Design of Mobile Robot Navigation system using SLAM and Adaptive Tracking Con...
Design of Mobile Robot Navigation system using SLAM and Adaptive Tracking Con...Design of Mobile Robot Navigation system using SLAM and Adaptive Tracking Con...
Design of Mobile Robot Navigation system using SLAM and Adaptive Tracking Con...iosrjce
 
Eecs221 final report
Eecs221   final reportEecs221   final report
Eecs221 final reportSaurebh Raut
 
EECS221 - Final Report
EECS221 - Final ReportEECS221 - Final Report
EECS221 - Final ReportSaurebh Raut
 
Indoor localisation and dead reckoning using Sensor Tag™ BLE.
Indoor localisation and dead reckoning using Sensor Tag™ BLE.Indoor localisation and dead reckoning using Sensor Tag™ BLE.
Indoor localisation and dead reckoning using Sensor Tag™ BLE.Abhishek Madav
 
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD Editor
 
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD Editor
 
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD Editor
 
Object tracking with SURF: ARM-Based platform Implementation
Object tracking with SURF: ARM-Based platform ImplementationObject tracking with SURF: ARM-Based platform Implementation
Object tracking with SURF: ARM-Based platform ImplementationEditor IJCATR
 
An Assessment of Image Matching Algorithms in Depth Estimation
An Assessment of Image Matching Algorithms in Depth EstimationAn Assessment of Image Matching Algorithms in Depth Estimation
An Assessment of Image Matching Algorithms in Depth EstimationCSCJournals
 
Sensors for mobile robot navigation based on robotics
Sensors for mobile robot navigation based on roboticsSensors for mobile robot navigation based on robotics
Sensors for mobile robot navigation based on roboticsKRSavinJoseph
 
HCI for Real world Applications
HCI for Real world ApplicationsHCI for Real world Applications
HCI for Real world ApplicationsIOSR Journals
 
Optical Flow Based Navigation
Optical Flow Based NavigationOptical Flow Based Navigation
Optical Flow Based NavigationVincent Kee
 
Practical Digital Image Processing 4
Practical Digital Image Processing 4Practical Digital Image Processing 4
Practical Digital Image Processing 4Aly Abdelkareem
 
Chapter 5: Remote sensing
Chapter 5: Remote sensingChapter 5: Remote sensing
Chapter 5: Remote sensingShankar Gangaju
 
An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...Kunal Kishor Nirala
 

Ähnlich wie Lecture 06: Features and Uncertainty (20)

Design of Mobile Robot Navigation system using SLAM and Adaptive Tracking Con...
Design of Mobile Robot Navigation system using SLAM and Adaptive Tracking Con...Design of Mobile Robot Navigation system using SLAM and Adaptive Tracking Con...
Design of Mobile Robot Navigation system using SLAM and Adaptive Tracking Con...
 
K017655963
K017655963K017655963
K017655963
 
Eecs221 final report
Eecs221   final reportEecs221   final report
Eecs221 final report
 
EECS221 - Final Report
EECS221 - Final ReportEECS221 - Final Report
EECS221 - Final Report
 
Indoor localisation and dead reckoning using Sensor Tag™ BLE.
Indoor localisation and dead reckoning using Sensor Tag™ BLE.Indoor localisation and dead reckoning using Sensor Tag™ BLE.
Indoor localisation and dead reckoning using Sensor Tag™ BLE.
 
PSanthanam.ppt
PSanthanam.pptPSanthanam.ppt
PSanthanam.ppt
 
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
 
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
 
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
IJERD (www.ijerd.com) International Journal of Engineering Research and Devel...
 
Object tracking with SURF: ARM-Based platform Implementation
Object tracking with SURF: ARM-Based platform ImplementationObject tracking with SURF: ARM-Based platform Implementation
Object tracking with SURF: ARM-Based platform Implementation
 
V01 i010412
V01 i010412V01 i010412
V01 i010412
 
An Assessment of Image Matching Algorithms in Depth Estimation
An Assessment of Image Matching Algorithms in Depth EstimationAn Assessment of Image Matching Algorithms in Depth Estimation
An Assessment of Image Matching Algorithms in Depth Estimation
 
Sensors for mobile robot navigation based on robotics
Sensors for mobile robot navigation based on roboticsSensors for mobile robot navigation based on robotics
Sensors for mobile robot navigation based on robotics
 
HCI for Real world Applications
HCI for Real world ApplicationsHCI for Real world Applications
HCI for Real world Applications
 
Optical Flow Based Navigation
Optical Flow Based NavigationOptical Flow Based Navigation
Optical Flow Based Navigation
 
Iw3515281533
Iw3515281533Iw3515281533
Iw3515281533
 
Practical Digital Image Processing 4
Practical Digital Image Processing 4Practical Digital Image Processing 4
Practical Digital Image Processing 4
 
Chapter 5: Remote sensing
Chapter 5: Remote sensingChapter 5: Remote sensing
Chapter 5: Remote sensing
 
An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...
 
Lecture 04: Sensors
Lecture 04: SensorsLecture 04: Sensors
Lecture 04: Sensors
 

Mehr von University of Colorado at Boulder

November 9, Planning and Control of Unmanned Aircraft Systems in Realistic C...
November 9, Planning and Control of Unmanned Aircraft Systems  in Realistic C...November 9, Planning and Control of Unmanned Aircraft Systems  in Realistic C...
November 9, Planning and Control of Unmanned Aircraft Systems in Realistic C...University of Colorado at Boulder
 

Mehr von University of Colorado at Boulder (19)

Template classes and ROS messages
Template classes and ROS messagesTemplate classes and ROS messages
Template classes and ROS messages
 
NLP for Robotics
NLP for RoboticsNLP for Robotics
NLP for Robotics
 
Indoor Localization Systems
Indoor Localization SystemsIndoor Localization Systems
Indoor Localization Systems
 
Vishal Verma: Rapidly Exploring Random Trees
Vishal Verma: Rapidly Exploring Random TreesVishal Verma: Rapidly Exploring Random Trees
Vishal Verma: Rapidly Exploring Random Trees
 
Lecture 08: Localization and Mapping II
Lecture 08: Localization and Mapping IILecture 08: Localization and Mapping II
Lecture 08: Localization and Mapping II
 
Lecture 07: Localization and Mapping I
Lecture 07: Localization and Mapping ILecture 07: Localization and Mapping I
Lecture 07: Localization and Mapping I
 
Lecture 01
Lecture 01Lecture 01
Lecture 01
 
Lectures 11+12: Debates
Lectures 11+12: DebatesLectures 11+12: Debates
Lectures 11+12: Debates
 
Lecture 09: Localization and Mapping III
Lecture 09: Localization and Mapping IIILecture 09: Localization and Mapping III
Lecture 09: Localization and Mapping III
 
Lecture 10: Navigation
Lecture 10: NavigationLecture 10: Navigation
Lecture 10: Navigation
 
Lecture 06: Features
Lecture 06: FeaturesLecture 06: Features
Lecture 06: Features
 
Lecture 05: Vision
Lecture 05: VisionLecture 05: Vision
Lecture 05: Vision
 
Lecture 03: Kinematics
Lecture 03: KinematicsLecture 03: Kinematics
Lecture 03: Kinematics
 
Lecture 02: Locomotion
Lecture 02: LocomotionLecture 02: Locomotion
Lecture 02: Locomotion
 
December 7, Projects
December 7, ProjectsDecember 7, Projects
December 7, Projects
 
December 4, Project
December 4, ProjectDecember 4, Project
December 4, Project
 
December 2, Projects
December 2, ProjectsDecember 2, Projects
December 2, Projects
 
November 30, Projects
November 30, ProjectsNovember 30, Projects
November 30, Projects
 
November 9, Planning and Control of Unmanned Aircraft Systems in Realistic C...
November 9, Planning and Control of Unmanned Aircraft Systems  in Realistic C...November 9, Planning and Control of Unmanned Aircraft Systems  in Realistic C...
November 9, Planning and Control of Unmanned Aircraft Systems in Realistic C...
 

Lecture 06: Features and Uncertainty

  • 1. Introduction to RoboticsFeatures and Uncertainty October 4, 2010 Nikolaus Correll
  • 2. Review: Vision Convolution-based filters Thresholds Goal: condensing information Exercise: 2. 1. 3. 4. 5. 6. 7. 8. a: (q1 ~ q2) b: (q3 ~ q4) c: (q1>q3) a&b&c -> Heart
  • 3. Midterm October 11 Today: Quick review of culearn exercises
  • 4. Week 2: Reading What is a possible reason for the fact that nature did not evolve wheels except for a few animals that using rolling as means of locomotion? Because rotational actuators are not part of nature's repertoire. Because wheeled locomotion is not efficient on soft and/or uneven ground. Not true, there are various examples for wheel-based locomotion in nature. What is the difference between static and dynamic stability? Dynamic stability is when a robot does not fall over even when moving. Static stability considers "snapshots" of robot poses, whereas dynamic stability addresses sequences of statically stable poses. Dynamic stability requires motion for the system to be stable, static stability does not. What is the prime purpose of a suspension system in a mobile robot? To prevent damage of equipment on the robot To guarantee that the robot base is always parallel to the ground To assure that all wheels have maximum ground contact
  • 5. Week 3: Reading How do you calculate the forward kinematics of a wheeled robot? I calculate the contribution of each wheel to the degrees of freedom of the robot in robot coordinates, then add up them up, and finally transform them into world coordinates. The world coordinates can be expressed in robot coordinates using a simple rotation matrix. I calculate the 1st and 2nd moments of the rotational center of the robot and transform those using a 3x3 rotation matrix in world coordinates. What is key when calculating wheel kinematic constraints?  The angle of the wheel plane needs to be fixed.  Rolling and sliding constraints should not be zero for the robot to move. T he rolling speed must add up to the robot motion along the direction of the wheel. Which one of the following configurations has steerability degree 1 and mobility degree 2? A robot that can translate along two axes and rotate its main body with a single steering wheel. A robot that can steer one wheel which leads to change translation along one axis and rotation of its main body. A robot with two steering wheels that can independently drive the robot AND let it rotate on the spot.  What is a good recipe to drive a differential wheel robot to a desired position? Calculate the robot speed as a function of the robot’s wheel-speed (Forward kinematics). Use this information to predict how to change the wheel-speed in order to drive the error (expressed in Polar-coordinates) using the controller from Section 3.6.2.4.  Use the control law from Section 3.6.2.4 to calculate the desired robot speed in polar coordinates. Now transform the polar coordinates into robot coordinates (Inverse kinematics) and from there in world coordinates (Forward kinematic model). Calculate first the relation between forward and angular speed at the robot’s center and its wheel-speed (Forward kinematic model). Determine how to set the wheel-speed in order to achieve a desired robot speed (Inverse kinematics). Calculate the error in polar coordinates, and use the control law from Section 3.6.2.4 to calculate the desired robot speed.
  • 6. Week 4: Reading Your robot is facing a wall with its distance sensor, and even though the robot is not moving its readings appear to be random. This is most likely a problem with the sensor’s Resolution Dynamic Range Bandwidth Precision Your robot is equipped with an infra-red distance sensor that delivers very accurate readings that reflect even very small changes in distance. Unfortunately, the sensors do not work well when sunlight is penetrating the lab windows. This is a problem of Sensitivity of the sensor Cross-Sensitivity of the sensor Accuracy of the sensor  Why do you require four satellites to establish your position with GPS? There are four unknowns: x, y, z and orientation There are four unknowns: x,y, z and clockskew There are only three unknowns, a compass is required for orientation How does a laser range finder work? A laser beam changes its amplitude at high speed. The Doppler effect leads to a phase-shift of the amplitude-modulated laser signal. This phase-shift can be measured and is proportional to the distance. The amplitude of the laser beam changes with a specific frequency whose wavelength is larger than the maximum range of the laser. Upon reflection the phase of this beam is shifted. This phase-shift can be measured and is proportional to the distance. A laser beam with wavelength of 824 nm is reflected from a surface and its reflection is recorded on a linear camera, which is used to measure the time between the emission of the ray and its arrival.
  • 7. Week 5: Reading What makes color-based object recognition using image sensors difficult? Colors are expressed in terms of their red, green and blue components. The associated gains change drastically as a function of the lighting conditions, and make even red and green objects ambiguous to distinguish. The way the sensor sees the image is different from that of the human eye and therefore requires careful calibration. Colors are easy to distinguish and this is therefore one of the easiest problems in vision.  What is not a valuable cue to detect depth from a single monocular image? Blurring Known size of an object Disparity All of the vision-based range extraction mechanisms suffer from the following problem Depth is difficult to estimate for objects that are far away Changes in lighting conditions change the way color is perceived Only stereo-vision range estimates fail in the far field Range estimates based on stereo-vision can be improved by increasing the baseline between the cameras. What are the trade-offs? The sensor requires considerably more space and range to objects that are close cannot be estimated as one of the cameras might not see it anymore. The sensor requires considerably more space and is more difficult to calibrate. The sensor just requires more space.
  • 8. Uncertainty All sensors are noisy Today How to model uncertainty? How does uncertainty propagate from sensors to features? Example: line detection
  • 9. The Gaussian/Normal Distribution Defined by Mean Variance Good approximation for some sensors
  • 10. Current Stats: Week 1-4 vs. Spring 2010 Bi-modal distribution: Undergraduates/Graduates Different performance thresholds for U-Grads / Grads Spring 2010, 2 different distributions N=27 Max=48 # students Points # students Overall score (%)
  • 11. Week 6: reading Why is a Gaussian distribution the model of choice for representing uncertainty in robotic sensing? Sensor readings are subject to uncertainty and this uncertainty behaves like a Gaussian distribution. The true distribution of noise on most sensors is unknown, but the mathematical properties of the Gaussian model make it the model of choice being applicable to most sensors. Because the likelihood is very high that all the sensor readings are within 3 standard deviations. What is the reasoning behind the derivation of Equation 4.73 and 4.74 (least-squares optimization)? The derivative of a function is zero at its extreme values (maximum or minimum), and thus finding the value where the derivative of the least-square error is zero, minimizes it. The value for which the least-square error is minimal, is the best fit for the line. Finding the angle of the line that best matches the set of point requires a double integration (double sum). Finding the best fitting line is a complex numerical optimization problem for which no analytical solution can be found. In order to detect an edge in the image You have to find areas in the image where the frequency, i.e. the change between neighboring pixels is high You have to find areas in the image that are brighter than others You have to find areas in the image that are darker than others How can you calculate the variance for the detection of a feature that relies on multiple sensors? The variance for feature detection corresponds to that of the sensor with the highest variance. This is represented by the Jacobian that encodes the dependencies between all sensor’s error models. The variance for feature detection is the product of the variances of all sensors involved in its detection. This is represented by the Jacobian that encodes the dependencies between all sensor’s error models. The variance for feature detection is a weighted sum of the individual variance of each sensor weighted by the dependency of the sensors of each other.
  • 12. Example Feature: Detecting Lines Camera Laserscan N.B. Every single point is subject to uncertainty!
  • 13. Line Fitting What is the uncertainty associated to each line feature?
  • 14. Example: Line Fitting Given: Desired: r, a
  • 15. Solution (Line fitting) Additional trick: weight each measurement by the variance expected at this distance.
  • 17. Error propagation What is the variance of a and r? Error propagation law Y are the output variables, X input Cx,y are matrices of covariances F is a Jacobian matrix
  • 18. Example: Line fitting 17 measurements f -> (ri,qi) -> (r,a) We need We know
  • 19. Summary Every sensor has noise and makes reasoning uncertain Sensor measurements can be combined into features The uncertainty of these features can be calculated using the error propagation law Knowing how uncertainty behaves helps you decide
  • 21.
  • 22. Here: WatershedGary Bradski (c) 2008 21 21
  • 24. Alternative line features: Hough Transform Demo: OpenCVhoughlines
  • 25. Hough Transform Source: K. Grauma / D. Scaramuzza
  • 26. Project Assignments 16 Undergraduate Students 10 Graduate students 5 groups -> 5+5+5+5+6 ~2 graduate students + ~3 undergrads Goal: implement a controller forRatslife Grad students: have to submit controller Undergraduates have to present (final/design reviews)
  • 28. Exercise 2: Locomotion If you were to write a controller, what do you think would be your best bet to generate the joint values for ji for joint i at time t? Hint: look at how the dog in ghostdog.wbt is controlled. ji(t+1)=a j1(t)+b ji(t+1)= a sin (t-b)+c ji(t+1)=a(t-b)^2+c Can you implement a new gait in ghostdog.wbt that lets the robot trot? What do you need to do except adding a case TROT to the finite state machine in ghostdog.c? Try this out before answering the question! Calculate the servo speed so that both front and hind pairs are out of phase, but one front leg is in phase with one hind leg. Calculate the servo speed so that the phase between front and hind legs is always shifted by 90 degrees. Calculate the servo speed so that all legs are phase shifted by 45 degrees. Which of the motions in Figure 2.1 is only dynamically stable? From 1->2 From 3->4 From 5->8 None What is a straightforward way to presumably double the speed of the forward motion? To test this, edit the file with a text-editor. If you don’t get the desired speed-up, why is this? The inertia and limited motor speed and torque hinder the robot from executing the motions double as fast. The motors are simply not fast enough. Just changing the timing of the gait does not affect its actual execution.
  • 29. Exercise 3: Control What happens when you increase KR and lower KA in the controller? The robot will drive curves with larger radius. The robot will drive curves with smaller radius. The robot will just drive straight, the values need to be exactly as they are. How does your controller deal with the obstacles Collision avoidance subsumes navigation. Obstacles are avoided and navigation is resumed as soon as the obstacles are cleared. The robot plans around the obstacles. The robot gets stuck in the obstacles. Build an obstacle with a U-shape by shifting the obstacles in the arena (press the shift key and move them with the mouse) and let the robot run into this. What happens? The robot follows the inner perimeter of the U-obstacle to get out of it and eventually reaches the goal. The robot goes back and forth into the U-obstacle. Some kind of planning would be needed. The shape of the obstacle does not matter.
  • 30. Exercise 4: Control How can you tell that the robot is lying on its front or its back? I need to integrate the direction of acceleration in order to determine the direction the robot has fallen. I need to identify the direction of the acceleration exerted by the gravity of earth. I use the accelerometer to detect a fall and then use the camera to detect whether the robot is facing down or not. Can you use the Nao’s accelerometer for integrating the robot’s position? If you are not sure, try it! Sure, the acceleration allows me to calculate the position and small errors around the mean will cancel each other out. It is not possible to calculate position simply from acceleration. Problems are the gravity of earth and the fact that even small errors have fatal impact after the required double integration. What is the problem with the resulting map? The laser scanner’s readings accuracy is pretty bad, leading to a rather noisy map. Accumulating odometry errors renders the resulting map useless very fast. The environment is hard to map. What problem does this robot create? It continues to collide with the mapping robot. Dynamic obstacles make their way into the map. The other robot moves to fast to be mapped accurately.
  • 31. Homework Read chapter 5 -> Section 5.5 (pages 181-212)