SlideShare ist ein Scribd-Unternehmen logo
1 von 30
COMPLEX WELD SEAM DETECTION
   USING COMPUTER VISION
                      Glenn Silvers, 16115327

A presentation submitted for the partial fulfilment for the degree of:
   Bachelor of Engineering (Mechatronic and Robotic) (Honours)

                             Supervisor
                             Dr Gu Fang
THE NEED FOR RESEARCH

   To allow for the effective and simple communication
    between humans and machines.

   Simplification of complex setup tasks.

   Reduce time needed to complete tasks.

   Improved repeatability.
KEY OBJECTIVES

   To use computer vision to define the users hand to
    enable movement via gestures of a welding robot.

   The definition and therefore tracking of the hand
    must be real time to allow for adequate control over
    the robots motion.

   To define the region of interest where the weld
    seam lies to allow for seam detection.
CURRENT PROJECTS



   Imirok: Real-time imitative robotic arm control for
    home robot applications                   (Heng-Tze
    et al.,2011).

   Recognising Hand Gestures with Microsoft’s Kinect
    (Tang, 2011).

   Recognition of Arm Movements                (Duric
    et al., 2002).
MICROSOFT KINECT




   http://www.itsagadget.com/2010/11/microsofts-kinect-sensor-is-
   now-officially-released.html
KINECT EXPLODED




http://hackedgadgets.com/wp-content/uploads/2010/11/inside-the-
microsoft-kinect_2.jpg
THE KINECT’S SENSORS

   The Kinect makes use of three different types of
    sensors:
        A Depth Sensor
        A Colour Camera

        A Four Element Microphone Array




   Each sensor allows the programmer endless
    possibilities in terms of application invention.
PLAN OF ATTACK
                          1.
                      Accessing
                      Kinect Data



                          2.
                     Detecting and
                     Tracking the
                         Hand



                               3.2 Extracting
           3.1   Gesture
                                 HSV Values
             Recognition
                                  from Hand



               3.1.1
                                      3.2.1
           Commands for
                                    Seam ROI
            Movement of
                                    Detection
             the Robot
1. ACCESSING KINECT’S DATA

   To access the Kinect’s data streams
    it is first necessary to understand its      data
    structures.

   Once the data streams have
    been opened, it is then a matter                   of
    converting the data into a                usable
    format.
2. DETECTING AND TRACKING THE HAND
   OpenCV libraries

       Created by Intel in 1999.

       Initially designed to provide optimised        computer
        vision code so that                programmers would not
        need to start           projects from scratch.

       Has its own inbuilt data structures.

       Over 500 functions that span many             different
        areas of computer vision.
2. DETECTING AND TRACKING THE HAND
CONTINUED
   Detecting the hand:
       Utilised the Kinect’s depth stream.

       Applied a threshold to the depth values to        create
        a depth region of interest where only        the hand
        would be visible.
2. DETECTING AND TRACKING THE HAND
CONTINUED
   Detecting the hand:
     Once the hand is within the frame it is        necessary
      for the program to identify it as            a hand.
     First step is to find the contours (Suzuki           and
      be, 1985) of the hand.
2. DETECTING AND TRACKING THE HAND
CONTINUED
   Detecting the hand:
       The next step is to enclose the contour with   the
        use of the minimum perimeter polygon algorithm
        (Sklansky, 1972).
2. DETECTING AND TRACKING THE HAND
CONTINUED
   Detecting the hand:
       Looking at points where the contour and the minimum
        perimeter polygon meet shows the position of the
        fingers.
2. DETECTING AND TRACKING THE HAND
CONTINUED
   Detecting the hand:
     To identify the fingers only a weighted         average is
      performed.
     This rejects all points lower than the thumb identifying
      the four fingers and thumb.
3.1 GESTURE RECOGNITION
   The robot being used in this research is a
    Fanuc 100iC with a Lincoln Electric
    welding attachment.

   As this robot has six degrees of freedom
    there will need to be six different gestures
    to allow for complete control.

   At this point in time there have been three
    gestures coded and tested.

   The remaining gestures need to be coded
    into the program.
3.1.1 COMMANDS FOR MOVEMENT OF THE
ROBOT

   Once the gestures have been recognised by
    the program, the robot movement speed will
    be limited to 10%.

   The reason for this is simply to ensure the
    health and safety of the users.

   Whilst the operator continues to make          a
    gesture to the Kinect the robot will continually
    move in accordance with       that gesture.

   As a safety precaution, for all gestures to be
    recognised five fingers need to be identified.
3.2 EXTRACTING HSV VALUES FROM THE
HAND
   Obtain colour image from Kinect.

   The colour image is converted to the HSV           colour
    space.

   An XY depth point from the hand is          transformed
    into the colour image.

   A HSV value is extracted for the hand.

   Makes use a Kinect function to                 transform
    point positions between the              depth image and
    colour image.
3.2 EXTRACTING HSV VALUES FROM THE
HAND CONTINUED
   The below images show the HSV value
    extraction.
3.2.1 SEAM REGION OF INTEREST
DETECTION
   A gesture causes an image to be taken
    from an on-board camera.

   The user then moves their hand into the
    image and after five seconds, video from
    the on-board camera starts recording.

   The tip of the finger is then run along
    the joint or seam that is to be welded.

   The same gesture is used to stop the
    recording.
3.2.1 SEAM REGION OF INTEREST
DETECTION

   The reason for the five second delay:

       So that the user has enough time to move their
        finger to the starting point of the seam before
        the recording starts.


   This ensures that the correct starting point
    of the seam is identified.
3.2.1 SEAM REGION OF INTEREST
DETECTION CONTINUED
   To identify the tip of the finger:

       The image from the on-board camera and the first
        image from the video are converted into the HSV
        colour space.

       The images are thresholded using the HSV values
        extracted earlier.

       The first image of the video is the  subtracted
        from the original image without the finger in it.

       All that is left is the hand and some   background
        noise.
3.2.1 SEAM REGION OF INTEREST
DETECTION CONTINUED
   The below images show the image
    subtraction:
3.2.1 SEAM REGION OF INTEREST
DETECTION CONTINUED
   To ensure that the hand as a whole is
    identified the image is dilated to give a
    stronger response:
3.2.1 SEAM REGION OF INTEREST
DETECTION CONTINUED
   From this image the contours and
    minimum perimeter polygon of the hand
    are calculated:
3.2.1 SEAM REGION OF INTEREST
DETECTION CONTINUED
   Once again where the contour and
    minimum perimeter meet is the region of
    interest (white circles).
WHAT HAS BEEN ACHIEVED

   Definition of the hand and fingers.

   Tracking of the hand and fingers in real time.

   Successfully defined the region of interest of the
    seam to allow seam detection.
WHAT STILL NEEDS TO BE ACHIEVED

   Hand gesture coding needs to be finalised – 80%

   Communication with the robot.

   Complete an in-depth analysis of how repeatable
    the methods set out are with many different skin
    types.
THE MAJOR STEP FORWARD
   The major step forward from this research has been
    the way in which the hand is identified within the
    colour image.

   By defining the hand in the depth image and then
    extracting HSV values from the colour image, a
    hybrid skin detector is formed.

   No matter what race or skin colour the user has,
    this method will be able to segment their hand to
    allow for seam ROI definition.
REFERENCES
   HENG-TZE, C., ZHENG, S. & PEI, Z. Imirok: Real-time imitative robotic arm control for home
    robot applications. Pervasive Computing and Communications Workshops (PERCOM
    Workshops), 2011 IEEE International Conference on, 21-25 March 2011 2011. 360-363.

   TANG, M. 2011. Recognising Hand Gestures with Microsoft’s Kinect. BEng (Electrical), Stanford
    University.

   DURIC, Z., LI, F. & WECHSLER, H. Recognition of arm movements. Automatic Face and
    Gesture Recognition, 2002. Proceedings. Fifth IEEE International Conference on, 21-21 May
    2002 2002. 348-353.

   http://www.itsagadget.com/2010/11/microsofts-kinect-sensor-is-now-officially-released.html.
    Accessed 16/4/2012.

   http://hackedgadgets.com/wp-content/uploads/2010/11/inside-the-microsoft-kinect_2.jpg.
    Accessed 16/4/2012.

   SUZUKI, S. & BE, K. 1985. Topological structural analysis of digitized binary images by border
    following. Computer Vision, Graphics, and Image Processing, 30, 32-46.

   SKLANSKY, J. 1972. Measuring Concavity on a Rectangular Mosaic. Computers, IEEE
    Transactions on, C-21, 1355-1364.

Weitere ähnliche Inhalte

Was ist angesagt?

Gesture final report new
Gesture final report newGesture final report new
Gesture final report new
chithiracyriac
 
Using The Hausdorff Algorithm to Enhance Kinect's Recognition of Arabic Sign ...
Using The Hausdorff Algorithm to Enhance Kinect's Recognition of Arabic Sign ...Using The Hausdorff Algorithm to Enhance Kinect's Recognition of Arabic Sign ...
Using The Hausdorff Algorithm to Enhance Kinect's Recognition of Arabic Sign ...
CSCJournals
 

Was ist angesagt? (20)

Gesture phones final
Gesture phones  finalGesture phones  final
Gesture phones final
 
Kinect v1+Processing workshot fabcafe_taipei
Kinect v1+Processing workshot fabcafe_taipeiKinect v1+Processing workshot fabcafe_taipei
Kinect v1+Processing workshot fabcafe_taipei
 
Recognition of sign language hand gestures using leap motion sensor based on ...
Recognition of sign language hand gestures using leap motion sensor based on ...Recognition of sign language hand gestures using leap motion sensor based on ...
Recognition of sign language hand gestures using leap motion sensor based on ...
 
researchPaper
researchPaperresearchPaper
researchPaper
 
Gesture final report new
Gesture final report newGesture final report new
Gesture final report new
 
10.1109@ecs.2015.7124874
10.1109@ecs.2015.712487410.1109@ecs.2015.7124874
10.1109@ecs.2015.7124874
 
Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot
 
Virtual Projection Interface
Virtual Projection InterfaceVirtual Projection Interface
Virtual Projection Interface
 
Kinect krishna kumar-itkan
Kinect krishna kumar-itkanKinect krishna kumar-itkan
Kinect krishna kumar-itkan
 
Design of Image Projection Using Combined Approach for Tracking
Design of Image Projection Using Combined Approach for  TrackingDesign of Image Projection Using Combined Approach for  Tracking
Design of Image Projection Using Combined Approach for Tracking
 
Touchless Touchscreen
Touchless TouchscreenTouchless Touchscreen
Touchless Touchscreen
 
Soli sensor
Soli sensorSoli sensor
Soli sensor
 
Using The Hausdorff Algorithm to Enhance Kinect's Recognition of Arabic Sign ...
Using The Hausdorff Algorithm to Enhance Kinect's Recognition of Arabic Sign ...Using The Hausdorff Algorithm to Enhance Kinect's Recognition of Arabic Sign ...
Using The Hausdorff Algorithm to Enhance Kinect's Recognition of Arabic Sign ...
 
40120130406016
4012013040601640120130406016
40120130406016
 
Touchless touchscreen technology
Touchless touchscreen technologyTouchless touchscreen technology
Touchless touchscreen technology
 
Detection Hand Motion on Virtual Reality Mathematics Game with Accelerometer ...
Detection Hand Motion on Virtual Reality Mathematics Game with Accelerometer ...Detection Hand Motion on Virtual Reality Mathematics Game with Accelerometer ...
Detection Hand Motion on Virtual Reality Mathematics Game with Accelerometer ...
 
OPENSESAME: Unlocking Smartphone Through Handwaving Biometrics
OPENSESAME: Unlocking Smartphone Through Handwaving BiometricsOPENSESAME: Unlocking Smartphone Through Handwaving Biometrics
OPENSESAME: Unlocking Smartphone Through Handwaving Biometrics
 
Detection and rectification of distorted fingerprint
Detection and rectification of distorted fingerprintDetection and rectification of distorted fingerprint
Detection and rectification of distorted fingerprint
 
Paper on Virtual inputs for Computer
Paper on Virtual inputs for ComputerPaper on Virtual inputs for Computer
Paper on Virtual inputs for Computer
 
Control Buggy using Leap Sensor Camera in Data Mining Domain
Control Buggy using Leap Sensor Camera in Data Mining DomainControl Buggy using Leap Sensor Camera in Data Mining Domain
Control Buggy using Leap Sensor Camera in Data Mining Domain
 

Andere mochten auch

Decreto 4 2012
Decreto 4 2012Decreto 4 2012
Decreto 4 2012
LuisMart77
 
A small walk through galatz
A small walk through galatz A small walk through galatz
A small walk through galatz
legenda2002
 
Beijing Olympics - More than medals
Beijing Olympics - More than medalsBeijing Olympics - More than medals
Beijing Olympics - More than medals
Rajeev Kulkarni
 
Ucdtlp00631
Ucdtlp00631Ucdtlp00631
Ucdtlp00631
Andri_A
 
Expresiones culturales
Expresiones culturalesExpresiones culturales
Expresiones culturales
lorehl
 

Andere mochten auch (17)

Modelo matemático-para-el-dióxido-de-carbono
Modelo matemático-para-el-dióxido-de-carbonoModelo matemático-para-el-dióxido-de-carbono
Modelo matemático-para-el-dióxido-de-carbono
 
Tony poem powerpoint
Tony poem powerpointTony poem powerpoint
Tony poem powerpoint
 
Navigating the Information-scape: Do Information Visualization Activities Imp...
Navigating the Information-scape: Do Information Visualization Activities Imp...Navigating the Information-scape: Do Information Visualization Activities Imp...
Navigating the Information-scape: Do Information Visualization Activities Imp...
 
Crewzable
CrewzableCrewzable
Crewzable
 
Decreto 4 2012
Decreto 4 2012Decreto 4 2012
Decreto 4 2012
 
Sighisoara
SighisoaraSighisoara
Sighisoara
 
Acc
AccAcc
Acc
 
Be prepared to deal with fraud for web
Be prepared to deal with fraud for webBe prepared to deal with fraud for web
Be prepared to deal with fraud for web
 
Group35618module13 (1)
Group35618module13 (1)Group35618module13 (1)
Group35618module13 (1)
 
Lederdag 301013
Lederdag 301013Lederdag 301013
Lederdag 301013
 
A small walk through galatz
A small walk through galatz A small walk through galatz
A small walk through galatz
 
Beijing Olympics - More than medals
Beijing Olympics - More than medalsBeijing Olympics - More than medals
Beijing Olympics - More than medals
 
Ucdtlp00631
Ucdtlp00631Ucdtlp00631
Ucdtlp00631
 
Expresiones culturales
Expresiones culturalesExpresiones culturales
Expresiones culturales
 
Top 5 benefits of chiropractic care
Top 5 benefits of chiropractic careTop 5 benefits of chiropractic care
Top 5 benefits of chiropractic care
 
Comparison/Contrast Northeastern_Murphy Hunt
Comparison/Contrast Northeastern_Murphy HuntComparison/Contrast Northeastern_Murphy Hunt
Comparison/Contrast Northeastern_Murphy Hunt
 
problemas-golstein-capitulo-1
 problemas-golstein-capitulo-1 problemas-golstein-capitulo-1
problemas-golstein-capitulo-1
 

Ähnlich wie Complex Weld Seam Detection Using Computer Vision Linked In

project presentation on mouse simulation using finger tip detection
project presentation on mouse simulation using finger tip detection project presentation on mouse simulation using finger tip detection
project presentation on mouse simulation using finger tip detection
Sumit Varshney
 
Sixth Sens Technology technology
Sixth Sens Technology                                          technologySixth Sens Technology                                          technology
Sixth Sens Technology technology
Khagesh Desai
 

Ähnlich wie Complex Weld Seam Detection Using Computer Vision Linked In (20)

Hand Gesture Controls for Digital TV using Mobile ARM Platform
Hand Gesture Controls for Digital TV using Mobile ARM PlatformHand Gesture Controls for Digital TV using Mobile ARM Platform
Hand Gesture Controls for Digital TV using Mobile ARM Platform
 
IRJET- Survey on Sign Language and Gesture Recognition System
IRJET- Survey on Sign Language and Gesture Recognition SystemIRJET- Survey on Sign Language and Gesture Recognition System
IRJET- Survey on Sign Language and Gesture Recognition System
 
Kinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing ApplicationKinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing Application
 
Media Control Using Hand Gesture Moments
Media Control Using Hand Gesture MomentsMedia Control Using Hand Gesture Moments
Media Control Using Hand Gesture Moments
 
Aacellerometer
AacellerometerAacellerometer
Aacellerometer
 
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardDevelopment of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
 
Introduction to Virtual Reality
Introduction to Virtual RealityIntroduction to Virtual Reality
Introduction to Virtual Reality
 
project presentation on mouse simulation using finger tip detection
project presentation on mouse simulation using finger tip detection project presentation on mouse simulation using finger tip detection
project presentation on mouse simulation using finger tip detection
 
A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand Gesture
 
Deep learning based static hand gesture recognition
Deep learning based static hand gesture recognition Deep learning based static hand gesture recognition
Deep learning based static hand gesture recognition
 
Lift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationLift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentation
 
GESTURE RECOGNITION TECHNOLOGY
GESTURE RECOGNITION TECHNOLOGYGESTURE RECOGNITION TECHNOLOGY
GESTURE RECOGNITION TECHNOLOGY
 
Nikppt
NikpptNikppt
Nikppt
 
Gesture Recognition Technology
Gesture Recognition TechnologyGesture Recognition Technology
Gesture Recognition Technology
 
Touchless touchscreen
Touchless touchscreenTouchless touchscreen
Touchless touchscreen
 
Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data
 
Sixth Sens Technology technology
Sixth Sens Technology                                          technologySixth Sens Technology                                          technology
Sixth Sens Technology technology
 
Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured TapesMouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes
 
Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes
 
Virtual Mouse Using Hand Gesture Recognition
Virtual Mouse Using Hand Gesture RecognitionVirtual Mouse Using Hand Gesture Recognition
Virtual Mouse Using Hand Gesture Recognition
 

Complex Weld Seam Detection Using Computer Vision Linked In

  • 1. COMPLEX WELD SEAM DETECTION USING COMPUTER VISION Glenn Silvers, 16115327 A presentation submitted for the partial fulfilment for the degree of: Bachelor of Engineering (Mechatronic and Robotic) (Honours) Supervisor Dr Gu Fang
  • 2. THE NEED FOR RESEARCH  To allow for the effective and simple communication between humans and machines.  Simplification of complex setup tasks.  Reduce time needed to complete tasks.  Improved repeatability.
  • 3. KEY OBJECTIVES  To use computer vision to define the users hand to enable movement via gestures of a welding robot.  The definition and therefore tracking of the hand must be real time to allow for adequate control over the robots motion.  To define the region of interest where the weld seam lies to allow for seam detection.
  • 4. CURRENT PROJECTS  Imirok: Real-time imitative robotic arm control for home robot applications (Heng-Tze et al.,2011).  Recognising Hand Gestures with Microsoft’s Kinect (Tang, 2011).  Recognition of Arm Movements (Duric et al., 2002).
  • 5. MICROSOFT KINECT http://www.itsagadget.com/2010/11/microsofts-kinect-sensor-is- now-officially-released.html
  • 7. THE KINECT’S SENSORS  The Kinect makes use of three different types of sensors:  A Depth Sensor  A Colour Camera  A Four Element Microphone Array  Each sensor allows the programmer endless possibilities in terms of application invention.
  • 8. PLAN OF ATTACK 1. Accessing Kinect Data 2. Detecting and Tracking the Hand 3.2 Extracting 3.1 Gesture HSV Values Recognition from Hand 3.1.1 3.2.1 Commands for Seam ROI Movement of Detection the Robot
  • 9. 1. ACCESSING KINECT’S DATA  To access the Kinect’s data streams it is first necessary to understand its data structures.  Once the data streams have been opened, it is then a matter of converting the data into a usable format.
  • 10. 2. DETECTING AND TRACKING THE HAND  OpenCV libraries  Created by Intel in 1999.  Initially designed to provide optimised computer vision code so that programmers would not need to start projects from scratch.  Has its own inbuilt data structures.  Over 500 functions that span many different areas of computer vision.
  • 11. 2. DETECTING AND TRACKING THE HAND CONTINUED  Detecting the hand:  Utilised the Kinect’s depth stream.  Applied a threshold to the depth values to create a depth region of interest where only the hand would be visible.
  • 12. 2. DETECTING AND TRACKING THE HAND CONTINUED  Detecting the hand:  Once the hand is within the frame it is necessary for the program to identify it as a hand.  First step is to find the contours (Suzuki and be, 1985) of the hand.
  • 13. 2. DETECTING AND TRACKING THE HAND CONTINUED  Detecting the hand:  The next step is to enclose the contour with the use of the minimum perimeter polygon algorithm (Sklansky, 1972).
  • 14. 2. DETECTING AND TRACKING THE HAND CONTINUED  Detecting the hand:  Looking at points where the contour and the minimum perimeter polygon meet shows the position of the fingers.
  • 15. 2. DETECTING AND TRACKING THE HAND CONTINUED  Detecting the hand:  To identify the fingers only a weighted average is performed.  This rejects all points lower than the thumb identifying the four fingers and thumb.
  • 16. 3.1 GESTURE RECOGNITION  The robot being used in this research is a Fanuc 100iC with a Lincoln Electric welding attachment.  As this robot has six degrees of freedom there will need to be six different gestures to allow for complete control.  At this point in time there have been three gestures coded and tested.  The remaining gestures need to be coded into the program.
  • 17. 3.1.1 COMMANDS FOR MOVEMENT OF THE ROBOT  Once the gestures have been recognised by the program, the robot movement speed will be limited to 10%.  The reason for this is simply to ensure the health and safety of the users.  Whilst the operator continues to make a gesture to the Kinect the robot will continually move in accordance with that gesture.  As a safety precaution, for all gestures to be recognised five fingers need to be identified.
  • 18. 3.2 EXTRACTING HSV VALUES FROM THE HAND  Obtain colour image from Kinect.  The colour image is converted to the HSV colour space.  An XY depth point from the hand is transformed into the colour image.  A HSV value is extracted for the hand.  Makes use a Kinect function to transform point positions between the depth image and colour image.
  • 19. 3.2 EXTRACTING HSV VALUES FROM THE HAND CONTINUED  The below images show the HSV value extraction.
  • 20. 3.2.1 SEAM REGION OF INTEREST DETECTION  A gesture causes an image to be taken from an on-board camera.  The user then moves their hand into the image and after five seconds, video from the on-board camera starts recording.  The tip of the finger is then run along the joint or seam that is to be welded.  The same gesture is used to stop the recording.
  • 21. 3.2.1 SEAM REGION OF INTEREST DETECTION  The reason for the five second delay:  So that the user has enough time to move their finger to the starting point of the seam before the recording starts.  This ensures that the correct starting point of the seam is identified.
  • 22. 3.2.1 SEAM REGION OF INTEREST DETECTION CONTINUED  To identify the tip of the finger:  The image from the on-board camera and the first image from the video are converted into the HSV colour space.  The images are thresholded using the HSV values extracted earlier.  The first image of the video is the subtracted from the original image without the finger in it.  All that is left is the hand and some background noise.
  • 23. 3.2.1 SEAM REGION OF INTEREST DETECTION CONTINUED  The below images show the image subtraction:
  • 24. 3.2.1 SEAM REGION OF INTEREST DETECTION CONTINUED  To ensure that the hand as a whole is identified the image is dilated to give a stronger response:
  • 25. 3.2.1 SEAM REGION OF INTEREST DETECTION CONTINUED  From this image the contours and minimum perimeter polygon of the hand are calculated:
  • 26. 3.2.1 SEAM REGION OF INTEREST DETECTION CONTINUED  Once again where the contour and minimum perimeter meet is the region of interest (white circles).
  • 27. WHAT HAS BEEN ACHIEVED  Definition of the hand and fingers.  Tracking of the hand and fingers in real time.  Successfully defined the region of interest of the seam to allow seam detection.
  • 28. WHAT STILL NEEDS TO BE ACHIEVED  Hand gesture coding needs to be finalised – 80%  Communication with the robot.  Complete an in-depth analysis of how repeatable the methods set out are with many different skin types.
  • 29. THE MAJOR STEP FORWARD  The major step forward from this research has been the way in which the hand is identified within the colour image.  By defining the hand in the depth image and then extracting HSV values from the colour image, a hybrid skin detector is formed.  No matter what race or skin colour the user has, this method will be able to segment their hand to allow for seam ROI definition.
  • 30. REFERENCES  HENG-TZE, C., ZHENG, S. & PEI, Z. Imirok: Real-time imitative robotic arm control for home robot applications. Pervasive Computing and Communications Workshops (PERCOM Workshops), 2011 IEEE International Conference on, 21-25 March 2011 2011. 360-363.  TANG, M. 2011. Recognising Hand Gestures with Microsoft’s Kinect. BEng (Electrical), Stanford University.  DURIC, Z., LI, F. & WECHSLER, H. Recognition of arm movements. Automatic Face and Gesture Recognition, 2002. Proceedings. Fifth IEEE International Conference on, 21-21 May 2002 2002. 348-353.  http://www.itsagadget.com/2010/11/microsofts-kinect-sensor-is-now-officially-released.html. Accessed 16/4/2012.  http://hackedgadgets.com/wp-content/uploads/2010/11/inside-the-microsoft-kinect_2.jpg. Accessed 16/4/2012.  SUZUKI, S. & BE, K. 1985. Topological structural analysis of digitized binary images by border following. Computer Vision, Graphics, and Image Processing, 30, 32-46.  SKLANSKY, J. 1972. Measuring Concavity on a Rectangular Mosaic. Computers, IEEE Transactions on, C-21, 1355-1364.

Hinweis der Redaktion

  1. In todays world it is becoming more important to be able to communicate with machines in a way that is intuitive for the user. The outcome of this being the achievement of complex tasks with minimal effort on behalf of the workforce.In a workplace that is continually changing their tooling to allow for different products to be manufactured or perhaps even manufacturing custom parts, the setup is often complex and requires specially skilled workers to be on site, taking them away from other tasks such as maintenance.The setup time needed for a new part for robotic welding often takes far longer to complete than the actual job itself. If were possible for the computer to identify where the weld seam needs to be completed with a simple gesture from a human, the time needed to complete the task would be drastically reduced.
  2. These three current projects give an outline as to what other researchers are looking into for the purpose of interaction between humans and machines.They give an insight into how computer vision can be used to breech the gap between humans and machines.The purpose of the first article was to show how it was possible to control a robotic arm with the use of computer vision and an ordinary webcam.The second article explains how it is possible to achieve recognition of hand gestures with the Microsoft Kinect.The third article is concerned with recognising certain sequences of arm movements as gestures.Over the next slides I will explain these three articles in greater detail.
  3. * This image shows that although the Kinect was designed for use in games, the hardware behind the sensor enables it to be used in complex computer vision applications.
  4. I would just like to start out by saying that every step of this project has been a challenge. In the most part this was because of my limited computer programming skills. It seemed like a project within a project to learn not only the necessary c++ programming skills but also to learn about the different functions and data structures of OpenCV to allow me to write a program to satisfy the needs of this research.
  5. * To initialise and open the Kinect’s data streams was a very time consuming task. In effect it took five weeks just to retrieve a depth image from the Kinect.
  6. For this project the OpenCV libraries were used considerably.They give the programmer a base to build sophisticated vision applications.To enable the use of OpenCV its data structures need to be fully understood.
  7. Once the depth stream had been opened I was able to set a threshold on the depth image. This threshold was set so that only values between 1.2 and 1.6 meters would be shown in the image.
  8. The best way to visualise the minimum perimeter polygon is to stretch an elastic band around your hand.
  9. Once the hand has been detected the next step is to use the colour camera of the Kinect to obtain a colour image.The colour image is converted to the HSV colour space as it is not as susceptible to lighting variations.The weighted average XY point from the depth image is transformed into the colour image and a HSV value is extracted.
  10. A gesture is given to the Kinect which takes a image from one of the two stereovision cameras mounted on the robot.The user then moves their hand into the image and after five seconds, video from the on-board camera starts recording.The tip of the finger is then run along the joint or seam that is to be welded.Once the finger has reached the end of the seam, the user gives the same gesture to the Kinect stopping the recording.