Advertisement

Development of a faster classification system for metal parts using machine vision under different lighting environments

  • Quang-Cherng HsuEmail author
  • Ngoc-Vu NgoEmail author
  • Rui-Hong Ni
ORIGINAL ARTICLE
  • 25 Downloads

Abstract

A machine vision system for the automatic classification process is developed under different lighting environments, and has been applied to the operation of a robot arm with 6 degrees of freedom (DOF). In order to obtain accurate positioning information, the overall image is captured by a CMOS camera which is mounted above the working platform. The effects of back-lighting and front-lighting environments to the proposed system were investigated. With the front-lighting environment, four different conditions were performed. For each condition, global and local contrast threshold operations were used to obtain good image quality. In this study, a quadratic transformation used to describe the relationship between the image coordinates and the world coordinates was proposed, which has been compared to linear transformation as well as the camera calibration model in MATLAB tool. Experimental results show that in a back-lighting environment, the image quality is improved, such that the positions of the centers of objects are more accurate than in a front-lighting environment. According to the calibration results, the quadratic transformation is more accurate than other methods. By calculating the calibration deviation using the quadratic transformation, the maximum positive deviation is 0.48 mm and 0.38 mm in the X and Y directions, respectively. The maximum negative deviation is − 0.34 mm and − 0.43 mm in X and Y directions, respectively. The proposed system is effective, robust, and can be valuable to industry, as it offers an automated robotic system with an improved flexibility for separating dissimilar items of nominal value, in order to reclaim material investments, and decrease the expense of purchasing the same items again.

Keywords

Machine vision Robot arm Camera calibration Image analysis 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

References

  1. 1.
    Bhuvanesh A, Mani MR (2007) Automatic detection of stamping defects in leadframes using machine vision: overcoming translational and rotational misalignment. Int J Adv Manuf Technol 32:1201–1210CrossRefGoogle Scholar
  2. 2.
    Dworkin SB, Nye TJ (2006) Image processing for machine vision measurement of hot formed parts. J Mater Process Technol 174:1–6CrossRefGoogle Scholar
  3. 3.
    Derganc J, Likar B, Pernuš F (2003) A machine vision system for measuring the eccentricity of bearings. Comput Ind 50:103–111CrossRefGoogle Scholar
  4. 4.
    Shuxia G, Jiancheng Z, Xiaofeng J, Yin P, Lei W (2011) Mini milling cutter measurement based on machine vision. Procedia Engineering 15:1807–1811CrossRefGoogle Scholar
  5. 5.
    Hsu QC, Lin CW, Chen JY (2012) Development of an automatic optical inspection system for defect detection of dental floss picks. In Advanced Intelligent Mechatronics (AIM), 2012 IEEE/ASME International Conference on 444–449Google Scholar
  6. 6.
    Ngo NV, Hsu QC, Hsiao WL, Yang CJ (2017) Development of a simple three-dimensional machine-vision measurement system for in-process mechanical parts. Adv Mech Eng 9:1–11CrossRefGoogle Scholar
  7. 7.
    Xie SQ, Cheng D, Wong S, Haemmerle E (2008) Three-dimensional object recognition system for enhancing the intelligence of a KUKA robot. Int J Adv Manuf Technol 38:822–839CrossRefGoogle Scholar
  8. 8.
    Phansak N, Pichitra U, Kontorn C (2016) Using machine vision for flexible automatic assembly system. The 20th International Conference on Knowledge Based and Intelligent Information and Engineering Systems, KES2016, York, United KingdomGoogle Scholar
  9. 9.
    Tsarouchi P, Matthaiakis SA, Michalos G, Makris S, Chryssolouris G (2016) A method for detection of randomly placed objects for robotic handling. CIRP J Manuf Sci Technol 14:20–27CrossRefGoogle Scholar
  10. 10.
    Iscimen B, Atasoy H, Kutlu Y, Yildirim S, Yildirim E (2015) Smart robot arm motion using computer vision. Elektron Elektrotech 21:3–7Google Scholar
  11. 11.
    Rai N, Rai B, Rai P (2014) Computer vision approach for controlling educational robotic arm based on object properties. In Emerging Technology Trends in Electronics, Communication and Networking (ET2ECN), 2014 2nd International Conference 1–9Google Scholar
  12. 12.
    Tsarouchi P, Michalos G, Makris S, Chryssolouris G (2013) Vision system for robotic handling of randomly placed objects. EURASIP J Adv Signal Process 9:61–66Google Scholar
  13. 13.
    Juang JG, Tsai YJ, Fan YW (2015) Visual recognition and its application to robot arm control. Appl Sci 5:851–880CrossRefGoogle Scholar
  14. 14.
    Jia G, Dong X, Huo Q, Wang K, Mei X (2018) Positioning and navigation system based on machine vision intended for laser-electrochemical micro-hole processing. Int J Adv Manuf Technol 94:1397–1410CrossRefGoogle Scholar
  15. 15.
    Karaszewski M, Sitnik R, Bunsch E (2012) On-line, collision-free positioning of a scanner during fully automated three-dimensional measurement of cultural heritage objects. Robot Auton Syst 60:1205–1219CrossRefGoogle Scholar
  16. 16.
    Ji Y, Yamashita A, Asama H (2017) Automatic calibration of camera sensor networks based on 3D texture map information. Robot Auton Syst 87:313–328CrossRefGoogle Scholar
  17. 17.
    Palousek D, Omasta M, Koutny D, Bednar J, Koutecky T, Dokoupil F (2015) Effect of matte coating on 3D optical measurement accuracy. Opt Mater 40:1–9CrossRefGoogle Scholar
  18. 18.
    Lin CC, Tai YC, Lee JJ, Chen YS (2017) A novel point cloud registration using 2D image features. EURASIP Journal on Advances in Signal Processing 5:1–11Google Scholar
  19. 19.
    Stančić I, Musić J, Zanchi V (2013) Improved structured light 3D scanner with application to anthropometric parameter estimation. Measurement 46:716–726CrossRefGoogle Scholar
  20. 20.
  21. 21.
    Samper D, Santolaria J, Brosed FJ, Majarena AC, Aguilar JJ (2013) Analysis of Tsai calibration method using two-and three-dimensional calibration objects. Mach Vis Appl 24:117–131CrossRefGoogle Scholar
  22. 22.
    Tsai RY (1987) A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE J Robot Autom 3:323–344CrossRefGoogle Scholar
  23. 23.
    Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22:1330–1334CrossRefGoogle Scholar
  24. 24.
    Zhang W, Yuchen B, Feng D (2012) Comparation of calibration methods for bubbly flow video image. Control and decision conference (CCDC), 2012 24th Chinese. IEEEGoogle Scholar
  25. 25.
    Chen HT (2016) Geometry-based camera calibration using five point correspondences from a single image. IEEE transactions on circuits and Systems for Video. Technology 27:2555–2566Google Scholar
  26. 26.
    Kang DJ, Wang HL (2010) Automatic circle pattern extraction and camera calibration using fast adaptive binarization and plane homography. Int J Precis Eng Manuf 11:13–21CrossRefGoogle Scholar
  27. 27.
    Hsu QC (2003) Comparison of different analysis models to measure plastic strains on sheet metal forming parts by digital image processing. Int J Mach Tools Manuf 43:515–521CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Mechanical EngineeringNational Kaohsiung University of Science and TechnologyKaohsiungTaiwan, Republic of China

Personalised recommendations