Automated Computer Vision System Based on Color Concentration and Level for Product Quality Inspection
Recently, the use of automated product quality inspection in industries is rapidly increasing. Quality is commonly related with product to satisfy the customer’s desire and it is important to maintain it before sending to customers. This study presents a technique for product inspection using a computer vision approach. Soft drink beverages have been used as product that to be tested for quality inspection. The database is created to inspect the product based on color concentration and water level quality inspection. The system used Otsu’ method for segmentation, histogram from combined red, green, blue (RGB) color model for features extraction, and quadratic distance classifier to classify the product based on color concentration. For water level, the coordinate of image is set to measure the range of water level. Internet Protocol (IP) camera is used while validate the performance of the system. The result shows that the proposed technique is 98% accurate using 246 samples.
KeywordsAutomatic visual inspection Color classification Internet protocol (IP) Level analysis Otsu’ method Quadratic distance classifier Red Green Blue (RGB) color
The authors would like to thank to the Universiti Teknikal Malaysia Melaka (UTeM), UTeM Zamalah Scheme, Rehabilitation Engineering & Assistive Technology (REAT) under Center for Robotics & Industrial Automation (CeRIA), Advanced Digital Signal Processing (ADSP) Research Laboratory and Ministry of Higher Education (MOHE), Malaysia for sponsoring this work under project GLuar/STEVIA/2016/FKE-CeRIA/l00009 and the use of the existing facilities to complete this project.
- 2.A.R. Rababaah, Y. Demi-Ejegi, Automatic visual inspection system for stamped sheet metals (AVIS 3 M), in 2012 IEEE International Conference on Computer Science and Automation Engineering (CSAE), May 2012, vol. 2 (IEEE), pp. 661–665Google Scholar
- 3.A.M. Tuates Ir, A.R. Ligisan, Development of PHilMech computer vision system (CVS) for quality analysis of rice and corn. Int. J. Adv. Sci. Eng. Inf. Technol. 6(6), 1060–1066 (2016)Google Scholar
- 4.I. Siena, K. Adi, R. Gernowo, N. Mirnasari, Development of algorithm tuberculosis bacteria identification using color segmentation and neural networks. Int. J. Video Image Process. Netw. Secur. IJVIPNS-IJENS 12(4), 9–13 (2012)Google Scholar
- 5.S.A. Daramola, M.A. Adefunminiyi, Text Content Dependent Writer Identification (2016), pp. 45–49Google Scholar
- 6.K. Abou-Moustafa, F.P. Ferrie, Local generalized quadratic distance metrics: application to the k-nearest neighbors classifier, in Advances in Data Analysis and Classification (2017), pp. 1–23Google Scholar
- 7.D.P. Hutabarat, D. Patria, S. Budijono, R. Saleh, Human tracking application in a certain closed area using RFID sensors and IP camera, in 2016 3rd International Conference on Information Technology, Computer, and Electrical Engineering (ICITACEE), Oct 2016 (IEEE), pp. 11–16Google Scholar
- 8.F. Nie, P. Zhang, Fuzzy partition and correlation for image segmentation with differential evolution. IAENG Int. J. Comput. Sci. 40(3), 164–172 (2013)Google Scholar
- 9.C.-S. Cho, B.-M. Chung, Development of real-time vision-based fabric inspection system. IEEE Trans. Ind. Electron. 52(4) (2005)Google Scholar
- 10.X. Wang, Y. Xue, Fast HEVC intra coding algorithm based on Otsu’s method and gradient, in 2016 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), June 2016 (IEEE), pp. 1–5Google Scholar
- 11.K.A. Panetta, S. Nercessian, S. Agaian, Methods and Apparatus for Image Processing and Analysis, Trustees of Tufts College, 2016, U.S. Patent 9,299,130Google Scholar
- 12.N. Mohd Saad, N.N.S.A. Rahman, A.R Abdullah, A.R. Syafeeza, N.S.M. Noor, Quadratic distance and level classifier for product quality inspection system, in Proceedings of the International MultiConference of Engineers and Computer Scientists 2017, IMECS 2017, 15–17 Mar 2017, Hong Kong. Lecture Notes in Engineering and Computer Science, pp. 386–390Google Scholar