Shop floor data-driven spatial–temporal verification for manual assembly planning
- 52 Downloads
Motivated by the increasing demand and highly customized products, accurate and up-to-date information about the manufacturing process become essential to meet these requirements. In manual assembly activities, performing theoretical planning in simulation environments is a crucial procedure to detect and avoid unreasonable assembly operations. However, the deviations between theoretical and actual assembly actions would result in the failure of the manual assembly planning. Therefore, the verification for the manual assembly planning is significant to ensure the correctness of the actual assembly operations, performing a convergence between the cyber and physical world. The challenges involved in retrieving and utilizing the actual data about the manual activities on a shop floor. In this paper, a self-contained wearable tracking system is proposed and applied to collect the shop-floor data during the manual assembly operations. And then, an unsupervised classification method is applied to empower semantic knowledge to the shop-floor data derived from the workplace. Thus, an automatic spatial–temporal verification for manual assembly planning is carried out, providing indicators to optimize the current manual assembly planning. Experimental results illustrate that the proposed work can perform the spatial–temporal verification for manual assembly task and indicate evidence to improve the manual assembly planning objectively.
KeywordsSpatial–temporal verification Manual assembly planning Data-driven Shop floor Wearable system
The authors gratefully acknowledge the supports of the National Key Research and Development Program of China (No. 2018YFB1701602), the Defense Industrial Technology Development Program (JCKY2018601C011), the Fundamental Research Funds for the Central Universities (2019RC26) and Beijing Engineering Research Center of Pose Intelligent Equipment. We also appreciate Xu Jiaxing for preparing the manual assembly experiments.
- Agethen, P., Otto, M., Gaisbauer, F., & Rukzio, E. (2016). Presenting a novel motion capture-based approach for walk path segmentation and drift analysis in manual assembly. In Proceedings of 6th changeable, agile, reconfigurable and virtual production (pp. 286–291).Google Scholar
- Baines, T., Hadfield, L., Mason, S., & Ladbrook, J. (2003). Using empirical evidence of variations in worker performance to extend the capabilities of discrete event simulations in manufacturing. In Proceedings of the 2003 winter simulation conference (pp. 1210–1216).Google Scholar
- Claeys, A., Hoedt, S., Soete, N., Cottyn, J., & Landeghem, H. (2015). Framework for evaluating cognitive support in mixed model assembly systems. In Proceedings of 15th IFAC symposium on information control in manufacturing (pp. 924–929).Google Scholar
- Du, J., & Duffy, V. (2007). A methodology for assessing industrial workstations using optical motion capture integrated with digital human models. Occupational Ergonomics, 7(1), 11–25.Google Scholar
- Huikari, V., Koskimaaki, H., Siirtola, P., & Roning, J. (2011). User-independent activity recognition for industrial assembly lines-feature vs. instance selection. In International conference on pervasive computing and applications (pp. 307–312).Google Scholar
- INTERACT. (2013–2016). Interactive manual assembly operations for the human-centered workplaces of the future. http://www.interact-fp7.eu/. Retrieved September 22, 2018.
- Manns, M., Otto, M., & Mauer, M. (2016). Measuring motion capture data quality for data driven human motion synthesis. In Proceedings of 48th CIRP conference on manufacturing systems (pp. 945–950).Google Scholar
- Pan, C. (2005). Integrating CAD files and automatic assembly sequence planning. Ph.D. thesis, Iowa State University.Google Scholar
- Stoessel, C., Wiesbeck, M., Stork, S., Zaeh, M.F., & Schuboe, A. (2008). Towards optimal worker assistance: investigating cognitive processes in manual assembly. In Proceedings of the CIRP conference on manufacturing systems (pp. 245–250).Google Scholar