Abstract
Human identification has been a prominent area in the field of computer vision and artificial intelligence. In this paper, a novel human identification method is proposed which is based on a Cubic Bezier Curve (CBC) and statistical techniques through 3D joint movement data. Data acquisition from motion capture system that provides accurate motion information of body joints in 3D environment. Such type of data has sole properties which distinguishes between images and videos. The simple kalman filter that can be used for removing noise in data, is guided by smooth and compactness manners. The features of the human body joints one upper joint (shoulder) and three lower joints (hip, knee and ankle) are computed by using the statistical moments. These features are used as the control points of the curve. The curve passes through the control points, which describes the relationship among joints muscles in human walking. Statistical techniques are applied to CBC coordinates for human recognition. Here, the rotation angle data of the joints is extracted from Biovison Hierarchical data, because these four joints provide the discriminating confusion of deduced information of human joints for identification through gait curve. The performance of our method is evaluated on CMU database. It achieves 100 % accuracy rate of identification by using the proposed database.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
The importance of a perfect human identification lies in the information that human facts are extremely difficult to be identified and tracked. Among many of the convictions, a few are large variations in human postures, and human size varied activities etc. Human recognition through walk as a new biometric aim to recognize people, which contain physiological or behavioural individuality of human. A more formal definitions of biometrics [1], joints analysis [42], body segment parameters [12] and the range of human body joints movement [38]. The need for computerized person identification is increasing in many applications [13, 24]. In human identification study area, the first beauty is human gender identification. Researchers in the field of gender recognition had focused on image processing techniques from face image [30, 35, 42] and gait appearance [22, 33, 44]. Under the study of multi-view gait, covariance analysis method [32] has been used for human identification. Furthermore, some other approaches in gender identification have been used through the voice of speakers [17, 40]. Even then in human identification studies, many research works have been carried out on different approaches such as spatio-temporal method [34], three-mode PCA [11], integrated face and gait recognition from multiple views [10], joint kinematic and kinetic data [14], gait signature [15], gait energy [16], matrix representation [43], moving light display (MLD) [25] and some other methods for human identification through gait [7, 9, 19, 23, 26, 40, 41].
Inspired by the aforementioned research, the researchers were motivated and employed different methods on the different video database for gender recognition and human identification. So they decided to use optical motion data (3D motion data) for human identification. Because with the explosive growth of motion capture data, in the last few decades, an increasing interest has been developed in the areas of human motion understanding, analysis and synthesis. Thus, the demand of high quality and new applications for motion capture data or method everywhere based on various consumer electronic devices are emerging day by day.
The maker-based optical motion capture method has become a standard technique to record human motions for animation, computer games, medical studies and sports. Therefore, there is a growing tunnel of high-quality motion capture data that can be used for scientific research [8, 36], and [31]. On the other hand, the human identification from motion captured data (BVH file) is still a current string of research. Numerous research studies have been carried out by using motion capture data in different tasks [27, 28, 32, 36, 45]. For example, as shown in Fig. 1. Hong et al. [20], used 3D motion of human body joints and employed the PCA method to identify the person and in [19], they demonstrated PCA1 and PCA2 on 3D motion data and identified the person. However, these methods have used full human body joints data for human identification. Razali et al. [37] also identified the person by using motion capture data of gait joints instead of full body joints. Multiple techniques demonstrated for human identification by Josinski et al. [26]. They used multilinear principal components analysis for reducing the high dimensional data.The motion capture data has a different file format and one of them is BVH format [29], which is utilized in our experiment. Particularly, it includes the position of the root and orientation of other joints. It can be produced from a motion capture system and it’s practical implementation [39], as shown in the Fig. 1. It has two parts, one is a header part that describes the hierarchy and initial pose of the skeleton. And the second part describes the channel data for each frame, and Fig. 2 show skeleton joints of the file. In this paper, we present a method to identify the human because its format contains digitalizing human movements. Our work is inspired by the study of Ludovic [21], where they evaluated the distinctive and attractiveness of human motions and identified the differences between different subjects. We kept the gait joints information of walking persons and feel that information has varied between the joints of the person. Our idea based on the Cubic Bezier [3, 6] Curve and statistical techniques [5], is applied to four joints (shoulder, hip, knee and ankle).
The features of human body joints are selected by choosing definite number of coordinate points of 3D motion data. First normalize the features of each coordinate of joint data, which are computed from data. These features will be used as data points. Then the cubic Bezier curve is drawn through interpolation. The sampling coordinates are used to compute the variance between joint movements and pure correlation between them (joints) by using the ratio of variance of the couple of walking person. Identification of human is achieved by matching the ratio values of the combined joint variance of the threshold values stored into the database.
2 System Overview
In our method, human recognition is done through 3D gait motion joint data during walking. We select the four important joints: shoulder, hip, knee and ankle and calculate the means of coordinates of each joint 3D motion data. The means of coordinate points will be interpolated to generate a cubic Bezier curve. These points on the curve are then used to calculate the mean and the variance. The workflow of proposed method is shown in Fig. 4.
3 Proposed Database
The CMU motion capture database [2] was built mainly to give a source of motion data for animation and other research areas [5, 36]. The database contains 2605 different motion clips of full body MoCap data. The actions have been performed by a total of 144 subjects few subjects are the same person. Here we consider simple walk action motion data of seven subjects. Then we adopt the process to manage proposed data by following the procedure [4]. In additionally, the simple kalman filter algorithm that has used for removing noise in optical motion capture data, the illustration can be seen in Fig. 3. Figure 3 illustrates the two labels such as A and B. A means before filtering data and B indicate after filtration data.
4 Workflow of Proposed Algorithm
The flowchart contains three units as shown in Fig. 4: unit one is the pre-processing unit, the second is the calculation unit and the third one is called the recognition unit.
4.1 Preprocessing Unit
In this unit, we follow the same procedure for the preprocessing unit that has been described in [4].
4.2 Calculation Unit
The calculation unit also has two steps. The first step is to extract the features of the concerned joints (shoulder, hip, knee and ankle) and the second step is to apply these features as the data points. The cubic Bezier curve passes curve through these data points. The variances of each coordinate of the curve is calculated and then compute the mean of the variance.
4.3 Recognition Unit
Finally, the output of the calculation unit is used as an input to the recognition unit. It contains two main things one detector and the second is a database. The detector uses the Id_vale and compares it with stored Id values into the one database and gives the result of human identification.
5 Proposed Method for Calculation
As mentioned above, we are interested in determining the human recognition. For calculating this, we used 3D motion data the joints of the subjects concerned during walking. The important quantities are used to measure the recognition of human is based on the cubic Bezier curve and statistical techniques under joint movement. The calculation is carried out in the following steps.
5.1 Feature Extracting for Bezier Curve
The original 3D human motions of walking person are obtained from the motion capture system. It consists of N number of frames and 96 channels. Let a walking person motion file has the number of frames such as \(\forall t \in N\) and each frame \(f^t\) having the x, y, and z coordinates of all skeleton nodes. This study uses the original skeleton joints (nodes) data. Then selected joints (shoulder, hip, knee and ankle) of the skeleton are shown in Fig. 2. By extracting data of the joints that were mentioned in Fig. 2 from the BVH file and normalized by the following objective functions can be expressed as
where \(i\in \{x,y,z\}\), J is joint position.
Where \(t=1,2,\dots , N\) and N is number of frames in motion data, \(\theta _x\), and \(\theta _y\) and \(\theta _z\) are the normalize features of shoulder joint xshu,yshu and zshu denotes the coordinate values of the shoulder joint. Eq (1) is used to normalize the shoulder joint features denoted by \(P_0(\theta _x,\theta _y,\theta _z)\). Similarly we compute the features of other three joints (hip, knee, and ankle). Denote the hip joint by a point \(P_1(\theta _x,\theta _y,\theta _z)\), the knee joint \(P_2(\theta _x,\theta _y,\theta _z)\), and an ankle joint \(P_3(\theta _x,\theta _y,\theta _z)\). These points are called the features of the joints. Furthermore, these features are used as control points of the curve. Sp is defined as a collection of features of the a walking sequence as
5.2 Bezier Curve
A Bezier curve is r(t) of degree n can be defined in term of a set of control points \(P_i(i=0,1,\dots ,n)\), and is given by Bezier [3, 6], which is widely used in the computer graphics community due to its solid mathematical equations and intuitive manipulation. The Bezier curve equation is (2).
where \(p_i\) are control points, \(B_{i,n}\) are blending functions called Bernstein polynomials, \(t\in [0,1]\) and is defined as
The Eqs. (3), (4) and (5) can be represented parametrically as \(r(t)=( x(t),y(t),z(t) )\)
where \(0\le t \le 1\). We construct 3D cubic Bezier curve of the walking person, by computing x,y and z coordinates for cubic Bezier curve using Eqs. (3), (4) and (5). The curve that passes through four joint (\(p_0=shoulder\), \(p_1=hip\), p2=knee, and \(p_3=ankle\)) is obtained. An Example of subject cubic Bezier curve can be seen in Fig. 5.
Suppose that the shoulder location is at \(J_0\), hip location at \(J_1\), knee location at \(J_2\) and ankle location \(J_3\). Therefore, the control points of the Bezier curve are computed as following:
The control point \(p_1\) and \(p_2\) is are computed so that the curve through \(J_1\) and \(J_2\). Here the parameter \(t_1\) corresponds to \(J_1\), which is computed as:
parameter \(t_2\) corresponds to \(J_2\), which is computed as:
Therefore,
Solving the linear equations simultaneously, we get the values of \(p_1\) and \(p_2\)
5.3 Computing 3D Cubic Bezier Curve
After obtaining the curve, a group of points on the curve can be calculated. The sampling points on the curve are used for computing the variance [12, 18]. Here \(cx_{joint}\), \(cy_{joint}\) and \(cz_{joint}\) are the x,y,z coordinates of the curve, it can be seen in Fig. 6.
The mean, and then variance of each coordinate of the curve is computed by using the following objective functions:
where \(L = 1,2,\dots ,s\) is the total number of coordinate values in the curve, \(cx_{joint}\), \(cy_{joint}\) and \(cz_{joint}\) are the means, \(cx_L\), \(cy_L\) and \(cz_L\) are L-th coordinates of the curve. Next, we computed the variation between each coordinate of the motion curve. These variations of the curve can be computed as
\(\delta _x\), \(\delta _y\) and \(\delta _z\) are the variances. cx, cy and cz are L-th coordinates of the curve and their average values can be computed as:
f(xyz) is the average variation in human temporal motion curve of each coordinate. We use this value to compute threshold value for human identification.
5.4 Computing the Optimized Curve
Replication is a statistical technique, which is used to minimize the extraneous variation in an experiment. While performing the experiment we tried to have its several replicates (subject walks couple of times) in order to increase precision of estimates and the cubic Bezier curve is calculated for each walk of the walking person. Let a person walk k times, and he generates the k motion curves. Using the definition of Eq.(6). These curves can be seen in Fig. 7.
The normal variation of each coordinate of the each curve is calculated; and then the average variable values of the curve is computed.
Here let \({v_1,v_2,\dots ,v_k}\) are average variations values of each curve. These values computed by using Eq. (13). The average of all values the grand average is computed from \(v_\lambda \) where \(\lambda =1,2,\dots , k\), which gives the tremendous and comprehensive variation during walking. It can be expressed by
We know that the subject walked k times and generated the k CBC curves, and each curve has control points and passes through joints (shoulder, hip, knee, and ankle). The control points of k curves represent in matrix M form
the matrix M rewrite in equation form as
Here R indicates curve control points (No of joints). The optimized control points computed from the corresponding points of each curve. It can be computed by the following objective function as:
where OCP (Optimized Control Points) is a stack of control points; \(op_w\in OCP\) and \(op_w\in (c_x,c_y,c_z)\), which help to construct. By using the Eq. (2) on OCP stack, we construct a 3D optimal walking curve, it can be written as
It has x, y, z coordinates, they represent as ox,oy and oz in 3D space. For illustration, as shown in Fig. 7.
5.5 Computing Gait Signature
Optimal curve is not sufficient for computing the gait signature because it has only skeleton joint information and does not have any temporal information. If we use this curve for making signature, there will be confusion. Under this situation we embed time weighted factor to compute the signature. We computer the variance of each coordinate that has temporal information of human. This is achieved by time factor. The first data value of the curve is given by time factor 2 and the last value by time factor 3. The time factor is increased linearly among 2 to 3. Time Factor Variance(TFV) along x for joints motion curve is computed by the Eq. (15).
Here ox(t) is the x coordinate of motion joint curve at time t and \(\tau (t)\) is the time factor given to each curve value and it is calculated as below
where ct is the current frame time to corresponding curve value,to indicate the first value on the curve and \(t_e\) is the last value. Similar way is carried out for y and z coordinates to compute the TVFs. They are denoted by \(v_{oy}\) and \(v_{oz}\) respectively and handled as \(\Phi (v)={v_{ox},v_{oy},v_{oz}}\).
Now we determine the maximum influence variation factor of the optimal curve in walking. It is written
5.6 Typical Gait Signature
Although we have extracted core values during walking. We need to find corresponding correlated typical values to the subject for identification. Therefore, we compute the variation between joints correlated factor values during walk, it is called typical gait signature and its computation performed with help of Eqs. (13),(14) and (16). It is written
The notation \(T_{pg}(id)\) is the identification typical signature value to the human subject. It is computed by the cross relationship ratio between single and couple walking time and a unique identification among joints (shoulder, hip, knee, and ankle) is obtained. It is stored into the local database so that it used for identification. \(T_{pg}(id)\) is an improved identification value of the human subject by removing extraneous variations and incorporating optimized curve variation among selected human body joints.
6 Experimental Results
To assess the performance of our proposed approach using the statistical techniques and the cubic Bezier curve method for people recognition, we performed a set of walking examples on standard CMU MoCap database [2]. We have used total 37 walking motion examples. From Fig. 8 shows the optimize curve values of each individual and these are computed from Eq. (18). Figure 9 illustrates the result of 5 motion examples during single time walking of subjects. X axis of Fig. 8 represents the number of subjects and Y axis represents optimize curve values. From Fig. 9 X axis is same but Y axis denotes the typical signature values. For more satisfactory result, we have tested remaining 32 examples of walking of 6 different people and achieved reliable accuracy. Its graphical and confusion matrix form representation can be seen in 9 and Fig. 11. Figure 10 illustrates different from Figs. 8 and 9 because it contain couple of walking examples of different subjects that represents on X axis and Y axis represent signature values. The results obtained are identified and reported exactly 100 % accuracy rate of human recognition according to our dataset. The results are depicted in Table. 1. It describes different fields. The first column gives the number of subjects, second column subject name as in the dataset on CMU database, third column shows how many times subject walk and the last column shows the accuracy of human identification in percentage form. We have demonstrated the result of the proposed method, as well as comparisons with other existing approaches in Table. 2.
7 Conclusion and Future Work
In this paper, we proposed a novel human identification method based on the statistical methods and cubic Bezier curve through three-dimensional human joints data. The results of our study suggest that a human recognition through gait joints (hip, knee and ankle) and one upper joint (shoulder) of the human body are reliable. We are the first, who to use it for human recognition. Earlier researcher used it for animation, retarget motion and analysis and synthesis motion but not for human identification. According to our mentioned data, the experiment results show a 100 % accuracy rate for the recognition of the subject. So that it is now possible to extract a lot of information about human walking by only studying the four joints. We have presented a simple technique of human recognition through 3D motion data based on statistical methods and the cubic Bezier curve. Our representation is rich enough to show promising results. In future we would like to further strengthen our results by studying a much larger database by considering some more parameters like age, weight and height using the same file format of motion capture system. We have already started to extend this work using a Kinect XBOX 360 device for identification in real time environment. It will reduce the cost of the system as compared to the motion capture system.
References
Biometrics (2014). http://www.webopedia.com/TERM/B/biometrics.html. Accessed 10 Feb 2015
Carnegie mellon university graphics lab motion capture database (2013). https://mocap.cs.cmu.edu. Accessed 10 Feb 2015
Alan, W., Mark, W.: Advanced Animation and Rendering Techniques. Addison-Wesley, Reading (1992)
Ali, S., Mingquan, Z., Zhongke, W., Razzaq, A., Hamada, M., Ahmed, H.: Comprehensive use of hip joint in gender identification using 3-dimension data. TELKOMNIKA Indonesian J. Electr. Eng. 11(6), 2933–2941 (2013)
Anthony, D.: Statistics for Health, Life and Social Sciences. BookBoon (2011)
Be, P., et al.: Numerical Control-Mathematics and Applications. Wiley, London (1972)
BenAbdelkader, C., Cutler, R., Davis, L.: Motion-based recognition of people in eigengait space. In: Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 267–272. IEEE (2002)
Chai, Y., Ren, J., Zhao, R., Jia, J.: Automatic gait recognition using dynamic variance features. In: 7th International Conference on Automatic Face and Gesture Recognition, FGR 2006, pp. 475–480. IEEE (2006)
Chen, C., Liang, J., Zhao, H., Hu, H., Tian, J.: Frame difference energy image for gait recognition with incomplete silhouettes. Pattern Recogn. Lett. 30(11), 977–984 (2009)
Dai, H., Cai, B., Song, J., Zhang, D.: Skeletal animation based on bvh motion data. In: 2010 2nd International Conference on Information Engineering and Computer Science (ICIECS), pp. 1–4. IEEE (2010)
Davis, J.W., Gao, H.: Gender recognition from walking movements using adaptive three-mode PCA. In: Conference on Computer Vision and Pattern Recognition Workshop, CVPRW 2004, pp. 9–9. IEEE (2004)
Drillis, R., Contini, R., Bluestein, M.: Body segment parameters. Artif. Limbs 8(1), 44–66 (1964)
Golomb, B.A., Lawrence, D.T., Sejnowski, T.J.: Sexnet: a neural network identifies sex from human faces. In: NIPS, pp. 572–579 (1990)
Growney, E., Meglan, D., Johnson, M., Cahalan, T., An, K.N.: Repeated measures of adult normal walking using a video tracking system. Gait Posture 6(2), 147–162 (1997)
Guerra-Filho, G., Biswas, A.: The human motion database: a cognitive and parametric sampling of human motion. Image Vis. Comput. 30(3), 251–261 (2012)
Han, J., Bhanu, B.: Individual recognition using gait energy image. IEEE Trans. Pattern Anal. Mach. Intell. 28(2), 316–322 (2006)
Harb, H., Chen, L.: Gender identification using a general audio classifier. In: Proceedings of the 2003 International Conference on Multimedia and Expo, ICME 2003, vol. 2, pp. II-733. IEEE (2003)
Hayfron-Acquah, J.B., Nixon, M.S., Carter, J.N.: Automatic gait recognition by symmetry analysis. Pattern Recogn. Lett. 24(13), 2175–2183 (2003)
Hong, J., Kang, J., Price, M.E.: Gait analysis and identification. In: 2012 18th International Conference on Automation and Computing (ICAC), pp. 1–6. IEEE (2012)
Hong, J., Kang, J., Price, M.E.: Extraction of bodily features for gait recognition and gait attractiveness evaluation. Multimedia Tools Appl. 71(3), 1999–2013 (2014)
Hoyet, L., Ryall, K., Zibrek, K., Park, H., Lee, J., Hodgins, J., O’Sullivan, C.: Evaluating the distinctiveness and attractiveness of human motions on realistic virtual bodies. ACM Trans. Graph. (TOG) 32(6), 204 (2013)
Hu, M., Wang, Y.: A new approach for gender classification based on gait analysis. In: Fifth International Conference on Image and Graphics, ICIG 2009, pp. 869–874. IEEE (2009)
Huang, X., Boulgouris, N.V.: Gait recognition using multiple views. In: IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2008, pp. 1705–1708. IEEE (2008)
Jain, A., Huang, J.: Integrating independent components and linear discriminant analysis for gender classification. In: Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 159–163. IEEE (2004)
Johansson, G.: Visual perception of biological motion and a model for its analysis. Percept. Psychophys. 14(2), 201–211 (1973)
Josinski, H., Switonski, A., Jedrasiak, K., Daniel, K.: Human identification based on gait motion capture data. In: Proceeding of the International MultiConference of Engineers and Computer Scientists, vol. 1 (2012)
Lou, H., Chai, J.: Example-based human motion denoising. IEEE Trans. Visual. Comput. Graph. 16(5), 870–879 (2010)
Lu, W., Liu, Y., Sun, J., Sun, L.: A motion retargeting method for topologically different characters. In: Sixth International Conference on Computer Graphics, Imaging and Visualization, CGIV 2009, pp. 96–100. IEEE (2009)
Meredith, M., Maddock, S.: Motion capture file formats explained. Technical report CS-01-11, Department of Computer Science, University of Sheffield, pp. 241–244 (2001)
Moghaddam, B., Yang, M.H.: Learning gender with support faces. IEEE Trans. Pattern Anal. Mach. Intell. 24(5), 707–711 (2002)
Müller, M., Röder, T., Clausen, M., Eberhardt, B., Krüger, B., Weber, A.: Documentation mocap database hdm05 (2007)
Ng, H., Tan, W.H., Abdullah, J.: Multi-view gait based human identification system with covariate analysis. Int. Arab J. Inf. Technol. 10(5), 519–526 (2013)
Nixon, M.S., Tan, T., Chellappa, R.: Human Identification Based on Gait, vol. 4. Springer, Verlag (2006)
Niyogi, S.A., Adelson, E.H.: Analyzing and recognizing walking figures in XYT. In: Proceedings of the 1994 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 1994, pp. 469–474. IEEE (1994)
Pal, S., Biswas, P., Abraham, A.: Face recognition using interpolated bezier curve based representation. In: Proceedings of the International Conference on Information Technology: Coding and Computing, ITCC 2004, vol. 1, pp. 45–49. IEEE (2004)
Qi, T., Feng, Y., Xiao, J., Zhuang, Y., Yang, X., Zhang, J.: A semantic feature for human motion retrieval. Comput. Animation Virtual Worlds 24(3–4), 399–407 (2013)
Razali, N.S., Manaf, A.: Gait recognition using motion capture data. In: 2012 8th International Conference on Informatics and Systems (INFOS), pp. MM-67. IEEE (2012)
Roach, K.E., Miles, T.P.: Normal hip and knee active range of motion: the relationship to age. Phys. Ther. 71(9), 656–665 (1991)
Roetenberg, D., Luinge, H., Slycke, P.: Xsens MVN: full 6DOF human motion tracking using miniature inertial sensors. Technical report, Xsens Motion Technologies BV (2009)
Shutler, J.D., Nixon, M.S., Harris, C.J.: Statistical gait description via temporal moments. In: Proceedings of the 4th IEEE Southwest Symposium Image Analysis and Interpretation, pp. 291–295. IEEE (2000)
Weimin, X., Ying, L., Hongzhe, H., Lun, X., ZhiLiang, W., et al.: New approach of gait recognition for human ID. In: Proceedings of the 2004 7th International Conference on Signal Processing, ICSP 2004, vol. 1, pp. 199–202. IEEE (2004)
Whittle, M.W.: Gait Analysis: An Introduction. Butterworth-Heinemann, Oxford (2014)
Xu, D., Yan, S., Tao, D., Zhang, L., Li, X., Zhang, H.J.: Human gait recognition with matrix representation. IEEE Trans. Circ. Syst. Video Technol. 16(7), 896–903 (2006)
Yu, S., Tan, T., Huang, K., Jia, K., Wu, X.: A study on gait-based gender classification. IEEE Trans. Image Proces. 18(8), 1905–1910 (2009)
Zordan, V.B., Van Der Horst, N.C.: Mapping optical motion capture data to skeletal motion using a physical model. In: Proceedings of the 2003 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 245–250. Eurographics Association (2003)
Acknowledgment
This research is supported by the National Natural Science Foundation of China under Grant No. 61170203, 61170170, The National Key Technology Research and Development Program of China under Grant No. 2012BAH33F04, Beijing Key Laboratory Program of China under Grant No. Z111101055281056.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Ali, S., Wu, Z., Li, X., Idrees, M., Kang, W., Zhou, M. (2015). A Novel Method: 3-D Gait Curve for Human Identification. In: Zhang, YJ. (eds) Image and Graphics. ICIG 2015. Lecture Notes in Computer Science(), vol 9217. Springer, Cham. https://doi.org/10.1007/978-3-319-21978-3_29
Download citation
DOI: https://doi.org/10.1007/978-3-319-21978-3_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-21977-6
Online ISBN: 978-3-319-21978-3
eBook Packages: Computer ScienceComputer Science (R0)