Keywords

1 Introduction

The importance of a perfect human identification lies in the information that human facts are extremely difficult to be identified and tracked. Among many of the convictions, a few are large variations in human postures, and human size varied activities etc. Human recognition through walk as a new biometric aim to recognize people, which contain physiological or behavioural individuality of human. A more formal definitions of biometrics [1], joints analysis [42], body segment parameters [12] and the range of human body joints movement [38]. The need for computerized person identification is increasing in many applications [13, 24]. In human identification study area, the first beauty is human gender identification. Researchers in the field of gender recognition had focused on image processing techniques from face image [30, 35, 42] and gait appearance [22, 33, 44]. Under the study of multi-view gait, covariance analysis method [32] has been used for human identification. Furthermore, some other approaches in gender identification have been used through the voice of speakers [17, 40]. Even then in human identification studies, many research works have been carried out on different approaches such as spatio-temporal method [34], three-mode PCA [11], integrated face and gait recognition from multiple views [10], joint kinematic and kinetic data [14], gait signature [15], gait energy [16], matrix representation [43], moving light display (MLD) [25] and some other methods for human identification through gait [7, 9, 19, 23, 26, 40, 41].

Inspired by the aforementioned research, the researchers were motivated and employed different methods on the different video database for gender recognition and human identification. So they decided to use optical motion data (3D motion data) for human identification. Because with the explosive growth of motion capture data, in the last few decades, an increasing interest has been developed in the areas of human motion understanding, analysis and synthesis. Thus, the demand of high quality and new applications for motion capture data or method everywhere based on various consumer electronic devices are emerging day by day.

The maker-based optical motion capture method has become a standard technique to record human motions for animation, computer games, medical studies and sports. Therefore, there is a growing tunnel of high-quality motion capture data that can be used for scientific research [8, 36], and [31]. On the other hand, the human identification from motion captured data (BVH file) is still a current string of research. Numerous research studies have been carried out by using motion capture data in different tasks [27, 28, 32, 36, 45]. For example, as shown in Fig. 1. Hong et al. [20], used 3D motion of human body joints and employed the PCA method to identify the person and in [19], they demonstrated PCA1 and PCA2 on 3D motion data and identified the person. However, these methods have used full human body joints data for human identification. Razali et al. [37] also identified the person by using motion capture data of gait joints instead of full body joints. Multiple techniques demonstrated for human identification by Josinski et al. [26]. They used multilinear principal components analysis for reducing the high dimensional data.The motion capture data has a different file format and one of them is BVH format [29], which is utilized in our experiment. Particularly, it includes the position of the root and orientation of other joints. It can be produced from a motion capture system and it’s practical implementation [39], as shown in the Fig. 1. It has two parts, one is a header part that describes the hierarchy and initial pose of the skeleton. And the second part describes the channel data for each frame, and Fig. 2 show skeleton joints of the file. In this paper, we present a method to identify the human because its format contains digitalizing human movements. Our work is inspired by the study of Ludovic [21], where they evaluated the distinctive and attractiveness of human motions and identified the differences between different subjects. We kept the gait joints information of walking persons and feel that information has varied between the joints of the person. Our idea based on the Cubic Bezier [3, 6] Curve and statistical techniques [5], is applied to four joints (shoulder, hip, knee and ankle).

Fig. 1.
figure 1

Practically uses of MoCap data in surroundings life.

Fig. 2.
figure 2

Humanoid skeleton model.

The features of human body joints are selected by choosing definite number of coordinate points of 3D motion data. First normalize the features of each coordinate of joint data, which are computed from data. These features will be used as data points. Then the cubic Bezier curve is drawn through interpolation. The sampling coordinates are used to compute the variance between joint movements and pure correlation between them (joints) by using the ratio of variance of the couple of walking person. Identification of human is achieved by matching the ratio values of the combined joint variance of the threshold values stored into the database.

2 System Overview

In our method, human recognition is done through 3D gait motion joint data during walking. We select the four important joints: shoulder, hip, knee and ankle and calculate the means of coordinates of each joint 3D motion data. The means of coordinate points will be interpolated to generate a cubic Bezier curve. These points on the curve are then used to calculate the mean and the variance. The workflow of proposed method is shown in Fig. 4.

3 Proposed Database

The CMU motion capture database [2] was built mainly to give a source of motion data for animation and other research areas [5, 36]. The database contains 2605 different motion clips of full body MoCap data. The actions have been performed by a total of 144 subjects few subjects are the same person. Here we consider simple walk action motion data of seven subjects. Then we adopt the process to manage proposed data by following the procedure [4]. In additionally, the simple kalman filter algorithm that has used for removing noise in optical motion capture data, the illustration can be seen in Fig. 3. Figure 3 illustrates the two labels such as A and B. A means before filtering data and B indicate after filtration data.

Fig. 3.
figure 3

An example of denoised MoCap data.

Fig. 4.
figure 4

Block diagram of proposed method.

4 Workflow of Proposed Algorithm

The flowchart contains three units as shown in Fig. 4: unit one is the pre-processing unit, the second is the calculation unit and the third one is called the recognition unit.

4.1 Preprocessing Unit

In this unit, we follow the same procedure for the preprocessing unit that has been described in [4].

4.2 Calculation Unit

The calculation unit also has two steps. The first step is to extract the features of the concerned joints (shoulder, hip, knee and ankle) and the second step is to apply these features as the data points. The cubic Bezier curve passes curve through these data points. The variances of each coordinate of the curve is calculated and then compute the mean of the variance.

4.3 Recognition Unit

Finally, the output of the calculation unit is used as an input to the recognition unit. It contains two main things one detector and the second is a database. The detector uses the Id_vale and compares it with stored Id values into the one database and gives the result of human identification.

5 Proposed Method for Calculation

As mentioned above, we are interested in determining the human recognition. For calculating this, we used 3D motion data the joints of the subjects concerned during walking. The important quantities are used to measure the recognition of human is based on the cubic Bezier curve and statistical techniques under joint movement. The calculation is carried out in the following steps.

5.1 Feature Extracting for Bezier Curve

The original 3D human motions of walking person are obtained from the motion capture system. It consists of N number of frames and 96 channels. Let a walking person motion file has the number of frames such as \(\forall t \in N\) and each frame \(f^t\) having the x, y, and z coordinates of all skeleton nodes. This study uses the original skeleton joints (nodes) data. Then selected joints (shoulder, hip, knee and ankle) of the skeleton are shown in Fig. 2. By extracting data of the joints that were mentioned in Fig. 2 from the BVH file and normalized by the following objective functions can be expressed as

$$\begin{aligned} \theta _i = \sum _{i=x}^z\left( \frac{1}{N}\sum _{t=1}^{N} J_t^i\right) \end{aligned}$$
(1)

where \(i\in \{x,y,z\}\), J is joint position.

Where \(t=1,2,\dots , N\) and N is number of frames in motion data, \(\theta _x\), and \(\theta _y\) and \(\theta _z\) are the normalize features of shoulder joint xshu,yshu and zshu denotes the coordinate values of the shoulder joint. Eq (1) is used to normalize the shoulder joint features denoted by \(P_0(\theta _x,\theta _y,\theta _z)\). Similarly we compute the features of other three joints (hip, knee, and ankle). Denote the hip joint by a point \(P_1(\theta _x,\theta _y,\theta _z)\), the knee joint \(P_2(\theta _x,\theta _y,\theta _z)\), and an ankle joint \(P_3(\theta _x,\theta _y,\theta _z)\). These points are called the features of the joints. Furthermore, these features are used as control points of the curve. Sp is defined as a collection of features of the a walking sequence as

$$\begin{aligned} Sp = \{P_i\}^n_{i=0} \qquad where \, P_i \in (\theta _x,\theta _y,\theta _z) \end{aligned}$$

5.2 Bezier Curve

A Bezier curve is r(t) of degree n can be defined in term of a set of control points \(P_i(i=0,1,\dots ,n)\), and is given by Bezier [3, 6], which is widely used in the computer graphics community due to its solid mathematical equations and intuitive manipulation. The Bezier curve equation is (2).

$$\begin{aligned} r(t) = \sum _{i=0}^{n}B_{i,n}(t)p_i \end{aligned}$$
(2)

where \(p_i\) are control points, \(B_{i,n}\) are blending functions called Bernstein polynomials, \(t\in [0,1]\) and is defined as

$$\begin{aligned} B_{i,n}(t) = \begin{Bmatrix}n \\ 1\end{Bmatrix} t^i(1-t)^{n-i} \end{aligned}$$

The Eqs. (3), (4) and (5) can be represented parametrically as \(r(t)=( x(t),y(t),z(t) )\)

$$\begin{aligned} x(t)&= \sum _{i=1}^{n}x_iB_{i,n}(t)\\ y(t)&= \sum _{i=1}^{n}y_iB_{i,n}(t) \\ z(t)&= \sum _{i=1}^{n}z_iB_{i,n}(t) \end{aligned}$$
(3)

where \(0\le t \le 1\). We construct 3D cubic Bezier curve of the walking person, by computing x,y and z coordinates for cubic Bezier curve using Eqs. (3), (4) and (5). The curve that passes through four joint (\(p_0=shoulder\), \(p_1=hip\), p2=knee, and \(p_3=ankle\)) is obtained. An Example of subject cubic Bezier curve can be seen in Fig. 5.

$$\begin{aligned} r(t) = (1-t)^3p_0 + 3(1-t)^2tp_1 + 3(1-t)t^2p_2 + t^3p_3 \end{aligned}$$
(4)

Suppose that the shoulder location is at \(J_0\), hip location at \(J_1\), knee location at \(J_2\) and ankle location \(J_3\). Therefore, the control points of the Bezier curve are computed as following:

$$\begin{aligned} p_0 = J_0\, \quad and \quad \, p_3=J_3 \end{aligned}$$

The control point \(p_1\) and \(p_2\) is are computed so that the curve through \(J_1\) and \(J_2\). Here the parameter \(t_1\) corresponds to \(J_1\), which is computed as:

$$\begin{aligned} t_1 = \dfrac{|J_0J_1|}{|J_0J_1|+|J_1J_2|+|J_2J_3|} \end{aligned}$$

parameter \(t_2\) corresponds to \(J_2\), which is computed as:

$$\begin{aligned} t_2 = \dfrac{|J_0J_1|+|J_1J_2|}{|J_0J_1|+|J_1J_2|+|J_2J_3|} \end{aligned}$$

Therefore,

$$\begin{aligned} J_2 = r(t_2) = (1-t_2)^3p_0+3(1-t_2)^2t_2p_1+3(1-t_2)t^2_2p_2+t^3_2p_3 \nonumber \\ J_1 = r(t_1) = (1-t_1)^3p_0+3(1-t_1)^2t_1p_1+3(1-t_1)t^2_1p_2+t^3_1p_3 \nonumber \end{aligned}$$

Solving the linear equations simultaneously, we get the values of \(p_1\) and \(p_2\)

Fig. 5.
figure 5

Motion curve a specific walking pose.

5.3 Computing 3D Cubic Bezier Curve

After obtaining the curve, a group of points on the curve can be calculated. The sampling points on the curve are used for computing the variance [12, 18]. Here \(cx_{joint}\), \(cy_{joint}\) and \(cz_{joint}\) are the x,y,z coordinates of the curve, it can be seen in Fig. 6.

Fig. 6.
figure 6

Example of joints curve in 3D space.

The mean, and then variance of each coordinate of the curve is computed by using the following objective functions:

$$\begin{aligned} cx_{joint} = \dfrac{1}{s}\sum _{L=1}^{s}cx_L \end{aligned}$$
(5)
$$\begin{aligned} cy_{joint} = \dfrac{1}{s}\sum _{L=1}^{s}cy_L \end{aligned}$$
(6)
$$\begin{aligned} cz_{joint} = \dfrac{1}{s}\sum _{L=1}^{s}cz_L \end{aligned}$$
(7)

where \(L = 1,2,\dots ,s\) is the total number of coordinate values in the curve, \(cx_{joint}\), \(cy_{joint}\) and \(cz_{joint}\) are the means, \(cx_L\), \(cy_L\) and \(cz_L\) are L-th coordinates of the curve. Next, we computed the variation between each coordinate of the motion curve. These variations of the curve can be computed as

$$\begin{aligned} (\delta _x|cx,cx_{joint}) = \dfrac{1}{s}\sum _{L=1}^{s}(cx_L-cx_{joint})\end{aligned}$$
(8)
$$\begin{aligned} (\delta _y|cy,cy_{joint}) = \dfrac{1}{s}\sum _{L=1}^{s}(cy_L-cy_{joint})\end{aligned}$$
(9)
$$\begin{aligned} (\delta _z|cz,cz_{joint}) = \dfrac{1}{s}\sum _{L=1}^{s}(cz_L-cz_{joint}) \end{aligned}$$
(10)

\(\delta _x\), \(\delta _y\) and \(\delta _z\) are the variances. cx, cy and cz are L-th coordinates of the curve and their average values can be computed as:

$$\begin{aligned} f(xyz) = \dfrac{1}{len(\delta _c)}\times \sum _{c=x}^{len(\delta _c)}\delta _c ,\quad where \, c \in (x,y,z) \end{aligned}$$
(11)

f(xyz) is the average variation in human temporal motion curve of each coordinate. We use this value to compute threshold value for human identification.

5.4 Computing the Optimized Curve

Replication is a statistical technique, which is used to minimize the extraneous variation in an experiment. While performing the experiment we tried to have its several replicates (subject walks couple of times) in order to increase precision of estimates and the cubic Bezier curve is calculated for each walk of the walking person. Let a person walk k times, and he generates the k motion curves. Using the definition of Eq.(6). These curves can be seen in Fig. 7.

Fig. 7.
figure 7

The Process of constructing optimal walking curve.

The normal variation of each coordinate of the each curve is calculated; and then the average variable values of the curve is computed.

Here let \({v_1,v_2,\dots ,v_k}\) are average variations values of each curve. These values computed by using Eq. (13). The average of all values the grand average is computed from \(v_\lambda \) where \(\lambda =1,2,\dots , k\), which gives the tremendous and comprehensive variation during walking. It can be expressed by

$$\begin{aligned} G(v)=\dfrac{1}{k}\sum _{\lambda =1}^{k}v_\lambda \end{aligned}$$
(12)

We know that the subject walked k times and generated the k CBC curves, and each curve has control points and passes through joints (shoulder, hip, knee, and ankle). The control points of k curves represent in matrix M form

$$\begin{aligned} \begin{Bmatrix}&p_{0\beta _1}&p_{1\beta _1}&\dots&p_{R\beta _1}\\&p_{0\beta _2}&p_{1\beta _2}&\dots&p_{R\beta _2}\\&\dots&\dots&\dots&\dots \\&p_{0\beta _k}&p_{1\beta _k}&\dots&p_{R\beta _k} \end{Bmatrix} \in M^{\beta \times R} \end{aligned}$$

the matrix M rewrite in equation form as

$$\begin{aligned} M =\sum _{\beta =1}^{k}\sum _{\gamma =1}^{R}\left( M\left( p_\gamma \right) _\beta \right) \end{aligned}$$

Here R indicates curve control points (No of joints). The optimized control points computed from the corresponding points of each curve. It can be computed by the following objective function as:

$$\begin{aligned} op_w = \sum _{w=0}^{R-1}\left( \dfrac{1}{k}\sum _{p=1}^{k}p_{w\beta }\right) \nonumber \\ OCP = [op_0,op_1,op_2,op_3]\nonumber \end{aligned}$$

where OCP (Optimized Control Points) is a stack of control points; \(op_w\in OCP\) and \(op_w\in (c_x,c_y,c_z)\), which help to construct. By using the Eq. (2) on OCP stack, we construct a 3D optimal walking curve, it can be written as

$$\begin{aligned} r_{new}(t) \end{aligned}$$

It has x, y, z coordinates, they represent as ox,oy and oz in 3D space. For illustration, as shown in Fig. 7.

5.5 Computing Gait Signature

Optimal curve is not sufficient for computing the gait signature because it has only skeleton joint information and does not have any temporal information. If we use this curve for making signature, there will be confusion. Under this situation we embed time weighted factor to compute the signature. We computer the variance of each coordinate that has temporal information of human. This is achieved by time factor. The first data value of the curve is given by time factor 2 and the last value by time factor 3. The time factor is increased linearly among 2 to 3. Time Factor Variance(TFV) along x for joints motion curve is computed by the Eq. (15).

$$\begin{aligned} v_{ox} = var(ox(t)\times \tau (t)) \end{aligned}$$
(13)

Here ox(t) is the x coordinate of motion joint curve at time t and \(\tau (t)\) is the time factor given to each curve value and it is calculated as below

$$\begin{aligned} \tau (t) = 2+\dfrac{ct-t_0}{t_e-t_0} \end{aligned}$$

where ct is the current frame time to corresponding curve value,to indicate the first value on the curve and \(t_e\) is the last value. Similar way is carried out for y and z coordinates to compute the TVFs. They are denoted by \(v_{oy}\) and \(v_{oz}\) respectively and handled as \(\Phi (v)={v_{ox},v_{oy},v_{oz}}\).

Now we determine the maximum influence variation factor of the optimal curve in walking. It is written

$$\begin{aligned} G_{inf} = max(\phi (v_\mu )) \\ where \, \mu \in \{ox,oy,oz\} \nonumber \end{aligned}$$
(14)

5.6 Typical Gait Signature

Although we have extracted core values during walking. We need to find corresponding correlated typical values to the subject for identification. Therefore, we compute the variation between joints correlated factor values during walk, it is called typical gait signature and its computation performed with help of Eqs. (13),(14) and (16). It is written

$$\begin{aligned} T_{pg}(id) = \dfrac{f(xyz)\times G(v)}{G_{inf}} \end{aligned}$$

The notation \(T_{pg}(id)\) is the identification typical signature value to the human subject. It is computed by the cross relationship ratio between single and couple walking time and a unique identification among joints (shoulder, hip, knee, and ankle) is obtained. It is stored into the local database so that it used for identification. \(T_{pg}(id)\) is an improved identification value of the human subject by removing extraneous variations and incorporating optimized curve variation among selected human body joints.

Fig. 8.
figure 8

Optimize curve variation values of individual subject.

Fig. 9.
figure 9

One time walking subjects test result.

Fig. 10.
figure 10

Couple of walking of different subjects.

Fig. 11.
figure 11

Results of identification in confusion matrix form.

6 Experimental Results

To assess the performance of our proposed approach using the statistical techniques and the cubic Bezier curve method for people recognition, we performed a set of walking examples on standard CMU MoCap database [2]. We have used total 37 walking motion examples. From Fig. 8 shows the optimize curve values of each individual and these are computed from Eq. (18). Figure 9 illustrates the result of 5 motion examples during single time walking of subjects. X axis of Fig. 8 represents the number of subjects and Y axis represents optimize curve values. From Fig. 9 X axis is same but Y axis denotes the typical signature values. For more satisfactory result, we have tested remaining 32 examples of walking of 6 different people and achieved reliable accuracy. Its graphical and confusion matrix form representation can be seen in 9 and Fig. 11. Figure 10 illustrates different from Figs. 8 and 9 because it contain couple of walking examples of different subjects that represents on X axis and Y axis represent signature values. The results obtained are identified and reported exactly 100 % accuracy rate of human recognition according to our dataset. The results are depicted in Table. 1. It describes different fields. The first column gives the number of subjects, second column subject name as in the dataset on CMU database, third column shows how many times subject walk and the last column shows the accuracy of human identification in percentage form. We have demonstrated the result of the proposed method, as well as comparisons with other existing approaches in Table. 2.

Table 1. Results of our proposed method
Table 2. Comparison result with other approaches

7 Conclusion and Future Work

In this paper, we proposed a novel human identification method based on the statistical methods and cubic Bezier curve through three-dimensional human joints data. The results of our study suggest that a human recognition through gait joints (hip, knee and ankle) and one upper joint (shoulder) of the human body are reliable. We are the first, who to use it for human recognition. Earlier researcher used it for animation, retarget motion and analysis and synthesis motion but not for human identification. According to our mentioned data, the experiment results show a 100 % accuracy rate for the recognition of the subject. So that it is now possible to extract a lot of information about human walking by only studying the four joints. We have presented a simple technique of human recognition through 3D motion data based on statistical methods and the cubic Bezier curve. Our representation is rich enough to show promising results. In future we would like to further strengthen our results by studying a much larger database by considering some more parameters like age, weight and height using the same file format of motion capture system. We have already started to extend this work using a Kinect XBOX 360 device for identification in real time environment. It will reduce the cost of the system as compared to the motion capture system.