Keywords

1 Introduction

Random numbers play an important role in many network protocols and encryption schemas on various network security applications [1], for example, visual crypto, digital signatures, authentication protocols and stream ciphers. To determine whether a random sequence is suitable for a cryptographic application, the NIST has published a series of statistical tests as standards.

In network security applications, the stream ciphers play a key role that have faster throughput and easier to implement compared to block ciphers [2]. RC4, the famous stream cipher, is suitable for large packets in Wireless LANs [3]. It has been used for encrypting the internet traffic in network protocols such as Sockets Layer (SSL), Transport Layer Security (TLS), Wi-Fi Protected Access (WPA), etc. [2].

eSTREAM project collected stream ciphers from international cryptology society [4] to promote the design of efficient and compact stream ciphers suitable for widespread adoptions. After a series of tests, algorithms submitted to eSTREAM are selected into two profiles. One is more suitable for software and another one is more suitable for hardware. Non-linear structures and recursive are playing the essential roles in new development.

Different visual schemes are required to test randomness of random sequences on different stream ciphers. Along this direction, this chapter proposes a flexible framework to handle a set of mete measurements on different combinatorial projections.

2 Variant Combinatorial Visualization

Architecture of variant visualization is shown in Fig. 1.

Fig. 1
figure 1

Visualization architecture

The variant visualization architecture is separated into four core components: EAC, SCC CC and VC.

  • RGC Randomness Generate Component generate a random sequence;

  • VSC Variant Statistic Component handles the statistic process using the variant measure method [5];

  • CC Combinatorial Component chooses combinations;

  • VC Visualization Component makes visualization based on SCC measures and CC vectors.

The input n is the length of the binary sequence. The stream ciphers could be changed to any stream cipher that can generate binary sequence. This section focuses on the variant measure module and the visual method module.

A visual example of RC4 will be described in Sect. 2.5.

2.1 Variant Logic Framework

The variant logic framework has been proposed in [6]. Li [7] used the variant measure method to generate different symmetry results [5] based on cellular automata schemes [8]. Under such construction, even some random sequences show symmetry properties in distributions.

Under variant construction, the variant conversion operator can be defined as follows:

$$ C\left( {x,y} \right) = \left\{ {\begin{array}{*{20}l} { \bot , x = 0,y = 0} \hfill \\ { + , x = 0,y = 1} \hfill \\ { - , x = 1,y = 0} \hfill \\ {{ \top }, x = 1,y = 1} \hfill \\ \end{array} } \right. $$
(1)

It is convenient to list relevant variant logic variables shown in Table 1.

Table 1 The variant measure method

In the variant measure method, each sequence is converting from binary sequence to probability which generated by counting the number of each variable in \( \left\{ { \bot , + , - ,{ \top }} \right\} \) and computes the probability of each variable. The measurement method is shown in Table 1.

The variant measure method provides a set of results in measures of different 0–1 sequences. The following mechanism can transfer stream cipher sequences as relevant measures.

The essential models of variant scheme are described as follows.

2.2 VSC Variant Statistic Component

The VSC component converts the binary sequence to variant sequence in VCM module, and to compute probabilities and entropies in PECM module, respectively. The component is shown in Fig. 2.

Fig. 2
figure 2

Variant statistic component

VCM Variant Conversion Module

VCM module transfers input binary sequences by following steps:

  1. Step 1.

    Generate an n bit binary sequence \( S = S_{1} S_{2} S_{3} \ldots S_{n} \) by a stream cipher.

  2. Step 2.

    Shift X to left by M bit (M is the length of shifting) and generate a new binary sequence \( S^{\prime } = S_{1}^{\prime } S_{2}^{\prime } S_{3}^{\prime } \ldots S_{n - M}^{\prime } = S_{1 + M} S_{2 + M} \ldots S_{n} \).

  3. Step 3.

    Convert two sequences: S and S′ to a variant sequence \( V = V_{i} = C\left( {S_{i,} S_{i}^{\prime } } \right),\quad i = 1,2,3 \ldots \left( {n - M} \right) \).

  4. Step 4.

    Separate V into \( n/N \) parts. N is the length of each part and \( M \le N \le n \) to form a set of variant sequence groups

$$ \begin{aligned} G & = \left\{ {G_{1} ,G_{2} , \ldots ,G_{n/N} } \right\} \\ & = \left\{ {\left\{ {V_{1} ,V_{2} , \ldots ,V_{N} } \right\}, \ldots ,\left\{ {V_{n - N} ,V_{n - N + 1} , \ldots ,V_{n} } \right\}} \right\} \\ \end{aligned} $$
  1. Step 5.

    Separate each item in G into \( N/M \) parts to establish a sequence group

$$ \begin{aligned} G & = \left\{ {\left\{ {\left\{ {V_{1} , \ldots ,V_{M} } \right\}, \ldots ,\left\{ {V_{{N - M + 1}} , \ldots ,V_{N} } \right\}} \right\}, \ldots ,} \right. \\ & \quad \left. {\left\{ {\left\{ {V_{{n - N}} , \ldots ,V_{{n - N + M}} } \right\}, \ldots ,\left\{ {V_{{n - M}} , \ldots ,V_{n} } \right\}} \right\}} \right\} \\ \end{aligned} $$

PECM Probability and Entropy Computing Module

PECM converts a variant sequences group to separate it into several parts to compute probability and entropies. The equations computing the parameters have been described in Table 1. The main steps are performed as follows:

  1. Step 6.

    Compute the probability vector \( P = \left\{ {P_{ \bot } ,P_{ + } ,P_{ - } ,P_{{ \top }} } \right\} \) of each part in G′;

  2. Step 7.

    Calculate the distribute probability vector \( D = \left\{ {D_{ \bot } ,D_{ + } ,D_{ - } ,D_{{ \top }} } \right\} \) of each part in G based on P vector;

  3. Step 8.

    Evaluate the entropy vector \( \left\{ {E_{ \bot } ,E_{ + } ,E_{ - } ,E_{{ \top }} } \right\} \) from the D vector.

2.3 CC Combinatorial Component

IIn the CC component, it can be separated into two modules. One is SM module to form the vector selecting and another one is VDM module to perform the visualization.

Visual data is a set of E vectors as input for VC. For E vector, choose a projection as a visual vector to compute the visual result from E vectors. So there will be 16 visual results.

Base on the same number of variables in a combination, the combination set can be integrated into 5 parts. i.e. The selected number of variables in the combination is in 0-4.

Let the classification be \( EC = \left\{ {EC_{0} ,EC_{1} ,EC_{2} ,EC_{3} ,EC_{4} } \right\} \). Since the \( EC_{0} \) is empty, it can be ignored. Only four distributions are of concern in Sect. 2.4.

2.4 Visualization Component

According to the variant measure method, in the rectangular axis, let \( E_{ \bot } \) be the positive axis of X, \( E_{{ \top }} \) be the negative axis of X, \( E_{ + } \) the positive axis of Y, \( E_{ - } \) be the negative axis of Y. The axis is shown in Fig. 3.

Fig. 3
figure 3

Visualization axis

For \( EC_{1} = \left\{ {\left\{ {E_{ \bot } } \right\},\left\{ {E_{ + } } \right\},\left\{ {E_{ - } } \right\},\left\{ {E_{{ \top }} } \right\}} \right\} \), points are distributed to the axis.

For \( EC_{2} = \left\{ {\left\{ {E_{ \bot } ,E_{ + } } \right\},\left\{ {E_{ \bot } ,E_{ - } } \right\},\left\{ {E_{ \bot } ,E_{{ \top }} } \right\},\left\{ {E_{ + } ,E_{ - } } \right\},\left\{ {E_{ + } ,E_{{ \top }} } \right\},\left\{ {E_{ - } ,E_{{ \top }} } \right\}} \right\} \), points are distributed in the shadow area in Fig. 4.

Fig. 4
figure 4

Distribution areas of \( EC_{2} \)

For \( EC_{3} = \left\{ {\left\{ {E_{ \bot } ,E_{ + } ,E_{ - } } \right\},\left\{ {E_{ \bot } ,E_{ + } ,E_{{ \top }} } \right\},\left\{ {E_{ \bot } ,E_{ - } ,E_{{ \top }} } \right\},\left\{ {E_{ + } ,E_{ - } ,E_{{ \top }} } \right\}} \right\} \), points are distributed in the area of \( EC_{1} \) and the area of \( EC_{2} \).

For \( EC_{4} = \left\{ {\left\{ {E_{ \bot } ,E_{ + } ,E_{ - } ,E_{{ \top }} } \right\}} \right\} \), points are distributed in Fig. 5.

Fig. 5
figure 5

Distribution areas of \( EC_{4} \)

2.5 Example

An example is given step by step to show how the algorithm runs. In the example, n, N and M are, respectively, assigned to 40, 16 and 8.

  1. Step 1.

    Input a 35 bit binary sequence, {010100101110101100101101011

    1101101010101}

  2. Step 2.

    Generates \( S^{\prime } \), {11101011001011010111101101010101}.

  3. Step 3.

    Generates V, \( \{ + { \top } + - + \bot { \top } + - - { \top } \bot { \top } + - { \top } \bot + { \top } + { \top } + - { \top } \bot { \top } - { \top } - + - { \top } \)}.

  4. Step 4.

    Separate V into a G vector. The G vector is \( \left\{ {\left\{ { + { \top } + - + \bot { \top } + - - { \top } \bot { \top } + - { \top }} \right\},\left\{ { \bot + { \top } + { \top } + - { \top } \bot { \top } - { \top } - + - { \top }} \right\}} \right\} \).

  5. Step 5.

    Separate the G into the G′ vector. The G′ vector in the example is \( \left\{ {\left\{ { + { \top } + - + \bot { \top } + , - - { \top } \bot { \top } + - { \top }} \right\},\left\{ { \bot + { \top } + { \top } + - { \top }, \bot { \top } - { \top } - + - { \top }} \right\}} \right\} \).

  6. Step 6.

    Generate probability vector P of each sequence in G′. The P vector of \( \left\{ { + { \top } + - + \bot { \top } + } \right\} \) is \( \left\{ {P_{ \bot } = 0.125,P_{ + } = 0.5,P_{ - } = 0.125,P_{{ \top }} = 0.25} \right\} \).

  7. Step 7.

    Compute the distribute probability vector D of each sequence in G from P. The D vector of \( \left\{ { + { \top } + - + \bot { \top } + , - - { \top } \bot { \top } + - { \top }} \right\} \) is shown in Fig. 6.

    Fig. 6
    figure 6

    D vectors of \( \left\{ { + { \top } + - + \bot { \top } + , - - { \top } \bot { \top } + - { \top }} \right\} \)

  1. Step 8.

    Compute the entropy vector E of each sequence in G from D. The E vector of \( \left\{ { + { \top } + - + \bot { \top } + - - { \top } \bot { \top } + - { \top }} \right\} \) is shown in Fig. 7.

    Fig. 7
    figure 7

    E vectors of \( \left\{ { + { \top } + - + \bot { \top } + - - { \top } \bot { \top } + - { \top }} \right\} \)

  1. Step 9.

    Compute visual results from E vectors. In the E vectors of \( \left\{ { + { \top } + - + \bot { \top } + - - { \top } \bot { \top } + - { \top }} \right\} \). If the selection is \( \left\{ {E_{ \bot } } \right\} \), points will be \( \left( {0.0,0.0} \right) \). If the selection is \( \left\{ {E_{ \bot } ,E_{{ \top }} } \right\} \), points will be \( \left( {0.0, - 0.693147} \right) \) and \( \left( {0.0,0.0} \right) \). If the selection is \( \left\{ {E_{{ \top }} ,E_{ - } } \right\} \), points will be \( \left\{ {E_{ - } - \left| {E_{{ \top }} } \right|} \right\} = \left( {0.0,0.0} \right) \) and \( \left( {0.0,0.693147} \right) \). If the selection is \( \left\{ {E_{ \bot } ,E_{{ \top }} ,E_{ - } } \right\} \), points will be \( \left\{ {E_{ \bot } ,E_{ - } - \left| {E_{{ \top }} } \right|} \right\} = \left( {0.0,0.0} \right) \) and \( \left( {0.0,0.693147} \right) \).

  2. Step 10.

    Separate visual results to EC classification. Visual results of the G in the example are shown in Fig. 8.

    Fig. 8
    figure 8

    Visual result of the example

3 Result

3.1 Visual Result of RC4

The initial: \( \left\{ {{\mathbf{n}}: 128{,}000,{\mathbf{N}}:128,{\mathbf{M}}:16} \right\} \)

The visual result (Fig. 9).

Fig. 9
figure 9

Visual result of RC4 \( \left\{ {{\mathbf{n}}:128000,{\mathbf{N}}:128,{\mathbf{M}}:16} \right\} \)

The initial: \( \left\{ {{\mathbf{n}}:128{,}000,{\mathbf{N}}:128,{\mathbf{M}}:24} \right\} \)

The visual result (Fig. 10).

Fig. 10
figure 10

Visual result of RC4 \( \left\{ {{\mathbf{n}}:128000,{\mathbf{N}}:128,{\mathbf{M}}:24} \right\} \)

The initial: \( \left\{ {{\mathbf{n}}:128{,}000,{\mathbf{N}}:1000,{\mathbf{M}}:8} \right\} \)

The visual result (Fig. 11).

Fig. 11
figure 11

Visual result of RC4 \( \left\{ {{\mathbf{n}}:128000,{\mathbf{N}}:1000,{\mathbf{M}}:8} \right\} \)

The initial: \( \left\{ {{\mathbf{n}}:100{,}000,{\mathbf{N}}:100,{\mathbf{M}}:24} \right\} \)

The visual result (Fig. 12).

Fig. 12
figure 12

Visual result of RC4 \( \left\{ {{\mathbf{n}}:100000,{\mathbf{N}}:100,{\mathbf{M}}:24} \right\} \)

3.2 Visual Result of HC256

The initial: \( \left\{ {{\mathbf{n}}:128{,}000,{\mathbf{N}}:128,{\mathbf{M}}:16} \right\} \)

The visual result (Fig. 13).

Fig. 13
figure 13

Visual result of HC256 \( \left\{ {{\mathbf{n}}: 128000, {\mathbf{N}}: 128, {\mathbf{M}}: 16} \right\} \)

The initial: \( \left\{ {{\mathbf{n}}:128{,}000,{\mathbf{N}}:128,{\mathbf{M}}:24} \right\} \)

The visual result (Fig. 14).

Fig. 14
figure 14

Visual result of HC256 \( \left\{ {{\mathbf{n}}: 128000, {\mathbf{N}}: 128, {\mathbf{M}}: 24} \right\} \)

The initial: \( \left\{ {{\mathbf{n}}:100{,}000,{\mathbf{N}}:100,{\mathbf{M}}:8} \right\} \)

The visual result (Fig. 15).

Fig. 15
figure 15

Visual result of HC256 \( \left\{ {{\mathbf{n}}: 100000, {\mathbf{N}}: 100, {\mathbf{M}}: 8} \right\} \)

The initial: \( \left\{ {{\mathbf{n}}:100{,}000,{\mathbf{N}}:100,{\mathbf{M}}:16} \right\} \)

The visual result: (Fig. 16).

Fig. 16
figure 16

Visual result of HC256 \( \left\{ {{\mathbf{n}}: 100000, {\mathbf{N}}: 100, {\mathbf{M}}: 16} \right\} \)

4 Conclusion

The visual results show the similar symmetry property of sequences generated by RC4 and HC256. They are showing interesting distributions and can be significantly distinguished from their combinatorial maps. From our models and illustrations, various maps can be integrated by their combinatorial projections to show different spatial distributions on random sequences. Under this configuration, the variant measure method provides a new analysis tool for stream cipher applications in further explorations.