Keywords

1 Introduction

Interpolation and approximation of acquired data is required in many areas. Usually a data domain is tessellated, i.e. meshed by triangles or tetrahedrons, using Delaunay tessellation [7]. However, there are some complications using this approach: Delaunay tessellation is not acceptable for some computational methods (other criteria for the shape of elements are required) and for computational and memory requirements in the case of higher dimensions. Also, if used for physical phenomena interpolation, there are additional problems with smoothness physical phenomena interpolation over resulting triangles or tetrahedrons. In the case of approximation, i.e. when the input data are to be reduced with regard to the values given, standard mesh reduction methods are complex as they are primarily designed for surface representation.

Meshless methods based on radial basis function (RBF) are based on the principle partition of unity, in general [2, 8, 28]. They are used for computations [1], e.g. partial differential equations (PDE) [6, 30, 31], interpolation and approximation of given data [16, 17], which lead to implicit representation [9, 18, 19] or explicit representation [10,11,12], impainting removal [25, 26, 29] vector data approximation (fluids, etc.) [24]. In the following, the RBFs for explicit representation will be explored more deeply from the shape parameter behavior and its estimation.

The RBFs interpolation is defined as [6, 20]:

$$ f\left( \varvec{x} \right) = \mathop \sum \limits_{j = 1}^{M} c_{j} \varphi \left( {r_{j} } \right)\quad r_{j} = \left\| {\varvec{x} - \varvec{x}_{j} } \right\| $$
(1)

where \( c_{j} \) are coefficients and \( \varphi \left( {r_{j} } \right) \) is a RBF kernel, and \( \varvec{x} \in R^{d} \) is an independent variable in \( d \)-dimensional space, in general. If scalar values are given in \( N \) points, i.e. \( h_{i} = f\left( {\varvec{x}_{i} } \right) \), \( i = 1, \ldots ,N \), then a system of linear equations is obtained

$$ h_{i} = f\left( {\varvec{x}_{i} } \right) = \mathop \sum \limits_{j = 1}^{M} c_{j} \varphi \left( {r_{j} } \right)\quad i = 1, \ldots ,N $$
(2)

This leads to a system of linear equations \( \varvec{Ax} = \varvec{b} \). If \( N = M \) and all points are distinct, then we have an interpolation scheme, if \( N > M \) than the approximation scheme is obtained, as the system of equations is overdetermined [6, 10, 13, 20, 21, 27].

However, in our case we use the interpolation scheme, i.e. \( N = M \), and used only points of extrema, inflection points and some additional points, using the RBF interpolation scheme, to study estimation of a suboptimal selection of weights and shape parameters of the RBF interpolation. In this case, we obtain a system of linear equations:

$$ \left[ {\begin{array}{*{20}c} {\varphi_{1,1} } & \cdots & {\varphi_{1,M} } \\ \vdots & \ddots & \vdots \\ {\varphi_{M,1} } & \cdots & {\varphi_{M,M} } \\ \end{array} } \right]\left[ {\begin{array}{*{20}c} {c_{1} } \\ \vdots \\ {c_{M} } \\ \end{array} } \right] = \left[ {\begin{array}{*{20}c} {h_{1} } \\ \vdots \\ {h_{M} } \\ \end{array} } \right] $$
(3)

The RBF interpolation was originally introduced as a multiquadric method by Hardy [5] in 1971. Since then, many different RFB interpolation schemes have been developed with some specific properties [3, 4].

The RBF kernel functions can be split to two main groups (just some examples):

  • Global functions:

    • Gauss - \( \varphi \left( {r,} \right) = e^{{ - \left( {\varepsilon r} \right)^{2} }} \),

    • Multiquadric (MQ) - \( \varphi \left( r \right) = \sqrt {1 - \left( {\varepsilon r} \right)^{2} } \),

    • Inverse multiquadric (IMQ) - \( \varphi \left( r \right) = 1/\sqrt {1 - \left( {\varepsilon r} \right)^{2} } \),

    • MQ-LG - \( \varphi \left( r \right) = \frac{1}{9}\left( {4c^{2} + r^{2} } \right)\sqrt {r^{2} + c^{2} } - \frac{{c^{3} }}{3}\ln \left( {c + \sqrt {r^{2} + c^{2} } } \right) \) [31]

    • Thin Spline (TPS)

      • \( \varphi \left( r \right) = r^{k} , k = 1,3,5,.. \),

      • \( \varphi \left( r \right) = r^{k} \ln r, k = 2,4,6,..\;{\text{etc}}. \)

  • Local - Compactly Supported RBF (CSRBF), e.g.

$$ \varphi \left( r \right) = \left( {1 - r/\varepsilon } \right)_{ + }^{4} \left( {1 + 4r/\varepsilon )} \right)\,0 \le r \le 1 $$

“+” means zero if a function argument is out of the interval.

Some of those are presented at Table 1 and Fig. 1.

Table 1. Typical examples of “local” functions – CSRBF (“\( + \)” means – value zero out of \( \left\langle {0,1} \right\rangle \))
Fig. 1.
figure 1

CS RBF functions

New RBF functions were recently introduced by Menandro [15].

They generally depend on a shape parameter, which must be carefully set up. Unfortunately, global functions lead to ill-conditioned systems, while CSRBF causes “blobby” behavior [23].

In this contribution, a slightly different problem is solved. Consider given points of an explicit curve; we aim to approximate it, strictly complying with the following requirements for the approximated curve and we want to approximate it having strict the following requirements for the approximated curve:

  • it has to pass through all points of extreme and also points of inflection

  • it keeps the value at the data interval border

Usually, the approximation case is solved using least square error (LSE) methods. The LSE application leads to good results in general, however, the LSE cannot guarantee the above-stated requirements. On the other hand, signal theory says that the sampling frequency should be two times higher than the highest frequency in the data. Also, different parts of a signal might have different high frequencies.

Therefore, the signal reconstruction should follow the local properties of the signal, if it would respect only the global properties, the compression ratio of the approximated signal would be lower and the sampling theorem is fulfilled. However, in practical experiments error behavior at the interval borders is to be improved by adding two additional points close to the border.

It means, that formally standard RBF interpolation scheme is obtained in which only points of extrema and inflection, points at the interval borders and two additional points are taken into account. Additional points only slightly improve interpolation precision.

However, the majority of RBFs depends on the “magic” constant – a shape parameter, which has a significant influence to the robustness, stability and precision of computation. Usually, some standard estimation formulas are used for the constant shape parameter, i.e. all RBFs have the same shape parameter. In the following, the case, when each kernel RBF has a non-constant parameter is described.

2 Determination of RBF Shape Parameters

Signal reconstruction using radial basis functions has to respect the basic requirements stated above. The sampling frequency should be locally two times higher than the highest frequency as the sampling theorem says. Therefore, if the values at the points of extrema and points of inflections are respected together with the values at the interval border, the sampling theory is fulfilled [14].

The question of choosing the shape parameters is not considered, yet. The global constant shape parameter, i.e. for all RBFs in the RBF approximation is to be used, the minimization process can be used

$$ \varepsilon = argmin\left\{ {\left\| {h_{i} - \mathop \sum \limits_{j = 1}^{M} c_{j} \varphi \left( {r_{ij} ,\varepsilon_{j} } \right)} \right\|} \right\}\quad r_{ij} = \left\| {\varvec{x}_{\varvec{i}} - \varvec{x}_{j} } \right\| $$
(4)

where \( \varepsilon_{j} = \varepsilon_{k} \) for all \( j,k \), i.e. all shape parameters are the same.

However, if the shape parameters can differ from each other, the approximation will be more precise with fewer reference points, which also speed-up the RBF function evaluation. The question is, whether there is a unique optimum. Therefore, the “Monte-Carlo” approach was taken and the minimization process was initiated for different starting vector (uniformly generated using Halton’s distribution) of shape parameters using Gauss function.

$$ \varvec{\varepsilon}= argmin\left\{ {\left\| {h_{i} - \mathop \sum \limits_{j = 1}^{M} c_{j} \varphi \left( {r_{ij} ,\varepsilon_{j} } \right)^{2} } \right\|} \right\}\quad \begin{array}{*{20}c} {r_{ij} = \left\| {\varvec{x}_{\varvec{i}} - \varvec{x}_{j} } \right\|} \\ {\varvec{\varepsilon}= \left[ {\varepsilon_{1} , \ldots ,\varepsilon_{M} } \right]^{T} } \\ \end{array} \quad $$
(5)

It can be seen, that finding optimum shape parameter vector is computationally expensive. In the following, only one representative example from experiments made is presented.

3 Experimental Results

Several explicit functions \( y = f\left( x \right) \) have been used to understand the behavior of the optimal shape parameter vector determination (Table 2).

Table 2. Examples of testing functions.

The experiments were implemented in MATLAB. The experiments proved, that there are several different shape parameters vectors, which give a local minimum of approximation error. In all cases, the error of the function values was less than \( 10^{ - 5} \), with a minimum number of points.

In the following, just two examples of RBF approximation are presented, additional examples can be found at [32]. Figure 2 presents two functions and their RBF approximation using found points of importance. Figure 3a presents shape parameters for each local optima found; the blue one is for a starting vector with an optimal global shape constant for all RBFs. Figure 3b presents computed weights of the RBF approximation.

Fig. 2.
figure 2

Two examples of an approximated functions with approximation \( error < 10^{ - 5} \)

Fig. 3.
figure 3

Diagram of shape parameters vectors (a) and weights for local optima found (b)

The radial distances in the graphs are transformed monotonically, but nonlinearly to obtain “visually reasonable” graphs as:

figure a

It can be seen, that the shape parameters for each local optimum changes significantly, while the computed weights are similar. The experiments made on several explicit functions proved a hypothesis, that there are several local optima and for all of those a similar behavior was identified.

However, it leads to a serious question, how the RBFs use in the solution of ordinary and partial differential equations, in approximation and interpolation, etc. is a computationally reliable method, as results depend on “good” choice of the vector of shape parameters.

Detailed test results for several testing functions can be found at [32] http://wscg.zcu.cz/RBF-shape/contents-new.htm.

4 Conclusion

In this contribution, we shortly described preliminary experimental results in finding optimal shape vector parameters, i.e. when each RBF has a different shape parameter, which leads to the higher precision of approximation of the given data. The experiments were made on different explicit curves to prove basic properties of the approach. The experiments proved that there are several local optima for the shape vector parameters, which leads to different precision of the final approximation. However, it should be stated, that the approximation was made with relatively high compression and the error was lower than 10−5, which is in many cases acceptable. The given approach approximates data having different local frequencies will be explored in future together with an extension to explicit functions of two variables. The presented approach and results obtained should also have an influence to solution of partial differential equations (PDE), as the precision of a solution depends on good shape parameter selection.

The presented approach will be studied especially for the explicit functions of two variables, i.e. \( z = f\left( {x,y} \right) \), where order is not defined and finding the nearest neighbors is too computationally expensive in the case of scattered data.