Skip to main content

Longitudinal Studies on Mathematical Aptitude and Intelligence Quotient of North Eastern Tribes in Tripura

  • Conference paper
  • First Online:
Advances in Growth Curve and Structural Equation Modeling

Abstract

Longitudinal studies are conducted on mathematical aptitude and intelligence quotient on North Eastern tribes in Tripura over successive interviews in a time span of more than 3 years, viz., 20 September 2011–28 November 2014. Analyzed longitudinal data indicate that both mathematical aptitude and intelligent quotient scores exhibit fluctuations over time and have upward trend immediately after first interaction with the interviewer, before stabilizing at a level slightly below the peak value of scores. Average level of mathematical aptitude is low, although the level of intelligence quotient score is comparatively high. Growth curves under different setups are estimated to infer about the status of tribal education and lifestyle. The score status is seen to be improving over time, although associated with mild fluctuations. Proliferation rates of different scores are estimated under different assumptions. In general, the proliferation rates reach stability towards the end of curves for large values of time. Postulating a simple model of association in scores over time based on martingales, we examine the fluctuation of scores. Excessive deviation results for martingales are derived. Under certain conditions on the martingale \(\{M_i:1\le i \le n\},\) the excessive deviation \(P(\max _{1\le i \le n}|M_i|\ge \lambda n^{1/2})\) is seen to be \(O(e^{-\frac{\lambda ^2}{2}(1+o(1))});\;\lambda \rightarrow \infty .\) This is similar to the tail probability of normal distribution. Deviation of observations from response curve may be compared with normal deviate to detect the presence of assignable causes.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ratan Dasgupta .

Editor information

Editors and Affiliations

Appendix

Appendix

To study the deviation of the observed growth from nonparametric response curve derived under mild assumption and to have a bound on maximum fluctuation of deviations, consider these errors to be martingale differences. An assignable cause may be looked into, if the fluctuations are of different type other than specified by martingale difference. We may obtain a bound on extreme fluctuation of observations from response curve from the assumption that the errors are martingale differences.

The following moment bound of Dasgupta (1993) holds for general stochastic processes. This includes martingales as a special case. Below we provide a slightly modified version of the result.

Theorem A

Let \(\{X_i, i\ge 1\}\) be a stochastic process with \(E[sgn(S_{i-1})X_i| \;|S_{i-1}|] \le 0,\) \(E(\sum _{i=1}^n\pm X_i)^2\le n\beta ^*_{2,n},\) where \(S_i=\sum _{j=1}^i X_j, \;\gamma _{\nu , n}=E|X_n|^\nu ,\; \beta ^*_{\nu ,n}=\max _{1\le j\le n} \gamma _{\nu , j}.\) If the l.h.s. of (1) is finite, then for \(\nu \ge 2\)

$$\begin{aligned} E|S_n|^\nu \le c_\nu n^{\nu /2}\beta ^*_{\nu ,n}, \text{ where } \;c_\nu =[2(\nu -1)\delta ]^{\nu /2} \end{aligned}$$
(1)

and for large \(n,\; \delta \approx (1+\frac{\nu }{2n}).\)

An extra term 2 in \(c_\nu \) above appears due to the fact that expectation of maximum of the terms in (2.2) of Dasgupta (1993) is bounded above by sum of the expectations in (2.3) therein, leading to an extra factor 2; i.e., the correct expression is \(E\max (|S_n|^{\nu -2}X_n^2,|S^*_n|^{\nu -2}X_n^2)< E(|S_n|^{\nu -2}X_n^2+|S^*_n|^{\nu -2}X_n^2).\) The bound is useful in estimating remainder from the main part in nonlinear statistics that arise in many situations related to limit theorems, see, e.g., Dasgupta (1994). The modification does not affect the results of Dasgupta (1994) as L is a generic positive constant therein.

Consider \(M_i=\sum _{j=1}^i y_j,\) where M is a martingale and y s are the martingale differences. For a martingale M with finite \(\nu \)th moment, one may write From Doob’s inequality

$$\begin{aligned} P(\max _{1\le i \le n}|M_i|\ge c)\le c^{-\nu }E|M_n|^{\nu } \end{aligned}$$
(2)

Thus, if a finite \(\nu (\ge 2)\) th order moment of martingale differences \(y_j\) exist, i.e.,

$$\sup _{j\ge 1} E|y_j|^\nu <\infty $$

then from (1) and (2) one may write

$$\begin{aligned} P(\max _{1\le i \le n}|M_i|\ge \lambda n^{1/2})\le \lambda ^{-\nu }c_\nu \beta ^*_{\nu ,n} \end{aligned}$$
(3)

Next consider the case where all the moments of martingale differences exist, but the moment generating function need not necessarily exist. Consider the following moment bounds for the martingale differences \(y_j.\)

Type 1:

$$\begin{aligned} \sup _{j\ge 1} E|y_j|^\nu \le L e^{w_0\nu ^\alpha } \end{aligned}$$
(4)

\(\forall \nu >1,\) and for some \(L>1,\) where \( w_0>0,\;\alpha >1.\) In what follows \(L (>1)\) is a generic constant. The above condition is equivalent to

$$\begin{aligned} \sup _{j\ge 1} E\exp [s\{\log _e(1+|y_j|)\}^{\alpha /(\alpha -1)}]< \infty \end{aligned}$$
(5)

where \(s=w_0^{-1/(\alpha -1)},\) see (4.15)–(4.16) and Appendix in p 87 of Dasgupta (2013) for similar assumptions. Assumption (4) is equivalent to finiteness of m.g.f. in a neighborhood of zero compared to that in Dasgupta (2015) for the transformed random variable \(\{\log _e(1+|y_j|)\}^{\alpha /(\alpha -1)}.\) The original variable \(y_j\) has moment bounds of high magnitude. After logarithmic transformation, the variables are tamed, and some power of the transformed random variable \(y_j\) possess m.g.f.

Probability bound given in (3) may be sharpened under the assumption (4)/(5) by minimizing the r.h.s. of (3) with respect to \(\nu \) under specific moment bound. Under (4), write

$$\begin{aligned} P(\max _{1\le i \le n}|M_i|\ge \lambda n^{1/2})\le \lambda ^{-\nu }c_\nu \beta ^*_{\nu ,n}<Le^{w\nu ^\alpha }\lambda ^{-\nu }=P^* \text{(say) }, w>w_0 \end{aligned}$$
(6)

Next write \(\log P^*=\log L+w\nu ^\alpha - \nu \log \lambda .\) At the minimum value \(\frac{d}{d\nu }\log P^*=0\) provides, \(\alpha w\nu ^{\alpha -1}-\log \lambda =0,\) with \(\nu =(\frac{\log \lambda }{\alpha w})^{1/(\alpha -1)}.\) Write \(\log P^*=\log L+(\alpha w\nu ^\alpha - \nu \log \lambda )+(1-\alpha )w\nu ^\alpha =\log L+(1-\alpha )w\nu ^\alpha =\log L+(1-\alpha )w(\frac{\log \lambda }{\alpha w})^{\alpha /(\alpha -1)}.\) The second derivative \(\frac{d^2}{d\nu ^2}\log P^*=\alpha (\alpha -1)w\nu ^{\alpha -2}>0.\) Hence, we have the following result.

Theorem 1

Under the moment bound given in (4)/(5) for martingale differences, the following excessive deviation result holds for the martingale: \(\{M_i:1\le i \le n\}\)

$$\begin{aligned} P(\max _{1\le i \le n}|M_i|\ge \lambda n^{1/2})\le L e^{-(\alpha -1)w(\frac{\log \lambda }{\alpha w})^{\alpha /(\alpha -1)}}, \lambda \rightarrow \infty \end{aligned}$$
(7)

where \(w>w_0\) is given in (4).

Remark 1

The above result is sharper than the bound given in (3). Since \(\alpha >1,\) the decaying order of \(\lambda \) as seen from \(\lambda ^{-(\alpha -1)w(\frac{\log \lambda }{\alpha w})^{1/(\alpha -1)}},\) is greater than any polynomial power, when \(\lambda \) is large; \(\lambda \rightarrow \infty .\)

Next we consider the following type of moment bounds for martingale differences:

Type 2:

$$\begin{aligned} \sup _{j\ge 1} E|y_j|^\nu \le L^\nu e^{\alpha \nu \log \nu } \end{aligned}$$
(8)

\(\forall \nu \ge 2,\) where \(L>0,\; \alpha >1.\) The above condition is implied by

$$\begin{aligned} \sup _{j\ge 1} E\exp (s|y_j|^{1/\alpha })< \infty \end{aligned}$$
(9)

where \(0<s<s_0=\alpha e^{-1}L^{-1/\alpha },\) see (4.34)–(4.35) of Dasgupta (2013) for similar assumptions. Assumption (12) is equivalent to finiteness of m.g.f. in a neighborhood of zero for the transformed random variable \(|y_j|^{1/\alpha }.\) The assumption is weaker than existence of m.g.f. as \(\alpha >1.\)

Sharper results on excessive deviation may be computed in this setup. Under (8), write

$$\begin{aligned} P(\max _{1\le i \le n}|M_i|\ge \lambda n^{1/2})\le \lambda ^{-\nu }c_\nu \beta ^*_{\nu ,n}\le \lambda ^{-\nu }L^\nu e^{(\alpha +r)\nu \log \nu } =P^* \mathrm{{(say)}} \end{aligned}$$
(10)

where \(r=\frac{1}{2} \;\text{ for } \;\nu \ll n, \;\text{ and }\; r=1\) for unrestricted \(\nu .\)

Next write \(\log P^*=\nu \log L+(\alpha +r)\nu \log \nu - \nu \log \lambda .\) At the minimum value \(\frac{d}{d\nu }\log P^*=0\) provides, \(\log L+(\alpha +r)\log \nu +(\alpha +r)-\log \lambda =0,\) with \(\log \nu =(\log \lambda -\log L)/(\alpha +r) - 1.\) Then \(\log P^*=\nu (\log L+(\alpha +r)\log \nu +(\alpha +r)- \log \lambda )-\nu (\alpha +r)=-\nu (\alpha +r).\) Thus \(P^*=e^{-(\alpha +r)e^{(\log \lambda -\log L)/(\alpha +r) - 1}},\) where \(r=\frac{1}{2},\) if \(n\gg \nu =O(e^{\log \lambda /(\alpha +\frac{1}{2})})\), i.e., \(\lambda =O(n^{\alpha +\frac{1}{2}})\); and \(r=1\), otherwise, i.e., for deviations \(\lambda \) of higher order.

The second derivative \(\frac{d^2}{d\nu ^2}\log P^*=(\alpha +r)/\nu >0.\) Hence, we have the following.

Theorem 2

Under the moment bound given in (8)/(9) with \(\alpha >1,\) the following excessive deviation result holds for the martingale \(\{M_i:1\le i \le n\}\):

$$\begin{aligned} P(\max _{1\le i \le n}|M_i|\ge \lambda n^{1/2})\le e^{-(\alpha +r)e^{(\log \lambda -\log L)/(\alpha +r) - 1}} \lambda \rightarrow \infty \end{aligned}$$
(11)

where \(r=\frac{1}{2},\) if the deviation \(\lambda =O(n^{\alpha +\frac{1}{2}});\) and \(r=1,\) otherwise, i.e., for deviations of higher order, \(\lambda \ggg n^{\alpha +\frac{1}{2}}\).

We further consider the following type of bound, which is more stringent than the bounds considered above.

Type 3: Bound of Type 2 with a different parametric zone, where \(\alpha \in (0,1].\) That is,

$$\begin{aligned} \sup _{j\ge 1} E\exp (s|y_j|^{1/\alpha })< \infty ,\; \alpha \in (0,1] \end{aligned}$$
(12)

This ensures m.g.f. of \(y_j\) exist, but \(y_j\; j\ge 1\) may not be bounded. Nonuniform Berry-Esseen bound in CLT for such cases was considered in Dasgupta (2006) for independent random variables in a triangular array.

For the moment bound of Type 3, take \(\alpha \in (0,1]\) in (9). This assumption ensures the existence of m.g.f for the martingale differences \(y_j,\) however, \(y_j\) in martigale \(M_n\) may not be bounded as \(n\rightarrow \infty .\) We then have

$$\begin{aligned} \sup _{j\ge 1} E|y_j|^\nu \le L^\nu e^{\alpha \nu \log \nu }, \;\forall \nu \ge 2, L>0,\; \alpha \in (0,1] \end{aligned}$$
(13)

The above condition is implied by

$$\begin{aligned} \sup _{j\ge 1} E\exp (s|y_j|^{1/\alpha })< \infty \end{aligned}$$
(14)

where \(0<s<s_0=\alpha e^{-1}L^{-1/\alpha }, \alpha \in (0,1]\).

Calculations for the moment bound of Type 3 with \(\alpha \in (0,1]\) is similar to that of Type 2 moment bound considered above. Proceeding in a similar fashion we obtain the following, which includes the case of bounded martingale differences in the limiting case when \(\alpha \rightarrow 0.\)

Theorem 3

Under the moment bound of Type 3, i.e., (8)/(9) with \(\alpha \in (0,1],\) the following excessive deviation result holds for the martingale \(\{M_i:1\le i \le n\}\):

$$\begin{aligned} P(\max _{1\le i \le n}|M_i|\ge \lambda n^{1/2})\le e^{-(\alpha +r)e^{(\log \lambda -\log L)/(\alpha +r) - 1}}, \lambda \rightarrow \infty \end{aligned}$$
(14)

where \(r=\frac{1}{2},\) if the deviation \(\lambda =O(n^{\alpha +\frac{1}{2}});\) and \(r=1,\) otherwise, i.e., for deviations of higher order, \(\lambda \ggg n^{\alpha +\frac{1}{2}}\).

Remark 2

The bound obtained above for large deviations on martingale is exponentially decaying, and sometimes even higher than that. For standardized sum of iid random variables CLT holds and the tail probability of normal distribution is \(\Phi (-\lambda )\sim \frac{1}{\sqrt{2\pi }}\lambda ^{-1}e^{-\lambda ^2/2}, \lambda \rightarrow \infty .\) The bound in (14) with \(\alpha \rightarrow 0,\; r=1/2\) is \(O(e^{-\frac{\lambda ^2}{2}(1+o(1))});\;\lambda \rightarrow \infty \).

Remark 3

Under the assumption that the scores are bounded variable, the observed residuals, i.e., deviations of data points from growth curve have a probabilistic bound specified by normal distribution, vide Remark 2. Any high departures of observations from the response curve, not in concordance with normal deviate, are to be assessed for assignable causes.

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Dasgupta, R. (2018). Longitudinal Studies on Mathematical Aptitude and Intelligence Quotient of North Eastern Tribes in Tripura. In: Dasgupta, R. (eds) Advances in Growth Curve and Structural Equation Modeling. Springer, Singapore. https://doi.org/10.1007/978-981-13-0980-9_1

Download citation

Publish with us

Policies and ethics