VULGARIZED NEIGHBOURING NETWORK OF MULTIVARIATE AUTOREGRESSIVE PROCESSES WITH GAUSSIAN AND STUDENT-T DISTRIBUTED RANDOM NOISES

Rasaki Olawale Olanrewaju1*, Ravi Prakash Ranjan 2, Queensley C. Chukwudum3, Sodiq Adejare Olanrewaju4

 1*,2Africa Business School (ABS), Mohammed VI Polytechnic University (UM6P), Morocco.

3Department of Insurance and Risk Management, University of Uyo, Nigeria.

. 4Department of Statistics, University of Ibadan, Ibadan, Nigeria.

1*This email address is being protected from spambots. You need JavaScript enabled to view it., 2This email address is being protected from spambots. You need JavaScript enabled to view it.,3This email address is being protected from spambots. You need JavaScript enabled to view it., 4This email address is being protected from spambots. You need JavaScript enabled to view it.

 

   
ABSTRACT

 

This paper introduces the vulgarized network autoregressive process with Gaussian and Student-t random noises. The processes relate the time-varying series of a given variable to the immediate past of the same phenomenon with the inclusion of its neighboring variables and networking structure. The generalized network autoregressive process would be fully spelt-out to contain the aforementioned random noises with their embedded parameters (the autoregressive coefficients, networking nodes, and neighboring nodes) and subjected to monthly prices of ten (10) edible cereals. Global-α of Generalized Network Autoregressive (GNAR) of order lag two, the neighbor at the time lags two and the neighbourhood nodal of zero, that is GNAR (2, [2,0]) was the ideal generalization for both Gaussian and student-t random noises for the prices of cereals, a model with two autoregressive parameters and network regression parameters on the first two neighbor sets at time lag one. GNAR model with student-t random noise produced the smallest BIC of -39.2298 compared to a BIC of -18.1683 by GNAR by Gaussian. The residual error via Gaussian was 0.9900 compared to the one of 0.9000 by student-t.  Additionally, GNAR MSE for error of forecasting via student-t was 15.105% less than that of the Gaussian. Similarly, student-t-GNAR MSE for VAR was 1.59% less than that of the Gaussian-GNAR MSE for VAR. Comparing the fitted histogram plots of both the student-t and Gaussian processes, the two histograms produced a symmetric residual estimate for the fitted GNAR model via student-t and Gaussian processes respectively, but the residuals via the student-t were more evenly symmetric than those of the Gaussian. In a contribution to the network autoregressive process, the GNAR process with Student-t random noise generalization should always be favoured over Gaussian random noise because of its ability to absolve contaminations, spread, and ability to contain time-varying network measurements.

 


Keywords
Gaussian Generalized Network Autoregressive (GNAR), Global-ɑ, Nodes, Student-t

 

 

Published On: 10 October 2023

 

Full Download