Detection of Collinearity Effects on Explanatory Variables and Error Terms in Multiple Regressions
Alhassan Umar Ahmad1, U.V. Balakrishnan2, Prem Shankar Jha3

1Alhassan Umar Ahmad, Department of Mathematics, Basic Science and Research Sharda University, Greater, Noida.

2U.V Balakarishnan, Department of Mathematics, Basic Science and Research Sharda University, Greater, Noida.

3Prem Shankar Jha, Department of Mathematics, Basic Science and Research Sharda University, Greater, Noida.

Manuscript received on 15 April 2019 | Revised Manuscript received on 22 April 2019 | Manuscript Published on 26 July 2019 | PP: 1584-1591 | Volume-8 Issue-6S4 April 2019 | Retrieval Number: F13190486S419/19©BEIESP | DOI: 10.35940/ijitee.F1319.0486S419

Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open-access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: In this work we investigates the effects and consequences of multicollinearity on both standard error and explanatory variables in multiple regression, the correlation between X1 to X6 (independent variables) measure their individual effect and performance on Y (Response variable) and it is carefully observes how those explanatory variables intercorrelated with one another and to the response variable. There are many procedures available in literature for detecting presence, degree and severity of multicollinearity in multiple regression analysis here we used correlation analysis to discover it is presence; we use variance inflation factors, tolerance level, indices number, eigenvalues to access fluctuation and influence of multicollinearity present in the model. Multicollinearity was discovered in this research work with a severe proportion using arrays of correlation analysis procedure which affects the performance of the explanatory variables present in the model by making it less independent and more redundant as it should not be. Collinearity inflates variance of estimates and brings change in direction and signs of the co-efficient of the estimates leading to unrealistic erroneous inference, wrong interpretation and instability among the predictor variables. standard error is discover to be slightly height which directly affect the accuracy and precision of the final result from the analysis, it bring type 1 error during and after the hypothesis testing and finally undermined the overall inference of the entire analysis interest and is in good agreements with the finding of Complete elimination of collinearity is not possible but in this work we reduce it is degree of intensity to enhance the performance of independent variables and error term in the model.

Keywords: Multicollinearity, Predictor Variable, Standard Error and Multiple Regression.
Scope of the Article: Applied Mathematics and Mechanics