An Efficient Trivial Principal Component Regression (TPCR)
To be published on April 2, 2019 by SAE International in United States
Downloadable datasets for this paper availableAnnotation of this paper is available
Understanding a system behavior involves developing an accurate relationship between the explanatory (predictive) variables and the output response. When the observed data is ill-conditioned with potential collinear correlations among the measured variables, some of the statistical methods such as least squared method (LSM) fail to generate good predictive models. In those situations, other methods like Principal Component Regression (PCR) are generally applicable. Additionally, the PCR reduces the dimensionality of the system by making use of covariance relationship among the variables. In this paper, an improved regression method over PCR is proposed, which is based on the Trivial Principal Components (TPC). The TPC regression (TPCR) makes use of the covariance of the output response and predictive variables while extracting principal components. A new method of selecting potential principal components for variable reduction in TPCR is also proposed and validated. Two example problems, which are highly collinear, were considered for illustration. Results are also compared with the Partial Least Squares Regression (PLS1), which is another widely used statistical method, for ill-conditioned data analysis. From these results, it can be concluded that the TPC regression has a great potential for applications in big data systems with multicollinearity.
CitationChinta, B., "An Efficient Trivial Principal Component Regression (TPCR)," SAE Technical Paper 2019-01-0515, 2019.
Data Sets - Support Documents
|[Unnamed Dataset 1]|
|[Unnamed Dataset 2]|
|[Unnamed Dataset 3]|
|[Unnamed Dataset 4]|
|[Unnamed Dataset 5]|
|[Unnamed Dataset 6]|
|[Unnamed Dataset 7]|