Extending OLS to do multiple measures against time is the next step in analytics. Similar to single measure analysis, time is linear in nature against which any number of measures can be analyzed. For each measure, the OLS equation is computed individually and plotted as a trend line with future predicted time periods.
Visually representing multiple measures must be made with caution due to principle of collinearity. Interpreting results may yield into the fallacy of associativity. One good example to illustrate this is via stock market data. One can analyze data for two companies ACME and DYNACORP and find a positive trend for both stocks. This does not imply that ACME & DYNACORP influence each other in any manner, but could be result of multitude of economic factors that both organizations happened to show positive trend for same time period. If the total time period changes, the OLS equation and trend line might have different slope.
Two positive trend lines might give rise to interpretation that both measures are inter-dependent. In such scenario, an additional measure to show whether the inter dependency is true or not must be factored in along with the measures under scrutiny
In statistics parlance, collinearity is referred to as correlation coefficient, which shows degree of associativity between two variables in liner regression context. It is represented as a single numeric value that signifies as follows
|+1||Strong associativity between variables. Values influence each other|
|0||Neutral. No associativity i.e. true random variables|
|-1||Strong associativity, but influence is opposite in nature|
Calculating the correlation coefficient is simple and the example earlier, two positive trend lines are noted, but the correlation coefficient is negligible which explains why there is negligible influence of one measure on another.
Adage in data science says “correlation is not causation” which should be kept in mind while developing analytics application where time series based analysis and OLS is used.