Should You Trust Analytics II: Data Provenance

The process of turning data into information to present it in a simple manner can be incredibly complex.  I believe this irony is primarily because most available data is not formatted for analysis.  Building a large, custom data set with the exact list of features you desire to analyze (Design of Experiments) can be very expensive.  If you have pockets as deep as big Pharma or are ready to dedicate years to a PhD, it’s definitely a great way to go.

Our last blog on trusting data analytics explored how the industry practice of “data cleaning” can spoil the reliability of an entire analysis.  But problems can also occur with perfect, clean, complete, and reliable data.  In this post we will explore the topic of data provenance and how the complexities of data storage can sabotage your data analytics.

Data Provenance 2

The truth is… business data is structured and formatted for business operations and efficient storage.  Observations are usually:

  • Recorded when it is convenient to do so, resulting in time increments that may not represent the events we actually want to measure;
  • Structured efficiently for databases to store and recall, resulting in information on real world events being shattered across multiple tables and systems; and
  • Described according to the IT departments’ naming conventions, resulting in the need to translate each coded observation;

Continue reading “Should You Trust Analytics II: Data Provenance”

Should You Trust Analytics III: Analytics Process

Lack of trust in source data is a common concern with data analytic solutions. A friend of mine is a product manager for a large software company that uses analytics for insights into product sales. He told me the first thing executives and managers do when new analytic products are released in his NYSE-traded, multi-billion dollar  company is…  manually recalculate key metrics.   Why would a busy manager or executive spend valuable time opening up a spreadsheet to recalculate a metric? Because he or she has been burned before by unreliable calculations.

I’ve been exploring the subject of unreliable data since a recent survey  of CEOs revealed that only 1/3 trust their data analytics.   I have also been studying for an exam next week to earn a Certified Analytics Professional designation  to formalize my knowledge on the subject.  While studying each step in the analytics process on INFORMS’ analytic process, the sponsoring organization for the Certified Analytics Professional exam, I’ve considered how things could go wrong and result in an unreliable outcome.  In the flavor of Lean process improvement (an area I specialized earlier in my career), I pulled those potential pitfalls together in a fishbone diagram:

Analytic Errors Fishbone

Continue reading “Should You Trust Analytics III: Analytics Process”

Statistical Version of 100 Year War

After 100+ years of being silent on the inadequacies of the statistic behind many “statistically significant” conclusions, the ASA published a new statement harshly criticizing p-values online last week. Here’s a link for those who are interested, but a short synopsis follows: http://amstat.tandfonline.com/doi/abs/10.1080/00031305.2016.1154108

The ASA’s actual statement starts on page 8 and includes the following statements:

“Researchers often wish to turn a p-value into a statement about the truth of a null hypothesis, or about the probability that random chance produced the observed data. The p-value is neither.” Ouch. Continue reading “Statistical Version of 100 Year War”

Quit Sampling!

This post may disturb some “old school” auditors.  In fact, used to be a self described old school auditor.  If I couldn’t find a work paper in the permanent file, someone was going to get an earful about their ability to keep reliable documentation.  But the business sector has evolved at a frightening rate since those days.  Surprisingly, many audit professionals t-teststill consider sampling and testing to be their go-to procedure.

I’m not knocking the old student’s t-test.  It’s still the right tool for some situations.  But it’s been harder and harder for me to find those situations in recent years.  Most of the subjects I audit now can easily be scrutinized using data mining, or even scripted into an automated monitoring report.  Continue reading “Quit Sampling!”