Do you have the technical knowledge to conduct an audit of an analytic dashboard? If not, you should go get it as soon as possible. As of February 2017, all the most used audit standards allow us to bring specialists into our audit teams to train and enhance our capabilities. If your team finds itself in need of some technical knowledge, find an expert and learn from them!
We recently spoke with a experienced auditor about how his organization has made analytics its top audit-related priority. This is a very reasonable decision given that work can be automatically executed, documented, and continuously recycled as a “continuous audit” procedure. The efficiencies are quite appealing to auditors who perform much of their re-work on a quarterly/annual cycle. Ironically, my colleague noted their primary tool is Microsoft Access (R), which is capable of none of the most valuable benefits. Let’s examine… Access (R) is limited in how it cannot natively:
- Produce sharable analytic procedures to import, clean, and transform data;
- Automatically execute analytic procedures to perform work without human action;
- Produce logs of multiple processing steps that serve as audit documentation; and
- Restrict access to private information.
So where is an inspiring analytic auditor to start?
What tools can an experienced decision scientist recommend developing analysts? We feel SQL is a valuable first step in any analytics career. SQL (Structured Query Language) is so pervasive that the International Organization for Standardization (ISO) has codified it. In today’s digitalized world with massive amounts of data being gathered every day and stored into a database, knowing how to query and program with SQL is the most useful tool we can imagine for an analytic auditor. Lots of people use it, so it’s a transferrable skill. Furthermore, SQL solutions are strong in many performance areas that are key to analytic auditing, including:
- Connect to multiple SQL data sources, which is a popular platform for operational data;
- Produce scripts that perform multiple processing actions and can be shared among different individuals and retained as audit documentation;
- Provide for access controls to databases, tables, and individual records.
There are multiple “flavors” of SQL, it is used by Microsoft SQL, Oracle, MySql, Amazon’s Redshift, and many many other popular platforms. Each of these solutions uses a slightly different version of the SQL language because each product has custom functions they have developed to differentiate their products. But the good news is, these functions are not necessary to perform all of the basic steps in the analytic process. If you’re organization uses a type of SQL, then we suggest you begin using it and almost all of the skills you learn will be transferrable to the other solutions! The most important decision is the decision to begin using SQL if you are pursuing a career in analytics. Learning is not supposed to be comfortable, so just get started! To help you on this journey, we’ve compiled some useful resources:
- AnalyticAuditor.com’s very own graphic SQL cheat sheet
- SQLzoo.net has an interactive browser-based SQL tutorial (no software to install)
- http://www.w3schools.com/sql/default.asp has a dictionary of SQL functions
For more free and valuable content, subscribe to this blog on the top right.
About four months ago I decided to take my passion for decision science to a new level by pursuing the Certified Analytics Professional (CAP) certification.
Coming from a non-technical background, some people (particularly those with computer science backgrounds) were skeptical of my knowledge and abilities working with large amounts of data and writing predictive models. (Ironically, one of the same data scientists with a heavy CS background inspired a separate post on the pitfalls of common data cleaning procedures.) I feel a relevant certification is a great way to give others confidence in my foundation of knowledge in data analytics.
The CAP seems to be the best branded, most well recognized, and best sponsored option for data science related certifications. In a July 2014 article titled 16 big data certifications that will pay off in CIO magazine, the CAP exam was listed as the first item on the list. Continue reading “About the Certified Analytics Professional (CAP)”
The process of turning data into information to present it in a simple manner can be incredibly complex. I believe this irony is primarily because most available data is not formatted for analysis. Building a large, custom data set with the exact list of features you desire to analyze (Design of Experiments) can be very expensive. If you have pockets as deep as big Pharma or are ready to dedicate years to a PhD, it’s definitely a great way to go.
Our last blog on trusting data analytics explored how the industry practice of “data cleaning” can spoil the reliability of an entire analysis. But problems can also occur with perfect, clean, complete, and reliable data. In this post we will explore the topic of data provenance and how the complexities of data storage can sabotage your data analytics.
The truth is… business data is structured and formatted for business operations and efficient storage. Observations are usually:
- Recorded when it is convenient to do so, resulting in time increments that may not represent the events we actually want to measure;
- Structured efficiently for databases to store and recall, resulting in information on real world events being shattered across multiple tables and systems; and
- Described according to the IT departments’ naming conventions, resulting in the need to translate each coded observation;
Lack of trust in source data is a common concern with data analytic solutions. A friend of mine is a product manager for a large software company that uses analytics for insights into product sales. He told me the first thing executives and managers do when new analytic products are released in his NYSE-traded, multi-billion dollar company is… manually recalculate key metrics. Why would a busy manager or executive spend valuable time opening up a spreadsheet to recalculate a metric? Because he or she has been burned before by unreliable calculations.
I’ve been exploring the subject of unreliable data since a recent survey of CEOs revealed that only 1/3 trust their data analytics. I have also been studying for an exam next week to earn a Certified Analytics Professional designation to formalize my knowledge on the subject. While studying each step in the analytics process on INFORMS’ analytic process, the sponsoring organization for the Certified Analytics Professional exam, I’ve considered how things could go wrong and result in an unreliable outcome. In the flavor of Lean process improvement (an area I specialized earlier in my career), I pulled those potential pitfalls together in a fishbone diagram:
Just one-third of 400 surveyed CEOs responded that they trust their data analytics according to KPMG’s 2016 CEO Outlook article. This is an astonishingly low rate that the decision science industry should take as a wake-up call to shine light on their processes. Auditors should also take note because most of these CEOs are from companies that have invested heavily in data analytics.
Learning is a constant part of decision science and, for those looking to advance your analysis skills, it never hurts to have some extra resources. Microsoft (R) Director of Sales Excellence, Eric Ligman, is offering TONS of free eBooks on their products.
While giving a presentation on Analytics during a recent event, one of the meeting participants asked how the Audit industry felt about data products created using Analytic processes. On first thought, I consider Analytics to be a form of “analytical procedures”. This was my response but I had to qualify it by acknowledging that I wasn’t sure how different auditing standards addressed the topic. Over the last few days I’ve been able to do some research and pull together a quick synopsis of how the most commonly used Audit standards define the work behind Analytics. In summary my initial impression was pretty close… several of major Audit standards define this type of work and emphasize the reliability of data that underpin Analytic data products.
After 100+ years of being silent on the inadequacies of the statistic behind many “statistically significant” conclusions, the ASA published a new statement harshly criticizing p-values online last week. Here’s a link for those who are interested, but a short synopsis follows: http://amstat.tandfonline.com/doi/abs/10.1080/00031305.2016.1154108
The ASA’s actual statement starts on page 8 and includes the following statements:
“Researchers often wish to turn a p-value into a statement about the truth of a null hypothesis, or about the probability that random chance produced the observed data. The p-value is neither.” Ouch. Continue reading “Statistical Version of 100 Year War”
This post may disturb some “old school” auditors. In fact, used to be a self described old school auditor. If I couldn’t find a work paper in the permanent file, someone was going to get an earful about their ability to keep reliable documentation. But the business sector has evolved at a frightening rate since those days. Surprisingly, many audit professionals still consider sampling and testing to be their go-to procedure.
I’m not knocking the old student’s t-test. It’s still the right tool for some situations. But it’s been harder and harder for me to find those situations in recent years. Most of the subjects I audit now can easily be scrutinized using data mining, or even scripted into an automated monitoring report. Continue reading “Quit Sampling!”