Journalism is struggling. According to Gallop’s long-running Honesty and Ethics in Professions survey, trust in journalists over the last forty years has seen a steady decline and is now at an all time low. In many instances, it isn’t necessarily the fault of the journalists themselves. In some cases, it is the confusion brought about by the network’s choice of sources, the wide variety of sources and the speed with which people are clamoring for news. Back when there were only three primary networks and limited methods for the transmission of information, seasoned newspaper reporters and face-to-face interactions seemed to keep a tighter rein on journalism’s criteria and standards.
Insurance executives are suffering from many of the same issues when trying to rely upon their data and analytics. They may frequently ask themselves, “Where am I getting my news about my business?” and “Can I trust what I’m being told?” Data within the organization can be coming from anywhere inside or outside the company. Analytics can be practiced by those who may be reaching across departmental boundaries. Methods may contain errors. Reporting can be suspect. Decisions may be hastily made upon “fake news.”
No industry is immune. Google Flu Trends (2008-2013) was supposed to predict flu outbreaks better than the Centers for Disease Control using a geographic picture of search terms loosely related to the flu. Somehow, though the data was not suspect, the algorithms consistently overrated correlations and over-predicted outbreaks. After several years of poor results, teams from Northeastern University, the University of Houston and Harvard, concluded that one of Google’s primary issues was opaque methodology, making it “dangerous to rely on.”
When Majesco works with its clients, we work with them to make sure that they have the right processes in place to certify data, reports and views. We want to make sure that they aren’t the ones generating fake news in the form of analyses and visualizations that allow people to reach the wrong conclusion with confidence. We want to generate confidence by consistently proving that sources, methods and conclusions are verifiable and transparent. Besides giving you the right conclusions, data quality increases the monetary value of data by improving process quality and lowering the risk of business initiative failure.
Here are a few actions that insurers can take to close data and analytic gaps and create an environment where news reflects reality and is able to be trusted.
One simple recommendation is to watermark views of data as certified. Certified sources, certified views, and certified analyses could carry a mark that would only be allowed if a series of steps had been taken to maintain source and process purity. This Good Housekeeping Seal of Approval ® will provide your organization’s information consumers with the confidence that they are looking at real news. Of course, the important part in this process is not the mark itself, but developing the methods that will be certified. We’ll talk more about that in a moment.
Attributing information that is used in an ad hoc way to the data source also allows other team members to trust that the source is vetted and the information presented will be verifiable. In any research project, it is common to add data citations, just as one would add a footnote in an article or paper.
Attributions add one other important layer of security to data and analytics — historical reference. If a team member leaves or is assigned to another project, someone attempting to duplicate the analysis a year from now will know where to look for an updated data set. It is also more likely that the results from decisions made upon the data are many months or years away. If those results are less than optimal, teams may wish to examine documented data sources and analytic processes.
Organizationally focusing on the benefits of good data hygiene and creating a culture of data quality will increase your organization’s data quality and improve trust levels for information. Governance is the core of safe data usability. Poor practices and fake news arise most easily from a loosely-governed data organization.
The concepts of governance should be communicated throughout the organization so that those who have been practicing data analytics without oversight can “come in from out of the cold” and allow their practices to be verified. But governance teams should always act less like data police and more like best practice facilitators. The goal is to enable the organization to make the best decisions in a timely manner, not to promote rigidity at the cost of opportunity.
Finally, when data teams constantly have their ear to the ground and are continuously aligning the information that is available with the needs of the consumers of that information, then best practices will happen naturally. This awareness not only ensures that fake news is kept to a minimum but it also ensures that new, less reliable reports and views are not cropping up with the excuse that necessity is the mother of invention.
It also means that data teams will have their eyes open to new sources with which to assist the business. When data teams and business users are frequently helping each other to attain the best results, a crucial bond is formed where everyone is unified behind the visualization of timely, transparent, usable insights. Data stewards will have confidence that their news is real. Business users will have confidence to act upon it.
 Lazer, David and Kennedy, Ryan, What We Can Learn From the Epic Failure of Google Flu Trends, Wired Magazine, October 1, 2015