Natural Language Technologies Augment Data Visualization

Boris Evelson, VP, Principal Analyst, Forrester Research
386
653
143
Boris Evelson, VP, Principal Analyst, Forrester Research

Boris Evelson, VP, Principal Analyst, Forrester Research

In the ongoing effort to derive insights from massive troves of data, businesses are increasingly realizing the benefits of presenting data in a visual format. Indeed, any enterprise BI tool worth its weight will come prebuilt with some degree of data visualization capabilities, allowing analysts to craft a coherent story from raw data.

But many times, simply visualizing data is not enough. Most internal stakeholders who need to grasp the importance of the data are not themselves data visualization experts. One can understand their struggles, when visualizations often necessitate representation of multiple dimensions through size, color, shape and spatial proximity. Our capacity to transform data for visual consumption has improved greatly, but the interpretation of those visuals is often far from effortless. Business users need data to tell them a story, and to do so with the least amount of manual drilling, screen staring and head scratching.

  One of the ultimate goals for an agile BI environment is end-user successful self-service 

Enter artificial intelligence–which should come as no surprise. Two AI technologies in particular are poised to augment the way we consume data visually by allowing us to have ‘conversations’ with our charts and dashboards. Natural language processing (NLP), by which algorithms parse everyday speech to determine meaning, will let users ask questions in place of conventional querying. Natural language generation (NLG) can then map data to concepts, and render them in a colloquial format for easy consumption. The onset of these capabilities will provide:

Faster Time to Insights

NLP technologies will drastically reduce the number of “clicks to insight” required by users. In lieu of dragging and dropping variables, and manually drilling down, business analysts will simply be able to type a question in plain language and the visualization will adapt accordingly. What’s more, users often waste time sifting through massive data sets to determine what the attribute or variable they’re interested in has even been named. After parsing a user’s question, these tools can map inputs to a catalogue of synonyms, determine semantic meaning, and search for the most relevant variable/attribute to display in the visualization.

Reduced Skill Requirements

One of the ultimate goals for an agile BI environment is end-user successful self-service. But obvious barriers exist, as not everyone in your organization can be expected to have proficiency in an enterprise BI tool, or even in the basics of data visualization. NLP capabilities only require that a business user know what question they want answered with no need to search for the exact tool or control necessary to manipulate the visualization. Auto completion features can offer suggestions while the user is mid-sentence to guide them toward words/phrases most closely linked with the fieldsthat exists in the visualization’s data set. Perhaps most importantly, NLG allows even the least skilled user to glean meaning from a chart or dashboard, diminishing the chance for misinterpretation of data, and leading to better decision-making.

More Natural Interactions

Users often know, in a conversational sense, what they want to get from visualization, but lack the skills to “translate” it into the language of data visualization environment. Much like advanced data visualization capabilities supplanted traditional SQL with visual querying, the conversational query will soon become the norm. “How many apartments were sold in Brooklyn in March 2016?”Imagine that this query updates a map of the NYC metro area to zoom in on Brooklyn and display all apartments sold in the borough during the time frame specified. As these tools develop stronger semantic understanding, users will be able to ask follow-ups exactly like they would in human interactions. “What about in May?” In Tableau’s Eviza prototype, the visualization can “remember” that you’re interested in Brooklyn, specifically sales in 2016, but will update the month to the new desired timeframe.

Deeper Explanations of the Data

On a micro-scale, certain tools allow users to enter queries as simple as “What do the biggest circles represent?” and receive a simple readout explaining the relevance of sizing for that given element. On a larger scale, NLG tools such as Wordsmith (which already has integrations with Tableau, TIBCO Spotfire, MicroStrategy, and even Excel), allow data scientists to create templates whose rules dictate how a given data chart is translated into plain language. Developers can decide exactly which metrics and timeframes matter most, and even customize the language to incorporate lingo specific to an individual business. In effect, this disambiguation of data interpretation means more consistency of insights and decisions across the organization.

Read Also

Fostering the Innovative Culture through Proof of Concept Projects

Fostering the Innovative Culture through Proof of Concept Projects

Karl Hightower, VP of Transformational Business Solutions, Rent-A-Center [NASDAQ:RCII]
Improving Predictive Analytics with Data Visualization

Improving Predictive Analytics with Data Visualization

Fadi Elawar, Technical Consultants Team Lead, iDashboards
Simplify and Strengthen Your Approach to Data Visualization

Simplify and Strengthen Your Approach to Data Visualization

Rolf Olsen, Chief Data Officer, Mindshare
Balancing Analytic and Visual Creativity for Effective Data Visualizations

Balancing Analytic and Visual Creativity for Effective Data Visualizations

John Lucker, Risk and Financial Advisory Principal and Global Advanced Analytics Market Leader, Deloitte & Touche LLP