The 80% Blind Spot: Are You Ignoring Most of Your Organization’s Data?

January 9, 2019      By Adam Rogers

Ultimate Takeaway
  • Only one-third of senior executives surveyed considered themselves successful in developing a data-driven organization, despite this being a primary goal for 99% of all responders.
  • Analyzing unstructured data has historically been quite difficult, so organizations focused on structured data. But only 20% of data is structured—meaning many organizations are ignoring 80% of their data.
  • Natural language processing and machine learning are cutting through previous limitations and have made unstructured data insights available to business leaders instantaneously.

Decision-makers love data.

In fact, in the sixth annual “Big Data Executive Survey” representing senior executives from 57 large corporations, 99% of responders reported transitioning to a more data-driven culture and a full 97.2% had invested in Big Data and AI initiatives. Yet only one-third considered themselves successful in developing a data-driven organization, a disconnect that’s been apparent since the first survey in 2012. And while nearly half of 2018 respondents cited people challenges as their greatest barriers to becoming data-driven, I think there’s another factor at play here: data variety.

Structured vs. unstructured data

All data are not created equal and most organizations are not even tapping into some of the most important data they have. Up until recently, organizations have primarily relied on structured data—i.e. highly organized data sets that are easy to analyze using predetermined parameters. Relational databases, spreadsheets, and clearly defined workforce trends like attrition are examples of structured data.

Unstructured data, however, do not follow a predefined data model and does not fit into relational databases. Think emails, videos, social media posts, and document copy. Due to its variability and lack of traditional organizational structure, unstructured data requires special human resources software (and occasionally hardware) to organize, analyze, and understand it. As a result of these complications, it’s been largely ignored by businesses.

Gartner researchers estimate that less than 20% of all enterprise data is structured. Can you imagine making crucial business decisions based on 1/5 of relevant information? If that thought gives you heart palpitations – it should. The sheer volume of unstructured data makes it a crucial player in long-term organizational strategy.

The promise of unstructured data

Unstructured data also provides rich, detailed, qualitative insight into what’s truly happening within your organization. Consider traditional employee satisfaction surveys. On a scale of 1-5, how happy are you with your employee benefits? These surveys don’t provide answers – they provide numbers. Which would you prefer: knowing that 67% of your employees are at least somewhat unhappy with their benefits program, or knowing that 35% of your people have a hard time meeting their deductibles and another 55% desperately want more PTO?

Open-ended questions let respondents talk about what truly matters to them, ultimately providing you with much more detailed and actionable insights. When you know exactly what’s on your employee’s minds, you can make strategic decisions to improve employee morale, retention, and performance.

Natural Language Processing

In the past, unstructured data necessitated manually analyzing massive amounts of free-text data – a process that’s both lengthy and expensive. But recent advances in natural language processing (NLP) and machine learning are cutting through previous limitations and making unstructured insights available to business leaders instantaneously.

Perception by Ultimate Software was developed by computational linguistics researchers from Stanford University, leveraging data models that can both interpret and intimately understand human language. These paradigm-breaking models can accurately classify free text into more than 100 emotions and 140 different themes, including both workplace topics and performance competencies.

What’s more, these models don’t rely on static keywords, which can become clunky and ineffective when put to the “natural language” test. Instead, Perception’s statistical approach uses part-of-speech tagging, dependency parsing, and preprocessing to break text into coherent phrases. These templates are backed by years of linguistic research and can actually translate text into vectors, seamlessly identifying themes and emotions and applying machine learning to continuously improve.

The result? Organizations receive honest, detailed answers to their most crucial employee questions – learning what their people are saying as well as how they actually feel. These insights can be applied to holistically improve the employee experience as well as predict employee behaviors and workforce trends.

When you’re finally able to analyze 100% of your workforce data, incredible things begin to happen.

You can take my word on this: employee surveys are just the beginning.

Posted in:

Related Posts

curly haired woman on computer

Musings of a Product Manager: There is No Bug Juice at this Camp

Close up on hand and ear listening for a quiet sound or paying attention


is it globalization

Is it Globalization?

big data

Big Data

Leave a Comment