Not All Data is Created Equal: the Value of Data Granularity

0
Evolving into the digital age, many modern organizations have developed an understandable obsession with data. On the plus side, they’ll generously fund initiatives that support the collection and storage of data. But on the negative side, there’s little realization that this often results in collecting data for data’s sake. Executives overeager to bring their companies into the 21st century by making them data-driven are part of the way there. Yes, they have realized that data is of little value without deriving insights from it. But there is so much data being collected that it becomes difficult to see the wood for the trees, and truly extract the sort of analysis that leads to meaningful shifts in strategy. What is this critical element missing? Simply this: large sets of data can lead to misleading conclusions. The real game changer is the insights that can be pulled out of nuggets and granules of data. But the catch is that you have to know where to look.

Valuing data quantity over quality equates to looking for that proverbial needle in a haystack. While it’s true that there’s a piece of valuable information somewhere in the massive dataset, how much labor and capital will it take to retrieve it? If the wrong piece of data is retrieved and developed into a plan for action, how long will it take for the organization to course correct?

Companies on the cutting edge are now beginning to take note that the adage “data is the new oil” is out of date. Instead, these companies prioritize refining their algorithms, allowing them to run lean when it comes to data. For example, Facebook no longer has to worry about encroaching on the user’s privacy, because their advanced algorithm can use simple, publicly available data to generate the insights necessary to sell ads.

The Level of Detail in Data

A data glut riddled with hidden costs can be avoided by honing-in on the correct level of data granularity before the collection process takes place. Data granularity measures the detail present in data. For example, data that measures yearly transactions across all stores in a country would have low granularity. While data that measures individual stores’ transactions by the second would have incredibly high granularity.

However, there’s a danger in organizations assuming that increased granularity of data directly correlates with its value or applicability. When someone is lost, the farther they zoom into a map isn’t proportionate to their chances of finding their way home. There is an optimal level of data granularity for each function within an organization—a uniform level of granularity throughout an organization might benefit some functions, but hinder others.

Consider the following two examples of organizations using the right and wrong levels of data granularity. Organization A succeeds by understanding the specific price sensitivity of each of their product and consumer combinations. While organization B bleeds margin by pushing a blanket top-down price change of +5% on every product and customer combination—solely informed by the data point that costs have increased an average of 5%. Both are informed by data, but the second has such a low granularity that it will inevitably deliver poor results.

Agility Matters in Uncertain Times

The pandemic, ensuing supply chain crisis, and geopolitical instability have shed light on the holes in traditional pricing models. Traditional pricing models assume that customers are highly price sensitive, that price is the deciding factor in choosing between two comparable items. However, COVID challenged these assumptions—retailers were surprised when massive discounts did little to remedy overstocks. Inflation and public health concerns governed spending patterns. Linear pricing models could not take these external factors into account. But with more granular data, pricing models can be developed which factor in location, demographics, seasonality, and countless other intangibles.

Another disadvantage of the traditional approach to pricing is that it’s inflexible and unresponsive to rapid changes in spending patterns. With today’s volatile macroeconomic conditions, agility is crucial—alternative trade routes, suppliers, and customer bases need to be established on the fly. Predictive models try to foresee global crises but are ultimately playing an unwinnable game. On the other hand, prescriptive pricing models based on granular data react so quickly that predicting the future becomes unnecessary.

Learn to Frame Complex Problems, Let AI Do the Rest

In the coming years, training and education will place more emphasis on asking the right questions rather than answering the wrong ones. While AI can automate away tedious, manual tasks, it lacks the critical thinking skills and independence necessary to frame complex problems. Data granularity goes hand in hand with this cultural shift—the collection of data will become cheap and accessible, yet granularity issues will require the skills that make humans irreplaceable.

AI isn’t automatically added value. Without talented human capital to write prompts, the lure of these tech investments can potentially do more harm than good. For example, asking AI“How do I sell more inventory?” is the wrong question: the machine would suggest applying massive discounts across the board. Rather one should ask “how can I lift my market share, sales and margin while preserving my value perception?” — because the answer will be a balanced view of the complex outcomes that typical firms optimize for. So, the approach should be to identify the most productive goal for the machine. If you forget what you actually want, the outcome can damage the business, even as the machine gets better at doing the wrong thing. The key is setting the right goals and putting rules in place that ensures nothing critical is sacrificed along the way.

For many, transitioning business processes from being manual to being automated and data-back has been difficult—often the wrong KPIs are emphasized and organizations over-fit or under-fit analytic data models. However, by focusing on the right granularity, organizations can unlock the full potential of artificial intelligence.

To learn how data granularity can help your organization, visit the website here.

Related News:

Unified Frontiers: Revolutionizing Operations Through Centralized IT/OT Integration

SQream Integrates with Dataiku for Advanced Big Data Analytics Technology

Share.

About Author

Fabrizio Fantini, PhD is Vice President of Product Strategy at ToolsGroup, a supply chain planning and optimization firm.