From its debut in the late 1980’s, the term “Big Data”—like its lexical cousins Big Oil, Big Tobacco and Big Pharma—was engineered to intimidate. Steve Lohr of the New York Times wrote a wonderful account of its origin as a phrase just last year. One originator was author Erik Larson —of Devil in the White City fame—who lamented the adverts piling in his mailbox (the original spam some might say). “The keepers of big data,” Larson wrote in a 1989 Harper’s Magazine article, “say they do it for the consumer’s benefit. But data have a way of being used for purposes other than originally intended.” Another originator—John Mashey of early SFX giant Silicon Graphics—used the term to educate potential customers on the wave of digital information to come. “[What matters is] what you do with it,” Mashey wrote in one presentation , “but if you don’t understand the trends it’s what it will do to you.” In both cases, and probably most others early on, Big Data was a colloquialism meant to convey the magnitude of measurable info that companies would inevitably have to deal with in coming years.
In the last two decades, the sense of urgency around Big Data, especially as directed toward business decision-makers, has been amplified: recent cover stories warn of the “ Data Deluge ,” announce that “ Data Is Power ,” or—in a most dramatic case—ask “ Is Data The New God?” Most of these treatments do a great job of impressing the importance of taking action around Big Data but give little insight as to how to do so. Only recently have I seen smart publications raise an equally worthwhile consideration: “ You May Not Need Big Data After All .” The most important audiences in the room—the execs, entrepreneurs and business users—need an impartial framework for thinking about leveraging the digital information their organization does or may have access to. By now, execs should understand the dangers of inaction around emerging tech trends, from loss of market share to competitors to leaving cash on the table. But we should keep in mind that committing prematurely to open-ended and / or cash-intensive “Big Data strategies” can be equally as crippling.
Rather than view Big Data as an obstacle to be tackled or treasure to be won, I suggest we think about it as a natural resource, in the same way we would timber, wind or fossil fuel precursors. This framework helps us distinguish between two very different conversations: (a) what raw signals are accessible that may be useful? and (b) what are the costs and benefits to extracting it into useable information?
The volume of measurable raw signals is exploding exponentially thanks to unprecedented levels of web traffic, exploding mobile usage both at home and work, and the proliferation of a wide range of connected devices. To give you an accessible reference point, the identifiable storage capacity of the human brain is around 2 Petabytes . That’s about 4,000 Macbook Airs worth of capacity. Facebook , Microsoft , Google and Amazon , amongst others, all boast data warehouses equivalent to hundreds of brains. The amount of digital information generated in the last two years represents the overwhelming majority—likely to the tune of 95%+—of extant data today.
All of those parties, however, devote massive resources to “mining” the raw signals that customers generate in their wake, “refining” them into a workable format and storing the results in an orderly fashion. Depending on the business, this process of extracting useable info from raw signals can be complex and expensive, just as is turning sedimentary petroleum into car-ready gasoline. For example, telecom companies all have some level of access—whether they parse it or not—to customers’ behaviors pre-purchase, while active subscribers and post-cancellation. Solutions exist that can turn those behaviors into insights that will help the telecoms win more customers or prevent losing them. Many software startups have attracted millions in investment dollars to provide such customer analytics; core competencies amongst them vary, but Gainsight , BluenoseAnalytics , Totango and ActionIQ are a few relevant examples. As customers of these and other solutions have found, the potential benefits of a successful business analytics initiative are huge. Many other software vendors are working hard to solve companies’ Big Data issues for them regardless of industry: Real Estate ( SmartZip ), Life Sciences ( Zephyr Health ), Government ( Premise ), Human Resources ( HiQ ), Fashion ( StyleSage ), Agriculture ( 640Labs ), and Transportation ( Transfix ), to name just a handful. Given the potential expense and management focus required in all cases, however, today’s executives must systematically determine which Big Data resources they can extract in an ROI-positive manner.
1) What raw signals does your business have ready access to? Like landholders that discover oil on their fields, search engines found early on that they could readily collect a wealth of signals on their customers’ behavior (thanks to their volume and high level of product interaction). Ad targeting was just one revenue-generating result. In some cases, it may also make sense to consider acquiring additional data access from 3rd parties to supplement core in-house data resources.
2) How complex will the process be to refine those raw signals into useable information? Identify your low-hanging fruit. Point-of-sales purchases are more straightforward than real-time video streams to collect, for example, because the latter usually requires more advanced database technologies that enable the querying of massive unstructured datasets. Moreover, the mere act of capturing data will result in some sort of expense; they are of little value without the ability also to reap actionable business insights from them.
3) What investments will you need to make to execute on this refinement process? A combination of data scientists and front-end business intelligence software will likely be needed to take full advantage of your natural data resource. Unlike most mining or energy companies, many Big Data adopters won’t need as much in the way of CapEx (thanks to As-A-Service providers and human capital). The investment sizes, however, can be just as significant and likely require different business planning (e.g. they may not be amortizable).
4) Once the refined insights are generated, what’s the true end value to the business? Oil and timber are easy to sell on the open market, but Big Data insights—in most cases—are of greatest value to the business internally (and privacy or competitive concerns may present regardless). Like any natural resource, Big Data may generate 10-figure revenue opportunities for some, while others will find it makes sense to invest on a limited basis or outsource more of the value chain.
In the modern enterprise, Big Data is a ubiquitous natural resource, completely renewable but only of potential value on the outset. Forward-thinking leaders must now commit to understanding the full complexity of their unique extraction process. ROI will vary by industry and business model. But a dedication to testing new, innovative strategies on a continuous basis—Big Data wildcatting , in a sense—is a core competency that all next-generation executives must now embrace.
This post originally appeared on the Square1Bank blog on November 5, 2014.
Below we have compiled a list of metrics that could be relevant for most B2B marketplaces and hope that it serves as a framework for tracking KPIs for success.