Welcome!

Wearables Authors: Carmen Gonzalez, Elizabeth White, Liz McMillan, Yeshim Deniz, Christopher Harrold

Related Topics: @BigDataExpo, @CloudExpo, @ThingsExpo

@BigDataExpo: Blog Feed Post

Economic Value of Data (EvD) Challenges | @BigDataExpo #BigData #Analytics

Data has a direct impact on an organization’s financial investments and monetization capabilities

Well, my recent University of San Francisco research paper “Applying Economic Concepts To Big Data To Determine The Financial Value Of The Organization’s Data And Analytics Research Paper” has fueled some very interesting conversations. Most excellent! That was one of its goals.

It is important for organizations to invest the time and effort to understand the economic value of their data because data has a direct impact on an organization’s financial investments and monetization capabilities. However, calculating economic value of data (EvD) is very difficult because:

  • Data does not have an innate fixed value, especially as compared to traditional assets, and
  • Using traditional accounting practices to calculate EvD doesn’t accurately capture the financial and economic potential of the data asset.

And in light of those points, let me share some thoughts that I probably should have been made more evident in the research paper.

Factoid #1:  Data is NOT a Commodity (So Data is NOT the New Oil)
Crude oil is a commodity. West Texas Intermediate (WTI), also known as Texas light sweet, is a grade of crude oil used as a benchmark in oil pricing. This grade is described as light because of its relatively low density, and sweet because of its low sulfur content.  WTI is a light crude oil, with an API gravity of around 39.6, specific gravity of about 0.827 and less than 0.5% sulfur[1].

And here’s the important factoid about a commodity: every barrel of Texas light sweet is exactly like any other barrel of Texas light sweet. One barrel of Texas light sweet is indistinguishable from any other barrel of Texas light sweet. Oil is truly a commodity.

However, data is not a commodity. Data does not have a fixed chemical composition, and pieces of data are NOT indistinguishable from any other piece of data. In fact, data may be more akin to genetic code, in so much as the genetic code defines who we are (see Figure 1).

Figure 1: Genetic Code

Every piece of personal data – every sales transactions, consumer comment, social media posts, phone calls, text messages, credit card transactions, fitness band readings, doctor visits, web browses, keyword searches, etc. – comprises another “strand” of one’s “behavioral genetic code” that indicates one’s inclinations, tendencies, propensities, interests, passions, associations and affiliations.

It’s not just the raw data that holds valuable strains of our “behavioral genetic code”, the metadata about our transactional and engagement data are a rich source of insights into our behavioral genetic code. For example, look at the metadata associated with a 140-character tweet. 140 characters wouldn’t seem to be much data. However, the richness of that 140-character tweet explodes when you start coupling the tweet with all the metadata necessary to understand the 140-characters in context of the conversation (see Figure 2).

Figure 2: “Importance of Metadata in a Big Data World”

The Bottom-line:
Data is not a commodity, which makes determining the economic value of data very difficult, and maybe even irrelevant, using traditional accounting techniques. Which brings us to the next point…

Factoid #2: Can’t Use Accounting Techniques to Calculate Economic Value of Data
The challenge with using accounting or GAAP (generally accepted accounting principles) techniques for determining the economic value of data is that accounting uses a retrospective view of your business to determine the value of assets. Accounting determines the value of assets based upon what the organization paid to acquire those assets.

Instead of using the retrospective accounting perspective, we want to take a forward-looking, predictive perspective to determine the economic value of data. We want to apply data science concepts and techniques to determine the EvD by looking at how the data will be used to optimize key business processes, uncover new revenue opportunities, reduce compliance and security risks, and create a more compelling customer experience. Think determining the value of data based upon “value in use” (see Table 1).

Accounting Perspective Data Science Perspective
Historical valuation based upon knowing what has happened Predictive valuation based upon knowing what is likely to happen and what action one should take
Value determination based upon what the organization paid for the asset in the past Value determination based upon how the organization will monetization the asset in the future
Valuations are known with 100% confidence based upon what was paid for the asset Valuations are based on probabilities with confidence levels dependent upon how the asset will be used and monetized
Value determination based upon acquisition costs (“value in acquisition”) Value determination in use based upon how the data will be used (“value in use”)

Table 1:  Accounting versus Data Science Perspectives

This “value in use” perspective traces its roots to Adam Smith, the pioneer of modern economics. In his book “Wealth of Nations,” Adam Smith[3] defined capital as “that part of a man’s stock which provides him a revenue stream.” Adam Smith’s concept of “revenue streams” is consistent with the data science approach looking to leverage data and analytics to create “value in use”.

We have ready examples of how other organizations determine the economic value of assets based upon “value in use” starting with my favorite data science book – Moneyball.  Moneyball describes a strategy of leveraging data and analytics (sabermetrics) to determine how valuable a player might be in the future. One of the biggest challenges for sports teams is to determine a player’s future value since player salaries and salary cap management are the biggest management challenges in sports management. Consequently, data science provides the necessary forward-looking, predictive perspective to make those “future value” decisions.

Sports organizations can not accurately make the economic determination of a player’s value based entirely on their past stats. To address this challenge, basketball created Real Plus-Minus (RPM)[4]. Real Plus-Minus is a predictive metric (score) that is designed to predict how well a player will perform in the future.

The Bottom-line:
We need to transition the economic vale of data conversation away from the accounting retrospective of what we paid to acquire the data, to a data science predictive retrospective of how the data is going to be used to deliver “value in use.”

Economic Value of Data Summary
Data is an asset that can’t be treated like a commodity because:

  1. Every piece of data is different and provides unique value based upon the context (metadata) of that data, and
  2. Traditional retrospective (accounting) methods of determining EvD won’t work because the intrinsic value of the data is not what one paid to acquire the data, but the value is in how that data will be used to create monetization opportunities (“data in use”).

To exploit the economic value of data, organizations need to transition the conversation from an accounting perspective (of what has happened) to a data science perspective (on what is likely to happen) on their data assets. Once you reframe the conversation, the EvD calculation becomes more manageable, more understandable and ultimately more actionable.

[1] https://en.wikipedia.org/wiki/West_Texas_Intermediate

[2] Edited by Seth Miller User:arapacana, Original file designed and produced by: Kosi Gramatikoff User:Kosigrim, courtesy of Abgent, also available in print (commercial offset one-page: original version of the image) by Abgent – Original file: en:File:GeneticCode21.svg, Public Domain, https://commons.wikimedia.org/w/index.php?curid=4574024

[3] “Wealth of Nations”, http://geolib.com/smith.adam/won1-04.html

[4] https://cornerthreehoops.wordpress.com/2014/04/17/explaining-espns-real-plus-minus/

The post Economic Value of Data (EvD) Challenges appeared first on InFocus Blog | Dell EMC Services.

Read the original blog entry...

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business”, is responsible for setting the strategy and defining the Big Data service line offerings and capabilities for the EMC Global Services organization. As part of Bill’s CTO charter, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, avid blogger and is a frequent speaker on the use of Big Data and advanced analytics to power organization’s key business initiatives. He also teaches the “Big Data MBA” at the University of San Francisco School of Management.

Bill has nearly three decades of experience in data warehousing, BI and analytics. Bill authored EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the Vice President of Advertiser Analytics at Yahoo and the Vice President of Analytic Applications at Business Objects.

@ThingsExpo Stories
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the USA and Europe, we work with a variety of customers from emerging startups to Fortune 1000 companies.
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Analytic. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 20th Cloud Expo, which will take place on June 6-8, 2017 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 add...
With billions of sensors deployed worldwide, the amount of machine-generated data will soon exceed what our networks can handle. But consumers and businesses will expect seamless experiences and real-time responsiveness. What does this mean for IoT devices and the infrastructure that supports them? More of the data will need to be handled at - or closer to - the devices themselves.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deli...
SYS-CON Events announced today that CollabNet, a global leader in enterprise software development, release automation and DevOps solutions, will be a Bronze Sponsor of SYS-CON's 20th International Cloud Expo®, taking place from June 6-8, 2017, at the Javits Center in New York City, NY. CollabNet offers a broad range of solutions with the mission of helping modern organizations deliver quality software at speed. The company’s latest innovation, the DevOps Lifecycle Manager (DLM), supports Value S...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
The age of Digital Disruption is evolving into the next era – Digital Cohesion, an age in which applications securely self-assemble and deliver predictive services that continuously adapt to user behavior. Information from devices, sensors and applications around us will drive services seamlessly across mobile and fixed devices/infrastructure. This evolution is happening now in software defined services and secure networking. Four key drivers – Performance, Economics, Interoperability and Trust ...
@ThingsExpo has been named the Most Influential ‘Smart Cities - IIoT' Account and @BigDataExpo has been named fourteenth by Right Relevance (RR), which provides curated information and intelligence on approximately 50,000 topics. In addition, Right Relevance provides an Insights offering that combines the above Topics and Influencers information with real time conversations to provide actionable intelligence with visualizations to enable decision making. The Insights service is applicable to eve...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.