DBTA Data Summit: The Future of Big Data Calls for Compromise Between Old and New Approaches
NEW YORK — At the Database Trends and Applications (DBTA) Data Summit today, Edd Dumbill, vice president of strategy at Silicon Valley Data Science, kicked off the day's presentations with a keynote address that introduced a new data value chain. With Big Data still dominating much of the discussion among IT professionals, the value chain is evolving and changing the dynamics not only between data scientists and the technology they rely on, but also between technology and business leaders.
Big Data and the databases that support it have changed the way data scientists approach business processes. "You have different aims and different capabilities from new technologies," Dumbill said. "Tools we have are different and they work differently now. Now there's a much bigger potential for growth." For example, Big Data has had a significant impact on decision-making processes. Because businesses don't have to worry about using data immediately and getting rid of it to free up space, they can now afford to keep data for as long as they need to and delay decision-making processes. "We don't need to throw things away anymore, we can use the same data in many different ways," he said.
While technology still plays a central role in data processes, it no longer has to be the first thing data scientists think about since it’s not as limited as it once was. "If you think tools first, you're doomed. You have to look at the data first," Dumbill said. Furthermore, because databases such as Hadoop have made looking at unstructured data and drawing value from it more attainable and realistic, companies can now focus on the "most important data 'V' of all:" variety, according to Dumbill.
To keep up with new capabilities and offerings, companies should evaluate their own organization to determine where they fall on the analytics maturity model, Brian Squibb, cloud platform sales engineering team lead at Google, urged. For now, there are four primary phases of analytics maturity—descriptive, exploratory, predictive, and prescriptive analytics. Each encompasses its precursors, and each level offers a more mature, far-reaching, and holistic environment. But even as solutions continue to evolve, striking a balance between traditional ecosystems and new, Big Data approaches is crucial, according to Squibb.
Traditional data processes rely primarily on organizational, historical data; Big Data calls for a proportionate use of external data sources as well. And, in a traditional data environment, IT or another central organization takes ownership of deliverables; in a Big Data setting, self-service is enabled. The schools of thought couldn't be more different, Squibb said, yet the two need to come together to form a data structure that leverages historical and external data, and "loosens the high conformance to data architecture," Squibb said. "Elasticity should be built into the systems."
Elasticity and flexibility will become particularly important as companies prepare to face the next business intelligence challenge—the Internet of Things. Today, modern smartphones have six to eight sensors, and cars have about 100, but by 2020 those numbers will double, and there will be billions of sensors monitoring a vast web of “connected things,” Gokula Mishra, vice president of big data and advancement analytics, said.
Businesses that aren’t typically thought of as business intelligence vendors, including GE and Samsung, are entering the IoT space, building their own solutions for collecting and analyzing the fast influx of data. Even butter brand Land-o-Lakes is "getting in on" the IoT, according to Mishra. "Their product has nothing to do with machines, right?" Mishra said. "But did you know they're also a large supplier of fertilizer and seeds for farmers? They're using sensor data to help farmers optimize their fields for a better harvest next year." But when asked whether companies' ability to collect data is outpacing their ability to make sense of and act on it, Mishra said "absolutely." The challenge now isn't collecting data, but determining which data is valuable and how to operationalize it. Mishra said Oracle is currently working on an IoT solution in the cloud to deliver that capability. It's "coming soon," Mishra said.
As new technologies and data sources emerge, there will be an increased need for agility. "Business changes, everything changes," Dumbill said. "Just look at Spark. Over the last year, Spark has changed so much. There are companies that we understood as being well established, and now they're very slow because they don't have Spark involved." Because so much is changing rapidly, companies have to build "experimental enterprises" that aren't constrained by old thinking and promote collaboration. To move forward, "technology leaders have to become coauthors with business leaders," according to Dumbill.
Datarista Offers Full-Featured Platform-as-a-Service to Data Providers
Data can be conveyed to cloud-based platforms through one all-encompassing integration point.
Information Builders Summit 2015: Reducing Organizational Data Divides, and Adding Spruced-Up Features
The Business Intelligence (BI) company makes data more practical for the average end user with added functions and a scalable interface.
Analysts at Gartner Business Intelligence Summit Talk Top BI 'Dilemmas' and How to Solve Them (Video Presentation)
The future of IT calls for boldness and a bimodal approach.
DBTA Data Summit Day 2: Big Data Strategy Means Finding the Right Tools for the Right Job
People, processes, and emerging technologies are driving the shift in big data.
Big Data Is Bringing Big Changes, Speakers at DBTA Data Summit Agree
Traditional processing and modeling approaches 'just don't cut it anymore.'