We have been talking about the Internet of Things (IoT) for many years now. But just how many “things” are we talking about? Last year, Gartner estimated that 20 billion things will be connected by 2018.
These “things” are connected devices that appear in many different forms, factors, locations, and usage patterns. Every day, we are creating many more devices that connect to the Internet. Gartner also predicted that, by the end of this year, 5.5 million new devices would be getting connected to the Internet every single day.
Recently, more and more companies and analysts have been talking about the “Internet of Everything” (IoE). When we move from “things” to “everything” we include people, processes, and data. That’s a much bigger, all-encompassing ecosystem than just the IoT. That’s great! We are all very excited to be in this expanded, connected ecosystem. When we think about consumers, the excitement is about being able to interact with smart cars, smart refrigerators, smart watches, smart beds, smart toothbrushes, and the list goes on. But for enterprises, the excitement is more than just connectivity, and being able to do new tasks over the Internet. What is this excitement, and how is it important?
What Lies Beneath
For every enterprise across the globe that embraces the IoE, it’s an opportunity to create economic value from this connected ecosystem. But how does that happen? The answer lies in the data that these connected devices generate. The IoE also includes data about people and processes that are captured in various applications such as CRM and ERP, and stored in the cloud, or in big data sources, as well as many other enterprise systems that are in a sense nothing but data repositories. Enterprises need to use data related to things, people, and processes for valuable feedback on how well end devices are working, whether those devices are functioning in the optimal way, whether all of their workforce is getting utilized in the most optimal way, and determining whether each process is laid out and functioning in the way it is intended to function. Leveraging this feedback, enterprises can make better business decisions that generate additional revenue and save cost or time.
With the advent of mobile, cloud, SaaS, social media, and various other web and media formats, the data economy is getting exponentially complicated. Not all forms of data and data repositories communicate with each other in the way we would like. To make sense of any enterprise data trove, we need clearly defined ways to capture, process, access, and present heterogeneous sets of data which, when combined, can provide organizations with invaluable insights for innovation and progress. Some progress has been made through data integration technologies like ETL, which helps build enterprise data warehouses (EDWs) as single sources of truth. Various BI tools make use of EDWs to try to understand customer behavior, the usefulness of a product, the right segmentation of a certain market, etc. But ETL processes do not deliver data in real time, and they are quite resource intensive. In many instances, EDW solutions are just too expensive to be viable. Recently, with the advent of big data analytics on inexpensive commodity servers or hyper-converged clusters, many organizations are gathering insights from their non-transactional or dark data. But often, big data analytics lack context, and without the right context, big data analytics mean very little. This is where data virtualization becomes instrumental.
To illustrate, let’s say a telecommunications company carries out big data analytics to learn that each of their Wi-Fi routers functions well for two years but then degrades in functionality. To ensure that a new router of the right brand and with the right configuration reaches the right end customer by the end of the two-year period, the company needs at least master data and inventory data among other information. If the company incorporates data virtualization as one of their central strategies, or as part of their enterprise data architecture, they have the right solution. All other technologies consume just too many resources, and too much time and cost, to be viable. The Denodo Platform establishes a data virtualization layer between data consumers (people or applications) and multiple heterogeneous data sources. Because it integrates data without replicating it, it always provides useful context, and in real time.
With the Denodo Platform in their architecture, this telecommunications company can create a logical data warehouse, much more quickly than it would take to implement an EDW. In addition, establishing this logical data warehouse would require much fewer resources, and it would inform the right person in the company exactly when to ship which router to which customer. At the end of the day, this telecommunications company would avoid downtime for its customers, it would reduce the number of hours spent by its customer service reps in talking to customers, it would prevent the wrong router to be shipped to a given customer, and it will have created some economic value for both itself and its customers.
The Current State of The IoE
Many companies have jumped onto the IoE bandwagon. But only a few have been partially successful in realizing or generating any economic value from the installation of sensor based devices, placing the right networking infrastructure, and putting the right set of people and processes in place. What’s missing?
The answers lies in the complexity of any enterprise infrastructure, in the data that it generates or transmits, and the many ways to turn that data medley into meaningful insights that can help business teams to make decisions. Most small to large enterprises have legacy infrastructure and systems in place, alongside some modern infrastructure. They often also have varieties of old and new software that deals with people, processes, and machine information, scattered across the organization. Even though some small IoE projects can be brought to life comparatively easily, when an organization wants to put together an enterprise-wide IoE project, it becomes daunting, and the problem is scalability.
On top of that, they have concerns around privacy and security, which is natural whenever consumer or company related sensitive data moves from the sensor, to the data repository, to an analytics or reporting platform. Because of the inherent risk of moving data, each and every enterprise is coming under greater scrutiny regarding their data governance and privacy measures.
The bottom line is that enterprises are taking very slow, measured steps in their journey from the IoT to the IoE, to ensure their business plan is bulletproof. Unfortunately, that causes the journey to the IoE to stop in its tracks.
How Do You Realize Your ROI?
Data virtualization is the right technology to give the right context to big data analytics. Data virtualization can also provide the right context for cloud based IT modernization and any project that involves a hybrid data architecture including big data, cloud, and an on premise enterprise systems. Data virtualization significantly reduces the time required to deploy enterprise data architecture. It can also reduce the required turnaround time for any big data or cloud based project, as well as the number of resources needed to deploy such projects. Any meaningful IoE project involves a big data project, a cloud project, and a hybrid data architecture.
While you are working on putting together a plan for your enterprise-wide IoE project, make sure you have data virtualization in place when you connect all your infrastructure, people, processes, and things. This technology might make you imagine the latest virtual reality trend. But we need you to understand that when it comes to data integration, virtualization is reality!
- Denodo and the Gartner Peer Insights™ Voice of the Customer for Data Integration Tools, 2024 - October 11, 2024
- Exploring the Gartner® Critical Capabilities for Data Integration Report Tools - June 4, 2024
- Denodo recognized as a Leader in the 2023 Gartner® Magic Quadrant™ for Data Integration Report - May 3, 2024