Integrating and actualizing data saves time and big money for businesses, but only if they speed results. The work can take time.
More than half of U.S. companies use data analytics at the same time that over half of corporate marketing departments say that analytics are disappointing. Disappointment can also be found in manufacturing, engineering, research and development, finance, and customer service, because users in my consulting work have told me they feel it takes too long to integrate and analyze data for actionable insights.
SEE: Electronic Data Disposal Policy (TechRepublic Premium)
“A lack of data integration and data preparation make it difficult for organizations to parse through enormous amounts of data and pull valuable insights,” said Hitachi Vantara’s Bjorn Andersson, senior director, Global Internet of Things Marketing. “Data quality and integrity is a major challenge no matter what industry you are in. In most enterprises data is spread across multiple silos.” Add to this the need for companies to manage more data at the edge of the enterprise. This data must ultimately be collected, cleaned, prepped and forwarded to central data repositories for comprehensive analytics to be performed.
A prime example of the need for data integration is in edge data management. In a factory, there might be a combination of edge data coming in from vibrations, video, lidar and audio. All of this data is unstructured. It needs to be preprocessed so “noise” is eliminated and then further transformed and integrated with central systems. Integrating all of these data sources is a formidable task for IT, so the usual goal is to locate a a software integration tool that can do the job.
“If an integration platform can tap into operational systems and integrate multiple data types, such as machine data, video, audio and vibration, it can forecast the potential failure of assets and the company can take direct and proactive action so a supply chain doesn’t fail,” Andersson said.
SEE: 3 steps to build a data fabric to integrate all your data tools (TechRepublic)
Andersson said he has seen cases where companies have had 60-day lead times to cure potential failure points before failure ever occurred. This creates a substantial savings, considering that 40% of 1,000 businesses interviewed by research and consulting firm ITIC between March and June, 2020 said that a single hour of downtime was costing their organizations between $1 and $5 million. This also illustrates the potential for data and analytics to actualize preemptive maintenance that avoids downtime altogether.
Across the board, companies need to see more of these actualized use cases, which actually deliver tangible business results from data and analytics. . Examples of use cases that have paid off in business include logistics route optimization through analyzing shortest routes, road construction areas, weather patterns, etc; smart city planning through analyzing areas of greatest population concentration and placing 911 response units closest to these areas; and consumer purchase anticipation by analyzing individual buying preferences and patterns.
SEE: Snowflake data warehouse platform: A cheat sheet (free PDF) (TechRepublic)
“In today’s world, complex environments include multicloud, hybrid cloud and edge environments. All increase the need for data access at the right place with the right performance” Andersson said. “Data mobility becomes key—that includes intelligent data placement, intelligent caching, automated data mobility and collaboration across clouds.”
Automated data integration platforms and the use of industry-standard APIs can help IT address integration in this diverse data environment. Equally important is an end-to-end IT data architecture that understands which data transformations and integrations need to be made where and how in each case data integration tools are to be applied.