Inside the Broadcom data simplification strategy, which allows 26 business units to use the same data analysis platform

Inside the Broadcom data simplification strategy, which allows 26 business units to use the same data analysis platform


Managing many different business units with all applications is not generally an easy data management task. Understanding all this information with data analysis could be even harder.

How Broadcom It developed thanks to the acquisition of many software and hardware corporations, the company struggled with the challenge of integrating various data sources and systems.

- Advertisement -

In particular, when the technological giant closed $ 61 billion in VMware acquisitions at the end of 2023, in the face of monumental data integration. VMware operated from 1800 different applications and a stunning product of the product. In the case of most corporations purchasing the textbook, it will be known – implement the major data management systems, build complex data lakes and create complex integrations to fill in systems, while step by step migrating them for several years.

Broadcom adopted a completely different approach. Instead of adding a layer of complexity to manage existing systems, he cleaned slate. The company also consolidated many data analysis tools, standardizing only on one platform from Arrivalwhich now operates in all Broadcom business units, including VMware.

“Vmware had a significant difference in application,” said Alan Davidson, Broadcom CIO, in an exclusive interview. “He actually warns each of them to consolidate our single management plane for the data model.”

Davidson explained that the data model is a source of Broadcom truth for data. The goal is to consolidate and simplify the data landscape to allow more practical reporting, analysis and decision making.

The consolidation approach of Broadcom focused on cleansing data at the source, and not transforming them later in the process. This philosophy managed the integration strategy.

The technical process included several key elements:

  • ERP consolidation: Vmware had seven corporate resources planning platforms (ERP). Davidson joked that that they had a different one for each day of the week. Broadcom consolidated it to one.
  • Rationalization of SKU: Vmware had 187,000 SKU – but now it has fallen to just 500. This radical simplification eliminated countless compatibility problems, price differences and support scenarios that may otherwise require complex mapping.
  • Restructuring of the major data: Instead of building complex translation layers between different systems, Broadcom enforced a single major data structure. This created one authoritative source of customer records, product information, data on orders and suppliers’ data.
  • Integration of contracts and reserve: A critical element was the strict integration of contracts and orders to serial numbers and booking. This ensures that when customers buy software, there is a clear vision line from the terms of the contract for actual use and permissions.

The result was a dramatically simplified architecture, which eliminated the need for complex data transformation.

Standardization in data analysis to enable corporate observations on a scale

The unified and pure data source is the basis of Broadcom data operation.

By expanding this, Davidson said that his company also decided to unite and standardize his data analysis on one platform. To supply analytics to a huge degree of data, Broadcom selected incort after assessing many alternatives. Currently, Broadcom has over 17,000 internal users consisting in incort in the scope of analysis in addition to over 200 terabytes of operational data.

“Look at every competitor in the landscape – Microsoft Power BI, Snowflake, Tableau – there are plenty of user interfaces (UII), but they are all delivered with luggage,” said Davidson. “There is never really a complete end-to-end, in which I receive a flexible user interface, self-service ability to thousands of end users and scaling in terms of use and many data maps.”

Different Broadcom business units, each with their very own models and data requirements, were a significant challenge. Davidson emphasized incort’s ability to manage the scale and combining various data sources. Simply put, Incort’s flexibility in segmenting access to data and enabling self -service analysis was a changing game.

“I have to turn on the segmentation, as well as combine dots and manage the user’s scale,” explained Davidson.

AI and Incorta automation functions extend data analysis

Broadcom is now to use the AI ​​series and automation update as a part of the latest Incorta product updates.

CEO of Incorta and co -founder of Osama Elkada explained to the Venturebeat that organizations at all times want to improve access to data. To meet these needs, Incorta introduced Nexus, which provides Broadcom and other clients of generative possibilities of artificial intelligence.

Thanks to Nexus, users can use advanced natural language processing (NLP) for tasks akin to data cleansing, building models and combining machine learning (ML) with gene AI. The goal is to strengthen Broadcom users, the possibility of quick and quick access to the crucial data, without loading manual preparation of data or complex navigation desktop creation.

Use of artificial intelligence by technique of data adjustment

While many enterprises are in a hurry to implement the AI ​​gene without solving the basic problems related to data quality, Broadcom undertakes a more measured approach – focusing first on data integrity.

“The worst thing we can do is to use artificial intelligence and get inaccurate data; Especially in the case of finances and telemetry, accuracy is very important, “emphasized Davidson. “If you give everything, it’s rubbish, all you are going to do is get the wrong answer quickly.”

Thanks to Nexus Incort, Broadcom rigorously uses artificial intelligence, in which he adds a clear value – primarily in the democratization of access to data.

“The whole point of artificial intelligence with incort is data transparency and easy access to data, without having to create your own desktop or hard work to generate data,” explained Davidson.

What does this mean for the company’s data leaders

In the case of enterprises facing the challenges of data complexity – whether from acquisition, organic development or a digital transformation initiative – the Broadcom approach offers several lessons:

  1. Question the need for complex data lakes: Instead of taking the complexity of data as inevitable, consider whether a radical simplification at the source could be more practical.
  2. Standardization breaks flexibility in basic systems: Although it requires difficult decisions, having one system of records creates less integration headaches than supporting many platforms.
  3. Self -service analytics requires a handrail: Broadcom’s success results from providing users with freedom in a rigorously managed framework, not unlimited freedom that causes maintenance.
  4. AI implementation requires data discipline first: Before implementing advanced artificial intelligence, make sure that the basic data is accurate, available and well structured.

As Davidson put it succinctly: “It is difficult to simplify everything and it is easy to make everything really complicated. That is why we try to do difficult things and make them simple. “

In the technological landscape often cluttered with latest tools aimed at solving problems with integration, the approach of Broadcom absolute simplifications is an attractive alternative to enterprises subjected to significant mergers and acquisitions. Performing a difficult work on data cleansing first, they created a foundation that permits, and does not inhibit future growth.

Latest Posts

Advertisement

More from this stream

Recomended