
AI-Driven Application & Process Testing: Embracing Agentic Testing
Learn how Agentic AI enables digital transformation, delivering true hyperautomation.
The advance of digitalization, an increasing pressure of competition and disruptions of various kinds act as catalysts for digital transformation in general. More and more, companies are faced with the pressure of implementing changes ever more quickly. In this environment, centralized data architectures become more of a limiting factor, even though it was a logical approach for companies up until a few years ago (and is still, especially with a more traditional business model). Now these centralized data architectures can no longer keep pace with the dynamic and technical requirements of decentralized business domains.
What is the source of this development? A comparison of traditional and digital business models will shed some light on this.
Digital business models are based on the capacity to generate valuable data products from raw data.
Here, data and analytical capabilities are the primary component of the value chain.
Traditional business models, for example manufacturing companies, are geared towards physical products or services.
Data and IT are a means to an end here and thus a secondary component of the value chain. They are centrally organized within IT as an ancillary process and viewed as a cost factor.
On the one hand, the increasing need for the digital transformation of processes and the resulting data volumes present businesses with an opportunity to transform their business models and utilize data as a primary component of their value chain. In this way, data can be processed into data products, which not only create value within the business, but also in the ecosystem when shared. On the other hand, this transformation presents companies based on a traditional business model with challenges that must be overcome. Let’s address some of these challenges.
Lack of data governance and the associated standards ensures that changes are implemented across functional areas without any overarching standardization or synchronization.
Causes: Structures that have developed over time can be attributed to functional specialization. Structuring businesses on the basis of departments is a good example of this. They use the tools and standards that apply to their respective area. This usually leads to the creation of functional silos, collaboration with others or exchange of data is more difficult to implement and there is no overarching body to unify the way in which data is handled. All this leads to a lack of transparency.
Solution: Centrally defined data architecture governance with metadata management supports and facilitates cross-functional exchange and understanding of data. This in turn increases interoperability across domains
Centralized data management on the basis of data warehouses (and data lakes) creates a bottleneck when handling business use cases.
Causes: The original demand for company-wide data analysis necessitated the consolidation of data from the operational systems of the business domains in an analysis layer (OLAP). The OLAP layer is centrally managed by data specialists, which means that business users lacking necessary skills cannot access data or create their own specific data-analysis models. They are dependent on a ticketing system, in which the data experts work through and issues and answer requests of business users. Since there are limited experts available to manage the data in the data warehouse, this ultimately results in a bottleneck when serving the needs of the business. The strict separation between data supply and data usage also means that there is no understanding among the data experts as to how the data are used in the business. Consequently, the supplied data often do not fulfill the purpose for which they were intended.
Solution: Provided that central requirements are acknowledged, decentralized management of data products from within the business eliminates the central bottleneck and brings the data experts into closer proximity with the business. In this way, they gain the necessary understanding for the purpose of the data products that they supply.
The separation of the operational and analytical layers, both on a technological and an organizational level, means that operational business experts cannot effectively apply their know-how when creating analytics applications and data products. This results in a great deal of wasted potential.
Causes: Transactional systems (OLTP) display limited performance and are not sufficiently flexible to carry out extensive and complex data analysis and preparatory work.
These functions are outsourced to the data analysis layer (OLAP) as central data warehouses and/or data lakes, and managed centrally. As such, these functions are separate from the direct influence of the business domains.
Solution: In order to unite the operational and analytical layers, the domains should have data management capabilities characterized by a cross-functional personnel structure. Domains have the task of creating data products and configure these in an interoperable way. This forms the basis of a data architecture that is supplied with operational/transaction data and analytical data products. The objective is a bidirectional flow of data products between both layers. This data architecture manifests itself, for example, in the form of data fabric or data mesh concepts.
Centralized, monolithic data architectures impede the democratization of data, a prerequisite of which is that business users are able to work with data.
Causes: Centralized, monolithic data architectures generate bottlenecks, since data can only flow through predefined points that are limited in number. This hinders the free flow of data to scenarios where it could be used to create value. In addition, data communication is also limited between various hubs within the data architecture.
Solution: Federal data architectures enable the decentralized flow and democratization of data. Access to data is not only possible through a scalable technology, but also via the domains, which offer the connection to the data via API, for example.
Data warehouses only permit predefined data transformations, ETL in particular.
Causes: For the scaled monolithic data warehouses, it made sense that data transformations also happened within this solution and data only needed to be ingested for more advanced purposes. Subsequently, the data would be accessed via business software, to e.g., visualize it.
Solution: Business users know best the use case for the data and should be given the option of deciding themselves, whether and which transformations should be carried out using the data. In addition, new low-code and no-code software solutions enable business users to carry out data transformations – even on challenging user interfaces.
Long processing times for new business use cases makes it cumbersome to promptly generate insights from the data.
Causes: Owing to existing monolithic structures and a lack of data democratization, data transformations can only be carried out by data experts. In the event of high demand for data for business use cases, requests can pile up since data experts are in short supply. Most of the time spent on data analysis is used on finding and cleansing data.
Solution: Self-service data solutions enable direct access to data for analytical purposes. Business users no longer need to rely on data experts for the necessary data. This enables quick value creation with the analytical data.
These selected challenges show why effective data architecture management is not a foregone conclusion alongside the digitalization of business processes. It requires a proactive and holistic approach. Businesses should select reliable partners for their transformations.
Businesses profit from creating a uniform data governance structure, decentralizing data management on the basis of business domains, interlinking the operative and analytical layers, democratizing data, equipping business users with suitable skills and tools for handling data, and implementing self-service solutions. To gain insights from information, data must not only be seen as an ancillary product of processes and activities, but as a raw material for the creation of data products.
Modern data architectures bring structure into the digital transformation of businesses. We will look at how exactly data architectures are characterized in a following blog post.
We would like to thank Stefan Morgenweck for his valuable contribution to this article.
Learn how Agentic AI enables digital transformation, delivering true hyperautomation.
Reimagine resilience and proactively minimize supply chain risks
This article shall help you to understand how to optimize your inventory positions in a month – or even less.
Modern PLM systems empower businesses to achieve product excellence in fast-paced markets by enhancing collaboration, agility and innovation.
© Camelot Management Consultants, Part of Accenture
Camelot Management Consultants is the brand name through which the member firms Camelot Management Consultants GmbH, Camelot ITLab GmbH and their local subsidiaries operate and deliver their services.