It is an understatement to say that over the last decade, cloud technology has changed the business landscape forever. Of all the by-products of this cloud and tech explosion, one of the most critical yet seemingly unheralded trends is decentralization—a phenomenon that made the classic notions of centralized, four-walled computing a fading memory.
The dilemma of whether IT should centralize or decentralize is becoming more pronounced and is regularly the subject of debates and internal corporate power struggles. The reality is that emerging technology needs within varied workforces have forced businesses to become more decentralized. Recent research by the Everest Group found that upwards of 50% of technology spend in organizations occurs outside the IT department. As an example, the needs of a mobile field representative don’t match the technological needs of a retail employee or a factory employee. Similarly, the data needs of someone in logistics may not match the needs of someone in service or sales.
Technology requirements across business units carry many variables and corresponding data. Even if technology integration were to stop cold, the evolution towards decentralization would not stop. But this movement begs the question of whether a complete decentralized approach makes sense for every business, especially when cyber threats loom so large.
At first glance, major enterprise companies within highly compliant industries such as Healthcare, Financial, Banking, Education, and Government may not be a natural fit for decentralized IT operations. The ideals of independence that come along with decentralized operations and technologies veer head-on into the elements of compliance, scale, and singular leadership. Alternatively, small companies, such as startups and growing mid-market organizations, appear on the surface to be naturally suited to certain decentralization principles.
This decentralization trend may give chief information officers and compliance officers trepidation due to security and compliance risks. CIOs should aim to retain control of overall IT strategy, security, and compliance frameworks, while also enabling their business units to make independent decisions. As long as checks and balances are put in place, fighting the independence trend is a futile attempt, as the much-maligned “Shadow IT” movement has proven.
There are numerous potential benefits of embracing this “controlled” decentralized IT strategy, including:
- Rapid Adoption: In a decentralized cloud environment, an operational business unit has the agility to deploy systems and applications quickly, rapidly adapting to emerging IT and application trends. In some instances, the ability to maintain verticalized industry compliance is also greatly improved.
- À La Carte IT: Properly empowered, departments can make decisions about their own IT, resources, costs, and any specific needs. Operationally decentralized environments allow rapid access, approval, and cloud governance systems. If it’s a test or trial system, then the resources that power that system should match.
- Fail-Safe Mania: The business benefits from decentralization as well, through increased resiliency. For example, using the right strategy, the resources of one department can provide a failsafe or backup scenario in the case of a significant event or failure. Many organizations in the early part of their cloud journeys start by using the cloud as a secondary or failover set of resources.
Decentralization is not a destination; it’s a journey and a spectrum for modern business leaders. If embraced, this change can put the CIO in a much more powerful position than a push for absolute control. Through “controlled” decentralization, an organization achieves not only high levels of security and the best that the cloud has to offer, but also technical agility, and the ability to respond to emerging technology needs and developments quickly.