Op-ed: Finding the Goldilocks zone on data sovereignty

Authored by Adam Gale, Field CTO for AI & Cyber Security at NetApp
Data sovereignty has become a geopolitical fault line. Europe is warning businesses about the risks of over-dependence on foreign cloud providers, while Washington is suggesting that its allies keep sensitive data out of certain jurisdictions. On top of this, regulators are tightening rules on how information can be stored and transferred.
For business leaders, the debate is not so abstract. Every decision about where data sits, who manages it, and how it is protected now carries financial, legal, and reputational consequences.
Yet too many still see sovereignty as a binary choice. Either data sits in a fortress-like facility, sealed off from the world, or it flows freely through global public clouds. The reality is much more nuanced. Sovereignty is a spectrum, and organisations should take a leaf out of Goldilocks’ book to find a level that is “just right” for them. An arrangement that gives the right balance of security and control, while granting them enough agility and flexibility to innovate. Striking the right balance can save millions while protecting against regulatory risk and operational disruption.
The spectrum of sovereignty
Fully sovereign on-premises facilities can be found on one end of the spectrum, often described as “dark sites”. These are on-premises, completely closed systems, walled off from the internet and the wider world. Such setups are useful, and at times non-negotiable, for certain parts of the military, intelligence agencies, and a handful of highly sensitive government functions. This level of sovereignty maximizes both security and control for the organisation, and may also include strict citizen requirements to work on the premises. This begs the question, why wouldn’t all organisations want such a high level of control over their data and infrastructure?
But simply put, this is overkill for most businesses. This level of sovereignty makes systems very rigid and expensive to maintain. It also reduces their agility, and may lock organisations out from the full benefits of cloud technology which they may require to scale.
For banks, insurers, healthcare providers and energy companies, the challenge looks different. These are industries where regulators demand strict control over certain types of information. Yet these same industries rely heavily on connectivity, collaboration, and speed. Security and control are still equally essential for these organisations, but they require more flexibility in their systems. As a result, many have adopted hybrid models, keeping their most sensitive data, such as patient records and financial history, local while using the scale and efficiency of the cloud for less critical workloads.
Consider a global bank with customers across Europe, North America, and Asia. Regulators in each region demand different rules for how personal data is stored and processed. Too much centralisation would breach compliance, yet too much localisation would inhibit efficiency. The “just right” level of sovereignty can be found in a hybrid model: sensitive data like customer account details stay within national borders, while less critical analytics workloads run in the cloud for speed and scale.
The key is the data infrastructure that orchestrates this balance. Visibility tools track where data lives, and make it easy to implement and execute policies that keep information from crossing the wrong borders. Finally, lifecycle controls can also ensure records can be securely deleted everywhere they exist. This is the Goldilocks zone in action – neither overly restrictive nor too lax, but just right for both compliance and growth.
Defining how much is enough
The real challenge for business leaders is identifying the level of sovereignty that is “just right.” The right balance depends on three forces. Regulatory obligations dictate where data can reside and how it must be processed, often leaving little room for compromise. Performance requirements may demand proximity and low latency, especially in sectors like finance or industrial automation.
And then there is vendor dependency. Concentrating critical workloads with a single hyperscaler may simplify operations and be a convenient short-term option. While these providers are valuable technology partners, they are also subject to the laws and pressures of the jurisdictions in which they are headquartered. In turn, overreliance on them can carry strategic risks if circumstances change. For example, if geopolitical tensions compel a hyperscaler to make customer data available to their government upon request, making data a board issue, rather than just data governance and IT.
By weighing these factors together, organisations can find their “Goldilocks zone” of sovereignty, where they can operate with confidence without curbing innovation or creating unsustainable expenses. Governance and architecture are equally important in reaching this zone. The first priority is to clarity on where data resides, how it moves, and who touches it along the way. This level of visibility requires more than a spreadsheet; it demands ongoing mapping of data flows and a readiness to act when something changes. This could be new regulations or shifting business priorities.
Then there is compliance. Data protection, security, and residency must meet local as well as global requirements. International businesses in particular must prove they can meet the requirements of each jurisdiction they operate in, from the GDPR in Europe to sector-specific rules in the United States or Asia. Let’s also not forget that data residency regulations may well evolve, which could also shift what is considered “sovereign”. As a result, having data infrastructure that could be reconfigured at a later time helps future-proof an organisation’s operations.
Finally, sovereignty must be managed across the entire lifecycle of data. Creation, use, storage, and deletion are all part of the same chain of responsibility. After all, systems that cannot securely delete data when it is no longer needed are not truly sovereign. This makes sovereignty a boardroom issue rather than a narrow technical one.
It’s also worth noting how sovereignty decisions affect competitiveness as well. The wrong position can leave an organisation either paralysed by caution or exposed to regulatory and security risks. In turn, having the “just right” level of sovereignty enables businesses to adapt with new technologies without compromising trustworthiness to customers and partners.
The Goldilocks mindset
Sovereignty is not static or a switch that can be toggled. Regulations change, cyber threats evolve, and business priorities shift. Today’s threshold for sovereignty may fall short tomorrow. For this reason, sovereignty should be treated as an ongoing governance process. It needs regular review, and it must be adaptable.
Boards and executive teams cannot afford to see this as a purely technical concern. Decisions about sovereignty determine how resilient an organisation will be in the face of regulatory changes and market disruption. More importantly, these decisions will influence an organisation’s long term capabilities to sustain also competitiveness, trust, and long-term growth.
The binary view of sovereignty is outdated. The future lies in understanding the spectrum and finding the balance point that is just right.
Photo courtesy of NetApp