A UK government Proof of Concept (PoC) led by Hitachi Solutions Europe has shown that Microsoft applications — including Power Platform, Dynamics 365 and Microsoft Copilot — can securely operate on live data that remains resident in Amazon Web Services (AWS) without copying or moving that sensitive information, using a private, Zero Trust‑aligned connector that links the clouds and preserves compliance and data sovereignty controls. (learn.microsoft.com)
The public sector has long struggled with the practical realities of multi‑cloud estates. Agencies often hold data and workloads across Microsoft Azure, AWS, Google Cloud Platform (GCP) and private or sovereign environments; until recently, connecting best‑of‑breed SaaS and platform tooling across those islands has meant either moving data, building brittle synchronization layers, or accepting reduced functionality. The demonstrated PoC from Hitachi Solutions aims to change that equation by exposing AWS‑resident data to Microsoft business applications in real time — without duplication and without traversing the public internet.
This is not a theoretical idea. Microsoft Dataverse supports “virtual tables” (virtual entities) which let Power Platform apps surface external data as if it were native to Dataverse, while leaving the source records in place. That capability explicitly avoids duplication and supports runtime access to external systems via secure data providers. This technical pattern is an established, supported route for integrating external sources into Power Apps and Dynamics‑style experiences. (learn.microsoft.com)
At the networking layer, cloud vendors and neutral interconnect providers have mature patterns for private, high‑performance cross‑cloud connectivity: combinations of AWS Direct Connect, Microsoft ExpressRoute, and carrier or colocation‑hosted fabrics (Equinix Fabric, Megaport and similar) can create private paths between cloud environments that avoid the public internet and reduce egress risk and latency. These architectures are widely recommended for multi‑cloud, low‑latency, and compliance‑sensitive scenarios. (aws.amazon.com, equinix.com)
Virtual tables are not a magical cure; they have feature and performance trade‑offs (for example, limitations around auditing, certain Dataverse features and some client‑side scenarios). However, for read‑heavy dashboards, on‑demand lookups, and workflows that can operate on live records, they provide a proven method to surface external records inside Power Apps and Dynamics‑driven processes. (learn.microsoft.com, microsoft.com)
Hitachi’s public-facing work with Microsoft, AWS and Google Cloud over recent years demonstrates the company’s strategic positioning as a multi‑cloud integrator; those relationships provide the organisational capability needed to build and support cross‑cloud solutions at government scale. (hitachi.com)
However, the move from PoC to production requires careful, agency‑specific assurance work: formal security accreditation, performance validation, operational runbooks and procurement changes to codify cross‑vendor responsibilities. When those pieces are put in place, the SMCC pattern can help public sector organisations escape costly migrations, accelerate AI and low‑code adoption, and deliver faster, joined‑up services to citizens — while still protecting the sensitive personal data at the heart of public service delivery.
Source: PublicTechnology Hitachi Solutions demonstrates secure use of Microsoft tools with AWS data in major win for government cloud interoperability
Background
The public sector has long struggled with the practical realities of multi‑cloud estates. Agencies often hold data and workloads across Microsoft Azure, AWS, Google Cloud Platform (GCP) and private or sovereign environments; until recently, connecting best‑of‑breed SaaS and platform tooling across those islands has meant either moving data, building brittle synchronization layers, or accepting reduced functionality. The demonstrated PoC from Hitachi Solutions aims to change that equation by exposing AWS‑resident data to Microsoft business applications in real time — without duplication and without traversing the public internet.This is not a theoretical idea. Microsoft Dataverse supports “virtual tables” (virtual entities) which let Power Platform apps surface external data as if it were native to Dataverse, while leaving the source records in place. That capability explicitly avoids duplication and supports runtime access to external systems via secure data providers. This technical pattern is an established, supported route for integrating external sources into Power Apps and Dynamics‑style experiences. (learn.microsoft.com)
At the networking layer, cloud vendors and neutral interconnect providers have mature patterns for private, high‑performance cross‑cloud connectivity: combinations of AWS Direct Connect, Microsoft ExpressRoute, and carrier or colocation‑hosted fabrics (Equinix Fabric, Megaport and similar) can create private paths between cloud environments that avoid the public internet and reduce egress risk and latency. These architectures are widely recommended for multi‑cloud, low‑latency, and compliance‑sensitive scenarios. (aws.amazon.com, equinix.com)
What the PoC claims it delivered
The Proof of Concept — described in recent reporting by PublicTechnology and presented by Hitachi Solutions Europe — makes several operational and security assertions about the Secure Multi‑Cloud Connector (SMCC):- Microsoft Power Platform and Copilot used live, production case data that remained stored in AWS; no data duplication was required.
- Connectivity was established over a private link (no public internet exposure), integrated with a Zero Trust‑based architecture and mapped against UK government security frameworks.
- The PoC was completed in eight weeks and enabled use cases such as live case‑load dashboards, workflow automation for caseworkers, virtual assistants and AI‑driven analysis to accelerate decision‑making.
- The solution is described as bi‑directional, scalable, and suitable for other cloud providers such as GCP.
How it works — the technical pattern
Virtualisation at the data layer (no duplication)
The core architectural idea used in the PoC aligns with virtualisation patterns already supported by Microsoft technologies. Dataverse’s virtual tables (virtual entities) allow Dataverse and Power Platform to present external data as if it were native, while the data itself remains in the external system and is retrieved on demand. This eliminates the need for nightly syncs, ETL pipelines or persistent data copies — a key benefit for sensitive case‑handling systems. (learn.microsoft.com)Virtual tables are not a magical cure; they have feature and performance trade‑offs (for example, limitations around auditing, certain Dataverse features and some client‑side scenarios). However, for read‑heavy dashboards, on‑demand lookups, and workflows that can operate on live records, they provide a proven method to surface external records inside Power Apps and Dynamics‑driven processes. (learn.microsoft.com, microsoft.com)
Private, provider‑neutral networking
To avoid public internet exposure, the PoC uses a private networking model between the Microsoft application environment and AWS where the data sits. This pattern mirrors the industry best practice of using Direct Connect and ExpressRoute (or via a neutral colocation provider’s fabric) to build private connectivity between cloud providers. Solutions from neutral interconnect vendors simplify cross‑cloud routing and reduce internet‑facing egress, which is important for both security posture and predictable latency. Published engineering guidance from AWS and Microsoft details the same connectivity patterns and their trade‑offs. (aws.amazon.com, learn.microsoft.com)Zero Trust and governance controls
The PoC is described as implementing a “scalable Zero Trust architecture” and aligning with UK Government security frameworks. Practically, that should mean:- Strong identity and conditional access (Entra ID / Azure AD or federated identity).
- Least‑privilege, role‑based access controls and fine‑grained authorization at the application layer.
- Encryption in transit and at rest, with private connectivity preventing exposure to the public internet.
- Comprehensive logging, monitoring and auditability to satisfy compliance and evidentiary needs.
Why this matters for government IT
- Choice of best tool for the job: Departments no longer need to standardise all data on a single hyperscaler before they can use the preferred SaaS or platform capability. That reduces political and procurement friction and lets teams adopt specialist capabilities faster.
- Avoiding risky data migrations: Large, sensitive datasets are expensive and risky to migrate. The virtual access model allows governments to keep data where governance, legal or contractual constraints require it, while still enabling modern citizen services on top of that data. (learn.microsoft.com)
- Accelerating AI/low‑code adoption: Exposing live data to Microsoft Copilot and Power Platform—while preserving residency—unlocks AI‑driven insights and automation without the need to create separate data lakes solely for tooling. This can materially speed up use‑case delivery (dashboards, triage automation, virtual assistants) that reduce case processing time.
Cross‑checking and independent verification
To evaluate the claims beyond the PoC announcement, several independent technical facts were verified:- Dataverse virtual tables are an official Microsoft capability that allows external data to appear in Power Platform without duplication. This confirms the technical feasibility of surfacing AWS‑hosted records into Power Platform apps without copying them into Dataverse. (learn.microsoft.com)
- Private, non‑internet connectivity patterns between AWS and Azure are well‑documented — AWS Direct Connect and Microsoft ExpressRoute (or neutral fabric providers like Equinix) are specifically recommended for private, low‑latency cross‑cloud networking. That validates the networking basis the PoC claims to use. (aws.amazon.com, equinix.com)
- Hitachi as a vendor already maintains strategic partnerships and productisation across Microsoft and AWS ecosystems (multiple joint announcements in 2024–2025), establishing organisational experience and precedent for multi‑cloud integration projects of the kind required for the SMCC. (hitachi.com)
Strengths and opportunities
- Practical multi‑cloud interoperability: The PoC demonstrates a way to end the politics of “pick a cloud” by enabling Microsoft tools to work with AWS data without migration. That’s a pragmatic win for agencies with sprawling estates.
- Reduced data duplication and lower operational overhead: By surfacing data via virtual tables and private connectors, organisations avoid the cost, complexity and security footprint of additional data stores. This is especially important for casework and health records. (learn.microsoft.com)
- Faster delivery and better citizen outcomes: Real‑time dashboards, automated triage and Copilot‑assisted casework can accelerate decision cycles and reduce backlogs. The PoC claims measurable gains in case processing speed and staff productivity.
- Alignment with government interoperability goals: The approach fits naturally with principles such as interoperability, reuse and ‘security by design’ articulated in government technology practice frameworks. That makes it a potential template for cross‑agency reuse and standardisation.
Risks, caveats and what to watch for
- Accreditation and assurance are still necessary. A PoC that “works” technically is not equivalent to formal accreditation under UK government impact levels or other national assurance regimes. Agencies must map the SMCC architecture to their specific controls and pass independent assurance to run case‑handling systems in production. The PoC description does not publish formal accreditation artifacts.
- Operational complexity and skills requirements. Multi‑cloud private networking, identity federation, and secure connector orchestration require cloud networking, identity and security engineering skills that are in short supply. Agencies must budget for training, runbooks and sustained operations. Industry analysis of multi‑cloud patterns consistently flags the operational burden as a core risk. (learn.microsoft.com, megaport.com)
- Performance and feature trade‑offs. Virtual tables and on‑demand integrations are excellent for many scenarios, but they have limitations (e.g., some Dataverse features, reporting, and offline behaviours). For heavy transactional workloads or scenarios that need rich Dataverse features (auditing, change tracking), replication or hybrid models may still be necessary. (learn.microsoft.com)
- Data protection, legal and procurement complexity. Cross‑cloud access does not remove the need to understand contractual terms, data residency rules, third‑party access controls and law‑enforcement or judicial access regimes. Agencies must explicitly document data flows and maintain DPIAs and lawful processing records. This remains a critical governance activity.
- Vendor concentration risk in the tooling layer. While the approach reduces data migration to a single hyperscaler, it still concentrates workflow and AI tooling within Microsoft’s commercial stack. Agencies should plan exit strategies, portability tests and ensure APIs and integration contracts remain robust to avoid new forms of lock‑in. This is a strategic governance decision, not a purely technical one.
Practical checklist for government IT leaders considering the SMCC approach
- Confirm regulatory and accreditation requirements for the workload (e.g., IL levels, FedRAMP equivalence, NCSC guidance). Ensure the SMCC design can be mapped to those controls and that an accreditation plan exists.
- Run a focused pilot with production‑like scale and representative data; test failover, latency, logging and incident response end‑to‑end.
- Evaluate Dataverse virtual table constraints against the intended feature set (dashboards, automation, auditing, offline support). Where features are missing, design compensating controls or hybrid replication. (learn.microsoft.com)
- Design identity and least‑privilege models using Entra ID or federated identity; adopt conditional access and step‑up authentication for sensitive operations.
- Build multicloud network resilience using Direct Connect / ExpressRoute patterns and a neutral fabric provider for redundancy; run egress and latency tests under realistic load. (aws.amazon.com, equinix.com)
- Define a long‑term operational model: runbooks, 24/7 monitoring, incident playbooks, and a skills uplift programme for networking, identity and cloud security teams.
- Document data flow, DPIAs, and legal rationale; ensure procurement contracts cover multi‑cloud interconnect, audit access and vendor responsibilities.
Policy and procurement implications
This PoC should shift procurement conversations from “which cloud do we pick?” toward “which capabilities do we need and where must the data remain?” That shift enables a modular procurement model where:- Data hosting and sovereignty are chosen based on legal and mission requirements.
- Application and productivity tooling are selected for capability and usability.
- Interconnect and security controls are procured as an integral service (including interconnect fees, fabric providers, and ongoing managed networking).
Where this fits in the broader industry movement
The SMCC approach is the practical extension of two converging industry trends: (1) vendors investing in integration points so organisations can mix and match hyperscaler services rather than fully committing to a single provider, and (2) the maturation of private, neutral networking fabrics that make secure, low‑latency cross‑cloud routing operationally feasible.Hitachi’s public-facing work with Microsoft, AWS and Google Cloud over recent years demonstrates the company’s strategic positioning as a multi‑cloud integrator; those relationships provide the organisational capability needed to build and support cross‑cloud solutions at government scale. (hitachi.com)
Limitations and statements that require further proof
The PoC report contains some assertions that merit further, independent validation before treating them as universally applicable:- The claim that the initial deployment completed in “eight weeks” is plausible for a focused PoC, but timeline reproducibility will depend on organisational readiness, pre‑existing connectivity, and procurement velocity. This figure should be treated as a PoC milestone, not a guaranteed delivery window for all departments.
- The specific security posture and accreditation status for the PoC were not published in full. Agencies must obtain the architecture’s compliance artifacts (design attestations, penetration test results and accreditation decisions) before rolling the design into production.
- While the PoC references bi‑directional integration, practical implementations may need to restrict write operations or use controlled APIs for updates to avoid accidental policy or provenance gaps; confirmation of how write‑through, transactions and rollback are handled in the SMCC is necessary for high‑assurance casework. This detail was not included in the public announcement and should be validated in any procurement exercise.
Conclusion
The Hitachi Solutions PoC represents a meaningful and pragmatic advancement in government multi‑cloud interoperability: it demonstrates a credible technical pattern to let Microsoft applications use AWS‑resident data in real time, behind private connectivity and with a Zero Trust orientation. The building blocks — Dataverse virtual tables, private Direct Connect/ExpressRoute interconnects and neutral fabric providers — are established and supported by major vendors, and Hitachi’s multi‑cloud experience gives the approach operational credibility. (learn.microsoft.com, aws.amazon.com, hitachi.com)However, the move from PoC to production requires careful, agency‑specific assurance work: formal security accreditation, performance validation, operational runbooks and procurement changes to codify cross‑vendor responsibilities. When those pieces are put in place, the SMCC pattern can help public sector organisations escape costly migrations, accelerate AI and low‑code adoption, and deliver faster, joined‑up services to citizens — while still protecting the sensitive personal data at the heart of public service delivery.
Quick reference: authoritative technical sources consulted
- Official Microsoft documentation on Dataverse virtual tables and how they enable real‑time access to external data. (learn.microsoft.com)
- AWS engineering guidance on private network connectivity patterns between AWS and Azure (Direct Connect / ExpressRoute guidance). (aws.amazon.com)
- Equinix and neutral fabric provider materials explaining multicloud private routing as an operational practice for cross‑cloud deployments. (equinix.com, deploy.equinix.com)
- Hitachi and Microsoft public releases describing strategic partnerships and multi‑cloud productization efforts that underpin integrator capability. (hitachi.com)
- PublicTechnology coverage of the Hitachi Solutions PoC and its claimed outcomes.
Source: PublicTechnology Hitachi Solutions demonstrates secure use of Microsoft tools with AWS data in major win for government cloud interoperability