• Thread Author
The digital transformation journey in healthcare is rife with complexity, promise, and, often, an unpredictable path toward meaningful innovation. In an era marked by rapid advances in artificial intelligence (AI), few domains are poised for more profound transformation than healthcare delivery and operations. Conversations with visionary leaders such as Dr. Eric Poon, Chief Health Information Officer at Duke Health, reveal crucial insights into how health systems are navigating the challenges, opportunities, and realities of AI adoption—particularly when it comes to integrating new generations of tools like Copilot into clinical workflows and everyday operations.

A doctor using a tablet with futuristic digital medical data displays in a hospital setting.
Understanding the Realities of AI Implementation in Healthcare​

Healthcare organizations are no strangers to technological optimism. However, as Dr. Poon advises, AI—like any disruptive innovation—demands a nuanced and realistic approach. The sobering lesson learned from decades of informatics is that not everything that promises to work, will actually deliver results in the complex, high-stakes environment of patient care. Technology may be mature in one setting, yet struggle in another due to organizational readiness, culture, or user preparedness.
Duke Health’s approach underscores the necessity of embracing a “fail fast” mentality, grounded in a culture of experimentation and iteration. Rather than bet the farm on a single vendor or novel application, the organization invests in head-to-head comparisons and phased pilots. Only after careful, data-driven selection is a technology scaled, capitalizing on existing momentum and internal buy-in.
Dr. Poon likens this process to “kissing a lot of frogs to find a prince or princess.” AI solutions are extensively trialed, and only those that prove truly transformative receive the investment required for broad deployment. This pragmatic approach ensures resources are effectively allocated, and the organization remains agile in the face of ongoing technological evolution.

The Pivotal Role of Ambient Technology​

Ambient technology—a term referencing AI-powered systems that unobtrusively capture, interpret, and support clinical work—has emerged as a standout success at Duke Health. Over several years, the organization explored a range of vendors and progressively integrated them into clinical workflows, culminating in a direct, head-to-head trial.
The stakes for this initiative were high: clinician burnout, time demands, and documentation overload are endemic problems in the healthcare sector. What differentiated this initiative was not only the systematic comparison of competing tools, but also the engagement strategies that followed. The trial generated considerable excitement, ensuring that when the preferred solution was rolled out to more than 5,000 providers, over a fifth were active daily users within just two months.
This meteoric rate of adoption is unprecedented. Even technological innovations with obvious upsides, such as electronic health records or telehealth, have rarely seen such spontaneous and enthusiastic uptake among clinicians. The key ingredients at Duke included proactive communication, the enlistment of “superusers” or clinical champions, prompt onboarding for interested clinicians, and a robust educational support structure.
The early outcomes have been nothing short of remarkable: feedback from clinical users signals hours reclaimed daily, reduced after-hours charting, and earlier note completion—tangible improvements that directly address the pervasive issue of clinician burnout and administrative overload.

Building the Foundations: Technology, Infrastructure, and Policy​

Any discussion of AI in healthcare must move beyond hype to consider foundational requirements. As Dr. Poon articulates, it is critical to have the right decision-making structures—multidisciplinary governance councils with the authority and expertise to ensure AI investments are strategic, effective, and accountable.
AI adoption in a major health system is not simply a matter of purchasing licenses or integrating new applications. It demands clear-eyed evaluation: Does this solution address a real clinical or operational problem? Can it demonstrate measurable benefit? Is the workforce equipped to embrace the inevitable shifts in workflow and expectations?
Duke Health offers instructive examples of best practices in AI democratization and change management. Early and broad exposure to Microsoft Bing Copilot Search, for instance, meant diverse cohorts of healthcare workers could experiment and acclimate to this tool—within carefully considered guardrails.
Similarly, the measured rollout of Microsoft Office Copilot began with a limited group, accompanied by pilot data collection and benefit quantification. This approach ensured leaders outside informatics or IT could make informed decisions about extending the tool’s reach to their own teams, reinforcing a culture of accountability and value-driven investment.
At every step, Duke’s experience suggests that the path to successful AI implementation is paved with careful piloting, inclusive engagement, and rigorous measurement.

Security, Governance, and Trust​

Healthcare is perhaps unique among industries in the gravity of its data privacy and security obligations. Every AI deployment—particularly those involving protected health information (PHI)—must clear a high bar for safety, compliance, and clinical appropriateness.
Duke Health’s governance model is built on interdisciplinary input: technical experts assess the security readiness of proposed tools; clinical leaders scrutinize how AI may be used, by whom, and under what circumstances. The resulting guidance is pragmatic but firm: clinicians must use only vetted, approved tools for patient-related work, and regardless of the AI’s capabilities, take personal and professional responsibility for reviewing its outputs.
Only clinicians with the proper training and clinical expertise are authorized to utilize these tools, ensuring a layer of human oversight remains front and center. At its core, this is about maintaining trust—among clinicians, for patients, and with the broader healthcare ecosystem.

Copilot in Action: Lessons from the Front Lines​

The Copilot family of AI-powered tools from Microsoft has earned a prominent place in Duke Health’s forward-thinking strategy. Bing Copilot Search allows clinicians and support staff to access curated, relevant information rapidly, while Microsoft Office Copilot transforms day-to-day productivity, from drafting documents to managing communications.
Perhaps the most transformative impact, however, is observed with the clinical-grade ambient technology Copilot in direct care environments. As Dr. Poon’s team discovered through rigorous piloting, the right tool can profoundly change how—and how quickly—clinicians document patient encounters, synthesize data, and make informed decisions.
Feedback from Duke Health’s providers echoes a consistent refrain: AI is not merely a theoretical benefit but a concrete workforce multiplier. Providers are reclaiming time, reducing overtime, and experiencing less after-hours administrative burden. In a field where understaffing, physician burnout, and documentation demands are persistent threats to quality and morale, these gains represent a critical inflection point.

Strengths and Success Factors​

The Duke Health story illuminates several best practices and notable strengths that can serve as a model for other healthcare organizations considering AI adoption:
  • Fail Fast, Iterate Rapidly: Recognize that not every innovation will succeed, and formalize processes for rapid, low-risk experimentation.
  • Engage and Empower End Users: Clinicians, nurses, and staff should have a voice in evaluating tools and shaping their deployment. Superuser networks foster peer learning.
  • Leverage Internal Buzz: Pilot trials that involve multiple vendors or tools can catalyze excitement, accelerating organization-wide uptake.
  • Democratize AI: Early, broad exposure with proper support and safety guardrails allows stakeholders with varying levels of technical skill to gain comfort and familiarity.
  • Enforce Strong Governance: Multidisciplinary councils provide vital oversight, balancing opportunity with oversight to ensure patient safety and regulatory compliance.
  • Support Through Education and Responsiveness: Fast onboarding for eager adopters and continuous support ensure that organizational momentum isn't lost to bureaucratic drag.

Potential Risks and Ongoing Challenges​

While the gains are considerable, AI in healthcare presents real—and occasionally underappreciated—risks.
  • Data Privacy and Security: The proliferation of AI tools increases the attack surface for sensitive data. Even with robust governance, malicious actors or accidental exposures remain a significant threat. Continuous vigilance is essential.
  • Bias and Quality Control: Even the best AI is susceptible to the quality of its training data. Without careful monitoring, AI tools can perpetuate or even exacerbate existing clinical biases, potentially impacting patient safety and equity.
  • Overreliance on Automation: Tools that seem infallible can engender a false sense of security. It’s critical that clinicians remain trained and empowered to question or override AI-generated recommendations.
  • Resource and Change Fatigue: While innovative organizations like Duke Health demonstrate how to scale rapidly, sustaining momentum—and avoiding fatigue or backlash—demands careful management.
  • Regulatory and Legal Uncertainty: Authorities are still clarifying the legal and ethical frameworks surrounding AI in healthcare. Adopting organizations must be prepared for evolving requirements, including those concerning transparency, explainability, and liability.

The Broader Industry Implications​

Duke Health is not alone in facing the challenges and rewards of AI in healthcare, but its experience is illustrative of a larger trend. Technology is arriving faster than many organizations can safely absorb. Those that move too slowly risk falling behind, but reckless adoption risks privacy disasters and clinical harm.
What sets leading organizations apart is not technical prowess alone, but a holistic approach that balances innovation with accountability, and technological enthusiasm with caution. Effective governance, inclusive leadership, and a workforce prepared for continuous learning are the differentiators that separate success stories from cautionary tales.

Roadmap for Healthcare AI Success​

Healthcare organizations looking to embark on or accelerate their AI journey can draw several lessons from Duke’s experience:
  • Establish Strong, Multidisciplinary Governance: Bring together clinical, technical, legal, and administrative leadership for decision-making and oversight.
  • Develop Robust Onboarding and Education Frameworks: Ensure users not only know how to use the tools, but understand their limitations and best practices.
  • Start Small, Scale Strategically: Pilot, measure, and adapt. Avoid deploying untested solutions at scale.
  • Prioritize Security and Compliance: Integrate privacy, security, and regulatory review into every stage of technology selection and deployment.
  • Foster a Culture of Experimentation: Encourage staff to provide feedback, participate in pilots, and share lessons learned.
  • Measure and Share Impact: Collect real-world data on efficiency, satisfaction, and outcomes, sharing successes and challenges frequently with stakeholders.
  • Continuously Monitor and Adapt: The AI landscape will change rapidly; build agility into your strategy and workflows.

Conclusion: The Future of AI in Healthcare​

The accelerating adoption of AI in healthcare is transforming the art and science of medicine. Duke Health’s experience with Copilot and ambient technologies offers a rare blueprint for responsible, effective, and rapid deployment at scale.
The road ahead is far from simple: Data privacy, ethical considerations, and sustained organizational readiness are ongoing challenges. But with transparent governance, an empowered workforce, and a willingness to fail fast—yet scale what works—healthcare organizations can deliver on AI’s transformative promise, driving efficiency and patient-centered care into a new era.
For the industry at large, the lesson is clear: It’s not about jumping on every AI opportunity, but about knowing which “frog” to bet on, preparing the organization for change, and setting a course for measurable, repeatable success. In doing so, the future of healthcare—augmented by AI—becomes not just a vision, but a tangible, achievable reality.

Source: HealthTech Magazine Q&A: Duke Health’s Dr. Eric Poon on AI Adoption and the Organization’s Copilot Implementation
 

Back
Top