The University of Southern Indiana’s AI Advisory Committee will host an in‑person, beginner‑friendly workshop titled “Getting Started with Microsoft Copilot” on Monday, October 20, from 1:00 to 2:00 p.m. in Orr Center 2018, presented by Kyle Tharp, IT Business Manager; the session promises an overview of AI tools and access, practical reasons for students and staff to use Copilot, coverage of AI agents, and live demonstrations aimed at faculty and staff who are new to AI tools.
Microsoft has rapidly folded large language model capabilities into productivity software under the Microsoft Copilot family—components that include Copilot for Microsoft 365, Copilot Chat, and administrative tools such as Copilot Studio. These tools are positioned to speed routine tasks (drafting, summarizing, lesson planning) while offering integration points into the Microsoft ecosystem that many campuses already use.
Higher‑education institutions and continuing education teams have converged on a common workshop model: short, hands‑on sessions that combine basic AI literacy (how models work and where they fail), practical demos (prompt engineering, co‑authoring workflows), and governance guidance (account context, data handling, and procurement). This blended format quickly moves attendees from awareness to immediate, low‑risk experimentation.
That said, important caveats remain:
Source: University of Southern Indiana | USI USI AI Advisory Committee workshop October 20 - University of Southern Indiana
Background
Microsoft has rapidly folded large language model capabilities into productivity software under the Microsoft Copilot family—components that include Copilot for Microsoft 365, Copilot Chat, and administrative tools such as Copilot Studio. These tools are positioned to speed routine tasks (drafting, summarizing, lesson planning) while offering integration points into the Microsoft ecosystem that many campuses already use.Higher‑education institutions and continuing education teams have converged on a common workshop model: short, hands‑on sessions that combine basic AI literacy (how models work and where they fail), practical demos (prompt engineering, co‑authoring workflows), and governance guidance (account context, data handling, and procurement). This blended format quickly moves attendees from awareness to immediate, low‑risk experimentation.
What the USI workshop promises (quick summary)
- Presenter: Kyle Tharp, IT Business Manager
- Title: Getting Started with Microsoft Copilot
- Audience: USI faculty and staff at beginner level with AI tools
- When & where: 1:00–2:00 p.m., Monday, October 20 — Orr Center 2018
- Core coverage:
- Overview of AI tools and how to access them
- Use cases for students, educators, and administrators
- Introduction to AI agents and live demonstrations
Why this matters to faculty, staff, and campus IT
Microsoft Copilot and similar copilots are not novelty toys for tech enthusiasts; they are deliberately integrated into daily workflows in Word, Outlook, Teams, and other Microsoft 365 apps to save staff time and improve consistency. Typical early wins include:- Faster draft creation (emails, lesson plans, rubrics)
- Rapid summarization of long documents or meeting transcripts
- Template and rubric generation for assessment
- Reusable prompt recipes and co‑authoring workflows that speed iteration
Microsoft Copilot: a concise primer
What Copilot is and where it appears
Microsoft Copilot is a portfolio of AI features woven into the Microsoft 365 ecosystem. Key elements include:- Copilot for Microsoft 365 — embedded assistance inside Word, Excel, PowerPoint, and Outlook for drafting, analysis, and summarization.
- Copilot Chat — conversational, chat‑style interaction that can synthesize documents and answer multi‑turn queries inside the tenant context.
- Copilot Studio — a no‑/low‑code authoring environment that allows institutions to build agents—customized, tenant‑scoped assistants that connect to internal knowledge sources.
Copilot agents in brief
Agents are preconfigured conversational assistants designed to represent a team, office, or workflow (advising, careers, registrars). Institutions can attach knowledge sources—uploaded documents, SharePoint content, or external connectors—and publish these agents within their tenant for authorized users. Agents can triage routine queries, surface institutional resources, and automate follow‑up tasks when properly designed.Security, privacy, and data governance — the practical truth
The most consequential technical and policy question for campuses is who sees and may use the data you send to a Copilot interaction. Vendor messaging and enterprise documentation consistently state that when Copilot features are enabled under a properly configured organizational tenant (Entra ID / Azure AD), tenant content and prompts are generally excluded from being used to train Microsoft’s public foundation models—provided the tenant and contract settings are configured appropriately. That technical posture is central to why many institutions adopt enterprise Copilot instead of consumer chatbots.That said, important caveats remain:
- These protections depend on tenant configuration and contractual language; they are not automatic for consumer or personal accounts.
- Administrative misconfiguration or unintended connectors can expose data. Governance and procurement steps are therefore essential.
- Claims that vendor features will never be used for training in perpetuity should be treated with caution; contractual terms and product roadmaps can change.
How peer institutions run these workshops (what works)
Case studies and event outlines from other campuses show recurring patterns you can expect at USI’s session:- Short, applied labs where participants produce tangible artifacts (rubrics, email templates, lesson plan drafts).
- Explicit demos showing the difference between consumer prompt usage and tenant‑protected Copilot interactions.
- A pedagogical segment that reframes Copilot as a co‑author—the instructor edits and refines outputs rather than accepting them verbatim.
- Clear guidance on syllabus language, assignment design, and academic integrity expectations when AI is used.
Live demos and agent examples: what you’ll likely see
USI’s advertised demos will probably include the following practical items, which other institutions have found useful to show live:- Co‑authoring in Word: create an assignment prompt from a short brief, iterate tone and requirements, and export to the LMS.
- Meeting summarization from Teams: show how Copilot extracts action items and decisions from a transcript.
- A simple Copilot agent walkthrough: how an advising agent asks guided questions and maps responses to institutional resources.
Risks, failure modes, and what to watch out for
Understanding where Copilot can mislead or produce liability is essential if an institution wants sustainable, ethical adoption.- Hallucinations and accuracy gaps. Generative outputs can be plausible but incorrect. Outputs used for clinical, legal, or research purposes should always be verified by domain experts.
- Data exposure through prompts. Users sometimes paste sensitive content into chat to get a cleaner answer—this habit can leak restricted data unless enterprise controls and user training prevent it.
- Overreliance and skill atrophy. If students rely on Copilot to produce final submissions without reflective work, critical thinking and writing skills can atrophy. Pedagogy must be adapted to preserve learning objectives.
- Administrative blind spots. Workshops that teach only user‑level prompts but do not coordinate with IT risk enabling experimentation without guardrails; tenant and procurement tracks must accompany user training.
Practical recommendations for USI attendees (what to bring and how to prepare)
To get maximal, immediate value from the October 20 session, attendees should:- Bring a real, small use case: one routine task that consumes time (e.g., preparing an email template, drafting a syllabus blurb, creating a rubric).
- Know which account you will use: personal vs. institutional. If uncertain, check with your unit’s IT or the presenter to confirm whether your USI account provides tenant protections.
- Prepare a short set of questions about governance: who manages tenant settings, how are logs audited, and where to escalate procurement questions.
- Be ready to treat outputs as drafts: plan to review, edit, and localize any Copilot‑generated content before deploying it to students or the public.
Recommendations for USI IT and leadership (governance checklist)
Workshops succeed when paired with institutional clarity. IT and campus leaders should consider:- Confirming and documenting tenant‑level protections and public guidance that staff can rely on. Put a short FAQ on the university IT site showing how to tell whether a Copilot interaction is protected.
- Publishing an acceptable‑use summary for AI tools that clarifies what must not be put into external models (PII, protected research data, etc.).
- Offering follow‑up office hours and a sandbox environment where faculty can try Copilot with non‑sensitive materials before using it in courses.
- Coordinating procurement and contract review for any third‑party AI tools so legal and data‑protection implications are reviewed before adoption.
Sample session timeline (how an effective 60‑minute workshop is structured)
- 0–10 minutes: Quick primer — what Copilot is, account context (consumer vs. tenant), and core features.
- 10–25 minutes: Live demo — co‑author a syllabus blurb and a rubric in Word; show iteration and export.
- 25–35 minutes: Agent demo — brief walkthrough of a tenant agent that answers student questions or triages advising.
- 35–50 minutes: Hands‑on micro‑lab — attendees try a short prompt recipe on their devices (or follow along) with emphasis on editing outputs.
- 50–60 minutes: Q&A and governance checklist — account guidance, procurement pathways, and next steps.
Metrics and follow‑up: how to know if Copilot adoption is working
Campuses that measure impact typically track a few pragmatic indicators:- Faculty adoption: number of instructors using Copilot for prep or feedback.
- Time saved: self‑reported reduction in hours for drafting or grading tasks.
- Quality and integrity: faculty and student satisfaction with AI‑augmented assignments and any trend in academic‑integrity incidents.
- Policy compliance: evidence that institutional guidance on data handling is being followed.
Strengths and opportunities of the USI session
- Timely, practical orientation. A focused one‑hour session gets beginners to the point of experimentation without overwhelming them.
- Live demos build confidence. Short co‑authoring demos let attendees see immediate gains and demystify the tool.
- Opportunity to align pedagogy and policy. If the workshop includes the governance checklist and IT confirmations, it will reduce the common gap between user enthusiasm and institutional controls.
Potential gaps and cautions
- One hour is short. The session should include signposts for follow‑up training (advanced sessions, office hours, sandbox labs) to truly change behavior beyond awareness.
- Vendor specifics may date the materials. Participants should be given links or references to tenant configuration steps and updated vendor documentation so the material remains actionable as product features evolve.
- Avoid overpromising: if workshop slides contain “coming soon” product features or roadmap items, label them clearly and follow up with confirmed dates or vendor documentation where possible.
Final takeaways for USI faculty and staff
The USI workshop on October 20 is a well‑timed, practical primer that mirrors successful approaches used at peer institutions: combine short, hands‑on demos with an explicit governance checklist and clear next steps. Attendees should leave able to:- Distinguish between consumer and tenant Copilot contexts and why that matters for privacy and training.
- Use Copilot as a co‑authoring tool—generate drafts and iterate, not as an authoritative source.
- Ask the right follow‑up questions of IT and procurement about tenant configuration and contractual protections.
Conclusion
USI’s “Getting Started with Microsoft Copilot” session offers a compact, practical entry point into campus‑level AI adoption. If the workshop balances live demonstrations with explicit guidance about account context, data handling, and instructional design, it can accelerate useful, low‑risk adoption across offices and classrooms. The real institutional value will come when USI couples this beginner training with clear technical configurations, short policy guides for users, and follow‑up practice opportunities so faculty and staff can turn early curiosity into disciplined, productive use.Source: University of Southern Indiana | USI USI AI Advisory Committee workshop October 20 - University of Southern Indiana