• Thread Author
Penn State this week announced a focused, weeklong professional-development push — Learning Tools for Teaching: Explore, Engage, Elevate — that brings campus-supported learning platforms into one place for hands-on demonstrations, pedagogical conversation, and practical workshops designed for faculty and staff eager to deepen digital teaching practices. The event runs Sept. 29 through Oct. 3 in the Dreamery (Shields Building) and pairs vendor specialists with Penn State instructional technologists to spotlight one enterprise platform each day: Top Hat, LinkedIn Learning, VoiceThread, and Microsoft 365 (including Copilot and LTI integrations), culminating in open virtual office hours for follow-up and artifact development. (psu.edu)

Overview​

Penn State’s new Learning Tools for Teaching week is positioned as both an introduction and a practical lab for educators who already use, or are considering adopting, institutional learning technologies. The program emphasizes three outcomes: deeper familiarity with campus-supported tools, hands-on skill development in workshop settings, and peer-led sessions that surface instructional best practices. Each day is structured around expert-led presentations followed by interactive workshops and office hours to let participants test features, collect artifacts, and ask product-specific questions. (psu.edu)
The choice of the Dreamery as the physical hub reinforces the hands-on intent: the Dreamery is a TLT facility on the Shields Building ground floor explicitly designed for experimentation with emerging teaching technologies and active learning configurations. Its flexible furniture, AR/VR assets, and facilitation support create an environment engineered for experimenting with new classroom designs as well as software tools. (tlt.psu.edu)

Background: Why a dedicated week for learning tools matters​

Higher-education teaching has shifted from isolated tool adoption toward integrated workflows, where LMS, student-response systems, asynchronous discussion tools, and AI assistants interact to shape assessment, presence, and engagement. When institutions support multiple enterprise platforms, faculty and staff face two major challenges:
  • Fragmentation: Tools overlap in capability (polling, discussion, analytics), making informed choices essential.
  • Adoption fidelity: Instructors often have surface-level familiarity but not workflow-level competence needed to design effective activities.
Penn State’s event addresses both by creating a low-risk, low-pressure space for practice and exchange. The week’s format — platform overview, pedagogy-focused case studies, guided workshops, and drop-in office hours — mirrors adult learning principles that prioritize immediate applicability and scaffolded practice.

Day-by-day breakdown and pedagogical implications​

Monday, Sept. 29 — Top Hat: active learning and formative data​

Top Hat serves as Penn State’s enterprise student-response and engagement platform; the week opens with a full program focused on in-class interactivity, polls, quizzes, and content that extends learning outside class. Sessions include live lecturing integrations, strategies to reach distracted learners, and practical approaches for extending engagement beyond the classroom. Faculty office hours and product feedback rounds are scheduled to support immediate iteration. (psu.edu, it.psu.edu)
Top Hat’s public product materials confirm the core claims promoted during the session: a range of question types and in-class activities, LMS integration (including grade sync), interactive eText capabilities, and instructor-facing analytics that surface formative gaps. These features support frequent low-stakes assessment — a research-backed practice for improving retention and guiding instruction. For instructors, the immediate takeaway is how to convert lecture moments into evidence-producing interactions. (tophat.com)
Key practical applications highlighted:
  • Use live polling and click-on-target questions to diagnose understanding in real time.
  • Deploy short, low-stakes in-class quizzes to increase retrieval practice and reduce test anxiety.
  • Leverage Top Hat’s content editor and eText features for flipped or blended course design.
Potential risks and considerations:
  • Equity: Polling systems must be used with attention to digital access and device parity.
  • Data privacy and grade integration: Ensure clear communication with students about data use and how Top Hat scores feed the LMS gradebook.

Tuesday, Sept. 30 — LinkedIn Learning: professional development and curricular supplements​

LinkedIn Learning takes center stage for a day split between staff-focused professional-growth programming and faculty-focused applications for teaching. The platform offers enterprise-driven learning paths, certificates of completion, and curated content that institutions often use for staff training, digital-fluency initiatives, and to supplement course modules that require industry-aligned instruction. (psu.edu, linkedin.com)
LinkedIn Learning’s enterprise documentation emphasizes features that matter for universities: single sign-on with organizational credentials, learning-path customization, reporting on engagement and completions, and integration options for recommended content. For faculty, the platform is useful to surface short, skill-based modules to students (for example, software tutorials or career-readiness modules), while staff units often use it for compliance or upskilling. Recent product updates also show a push toward career-centric tools like the Career Hub, which may be attractive for student-services teams. (learning.linkedin.com)
Practical recommendations for attendees:
  • Map LinkedIn Learning content to specific course outcomes (not just “extra credit”).
  • Use organization-curated learning paths to align staff development with institutional goals.
  • Review privacy and reporting policies before using completion certificates for grading.
Limitations to watch:
  • Not every course topic maps neatly to on-demand video lessons; design intentional scaffolding for transfer.
  • Organizational visibility into activity means faculty and staff should know what participation signals to supervisors.

Wednesday, Oct. 1 — VoiceThread: asynchronous, multimodal discussion​

VoiceThread is a multimedia discussion and presentation platform designed to expand presence in asynchronous learning environments. The Wednesday agenda mixes pedagogical framing with faculty spotlights and two hands-on workshops: a beginner session and an advanced pro-tips lab. VoiceThread’s value proposition is clear — it allows image/video/slideshow-based prompts to be annotated with voice, text, and video comments, creating a conversational space that mirrors face-to-face discussion in tone and richness. (psu.edu, ed.voicethread.com)
Higher-education case studies and institutional use cases show VoiceThread’s strengths in:
  • Humanizing online courses via student-created audio/video introductions and reflections.
  • Promoting low-stakes spoken practice, particularly in language and presentation-focused disciplines.
  • Enabling peer critique and iterative feedback pathways that are richer than text-only forums. (cei.bd.psu.edu, voicethread.psu.edu)
Workshop takeaways likely to be emphasized:
  • Designing prompts that scaffold reflective thinking rather than expecting polished productions.
  • Using VoiceThread for formative checkpoints that generate artifacts for later assessment.
  • Integrating VoiceThread with the LMS via LTI to preserve participation data in the course gradebook.
Known constraints:
  • Accessibility and bandwidth — audio/video first approaches require attention to captioning and upload speeds.
  • Instructor time — multimedia feedback can be more time-consuming; consider rubrics and peer-review structures to scale grading.

Thursday, Oct. 2 — Microsoft 365: Copilot, LTI for Canvas, and intelligent agents​

Thursday focuses on Microsoft 365 and the institutionally emerging layer of AI assistants, with sessions on Unlocking Productivity with Microsoft Copilot, a technical demo of Microsoft 365 LTI for Canvas, and a practical workshop on Building Intelligent Agents with Microsoft Copilot. Sessions combine productivity-focused guidance (lesson-plan generation, summarization workflows) with developer and no-code paths for institutional agents. (psu.edu)
Microsoft’s published materials describe Copilot for Education as a toolkit that can generate lesson plans, rubrics, quizzes (with alignment to standards), and provide AI-assisted feedback — while also enabling institutions to build Copilot agents that connect to LMS data and institutional knowledge stores. The broader Microsoft developer and community event ecosystem includes hands-on training for building agents in Copilot Studio and emphasizes secure integration, data governance, and the ability to orchestrate tasks via Power Automate and other connectors. These sessions signal a move from static AI assistance (single-response generation) to actionable, workflow-integrated agents that can automate tasks and surface contextual resources for instructors and students. (learn.microsoft.com)
Practical implications:
  • Copilot can accelerate content generation, but pedagogical judgement is necessary to adapt AI outputs to course contexts.
  • Building agents (e.g., Canvas Teacher/Student agents) requires planning for data access, authentication, and permissions.
  • Institutions must design guardrails to ensure AI outputs are pedagogically sound and equitable.
Risks and governance considerations:
  • Data privacy and training datasets: Microsoft’s guidance says organizational data is not used to train the underlying models in some enterprise configurations, but administrators must confirm contractual and technical settings before broad rollout.
  • Overreliance: Copilot outputs should be treated as generative aids rather than authoritative content; faculty oversight prevents incorrect or biased outputs from being adopted wholesale. (techcommunity.microsoft.com, developer.microsoft.com)

Friday, Oct. 3 — Virtual office hours and artifact development​

The week closes with virtual office hours where attendees can drop in to ask follow-up questions, continue artifact creation (syllabus supplements, VoiceThread prompts, Top Hat activities, Copilot-generated lesson drafts), and get one-on-one help from Penn State professionals. The session design aligns with adult learning best practices by allowing participants to leave with tangible outputs rather than just notes. (psu.edu)

What this week means for faculty and staff — practical benefits​

  • Hands-on practice: The workshop structure reduces the gulf between awareness and application, helping instructors move from “I know this tool exists” to “I can design an activity with this tool.”
  • Cross-platform thinking: Seeing Top Hat, VoiceThread, LinkedIn Learning and Microsoft 365 side-by-side helps educators choose tools based on pedagogical fit, rather than features alone.
  • Professional growth: LinkedIn Learning sessions tie into career-readiness and staff development frameworks; Copilot and Top Hat workshops support productivity and formative assessment skills.
Benefits summarized:
  • Rapid upskilling in enterprise tools
  • Opportunities for interdisciplinary collaboration and shared practices
  • Access to vendor experts and institutional staff for troubleshooting and planning

Critical analysis: strengths, trade‑offs, and governance gaps​

Strengths​

  • Integrated, scaffolded format: The week’s combination of vendor overviews, faculty spotlights, hands-on practice, and office hours follows well-established professional learning models that produce behavior change.
  • Institutional alignment: By centering Penn State-supported platforms, sessions reduce confusion about licensing, data policies, and support channels — a critical step for scalable adoption.
  • Emphasis on pedagogy: Including faculty spotlights and pedagogical framing mitigates the “tools-first” trap and highlights real classroom uses.
These strengths are evident in the event schedule and the Dreamery’s role as an experimentation space. (psu.edu, tlt.psu.edu)

Trade-offs and operational risks​

  • Tool proliferation risk: Even with institutional support, having multiple platforms with overlapping capabilities raises the risk that course design becomes tool-driven rather than outcome-driven. Institutions should invest in crosswalks that map a learning outcome to a recommended tool and workflow.
  • Support and sustainability: Intensive workshops can spark adoption, but long-term usage requires accessible help resources, TA training models, and automated integrations (grade sync, roster provisioning) to prevent faculty fatigue.
  • Equity and access: Many interactive tools assume reliable internet and modern devices. Sessions must address equitable implementation strategies — for example, offline alternatives, asynchronous options, or campus-device loan programs.

Data governance and AI ethics​

  • Copilot and agent functionality deliver high potential but also heighten governance requirements. Institutions must:
  • Verify contractual protections and data use language for AI features.
  • Configure tenancy and data-processing settings that prevent unwanted exposure of student records.
  • Train faculty on how to evaluate and edit AI-generated materials to avoid embedding biases or inaccuracies.
Microsoft’s documentation and community guidance highlight that Copilot agents can be built and customized, but they also emphasize secure integration, compliance, and the need to ground agents with authoritative knowledge sources — all important guardrails Penn State should plan for as deployment deepens. (learn.microsoft.com, techcommunity.microsoft.com)

Practical implementation checklist for attendees and department leaders​

  • Register for sessions aligned to immediate needs (course redesign, staff training, accessibility).
  • Prepare by identifying a small, concrete deliverable to produce during the week (a Top Hat quiz, a VoiceThread prompt, a Copilot draft).
  • Bring sample course materials (syllabus module, lecture slides, discussion prompts) to iterate with vendor and Penn State staff support.
  • Document integration points: confirm LMS gradebook sync, roster integration, and data-retention policies before enabling auto-grade or auto-sync features.
  • Plan for accessibility: request captioning workflows for multimedia assignments and rehearse alternative assessments for students with limited bandwidth.
These steps reflect both pedagogical best practices and pragmatic steps to reduce friction in adoption.

Opportunities for institutional leaders​

  • Align vendor partnerships with center-for-teaching priorities to produce shared playbooks mapping common course types to a recommended toolset.
  • Develop a lightweight governance checklist for AI tool adoption: data flows, training-set disclosures (where applicable), access controls, and evaluation timelines.
  • Commit to post-event support: micro-mentoring, drop-in clinics, and a repository of exemplar assignments that demonstrate how tools map to outcomes.

Cautionary notes and unverifiable claims​

Several product claims and vendor marketing points require scrutiny before institutional or curricular reliance:
  • Generative AI outputs must be validated for accuracy and bias; vendors often emphasize capability but institutional procurement should confirm contractual protections and non-training clauses when dealing with student data.
  • Any claim about measured learning gains tied to a particular platform should be evaluated in the context of independent, peer-reviewed research rather than vendor case studies alone.
  • While platform roadmaps (for example, Copilot agents and new LinkedIn Learning Career Hub features) are published by vendors and community channels, availability and feature sets can change rapidly; technical teams should confirm feature availability and timeline for institutional tenants before scheduling large-scale rollouts.
When using institutional events to inform procurement or course redesign, treat vendor demonstrations as starting points for pilot testing rather than definitive evidence of impact.

How to make the most of Learning Tools for Teaching week​

  • Come with a hypothesis: Identify one teaching problem you want a tool to solve (e.g., low discussion quality, low timely feedback) and test whether the tool meaningfully addresses it.
  • Use the office hours to produce a tangible asset: a VoiceThread assignment, a Top Hat reading module, or a Copilot-generated draft that you then humanize and adapt.
  • Connect with peers in other colleges — cross-disciplinary perspectives often surface creative uses and reveal pitfalls early.
  • Document time costs: measure the instructor preparation time required to create, assess, and iterate on activities so you can estimate sustainability at scale.

Final assessment​

Penn State’s Learning Tools for Teaching: Explore, Engage, Elevate is a well-calibrated, evidence-informed approach to faculty and staff development around enterprise learning technologies. By situating vendor expertise inside a pedagogy-first week, offering hands-on practice, and providing follow-up office hours, the initiative addresses the chronic gap between awareness and adoption that many institutions face.
The program’s success will hinge on two follow-through elements: robust governance for data and AI, and a sustained post-event support structure that turns one-off enthusiasm into scalable instructional change. If Penn State couples this event with clear integration policies, accessibility planning, and ongoing coaching, the week has the potential to raise not just tool competency but overall instructional quality across campus.
Attendance is encouraged for any faculty or staff interested in practical strategies for classroom engagement, professional learning, and the careful, critical adoption of generative AI in teaching. Registration and the official schedule are maintained through Penn State’s Learning Resource Network and related TLT channels. (psu.edu, tlt.psu.edu)


Source: Penn State University TLT to host first Learning Tools for Teaching: Explore, Engage, Elevate event | Penn State University