Medtrics was created to be flexible and to change with the needs of the medical education community.
Edgar Poe | Director Michigan State University
Michigan State University College of Osteopathic Medicine (MSUCOM) partnered with Medtrics to pilot an AI-assisted approach to MSPE compilation—targeted specifically at typo and grammar correction. Designed with strict oversight, the project maintained evaluator voice, student control, and FERPA compliance while reducing editing time across nearly 7,000 preceptor comments. This initiative offers a scalable, replicable model for improving MSPE accuracy and efficiency without compromising institutional standards.
MSUCOM faced rising administrative pressure during MSPE season, with growing class sizes and a labor-intensive editing process that diverted staff away from high-value advising and slowed finalization timelines.
To streamline editing without losing narrative quality, MSUCOM and Medtrics implemented an AI assistant focused solely on mechanical cleanup—typos, grammar, and formatting—embedded directly into the MSPE workflow and reviewed by both students and staff.
The pilot processed nearly 7,000 comments across 293 students. Students reviewed AI-suggested edits before staff finalized MSPEs, preserving transparency and control. The result: faster document turnaround, higher accuracy, and better alignment between institutional oversight and student experience.
At MSUCOM, staff producing Medical Student Performance Evaluations (MSPEs) were burdened by rising student volume and time-consuming manual edits. Each MSPE included dozens of narrative comments submitted by clinical preceptors—often requiring hours of cleanup for typos, grammar, and formatting inconsistencies. This slowed final review, limited student input, and drained capacity for coaching and support.
MSUCOM’s leadership saw an opportunity to rethink the process. Together with Medtrics, they piloted an AI assistant focused on mechanical accuracy. The tool reviewed over 7,000 comments across 293 MSPEs—offering corrections side-by-side with original text, which students could review before anything was finalized. The AI never rewrote content or changed tone. Final approval remained fully in human hands.
Every AI-assisted MSPE followed a dual-review process: student input first, then final staff verification. No content left institutional boundaries. No changes were made without visibility. As a result, MSUCOM maintained compliance, built trust, and saved staff hours of clerical editing.
For institutions considering AI, this pilot offers a blueprint for responsible innovation—one that enhances workflows without compromising quality or control.
Submit the form at the top of the page to receive a downloadable PDF. Filling out this form allows Medtrics to contact you via email.
Medical education leaders are under pressure to modernize without risking compliance or student experience. This guide offers 25 essential questions that help MD programs evaluate what’s working, what’s not, and what a platform truly needs to support LCME alignment, MSPE efficiency, and competency-based education. You’ll see how Medtrics supports streamlined workflows, integrated evaluations, and real-time curriculum tracking across the full learner lifecycle.
9/12/2024
In an engaging and practical webinar, Natasha Brocks, owner of GME Admin Insights, teamed up with Medtrics to provide invaluable insights into managing evaluations in Graduate Medical Education (GME). Natasha’s 18+ years of experience in GME helped guide attendees through the complexities of evaluation processes, offering strategies to improve compliance, streamline workflows, and ensure the success of residency programs. Designed for GME coordinators and administrators, this session showcased how evaluation management can shape the future of medical education.