Free cookie consent management tool by TermsFeed

MSPE Automation at MSU: Faster, Safer, Smarter

Discover how Medtrics automated the MSPE process at LMU-DCOM, reducing errors and administrative strain while elevating student advising—this case study showcases real-world results for institutions seeking smarter academic operations.
Download Case Study
Thank you for your submission!
The Case Study has been sent to your email, ready for download!
Oops! Something went wrong while submitting the form.
Program

MSU - College of Human Medicine

Location

Grand Rapids, MI

Michigan State University College of Osteopathic Medicine (MSUCOM) partnered with Medtrics to pilot an AI-assisted approach to MSPE compilation—targeted specifically at typo and grammar correction. Designed with strict oversight, the project maintained evaluator voice, student control, and FERPA compliance while reducing editing time across nearly 7,000 preceptor comments. This initiative offers a scalable, replicable model for improving MSPE accuracy and efficiency without compromising institutional standards.

Challenge

MSUCOM faced rising administrative pressure during MSPE season, with growing class sizes and a labor-intensive editing process that diverted staff away from high-value advising and slowed finalization timelines.

Solution

To streamline editing without losing narrative quality, MSUCOM and Medtrics implemented an AI assistant focused solely on mechanical cleanup—typos, grammar, and formatting—embedded directly into the MSPE workflow and reviewed by both students and staff.

Results

The pilot processed nearly 7,000 comments across 293 students. Students reviewed AI-suggested edits before staff finalized MSPEs, preserving transparency and control. The result: faster document turnaround, higher accuracy, and better alignment between institutional oversight and student experience.

AI-Assisted MSPE Compilation: A Pilot in Precision and Oversight

At MSUCOM, staff producing Medical Student Performance Evaluations (MSPEs) were burdened by rising student volume and time-consuming manual edits. Each MSPE included dozens of narrative comments submitted by clinical preceptors—often requiring hours of cleanup for typos, grammar, and formatting inconsistencies. This slowed final review, limited student input, and drained capacity for coaching and support.

MSUCOM’s leadership saw an opportunity to rethink the process. Together with Medtrics, they piloted an AI assistant focused on mechanical accuracy. The tool reviewed over 7,000 comments across 293 MSPEs—offering corrections side-by-side with original text, which students could review before anything was finalized. The AI never rewrote content or changed tone. Final approval remained fully in human hands.

A Smarter, Safer Use of AI in Academic Medicine

Every AI-assisted MSPE followed a dual-review process: student input first, then final staff verification. No content left institutional boundaries. No changes were made without visibility. As a result, MSUCOM maintained compliance, built trust, and saved staff hours of clerical editing.

For institutions considering AI, this pilot offers a blueprint for responsible innovation—one that enhances workflows without compromising quality or control.

Download the Case Study

Submit the form at the top of the page to receive a downloadable PDF. Filling out this form allows Medtrics to contact you via email.

Related Features

Interested in learning more about the solution we provided?
Check out the related features below to learn more about how
our software can meet the challenges of your organization.

Other Resources

Bookworms & YouTube bingers rejoice!
Check out our content collection.

Interested in Learning More?

Thank you! Your submission has been received,
we will contact you soon to arrange a demo.
Oops! Something went wrong while submitting the form.