Learning Platform Evaluation Framework
Purpose
This framework provides a structured, platform-agnostic set of criteria and questions for evaluating any learning management system or online learning platform — whether for a procurement RFP, a competitive analysis, or an internal build-vs-buy decision.
It is distinct from the Open edX Vendor Evaluation Framework in an important way:
| Framework | What it evaluates |
|---|---|
| Open edX Vendor Evaluation | Which vendor/hosting partner to use once Open edX is chosen |
| Learning Platform Evaluation (this framework) | Which platform to choose in the first place |
Contents
| File | Purpose |
|---|---|
| 01-rfp-sources.md | Annotated bibliography of real RFPs, evaluation guides, and scoring rubrics used as source material |
| 02-requirements-gathering.md | Pre-evaluation stakeholder discovery questions that define which criteria matter for a specific context |
| 03-scoring-framework.md | 13-category, 72-criterion weighted scoring matrix with per-criterion scoring guidance |
How to Use This Framework
Step 1 — Define context Work through 02-requirements-gathering.md with key stakeholders. The answers determine which of the 13 categories carry the most weight, and which criteria can be zeroed out for a given evaluation.
Step 2 — Shortlist platforms Apply hard-filter criteria (data residency, accessibility compliance, minimum scalability) to eliminate non-qualifying platforms before scoring. Use the platform profiles in open-edx-competitor/platform/ for per-platform notes on all 20 documented platforms.
Step 3 — Score Use the criteria and default weights in 03-scoring-framework.md. Adjust weights based on your context from Step 1. Each criterion is scored 1–5; the weighted total enables cross-platform comparison.
Step 4 — Validate Request vendor demos or sandbox access targeting your highest-weighted criteria. Use the specific question prompts in 03-scoring-framework.md as your demo script.
Platforms This Framework Is Designed to Evaluate
All 20 platforms documented in open-edx-competitor/platform/ can be scored against this framework:
Open-Source LMS: Canvas, Moodle, Sakai, Chamilo, ILIAS
Commercial LMS: Blackboard/Anthology, D2L Brightspace, Google Classroom
MOOC & Course Creator: Coursera/edX, Thinkific, Teachable, Kajabi
Enterprise LXP & Collaborative Learning: Sana Labs, Uplimit, Valamis, Fuse Universal, 360Learning, Continu
Specialized: OpenSesame (content), Disco (community), Skilljar (customer ed)
Reference: Open edX
Evaluation Categories (Summary)
This framework uses 13 categories covering ~72 criteria:
| # | Category | What It Covers |
|---|---|---|
| 1 | Learner Experience | UX quality, navigation, mobile, personalization, accessibility |
| 2 | Content & Curriculum Authoring | Course building tools, content formats, AI authoring, templates |
| 3 | Assessment & Credentialing | Quiz types, grading, certificates, badges, proctoring |
| 4 | Administration & User Management | Provisioning, roles, enrollment automation, multi-tenancy |
| 5 | Collaboration & Community | Discussions, cohorts, live sessions, social/UGC features |
| 6 | Analytics & Reporting | Dashboards, skill tracking, compliance reporting, data export |
| 7 | Integration & Interoperability | LTI, SCORM/xAPI, SSO, HRIS/CRM, API, marketplace |
| 8 | Security, Compliance & Accessibility | WCAG, data residency, FERPA/GDPR, SOC 2, encryption |
| 9 | Infrastructure & Scalability | Deployment model, uptime, scale capacity, disaster recovery |
| 10 | Implementation & Onboarding | Setup timeline, migration, documentation, sandbox |
| 11 | Support & Service | Channels, SLAs, account management, community |
| 12 | Pricing & Commercial Terms | Model transparency, TCO, contract terms, exit provisions |
| 13 | Vendor & Platform Viability | Company health, roadmap, customer base, open-source posture |
Framework Origin
This framework was built from a synthesis of publicly available LMS RFP documents, evaluation rubrics, and vendor evaluation guides (see 01-rfp-sources.md for full citations). It was designed to reflect the actual questions that appear in real procurement processes, not theoretical best practices alone.
Sources include: University of Missouri System LMS RFP (2023), Mohave Community College LMS RFP (2021), World Bank LMS Evaluation Rubric, ListEdTech analysis of 69 North American institution RFPs, and evaluation checklists from Docebo, LearnWorlds, and Rippling.