Skip to content

Open edX Vendor Evaluation — Requirements Gathering Questions

Use this question bank in discovery sessions with institutional stakeholders before issuing an RFP or beginning vendor outreach. The answers define which evaluation criteria matter most and which can be deprioritized or zeroed out.

Questions are tagged by priority: High = answer is needed before shortlisting; Medium = answer shapes scoring weights; Low = nice-to-have context.


Migration Scope & Timeline

  • [High] Is the institution planning a full migration from an existing LMS to Open edX, or will the two platforms run in parallel for a period?
  • [High] What is the target go-live date or desired timeline for the Open edX instance to be production-ready?
  • [High] Is there existing course content in the current LMS that needs to be migrated? If so, approximately how many courses and in what formats (SCORM, QTI, LTI, etc.)?
  • [High] Are there hard deadlines driven by an existing LMS contract renewal or academic calendar?

Hosting & Infrastructure

  • [High] What is the expected enrollment count in Year 1? Year 3? (Monthly Active Users / MAU is the most useful unit for vendor scoping.)
  • [High] What is the expected peak concurrent user count? (Enrollment counts are helpful, but peak concurrency during exams or registration periods drives infrastructure sizing.)
  • [High] How many courses do they expect to run concurrently on the platform in Year 1? Year 3?
  • [Medium] Does the institution have a preferred cloud provider (AWS, Azure, GCP) or any requirement for on-premise hosting?
  • [Medium] Are there data residency requirements (e.g., data must stay in the US)?
  • [Medium] What uptime SLA expectations does the institution have?

Authentication & Integration

  • [High] If an existing LMS will run alongside Open edX, will there be a shared authentication service (SSO/SAML/LDAP)?
  • [High] Does the institution use a Student Information System (SIS) that needs to integrate with Open edX (e.g., Banner, PeopleSoft, Colleague)?
  • [Medium] Are there other third-party tools that need to integrate (proctoring, video conferencing, library systems, accessibility tools)?
  • [Medium] Is LTI interoperability a requirement for connecting external tools or content?

Analytics & Reporting

  • [High] What specific analytics capabilities has the institution expressed interest in? (e.g., learner progress tracking, engagement metrics, course completion rates, predictive analytics, accreditation reporting)
  • [High] Would the native Open edX analytics solution (Aspects — built on ClickHouse and Apache Superset) meet their needs, or are custom analytics/reporting features anticipated?
  • [Medium] Does the institution need analytics data to feed into external systems (e.g., institutional data warehouse, accreditation reporting tools)?
  • [Medium] Are there specific compliance or accreditation reporting requirements the analytics must support?

Customization & Features

  • [Medium] Beyond branding/theming, are there specific UI/UX customizations the institution requires?
  • [Medium] Are there custom XBlock types or course components needed (e.g., specialized assessment types, interactive simulations)?
  • [Medium] Is a native mobile app a requirement, or is mobile-responsive web sufficient?
  • [Low] Are there e-commerce or payment requirements (course fees, certificate purchases)?
  • [Low] Does the institution need multi-tenant capability or a single dedicated instance?

Support & Operations

  • [High] What level of ongoing support does the institution expect? (Options: business hours only, 24/7, dedicated account manager, SLA-backed incident response)
  • [High] Does the institution have internal technical staff who will manage any part of the platform, or is fully managed hosting preferred?
  • [Medium] What is the expected cadence for platform upgrades? (Stay current with each Open edX named release, or controlled upgrade cycles with extended testing?)
  • [Medium] Is staff and instructor training on Open edX Studio part of the scope?

Budget & Evaluation Preferences

  • [High] Is there an established budget range for the hosting/vendor engagement (annual cost, or total contract value)?
  • [High] Are there any pre-existing vendor preferences or existing relationships that should factor into the evaluation?
  • [Medium] When vendors are closely matched in scoring, what secondary criteria matter most? (e.g., geographic proximity, open-source contribution track record, specific technology stack, implementation timeline)
  • [Medium] Are there any vendors to exclude from consideration, or any past experiences that should inform this process?

Using These Answers in the Evaluation

Once answers are gathered, map them to the scoring framework in 02-scoring-framework.md:

AnswerWeight Adjustment
Migration is out of scopeSet "Migration experience" weight to 0
US data residency requiredTreat "Data residency" as a hard filter (non-negotiable)
Phase 1 only needs standard admin analyticsSet "Custom dashboards" and "Predictive analytics/AI" to 0
Budget is the primary constraintIncrease all Pricing & Terms weights to 1.0
Institution has no internal technical staffIncrease "Fully managed hosting" and "Training & onboarding" weights
Mobile app requiredIncrease "Mobile experience" weight to 1.0
LTI integrations requiredIncrease "LTI integration" weight to 0.8–1.0

Schema Education — Internal Research