Learning Analytics Toolkit

Downloadable Resources, Instant Access

Create and maintain procedures for identifying areas of risks and opportunities in your organizations quality performance through identification of emerging trends with supporting analysis (Identify/develop key metrics for performance management).

More Uses of the Learning Analytics Toolkit:

  • Manage work with organizational leadership at all levels to develop solutions that directly align with your organizations strategic plan and unlock new business opportunities.

  • Devise: partners with department on the implementation of designs for a variety of modalities, platforms, and scales often leveraging emerging educational technologies, learning science research, and learning analytics.

  • Identify, develop, and maintain relationships with prospective external partners to adopt content aligned with their organizations learning and development goals.

  • Apply project management, communication, relationship management, design, quality assurance, and consultative skills to collaborate with department, course teams, administrators, vendors and other stakeholders.

  • Foster a culture of operational excellence to drive higher levels of system reliability, feature quality and resiliency through modern delivery practices that eliminate operational risks and impacts to service levels.

  • Evaluate: own the strategy and execution as the subject matter expert of end to end quality data as it ties to the quality management system working closely with quality engineers and quality assurance teams.

  • Develop protocol and program data interfaces across information platforms that are used for analytics and Business/ Process Intelligence as part of a digital ecosystem.

  • Provide engineering and technical leadership for project planning and implementation activities with internal and external teams as it relates to analytics, platform integration, digital ecosystem, AI and Machine Learning.

  • Audit: leverage statistical, econometric, stochastic, operations research, predictive modeling, simulation, optimization (linear, mixed integer, constraint programming), and/or machine learning analytics techniques.

  • Ensure your organization prioritizes customer requests based on technology product vision, strategy, end user value and schedules appropriately as a project or task consistent with established standards and/or guidelines.

  • Develop strategies to create and maintain insightful automated dashboards and data visualization to track core metrics and extract useful insights for your organization.

  • Ensure you lead design of dashboards and reports for self and others to provide insights to the business, ensuring execution and delivery of weekly, monthly and quarterly management quality reporting.

  • Contribute to technology innovation and research, technology transfer from research to development, algorithm development, simulation, and modeling.

  • Ensure you lead projects as the analytics and data science subject matter expert in support of end to end quality data and analytics working closely with Quality Engineers and Quality Assurance teams.

  • Steer: work across business areas to influence and support the creation of a data feedback loop as part of the product development life cycle in support of data driven decision making.


Save time, empower your teams and effectively upgrade your processes with access to this practical Learning Analytics Toolkit and guide. Address common challenges with best-practice templates, step-by-step work plans and maturity diagnostics for any Learning Analytics related project.

Download the Toolkit and in Three Steps you will be guided from idea to implementation results.

The Toolkit contains the following practical and powerful enablers with new and updated Learning Analytics specific requirements:

STEP 1: Get your bearings

Start with...

  • The latest quick edition of the Learning Analytics Self Assessment book in PDF containing 49 requirements to perform a quickscan, get an overview and share with stakeholders.

Organized in a data driven improvement cycle RDMAICS (Recognize, Define, Measure, Analyze, Improve, Control and Sustain), check the…

  • Example pre-filled Self-Assessment Excel Dashboard to get familiar with results generation

Then find your goals...

STEP 2: Set concrete goals, tasks, dates and numbers you can track

Featuring 991 new and updated case-based questions, organized into seven core areas of process design, this Self-Assessment will help you identify areas in which Learning Analytics improvements can be made.

Examples; 10 of the 991 standard requirements:

  1. What motivates the sharing, what specific benefits is data sharing believed to deliver that would be difficult or impossible to achieve within a single educational establishment?

  2. Does anonymization of data become more difficult as multiple data sources are aggregated, potentially leading to reidentification of an individual?

  3. Can it confirm in writing that it will only process data in accordance with your instructions and will maintain an appropriate level of security?

  4. How effective is a concept-mapping model as a strategy that supports employee directed learning in a virtual, self-directed learning environment?

  5. Is it unethical for administrators to do whatever possible to help ensure employee success, even if it means stretching the meaning of privacy?

  6. How effective is the inclusion of learning analytics when embedded in a virtual, self-directed learning environment modeled on concept-mapping?

  7. Do you adapt informal programs to learning pace, to the knowledge and skills acquired elsewhere, to the way people learn and to new technology?

  8. How do learning designers find the right content and deliver it to learners in a personalized and prioritized way at the moment of need?

  9. What obligation does your organization have to intervene when there is evidence that a employee could benefit from additional support?

  10. Which learning pathways do employees follow through the learning activities, and are certain pathways more effective than others?

Complete the self assessment, on your own or with a team in a workshop setting. Use the workbook together with the self assessment requirements spreadsheet:

  • The workbook is the latest in-depth complete edition of the Learning Analytics book in PDF containing 991 requirements, which criteria correspond to the criteria in...

Your Learning Analytics self-assessment dashboard which gives you your dynamically prioritized projects-ready tool and shows your organization exactly what to do next:

  • The Self-Assessment Excel Dashboard; with the Learning Analytics Self-Assessment and Scorecard you will develop a clear picture of which Learning Analytics areas need attention, which requirements you should focus on and who will be responsible for them:

    • Shows your organization instant insight in areas for improvement: Auto generates reports, radar chart for maturity assessment, insights per process and participant and bespoke, ready to use, RACI Matrix
    • Gives you a professional Dashboard to guide and perform a thorough Learning Analytics Self-Assessment
    • Is secure: Ensures offline data protection of your Self-Assessment results
    • Dynamically prioritized projects-ready RACI Matrix shows your organization exactly what to do next:


STEP 3: Implement, Track, follow up and revise strategy

The outcomes of STEP 2, the self assessment, are the inputs for STEP 3; Start and manage Learning Analytics projects with the 62 implementation resources:

  • 62 step-by-step Learning Analytics Project Management Form Templates covering over 1500 Learning Analytics project requirements and success criteria:

Examples; 10 of the check box criteria:

  1. Change Request: Have scm procedures for noting the change, recording it, and reporting it been followed?

  2. WBS Dictionary: Software specification, development, integration, and testing, licenses ?

  3. Procurement Audit: Does the procurement unit have sound commercial awareness and knowledge of suppliers and the market?

  4. Cost Baseline: Has the actual cost of the Learning Analytics project (or Learning Analytics project phase) been tallied and compared to the approved budget?

  5. Decision Log: Adversarial environment. is your opponent open to a non-traditional workflow, or will it likely challenge anything you do?

  6. Procurement Management Plan: Are the schedule estimates reasonable given the Learning Analytics project?

  7. Roles and Responsibilities: Once the responsibilities are defined for the Learning Analytics project, have the deliverables, roles and responsibilities been clearly communicated to every participant?

  8. Stakeholder Management Plan: Are actuals compared against estimates to analyze and correct variances?

  9. Procurement Audit: Is there a policy covering the relationship of other departments with vendors?

  10. Closing Process Group: What is the overall risk of the Learning Analytics project to your organization?

Step-by-step and complete Learning Analytics Project Management Forms and Templates including check box criteria and templates.

1.0 Initiating Process Group:

  • 1.1 Learning Analytics project Charter
  • 1.2 Stakeholder Register
  • 1.3 Stakeholder Analysis Matrix

2.0 Planning Process Group:

  • 2.1 Learning Analytics project Management Plan
  • 2.2 Scope Management Plan
  • 2.3 Requirements Management Plan
  • 2.4 Requirements Documentation
  • 2.5 Requirements Traceability Matrix
  • 2.6 Learning Analytics project Scope Statement
  • 2.7 Assumption and Constraint Log
  • 2.8 Work Breakdown Structure
  • 2.9 WBS Dictionary
  • 2.10 Schedule Management Plan
  • 2.11 Activity List
  • 2.12 Activity Attributes
  • 2.13 Milestone List
  • 2.14 Network Diagram
  • 2.15 Activity Resource Requirements
  • 2.16 Resource Breakdown Structure
  • 2.17 Activity Duration Estimates
  • 2.18 Duration Estimating Worksheet
  • 2.19 Learning Analytics project Schedule
  • 2.20 Cost Management Plan
  • 2.21 Activity Cost Estimates
  • 2.22 Cost Estimating Worksheet
  • 2.23 Cost Baseline
  • 2.24 Quality Management Plan
  • 2.25 Quality Metrics
  • 2.26 Process Improvement Plan
  • 2.27 Responsibility Assignment Matrix
  • 2.28 Roles and Responsibilities
  • 2.29 Human Resource Management Plan
  • 2.30 Communications Management Plan
  • 2.31 Risk Management Plan
  • 2.32 Risk Register
  • 2.33 Probability and Impact Assessment
  • 2.34 Probability and Impact Matrix
  • 2.35 Risk Data Sheet
  • 2.36 Procurement Management Plan
  • 2.37 Source Selection Criteria
  • 2.38 Stakeholder Management Plan
  • 2.39 Change Management Plan

3.0 Executing Process Group:

  • 3.1 Team Member Status Report
  • 3.2 Change Request
  • 3.3 Change Log
  • 3.4 Decision Log
  • 3.5 Quality Audit
  • 3.6 Team Directory
  • 3.7 Team Operating Agreement
  • 3.8 Team Performance Assessment
  • 3.9 Team Member Performance Assessment
  • 3.10 Issue Log

4.0 Monitoring and Controlling Process Group:

  • 4.1 Learning Analytics project Performance Report
  • 4.2 Variance Analysis
  • 4.3 Earned Value Status
  • 4.4 Risk Audit
  • 4.5 Contractor Status Report
  • 4.6 Formal Acceptance

5.0 Closing Process Group:

  • 5.1 Procurement Audit
  • 5.2 Contract Close-Out
  • 5.3 Learning Analytics project or Phase Close-Out
  • 5.4 Lessons Learned



With this Three Step process you will have all the tools you need for any Learning Analytics project with this in-depth Learning Analytics Toolkit.

In using the Toolkit you will be better able to:

  • Diagnose Learning Analytics projects, initiatives, organizations, businesses and processes using accepted diagnostic standards and practices
  • Implement evidence-based best practice strategies aligned with overall goals
  • Integrate recent advances in Learning Analytics and put process design strategies into practice according to best practice guidelines

Defining, designing, creating, and implementing a process to solve a business challenge or meet a business objective is the most valuable role; In EVERY company, organization and department.

Unless you are talking a one-time, single-use project within a business, there should be a process. Whether that process is managed and implemented by humans, AI, or a combination of the two, it needs to be designed by someone with a complex enough perspective to ask the right questions. Someone capable of asking the right questions and step back and say, 'What are we really trying to accomplish here? And is there a different way to look at it?'

This Toolkit empowers people to do just that - whether their title is entrepreneur, manager, consultant, (Vice-)President, CxO etc... - they are the people who rule the future. They are the person who asks the right questions to make Learning Analytics investments work better.

This Learning Analytics All-Inclusive Toolkit enables You to be that person.


Includes lifetime updates

Every self assessment comes with Lifetime Updates and Lifetime Free Updated Books. Lifetime Updates is an industry-first feature which allows you to receive verified self assessment updates, ensuring you always have the most accurate information at your fingertips.