Skip to content

Product Analytics Overview

YeboLearn's product analytics framework provides deep insights into how schools, teachers, and students interact with our platform. This data drives product decisions, feature prioritization, and user experience improvements.

Product Analytics Philosophy

Our product analytics approach is built on three core principles:

  1. User-Centric: Understand behavior, not just metrics
  2. Actionable: Every insight should inform product decisions
  3. Holistic: Track the complete user journey from signup to power user

Analytics Framework

Data Collection Strategy

Event Tracking Architecture:

User Actions → Segment → Data Warehouse (BigQuery) → Analytics Tools

              Mixpanel (Real-time Product Analytics)
              Metabase (Business Intelligence)
              Custom Dashboards (Executive Views)

Event Categories:

CategoryEvents TrackedPurposeVolume/Day
AuthenticationLogin, logout, sessionUser access patterns28,000
Feature UsageFeature opens, interactions, completionsAdoption tracking145,000
Content CreationLessons, quizzes, resources createdValue delivery38,000
AI InteractionsAI requests, completions, ratingsAI usage patterns52,000
EngagementPage views, clicks, time on pagePlatform navigation285,000
CollaborationShares, comments, invitesSocial features12,000
AdministrativeSettings changes, permissionsPlatform management8,500

Total Events Tracked Daily: ~568,500 events

User Segmentation

Primary Segments:

SegmentDefinitionSizeKey Metrics
Power Users8+ features, daily usage, high AI adoption18 schools98% retention, $3,200 ARPU
High Engagement5-7 features, 4+ days/week usage45 schools94% retention, $2,100 ARPU
Medium Engagement3-4 features, 2-3 days/week usage52 schools88% retention, $1,650 ARPU
Low Engagement1-2 features, <2 days/week usage22 schools68% retention, $1,100 ARPU
At-RiskDeclining usage, <1 feature/week8 schools42% retention, $850 ARPU

Engagement Correlation: Direct relationship between engagement level and retention/ARPU

Secondary Segments:

  • By Role: Teachers, Students, Administrators, Parents
  • By School Type: Private, Public, Charter, Religious
  • By Region: Gauteng, Western Cape, KZN, Eastern Cape, Other
  • By Tier: Enterprise, Professional, Essentials
  • By Tenure: New (❤️ months), Growing (3-12 months), Mature (12+ months)

Behavior Tracking

User Journey Tracking:

Signup → Onboarding → First Value → Regular Usage → Power User
   ↓         ↓            ↓              ↓              ↓
 100%      95%          80%            65%            12%

Key Conversion Points:

  1. Signup → Onboarding: 95% (5% never complete setup)
  2. Onboarding → First Value: 84% (reach 5+ features in 14 days)
  3. First Value → Regular Usage: 81% (become weekly active)
  4. Regular Usage → Power User: 18% (adopt 8+ features, daily usage)

Journey Optimization Focus: Improve "First Value → Regular Usage" (currently 81%, target 85%)

Feature Performance Tracking

Feature Health Metrics:

MetricDefinitionMeasurement
Adoption Rate% of schools using feature monthlyMonthly calculation
Engagement Rate% of adopters using feature weeklyWeekly calculation
StickinessDAU/MAU ratio for featureDaily tracking
Retention ImpactRetention difference with/without featureCohort analysis
Time to AdoptionDays from signup to first useUser journey tracking
Depth of UseActions per session with featureSession analysis

Feature Lifecycle Stages:

  1. Launch (0-30 days): Track initial adoption, identify early issues
  2. Growth (30-90 days): Monitor adoption curve, optimize onboarding
  3. Maturity (90+ days): Measure steady-state usage, plan enhancements
  4. Decline (usage decreasing): Identify causes, plan refresh or sunset

Analytics Tools and Platforms

Mixpanel (Product Analytics)

Primary Use Cases:

  • Real-time feature usage tracking
  • User funnel analysis
  • Cohort retention analysis
  • A/B test result tracking
  • User path analysis

Key Dashboards:

  • Daily Product Health (engagement, adoption, errors)
  • Feature Performance (individual feature deep-dives)
  • User Cohort Analysis (retention, behavior patterns)
  • Conversion Funnels (signup, onboarding, feature adoption)

Data Retention: 5 years of event data

Metabase (Business Intelligence)

Primary Use Cases:

  • Custom SQL queries on data warehouse
  • Business metric dashboards
  • Cross-functional reporting
  • Executive summaries
  • Financial analytics integration

Key Dashboards:

  • Executive Dashboard (high-level KPIs)
  • Revenue Analytics (MRR, churn, expansion)
  • School Health Scores (engagement + financial metrics)
  • Product Roadmap Impact (feature ROI analysis)

Update Frequency: Real-time for critical metrics, hourly for standard reports

Google Analytics (Website Analytics)

Primary Use Cases:

  • Marketing website traffic
  • Lead generation tracking
  • Content performance
  • SEO monitoring
  • Conversion rate optimization

Not Used For: In-app product analytics (handled by Mixpanel)

Custom Dashboards (Internal Tools)

Built For:

  • Real-time operational monitoring
  • Customer Success team workflows
  • Sales pipeline integration
  • Support ticket correlation with usage
  • Engineering performance monitoring

A/B Testing Framework

Testing Philosophy

When to A/B Test:

  • New feature releases (test adoption approaches)
  • UI/UX changes (test interface variations)
  • Onboarding flows (test completion rates)
  • Pricing changes (test conversion impact)
  • Marketing copy (test messaging effectiveness)

When Not to Test:

  • Critical bug fixes (ship immediately)
  • Obvious improvements (don't delay value)
  • Insufficient traffic (need statistical significance)
  • Legal/compliance requirements (no choice)

Testing Process

Standard A/B Test Workflow:

  1. Hypothesis Definition (Day 0)

    • Problem statement
    • Proposed solution
    • Success metrics
    • Minimum detectable effect
  2. Test Design (Days 1-2)

    • Control vs variant definition
    • Sample size calculation
    • Test duration estimate
    • Randomization strategy
  3. Implementation (Days 3-5)

    • Build variant
    • Implement tracking
    • QA test both versions
    • Set up monitoring
  4. Test Execution (Days 6-20)

    • Launch to percentage of users
    • Monitor for errors
    • Track metrics daily
    • Ensure sample size reached
  5. Analysis (Days 21-22)

    • Statistical significance check
    • Secondary metric review
    • Segment analysis
    • Decision recommendation
  6. Rollout (Days 23-30)

    • Winner to 100% of users
    • Monitor for issues
    • Document learnings
    • Archive test results

Statistical Requirements:

  • Minimum Sample Size: 1,000 users per variant
  • Confidence Level: 95%
  • Minimum Test Duration: 14 days (capture weekly patterns)
  • Maximum Test Duration: 30 days (avoid metric drift)

Active Tests and Results

Recent Test Results (Last 90 Days):

TestVariantsWinnerImprovementImpact
Onboarding Flow3-step vs 5-step3-step+15% completionLaunched
AI Feature PromptsIn-context vs ModalIn-context+22% adoptionLaunched
Dashboard LayoutCards vs ListCards+8% engagementLaunched
Quiz Generator UIWizard vs FormWizard+18% completionLaunched
Pricing PageFeature table vs ComparisonComparison+12% signupsLaunched

Current Active Tests:

  1. Parent Portal Onboarding: Email invite vs In-app notification
  2. Lesson Planner Templates: 10 templates vs 25 templates
  3. Upgrade CTA Placement: Dashboard banner vs Feature limit modal

User Behavior Patterns

Session Analysis

Average Session Metrics:

User TypeSessions/WeekAvg DurationPages/SessionFeatures/Session
Teachers8.524 min12.43.2
Students4.218 min6.81.8
Admins3.132 min18.65.4
Parents1.88 min4.21.2

Session Depth Distribution:

Depth Level% of SessionsAvg DurationConversion to Next Session
Shallow (1-2 pages)28%3 min42%
Medium (3-5 pages)38%12 min68%
Deep (6-10 pages)24%24 min85%
Very Deep (11+ pages)10%42 min92%

Insight: Deeper sessions have higher return rates. Focus on encouraging exploration.

Feature Navigation Patterns

Most Common Feature Paths (Top 5):

  1. Dashboard → Lesson Planner → Quiz Generator (28% of sessions)

    • Use case: Complete lesson planning workflow
    • Avg completion time: 32 minutes
    • Satisfaction: 4.5/5
  2. Dashboard → Auto-Grading → Progress Tracking (18% of sessions)

    • Use case: Grading and student monitoring
    • Avg completion time: 22 minutes
    • Satisfaction: 4.6/5
  3. Dashboard → Resource Library → Lesson Planner (15% of sessions)

    • Use case: Resource discovery and lesson creation
    • Avg completion time: 28 minutes
    • Satisfaction: 4.3/5
  4. Dashboard → Student Analytics → Parent Portal (8% of sessions)

    • Use case: Student performance review and parent communication
    • Avg completion time: 18 minutes
    • Satisfaction: 4.2/5
  5. Dashboard → AI Lesson Planner → Curriculum Alignment (7% of sessions)

    • Use case: Standards-aligned lesson creation
    • Avg completion time: 26 minutes
    • Satisfaction: 4.7/5

Drop-off Points:

  • 22% drop-off at Quiz Generator (complexity barrier)
  • 18% drop-off at Parent Portal (unclear value)
  • 12% drop-off at Student Analytics (information overload)

Time-Based Usage Patterns

Peak Usage Times (UTC+2):

Time BlockTeacher UsageStudent UsageTotal Events
6am-8amHigh (prep)Low38,000
8am-10amPeak (teaching)Medium95,000
10am-12pmHigh (teaching)Medium82,000
12pm-2pmMedium (lunch)Peak (learning)68,000
2pm-4pmMedium (teaching)Peak (learning)112,000
4pm-6pmHigh (grading)Medium78,000
6pm-8pmMedium (planning)Low42,000
8pm-10pmLowLow18,000

Infrastructure Implications:

  • Scale up servers 1:30pm-4:30pm (peak student usage)
  • Optimize for mobile during 6am-8am (teachers at home)
  • Schedule maintenance 10pm-6am (minimal impact)

Device and Platform Distribution

Access Methods:

Platform% of SessionsAvg Session DurationPrimary User
Desktop Web58%28 minTeachers
Mobile Web28%12 minTeachers (planning)
Tablet12%22 minStudents
Mobile App2%8 minParents

Browser Distribution (Desktop):

  • Chrome: 72%
  • Safari: 18%
  • Firefox: 7%
  • Edge: 3%

OS Distribution (Mobile):

  • Android: 78%
  • iOS: 22%

Platform Optimization Priorities:

  1. Desktop web (highest usage, teachers)
  2. Mobile web (growing, important for access)
  3. Tablet (student learning experience)
  4. Mobile app (lower priority, limited use case)

User Feedback Integration

In-App Feedback Collection

Feedback Mechanisms:

MechanismResponse RateMonthly ResponsesQuality Score
Feature rating (1-5 stars)12%4,200High
NPS survey (quarterly)38%1,850High
Bug report button2%680Medium
Feature request form5%1,420High
Live chat feedback8%2,240Very High

Feedback Analysis:

  • Automated sentiment analysis on all text feedback
  • Weekly product team review of themes
  • Monthly executive summary of user voice
  • Quarterly deep-dive on feature requests

User Research Program

Research Activities:

ActivityFrequencyParticipantsPurpose
User interviewsWeekly4-6 usersDeep problem understanding
Usability testingBi-weekly6-8 usersFeature validation
Survey campaignsMonthly200+ usersQuantitative insights
Beta testingPer feature15-20 schoolsEarly feedback
Advisory boardQuarterly12 schoolsStrategic input

Research Impact: 75% of major features informed by user research

Data Privacy and Security

Data Collection Principles

What We Track:

  • Feature usage and interaction patterns
  • Performance metrics (load times, errors)
  • Aggregate behavior (anonymized where possible)
  • User-initiated feedback

What We Don't Track:

  • Student content or work (POPIA/GDPR compliant)
  • Personal student information beyond usage
  • Third-party tool usage outside our platform
  • Teacher grading decisions or comments

Data Retention:

  • Usage events: 5 years
  • Personal data: Retained while customer active + 90 days
  • Aggregated analytics: Indefinite (anonymized)

Compliance

Regulatory Compliance:

  • POPIA (Protection of Personal Information Act, South Africa)
  • GDPR (for any European users)
  • FERPA principles (student privacy, US standard)
  • ISO 27001 (information security)

User Controls:

  • Opt-out of non-essential analytics
  • Data export on request
  • Account deletion with data purge
  • Usage report access for administrators

Analytics Team Structure

Roles and Responsibilities

Product Analytics Lead:

  • Define analytics strategy
  • Ensure data quality and governance
  • Train team on analytics tools
  • Present insights to executive team

Data Analysts (2):

  • Build dashboards and reports
  • Conduct ad-hoc analysis
  • Support A/B test design and analysis
  • Create weekly/monthly analytics summaries

Data Engineers (2):

  • Maintain data pipeline
  • Ensure event tracking accuracy
  • Optimize data warehouse performance
  • Build custom integrations

Product Managers (All):

  • Define feature success metrics
  • Review analytics weekly
  • Use data to inform roadmap
  • Share insights with stakeholders

Success Metrics for Analytics Function

Analytics Team Goals:

MetricCurrentTarget
Dashboard uptime99.8%99.9%
Data freshness<15 min<5 min
Metric accuracy99.2%99.5%
Self-service usage68%80%
Time to insight2.5 days1 day
PM satisfaction8.2/109/10

Impact Metrics:

  • Product decisions backed by data: 85% (target: 90%)
  • Features validated with A/B tests: 62% (target: 75%)
  • User research participants recruited via analytics: 78%
  • Revenue attributed to data-driven decisions: $1.2M ARR

Next Steps

For detailed analytics deep-dives:

YeboLearn - Empowering African Education