Feature Analytics
Feature analytics provide deep insights into how schools adopt, use, and derive value from YeboLearn's 15+ features. This data drives product roadmap decisions, feature optimization, and go-to-market strategy.
Feature Portfolio Overview
All Features Summary
Active Features: 15 core features + 6 add-ons
| Feature Category | Features | Avg Adoption | Avg Usage/Week | Revenue Impact |
|---|---|---|---|---|
| AI-Powered (6) | Lesson Planner, Quiz Gen, Auto-Grade, etc | 70% | 23.4 uses | High |
| Core Platform (5) | Dashboard, Progress, Resources, etc | 82% | 18.6 uses | Essential |
| Collaboration (2) | Parent Portal, Live Collab | 51% | 2.9 uses | Medium |
| Analytics (2) | Student Analytics, Curriculum Align | 62% | 7.1 uses | Medium |
Feature Adoption Distribution:
Features Used per School (Monthly Average)
1-2 features: 18 schools (12%) - Low engagement
3-4 features: 40 schools (28%) - Moderate
5-7 features: 52 schools (36%) - Good
8-10 features: 25 schools (17%) - High
11+ features: 10 schools (7%) - Power usersTop Features Deep-Dive
1. AI Lesson Planner
Adoption: 128 schools (88% of active schools)
Usage Metrics:
- Weekly active users: 1,245 teachers
- Avg uses per teacher/week: 3.8 lessons
- Total lessons generated/month: 18,960
- Avg time saved per lesson: 28 minutes
- Total time saved: 8,848 hours/month
Engagement Patterns:
| Day of Week | Lessons Created | % of Weekly Total |
|---|---|---|
| Sunday | 3,420 | 18% (weekend planning) |
| Monday | 4,180 | 22% (week prep) |
| Tuesday | 2,850 | 15% |
| Wednesday | 2,470 | 13% |
| Thursday | 2,090 | 11% |
| Friday | 1,710 | 9% |
| Saturday | 2,240 | 12% |
Peak Usage Times: Sunday 6pm-10pm, Monday 6am-9am
Feature Flow:
Start Lesson → AI Generation → Review/Edit → Save → Share
100% 95% 88% 85% 42%
Drop-off Analysis:
- 5% abandon during generation (timeout or dissatisfaction)
- 7% abandon after review (not meeting needs)
- 3% don't save (just exploring)
- 43% save but don't share (personal use only)Customization Usage:
- Use templates: 68%
- Customize AI output: 82%
- Add own content: 55%
- Align to curriculum: 73%
Quality Ratings (User Feedback):
- 5 stars: 58%
- 4 stars: 28%
- 3 stars: 10%
- 2 stars: 3%
- 1 star: 1%
- Average: 4.7/5.0
Impact on Retention: Schools using Lesson Planner weekly have 94% annual retention (vs 88% baseline)
Revenue Correlation: 85% of Enterprise tier schools are heavy Lesson Planner users
2. Smart Quiz Generator
Adoption: 118 schools (81% of active schools)
Usage Metrics:
- Weekly active users: 1,080 teachers
- Avg quizzes per teacher/week: 2.4
- Total quizzes generated/month: 10,368
- Avg questions per quiz: 15.2
- Total questions generated/month: 157,594
Quiz Types Generated:
| Quiz Type | % of Total | Avg Questions | Popularity Trend |
|---|---|---|---|
| Multiple Choice | 45% | 12.5 | Stable |
| True/False | 22% | 18.0 | Growing |
| Short Answer | 18% | 8.0 | Growing |
| Fill-in-Blank | 10% | 10.5 | Declining |
| Mixed Format | 5% | 20.0 | Growing |
Generation Time:
- <10 questions: 8 seconds avg
- 10-20 questions: 15 seconds avg
- 20+ questions: 28 seconds avg
- Target: ❤️ seconds per question (currently 1.5s avg)
Edit Rate: 76% of quizzes are edited after generation
- Minor edits (1-2 changes): 42%
- Moderate edits (3-5 changes): 28%
- Major edits (6+ changes): 6%
- Complete regeneration: 18%
Feature Performance:
Quiz Generation Success Rate: 96.8%
Failures by Reason:
- API timeout: 1.8%
- Invalid parameters: 0.9%
- Content policy violation: 0.3%
- System error: 0.2%User Satisfaction: 4.5/5.0 average rating
Improvement Requests (Top 3):
- Difficulty level control (requested by 42% of users)
- More question types (35% of users)
- Faster generation (28% of users)
3. Auto-Grading
Adoption: 112 schools (77% of active schools)
Usage Metrics:
- Weekly active users: 980 teachers
- Avg grading sessions/week: 8.2 per teacher
- Total assignments graded/month: 32,032
- Total student submissions graded: 284,850
- Avg time saved per assignment: 45 minutes
- Total time saved: 24,024 hours/month
Grading Types:
| Type | % of Grading | Accuracy | Teacher Override Rate |
|---|---|---|---|
| Multiple Choice | 52% | 99.8% | 0.2% |
| True/False | 24% | 99.9% | 0.1% |
| Fill-in-Blank | 12% | 94.5% | 5.5% |
| Short Answer (AI) | 8% | 87.2% | 12.8% |
| Essay (AI) | 4% | 78.5% | 21.5% |
AI Grading Performance:
- Objective questions: 99.8% accuracy
- Short answer: 87.2% accuracy (teacher review recommended)
- Essays: 78.5% accuracy (teacher review required)
Teacher Review Patterns:
- Review all grades: 32%
- Spot check (10-20%): 48%
- Only review flagged: 18%
- Trust fully: 2%
Time Savings:
Traditional Grading Time: 45 min/assignment
With Auto-Grading: 8 min/assignment
Time Saved: 37 min (82% reduction)
Monthly Impact per Teacher:
Assignments graded: 35
Time saved: 1,295 minutes (21.6 hours)
Value: $540/month at $25/hour teacher timeFeature Satisfaction: 4.6/5.0 (highest rated feature)
Retention Impact: 96% retention for schools using Auto-Grading weekly
4. Curriculum Alignment
Adoption: 106 schools (73% of active schools)
Usage Metrics:
- Weekly active users: 850 teachers
- Avg alignment checks/week: 6.3
- Total lessons aligned/month: 21,420
- Standards databases: 3 (South African CAPS, IEB, Cambridge)
Standards Usage:
| Curriculum | Schools Using | % of Total | Lessons Aligned/Month |
|---|---|---|---|
| CAPS (South African) | 92 | 87% | 16,200 |
| IEB | 18 | 17% | 3,840 |
| Cambridge (IGCSE) | 12 | 11% | 1,380 |
Alignment Workflow:
Create Lesson → Auto-Suggest Standards → Review/Select → Save Alignment
100% 78% 88% 82%
78% of lessons receive auto-suggested standards
88% of those suggestions are accepted with no changes
12% are modified or custom standards addedImpact on Lesson Quality:
- Lessons with alignment: 4.5/5.0 avg rating
- Lessons without: 3.8/5.0 avg rating
- Improvement: +18% quality score
Use Cases:
- Lesson planning: 58%
- Reporting to administrators: 28%
- Parent communication: 10%
- Compliance documentation: 4%
Feature Requests:
- Add more curricula (US Common Core, UK National Curriculum)
- Multi-standard alignment (align to 2+ frameworks)
- Curriculum gap analysis
5. Resource Library
Adoption: 98 schools (68% of active schools)
Usage Metrics:
- Weekly active users: 745 teachers
- Avg resources accessed/week: 4.8
- Total resource downloads/month: 14,352
- User-uploaded resources: 2,840
- Platform resources: 8,500
Resource Types:
| Type | Count | Downloads/Month | Avg Rating |
|---|---|---|---|
| Worksheets | 3,200 | 5,280 | 4.3/5 |
| Lesson Plans | 2,850 | 4,420 | 4.5/5 |
| Presentations | 1,680 | 2,140 | 4.2/5 |
| Videos | 920 | 1,580 | 4.6/5 |
| Assessments | 640 | 932 | 4.4/5 |
Search Performance:
- Avg search results: 24 resources
- Click-through rate: 38%
- Download rate after click: 65%
- Resource not found rate: 8%
User-Generated Content:
Upload Activity:
Total uploads/month: 240 resources
Avg uploads/school: 2.4/month
Top uploaders: 5 schools (50+ resources each)
Quality Control:
Automated review: 100% (scan for policy violations)
Manual review: 15% (flagged for quality)
Approval rate: 94%
Rejection rate: 6% (inappropriate, low quality)Sharing Behavior:
- Keep private: 32%
- Share with own school: 45%
- Share with all schools: 23%
Monetization Opportunity: Premium resource marketplace (future consideration)
AI Feature Analytics
AI Usage Summary
AI-Powered Features: 6 features
| Feature | Adoption | Weekly Uses | AI Cost/Month | Revenue/Month |
|---|---|---|---|---|
| AI Lesson Planner | 88% | 18,960 | $7,450 | $138,240 |
| Smart Quiz Generator | 81% | 10,368 | $5,280 | $130,680 |
| Auto-Grading (AI portion) | 77% | 3,840 | $3,420 | $120,960 |
| Plagiarism Checker | 61% | 2,080 | $730 | $95,760 |
| Writing Assistant | 50% | 3,280 | $1,580 | $85,500 |
| Content Recommendations | 47% | 2,120 | $740 | $80,360 |
| Total | 70% | 40,648 | $19,200 | $651,500 |
AI Economics:
- Total AI cost: $19,200/month
- Revenue from AI users: $178,500/month (102 schools × $1,750 avg)
- AI cost as % of AI revenue: 10.8%
- Profit margin on AI features: 89.2% (excellent)
AI Quality Metrics
Response Quality (User Ratings):
| Feature | Avg Rating | 5-Star % | 1-Star % | Retry Rate |
|---|---|---|---|---|
| Lesson Planner | 4.7/5 | 58% | 1% | 6.2% |
| Quiz Generator | 4.5/5 | 48% | 2% | 8.4% |
| Auto-Grading | 4.6/5 | 52% | 1% | 3.8% |
| Plagiarism Checker | 4.3/5 | 38% | 3% | 12.5% |
| Writing Assistant | 4.4/5 | 42% | 2% | 9.2% |
| Content Recommendations | 4.1/5 | 32% | 4% | 15.8% |
Retry Rate: Percentage of users regenerating output (indicates initial dissatisfaction)
AI Performance Targets:
- Average rating: >4.5/5 (currently 4.4/5)
- 5-star percentage: >50% (currently 45%)
- Retry rate: <8% (currently 9.3%)
AI Cost Optimization
Cost Trends (Last 6 Months):
| Month | AI Uses | Total Cost | Cost/Use | Optimization Impact |
|---|---|---|---|---|
| June | 62,400 | $14,200 | $0.228 | Baseline |
| July | 68,800 | $14,850 | $0.216 | -5.3% (caching) |
| Aug | 78,400 | $16,400 | $0.209 | -8.3% |
| Sep | 85,200 | $17,600 | $0.207 | -9.2% |
| Oct | 91,800 | $18,500 | $0.202 | -11.4% |
| Nov | 98,500 | $19,200 | $0.195 | -14.5% |
Optimization Strategies:
- Response Caching: 28% cache hit rate saves $2,400/month
- Prompt Optimization: Shorter prompts save $800/month
- Batch Processing: Group requests save $600/month
- Model Selection: Use smaller models where appropriate
Future Optimizations:
- Implement semantic caching (expected +15% hit rate)
- Fine-tune custom model (expected -25% cost)
- Compress prompts further (expected -8% cost)
Feature Engagement Metrics
Feature Stickiness (DAU/MAU)
Feature Stickiness Rankings:
| Feature | DAU | MAU | Stickiness | Category |
|---|---|---|---|---|
| Progress Tracking | 1,850 | 3,200 | 58% | Highly sticky |
| Auto-Grading | 1,680 | 2,980 | 56% | Highly sticky |
| AI Lesson Planner | 1,245 | 2,560 | 49% | Sticky |
| Dashboard | 2,840 | 5,800 | 49% | Sticky |
| Quiz Generator | 1,080 | 2,360 | 46% | Moderately sticky |
| Resource Library | 745 | 1,960 | 38% | Moderately sticky |
| Student Analytics | 520 | 1,440 | 36% | Moderately sticky |
| Curriculum Alignment | 425 | 1,700 | 25% | Low stickiness |
| Parent Portal | 280 | 1,080 | 26% | Low stickiness |
| Live Collaboration | 185 | 940 | 20% | Low stickiness |
Insights:
- High stickiness (>45%): Daily habit features (grading, tracking)
- Medium stickiness (30-45%): Regular use features (planning, quizzes)
- Low stickiness (<30%): Occasional use features (alignment, collaboration)
Improvement Opportunities:
- Parent Portal: Add weekly summaries to drive daily parent engagement
- Live Collaboration: Promote for daily teacher planning meetings
- Curriculum Alignment: Integrate into lesson planner workflow
Feature Retention Impact
90-Day Retention by Feature Usage:
| Feature | Users | 90-Day Retention | vs Baseline | Impact |
|---|---|---|---|---|
| Auto-Grading | 112 schools | 96% | +8% | Very High |
| AI Lesson Planner | 128 schools | 94% | +6% | High |
| Progress Tracking | 95 schools | 93% | +5% | High |
| Quiz Generator | 118 schools | 92% | +4% | High |
| Curriculum Alignment | 106 schools | 91% | +3% | Medium |
| Student Analytics | 72 schools | 90% | +2% | Medium |
| Resource Library | 98 schools | 89% | +1% | Low |
| Parent Portal | 78 schools | 88% | 0% | Neutral |
| Live Collaboration | 68 schools | 87% | -1% | Neutral |
| Baseline (no features) | - | 88% | - | - |
Strategic Implication: Focus on driving adoption of high-retention-impact features (Auto-Grading, AI Lesson Planner, Progress Tracking)
Feature Combination Analysis
Most Common Feature Bundles:
| Bundle | Schools | Retention | Avg ARPU | Tier Distribution |
|---|---|---|---|---|
| Planner + Quiz + Grading | 85 | 98% | $2,400 | 70% Pro, 30% Ent |
| Full AI Suite (6 features) | 42 | 99% | $3,100 | 80% Ent, 20% Pro |
| Teaching Core (Planner + Resources) | 68 | 91% | $1,650 | 60% Pro, 40% Ess |
| Analytics Bundle (Student + Progress) | 52 | 93% | $1,900 | 75% Pro, 25% Ent |
Feature Synergies:
- Lesson Planner + Quiz Generator: 82% who use one adopt the other
- Auto-Grading + Progress Tracking: 75% co-adoption
- Student Analytics + Parent Portal: 68% co-adoption
Feature ROI Analysis
Development Cost vs Impact
| Feature | Dev Cost | Ongoing Cost/Month | Adoption | Revenue Impact | ROI |
|---|---|---|---|---|---|
| AI Lesson Planner | $45,000 | $7,450 | 88% | +$138,240/mo | 3,075% |
| Auto-Grading | $62,000 | $3,420 | 77% | +$120,960/mo | 1,951% |
| Quiz Generator | $38,000 | $5,280 | 81% | +$130,680/mo | 3,439% |
| Progress Tracking | $28,000 | $800 | 66% | +$95,760/mo | 3,420% |
| Student Analytics | $52,000 | $1,200 | 50% | +$62,400/mo | 1,200% |
| Parent Portal | $42,000 | $600 | 54% | +$68,400/mo | 1,629% |
| Live Collaboration | $75,000 | $2,400 | 47% | +$58,320/mo | 778% |
ROI Calculation: (Annual Revenue Impact - Annual Ongoing Cost) / Development Cost
Insights:
- Quiz Generator has highest ROI (low dev cost, high adoption)
- Live Collaboration has lowest ROI (high dev cost, medium adoption)
- All features have positive ROI >700%
Future Investment Priorities:
- High adoption, high impact: Invest in improvements
- Low adoption, high potential: Invest in marketing/onboarding
- Low adoption, low impact: Consider sunset or pivot
Feature Roadmap Impact
Recently Launched Features (Last 6 Months)
Live Collaboration (Launched Sep 1, 2025):
Day 30 Adoption: 35% (target: 30%) 🟢
Day 60 Adoption: 47% (target: 40%) 🟢
Day 90 Adoption: 47% (target: 50%) 🟡
Adoption Curve:
Week 1: 12% ███░░░░░░░
Week 4: 25% ███████░░░
Week 8: 35% ██████████
Week 12: 47% █████████████
Status: Plateauing, need activation campaignStudent Analytics (Launched Jul 15, 2025):
Day 30 Adoption: 28%
Day 60 Adoption: 42%
Day 90 Adoption: 50% (target achieved) 🟢
Exceeded Expectations: Yes
Strong word-of-mouth driving adoptionParent Portal (Launched Jun 1, 2025):
Day 30 Adoption: 42%
Day 60 Adoption: 52%
Day 120 Adoption: 54% (target: 50%) 🟢
Mature Feature: Growth slowing, focus on engagementPlanned Features (Next 6 Months)
| Feature | Priority | Target Launch | Expected Adoption | Revenue Impact |
|---|---|---|---|---|
| Assessment Builder | High | Jan 2026 | 70% | +$85K MRR |
| Video Lessons | High | Feb 2026 | 65% | +$72K MRR |
| Attendance Tracking | Medium | Mar 2026 | 80% | +$45K MRR |
| Mobile App (iOS/Android) | Medium | Apr 2026 | 55% | +$38K MRR |
| Advanced Reporting | Low | May 2026 | 45% | +$28K MRR |
| API Access (Enterprise) | Low | Jun 2026 | 15% | +$42K MRR |
Feature Health Monitoring
Weekly Feature Health Scorecard
Scoring Criteria:
- Adoption (30%): % of schools using feature monthly
- Engagement (25%): DAU/MAU stickiness
- Satisfaction (20%): User ratings
- Retention Impact (15%): Effect on customer retention
- Revenue Impact (10%): MRR contribution
Current Health Scores (Max 100):
| Feature | Health Score | Status | Action Needed |
|---|---|---|---|
| AI Lesson Planner | 94 | 🟢 Excellent | Maintain |
| Auto-Grading | 92 | 🟢 Excellent | Maintain |
| Smart Quiz Generator | 89 | 🟢 Good | Minor improvements |
| Progress Tracking | 86 | 🟢 Good | Maintain |
| Curriculum Alignment | 78 | 🟡 Acceptable | Improve engagement |
| Resource Library | 75 | 🟡 Acceptable | Improve discovery |
| Student Analytics | 72 | 🟡 Acceptable | Drive adoption |
| Plagiarism Checker | 68 | 🟡 Needs attention | Improve quality |
| Parent Portal | 65 | 🟠 Concerning | Activation campaign |
| Live Collaboration | 58 | 🟠 Concerning | Major improvements needed |
Alert Thresholds:
- <50: Critical, consider major overhaul or sunset
- 50-70: Concerning, needs improvement
- 70-85: Acceptable, minor optimizations
- 85+: Good/Excellent, maintain or enhance
Feature Analytics Best Practices
Metrics to Track for Every Feature
Core Metrics:
- Adoption rate (% of schools using monthly)
- Engagement rate (% of adopters using weekly)
- Feature DAU/MAU stickiness
- User satisfaction rating
- Retention impact
Advanced Metrics: 6. Time to first use (after signup) 7. Feature depth (actions per session) 8. Cross-feature usage patterns 9. Error rate and performance 10. Revenue contribution
Feature Launch Checklist
Pre-Launch (-7 days):
- [ ] Define success metrics and targets
- [ ] Implement tracking (events, properties)
- [ ] Set up dashboards
- [ ] Create A/B test for adoption approaches
- [ ] Prepare feedback collection mechanism
Launch (Day 0):
- [ ] Monitor adoption hourly
- [ ] Track errors in real-time
- [ ] Gather initial user feedback
- [ ] Communicate with sales/CS teams
Post-Launch:
- [ ] Day 7: Review early adoption, fix critical issues
- [ ] Day 30: Analyze 30-day adoption, compare to target
- [ ] Day 60: Deep-dive into engagement patterns
- [ ] Day 90: Full feature review, ROI analysis
Next Steps
- Product Analytics Overview - Full product analytics framework
- User Analytics - User segmentation and behavior
- Product Dashboards - Real-time feature monitoring
- Product Metrics - Platform-wide engagement metrics