Feedback System
Understanding and managing the user feedback system
Version: 1.0 Last Updated: 2026-02-04 For: Super Admins managing the CritForge Multi-Feature Feedback System
Table of Contents
- Overview
- Accessing the Admin Dashboard
- Feature Management
- Reviewing User Feedback
- Understanding Analytics
- Reward System Management
- System Controls
- Troubleshooting
Overview
The Multi-Feature Feedback System collects structured user feedback about CritForge features with a gamified reward system. As an admin, you can:
- Create and manage feedback features
- Review user ratings and comments
- Analyze feedback trends
- Flag critical issues (Tier 4 rewards)
- Export data for analysis
- Control system availability
Key Concept: Features are admin-configured topics users can rate. Think of them as survey questions that can be added, edited, or disabled without code changes.
Accessing the Admin Dashboard
Prerequisites
- Super Admin account (
is_super_admin = truein database) - Authenticated session
Navigation
- Sign in to CritForge
- Navigate to
/admin/feedbackfor the feedback dashboard - Navigate to
/admin/feedback/featuresfor feature management
Dashboard Sections
- Feedback Dashboard (
/admin/feedback): View all user submissions, filter by feature/rating, export CSV - Feature Management (
/admin/feedback/features): Add/edit/reorder/deactivate features
Feature Management
What is a Feature?
A feature is a specific aspect of CritForge users can rate (1-5 stars) and comment on. Examples:
- "NPC Generator"
- "Plot Generator"
- "Encounter Builder"
- "UI/UX Design"
- "Mobile App"
Creating a New Feature
Step-by-Step:
- Navigate to
/admin/feedback/features - Click the "Create Feature" button (top-right)
- Fill out the form:
- Feature Key (required): Unique identifier, lowercase, alphanumeric, hyphens only
- ✅ Good:
npc-generator,plot-quality,mobile-app-usability - ❌ Bad:
NPC_Generator,Plot Quality!,mobile app
- ✅ Good:
- Display Name (required): User-facing name (e.g., "NPC Generator")
- Description (optional): Brief explanation (e.g., "Generate unique NPCs with stats and personality")
- Icon (optional): Single emoji representing the feature (e.g., 🎭, 📖, ⚔️)
- Category (optional): Grouping (e.g., "generation", "ui", "performance")
- Phase (optional): Development phase (e.g., "Phase 0", "Phase 1", "Beta")
- Display Order (number): Position in list (0 = first, higher = later)
- Active (toggle): Whether users can see and rate this feature
- Feature Key (required): Unique identifier, lowercase, alphanumeric, hyphens only
- Click "Create Feature"
Validation Rules:
- Feature key must be unique (case-sensitive)
- Feature key format:
/^[a-z0-9-]+$/ - Display name max 100 characters
- Description max 500 characters
Example:
Feature Key: mobile-companion
Display Name: Mobile Companion App
Description: On-the-go access to your NPCs and campaigns via iOS app
Icon: 📱
Category: mobile
Phase: Phase 1
Display Order: 10
Active: ✓ (checked)
Editing a Feature
Step-by-Step:
- Find the feature in the Feature Management list
- Click the Edit icon (pencil) on the feature card
- Update any field except Feature Key (immutable after creation)
- Click "Save Changes"
What You Can Edit:
- Display Name
- Description
- Icon
- Category
- Phase
- Display Order
- Active status
What You Cannot Edit:
- Feature Key (permanent identifier)
Reordering Features
Features are displayed to users in display_order order. Use drag-and-drop to reorder:
Step-by-Step:
- Navigate to
/admin/feedback/features - Hover over a feature card
- Click and hold the grip icon (⋮⋮) on the left side
- Drag the feature up or down
- Release to drop in new position
- Automatic save: Display orders update immediately
Keyboard Alternative:
- Tab to grip handle
- Press Space to grab
- Arrow Up/Down to move
- Space to drop
Activating/Deactivating Features
Toggle Active Status:
- Find the feature in the list
- Click the toggle switch on the feature card
- Confirm if prompted (especially for features with existing feedback)
Effects of Deactivation:
- ✓ Feature hidden from user-facing feedback form
- ✓ Existing feedback preserved (not deleted)
- ✓ Feature still visible in admin dashboard
- ✓ Users cannot submit new ratings for this feature
When to Deactivate:
- Feature deprecated or removed from CritForge
- Pausing collection for a specific area
- Feature still in development (not ready for feedback)
⚠️ Warning: Deactivating a feature with existing feedback preserves all historical data but prevents new submissions.
Deleting a Feature
Note: Deletion is a soft delete (sets is_active = false). Data is not permanently removed.
Step-by-Step:
- Click the trash icon on the feature card
- Confirm deletion in the prompt
- Result: Feature marked inactive, removed from user view
Edge Case: Features with existing feedback show a warning before deletion.
Reviewing User Feedback
Feedback Dashboard Overview
Navigate to /admin/feedback to see:
- Aggregate Statistics Cards (top): Average ratings, total responses per feature
- Filters (middle): Feature, rating range, status, date range
- Feedback Table (bottom): Individual submissions with pseudonymized user IDs
Understanding the Table
Columns:
- User ID: First 8 characters of user UUID (e.g.,
abc123de)- GDPR Compliance: Full IDs not shown by default
- View Email: Click "View Details" for more info (access logged)
- Feature: Which feature was rated
- Rating: 1-5 stars (color-coded)
- Green (4-5 stars): Positive
- Yellow (3 stars): Neutral
- Red (1-2 stars): Negative
- Comment: User-provided text (truncated in table, full in details)
- Status:
pending,acknowledged,resolved - Submitted: Relative time (e.g., "2 hours ago")
- Actions: View details, flag as critical
Filtering Feedback
Available Filters:
- Feature: Show feedback for specific feature only
- Min Rating: Filter by minimum star rating (e.g., only 4-5 stars)
- Max Rating: Filter by maximum star rating (e.g., only 1-2 stars)
- Status: Filter by review status (pending/acknowledged/resolved)
Use Case Examples:
- Find all critical issues: Set Max Rating = 2
- See positive feedback: Set Min Rating = 4
- Review specific feature: Select feature from dropdown
- Unresolved feedback: Set Status = Pending
Viewing Full Feedback Details
Step-by-Step:
- Click the eye icon on a feedback row
- Modal opens with:
- Full comment text
- User ID (pseudonymized)
- Submission timestamp
- Rating visualization
- Optional: Add admin notes, change status
GDPR Note: Viewing details does NOT expose email addresses. CSV export includes emails but is logged.
Exporting Feedback to CSV
Step-by-Step:
- Apply filters (optional) to narrow export scope
- Click "Export CSV" button (top-right)
- File downloads with timestamp:
feedback-export-YYYYMMDD-HHmmss.csv
CSV Contents:
- Full user emails (⚠️ GDPR: logged to admin_audit_logs)
- Feature name and key
- Rating and comment
- Submission timestamp
- Metadata header (export time, admin, filters applied)
Metadata Header Example:
# Export Date: 2026-02-04 15:30:00
# Exported By: [email protected]
# Filters: feature=npc-generator, min_rating=4
Use Cases:
- Quarterly feedback reports
- Sentiment analysis in external tools
- Sharing feedback with development team
- Compliance audits
Understanding Analytics
Aggregate Statistics Cards
Top of Dashboard shows 4 feature cards with:
- Feature Name: E.g., "NPC Generator"
- Total Responses: Number of feedback submissions
- Average Rating: Mean of all ratings (1.0 - 5.0)
- Sentiment: Positive (≥4.0), Neutral (3.0-3.9), Negative (under 3.0)
- Comment Count: How many users left comments
Interpretation:
- High avg rating + many responses = Feature performing well
- Low avg rating + many comments = Pain points identified, read comments!
- Few responses = Low engagement, consider promoting feature
Rating Distribution
How to interpret:
- Skewed toward 5 stars: Users love this feature
- Bimodal distribution (peaks at 1 and 5): Polarizing feature, investigate why
- Many 3 stars: Feature is "meh", lacks wow factor
Sentiment Analysis
Auto-calculated:
- Positive: Average rating ≥ 4.0 (Green indicator)
- Neutral: Average rating 3.0 - 3.9 (Yellow indicator)
- Negative: Average rating < 3.0 (Red indicator)
Action Items:
- Negative sentiment: Prioritize reading comments for improvement ideas
- Neutral sentiment: Identify what's missing to push to positive
- Positive sentiment: Highlight in marketing, understand what's working
Reward System Management
Reward Tiers Overview
Users earn bonus generation credits for providing feedback:
| Tier | Trigger | Reward | Eligibility |
|---|---|---|---|
| Tier 1 | First-time feedback | +2 credits | All users (including trial) |
| Tier 2 | ≥5 features rated + ≥3 comments | +5 credits + Early Adopter badge | Premium/Free (immediate), Trial (deferred until upgrade) |
| Tier 3 | Monthly check-in (30+ days since last) | +3 credits | Premium/Free |
| Tier 4 | Admin-flagged critical bug | Variable credits | Admin discretion |
Max Reward Cap: 10 credits total from feedback (excludes Tier 4)
Tier 4: Flagging Critical Issues
When to use:
- User reports a serious bug with reproduction steps
- User provides exceptionally detailed feedback leading to major improvement
- User identifies security vulnerability
How to flag:
- Review feedback in admin dashboard
- Verify issue is critical/valuable
- Flag as Tier 4 (manual process, requires database update)
- Set credit amount based on severity (5-20 credits typical)
Process (manual for now):
-- Insert Tier 4 reward
INSERT INTO feedback_rewards (user_id, tier, credits_granted, trigger_reason, status)
VALUES ('[user_id]', 4, 10, 'tier_4_critical_bug_npc_stat_calculation', 'granted');
-- Update user profile credits
UPDATE profiles
SET reward_bonus_generations = reward_bonus_generations + 10
WHERE id = '[user_id]';
Best Practices:
- Document reason in
trigger_reasonfield - Be consistent with credit amounts (5/10/15/20 based on impact)
- Notify user via email (manual for now)
Monitoring Reward Grants
Check feedback_rewards table:
SELECT
tier,
COUNT(*) as grants,
SUM(credits_granted) as total_credits
FROM feedback_rewards
WHERE status = 'granted'
GROUP BY tier
ORDER BY tier;
Expected Distribution:
- Tier 1: Highest count (everyone's first feedback)
- Tier 2: ~20-30% of users (comprehensive feedback)
- Tier 3: Monthly users who return
- Tier 4: Very rare (admin-flagged only)
Trial User Reward Deferrals
How it works:
- Trial user submits comprehensive feedback (≥5 rated + ≥3 comments)
- System creates
feedback_rewardsrecord withstatus='pending_upgrade' - User sees message: "Upgrade to Premium to claim 5 credits + badge"
- When user upgrades (Stripe webhook), rewards auto-granted
Admin View:
- Dashboard: See deferred rewards with "Pending Upgrade" badge
- Database: Query
WHERE status='pending_upgrade'
Edge Case: User never upgrades → Reward remains pending_upgrade forever (no harm)
System Controls
Disabling Feedback Collection
Use Cases:
- Sufficient data collected (10,000+ submissions)
- System maintenance
- Redesigning feedback approach
- Seasonal pause (e.g., holidays, major feature releases)
Method 1: Environment Variable (Immediate Global Kill Switch)
This is the primary kill switch for the entire feedback system:
- Edit
.envor environment config (Vercel/Render dashboard) - Set
FEEDBACK_SYSTEM_ENABLED=false - Restart application (or deploy)
- Result: Banner hidden, submissions blocked, eligibility returns false
Environment Variable:
# In .env or deployment platform (Vercel/Render)
FEEDBACK_SYSTEM_ENABLED=false
How It Works:
- Checked first in
FeedbackEligibilityService.isUserEligible()(line 60) - Short-circuits all other eligibility checks (performance optimization)
- Returns
{ eligible: false, reasons: ['Feedback system is currently disabled'] } - Default behavior:
true(enabled) if variable unset or not provided
Method 2: Date Window (Scheduled Time-Boxed Campaigns)
For time-boxed feedback collection (e.g., Q1 2026 only):
- Edit
.envor environment config - Set
FEEDBACK_COLLECTION_START_DATE=2026-01-01 - Set
FEEDBACK_COLLECTION_END_DATE=2026-03-31 - Restart application
- Result: Collection only active during date range
Environment Variables:
# In .env or deployment platform
FEEDBACK_COLLECTION_START_DATE=2026-01-01
FEEDBACK_COLLECTION_END_DATE=2026-03-31
How It Works:
- Checked second in
FeedbackEligibilityService.isUserEligible()(line 69) - Before start date: Returns
{ eligible: false, reasons: ['Feedback collection is not currently active'], nextEligibleDate: '2026-01-01' } - After end date: Same message,
nextEligibleDate: null - No date restriction if both variables unset (default behavior)
What Happens When Disabled:
- ✓ Banner hidden from all users (client-side check on mount)
- ✓ Submissions blocked (API returns 403 Forbidden)
- ✓ Existing feedback preserved (not deleted from database)
- ✓ Admin dashboard still accessible (view historical data)
- ✓ Eligibility checks cached for 1 hour (clear cache with service restart)
Re-enabling After Kill Switch:
- Set
FEEDBACK_SYSTEM_ENABLED=true(or remove variable) - Restart application (or redeploy)
- Banner reappears for eligible users (eligibility cache clears)
- Verify by checking banner visibility as test user
Re-enabling After Date Window Ends:
- Update
FEEDBACK_COLLECTION_END_DATEto future date - Or remove both date variables for permanent collection
- Restart application
- Banner reappears for eligible users
⚠️ Important Notes:
- Disabling does NOT delete data - all feedback preserved in database
- Users can still view their previously submitted feedback via profile
- Kill switch affects global eligibility but does NOT override user-specific dismissals (permanent "Don't show again" remains honored)
- Admin dashboard remains accessible for data export and analysis even when system disabled
- Eligibility results are cached for 1 hour - may take up to 60 minutes for changes to propagate to all users (restart clears cache immediately)
Deployment Platform Configuration:
Vercel:
- Dashboard → Project → Settings → Environment Variables
- Add
FEEDBACK_SYSTEM_ENABLED=false - Redeploy (automatic on environment variable change)
Render.com:
- Dashboard → Service → Environment → Environment Variables
- Add
FEEDBACK_SYSTEM_ENABLED=false - Manual deploy or wait for auto-deploy trigger
Collection Window (Date-Based)
Scenario: Run feedback campaign for Q1 2026 only
Configuration:
FEEDBACK_COLLECTION_START_DATE=2026-01-01
FEEDBACK_COLLECTION_END_DATE=2026-03-31
Behavior:
- Before 2026-01-01: Banner hidden
- During window: Banner shown (if eligible)
- After 2026-03-31: Banner hidden automatically
Null/Unset = No Restriction
Troubleshooting
Common Issues
1. User can't see feedback banner
Possible Causes:
- User not eligible (check eligibility criteria):
- Needs ≥5 generations
- Must have active subscription (trial/premium/free with grace period)
- Banner not dismissed permanently
- Not snoozed (7-day window)
- System disabled (
FEEDBACK_SYSTEM_ENABLED=false) - Outside collection window
Debug Steps:
- Check user profile:
SELECT * FROM profiles WHERE id = '[user_id]' - Check preferences:
SELECT * FROM feedback_user_preferences WHERE user_id = '[user_id]' - Verify eligibility: Call
is_user_feedback_eligible('[user_id]')
2. Feature not appearing in user form
Possible Causes:
- Feature inactive (
is_active = false) - Feature created recently (cache not refreshed)
Fix:
- Set
is_active = truein Feature Management - Wait 30 seconds for cache refresh
3. Reward not granted
Possible Causes:
- User already claimed this tier (check
feedback_rewards) - Trial user (Tier 2+ deferred until upgrade)
- 10-credit cap reached
Debug Steps:
-- Check existing rewards
SELECT * FROM feedback_rewards WHERE user_id = '[user_id]';
-- Check profile credits
SELECT reward_bonus_generations FROM profiles WHERE id = '[user_id]';
4. CSV export empty
Possible Causes:
- Filters too restrictive (no matches)
- No feedback submitted yet
Fix:
- Clear all filters
- Check feedback table:
SELECT COUNT(*) FROM user_feedback
5. Drag-and-drop not working
Possible Causes:
- Browser compatibility (needs modern browser)
- JavaScript error (check console)
Fix:
- Use Chrome, Firefox, or Safari (latest versions)
- Check browser console for errors
- Fallback: Manually edit
display_orderin database
Best Practices
Feature Management
- Start with core features users actually use (NPC Gen, Plot Gen, etc.)
- Use clear, user-friendly names (not internal jargon)
- Add icons for visual appeal (emojis work great)
- Group related features with categories
- Order by importance (most used = lower display_order)
Feedback Review
- Review weekly to catch trends early
- Respond to critical issues (≤2 stars) within 48 hours
- Look for patterns across features (recurring pain points)
- Export quarterly for long-term analysis
Reward System
- Be generous with Tier 4 for truly valuable feedback
- Monitor reward distribution to prevent gaming
- Communicate value of feedback to users (show impact)
System Health
- Check analytics monthly for engagement trends
- Disable when sufficient data collected (10k+ submissions)
- Archive data before major system changes
Support
Need Help?
- Technical Issues: Contact dev team via Slack #admin-support
- Feature Requests: File issue in GitHub repo
- Database Access: Use Supabase dashboard (admin access required)
Useful Queries
Total feedback by feature:
SELECT
ff.display_name,
COUNT(uf.id) as total_feedback,
AVG(uf.rating) as avg_rating
FROM user_feedback uf
JOIN feedback_features ff ON uf.feature_id = ff.id
GROUP BY ff.display_name
ORDER BY total_feedback DESC;
Recent critical feedback (≤2 stars):
SELECT * FROM user_feedback
WHERE rating <= 2
ORDER BY submitted_at DESC
LIMIT 20;
Pending trial rewards:
SELECT * FROM feedback_rewards
WHERE status = 'pending_upgrade'
ORDER BY created_at DESC;
End of Guide
Version 1.0 - Last updated 2026-02-04