How We Review

Our complete testing methodology, scoring criteria, and editorial standards. The process behind every recommendation we publish.

Introduction

Most affiliate sites won't tell you how they "test" products because they don't actually test them. They sign up for an affiliate program, write 800 words based on the marketing page, and publish.

We've taken a different approach since 2017. Every product we recommend goes through a structured testing process before we publish a review. This page explains exactly how that process works, what criteria we use, and the standards we hold ourselves to.

If you're going to trust our recommendations with your money and your voice, you deserve to know how we make those recommendations.

How We Test Singing Courses

Reviewing a singing course properly takes weeks. Here's what our process looks like:

Step 1: Purchase and Enroll

We purchase every course we review using our own funds.

Step 2: Complete the Curriculum

A team member completes the entire course from start to finish. For a 30-day program, that means 30 days of practice. For a self-paced program, we work through every module, lesson, and exercise. We don't skim. We don't skip the "boring" parts. We do the actual work in the course.

Step 3: Track Progress with Measurements

Where possible, we measure improvement objectively:

- Vocal range expansion: We record before-and-after range tests
- Pitch accuracy: We use pitch-tracking software to measure improvement
- Breath support: We measure sustained note duration before and after

This isn't perfect (singing improvement is subjective in many ways), but it gives us a baseline to compare programs.

Step 4: Evaluate the Platform

A great curriculum on a terrible platform is a frustrating experience. We evaluate:

- Video and audio quality
- Mobile responsiveness
- Lesson navigation and structure
- Practice tools (pitch trainers, recording features, ear training)
- Community features
- Customer support

Step 5: Scoring Criteria

Every course is scored on six factors, each weighted by importance:


1. Curriculum Quality (25%) - How comprehensive and well-structured are the lessons?
2. Value for Money (25%) - Does the price match what's delivered?
3. Production Value (15%) - Are the videos, audio, and platform polished?
4. Beginner Accessibility (15%) - Can someone with no experience follow along?
5. Instructor Credentials (10%) - Does the teacher have verifiable experience and qualifications?
6. Real Results (10%) - Did our testers measurably improve over the testing period?

Each factor gets a score from 1 to 5. The weighted average becomes our overall rating, then we round to one decimal place. A course scoring 4.8/5 represents excellence across nearly all criteria.

Step 6: Gather Community Feedback

We don't just trust our own testing. We get feedback from others and ask:

- Have you taken this course?
- What did you think?
- Did you experience anything we didn't?
- What would you score it?

This crowd-sourced feedback either validates our findings or pushes us to test more thoroughly. Several of our published reviews have been delayed because we needed to research and investigate further.

Step 7: Write the Review

Only after all of the above do we write the actual review. Our reviews include:

  • An honest verdict
  • What we loved (the genuine pros)
  • What we didn't love (the real cons)
  • Who the course is best for (and who should skip it)
  • A direct comparison to alternatives in the same price range

How We Test Gear

Microphones, headphones, audio interfaces, and accessories follow a similar but adapted process:

Sourcing the Equipment

We purchase most gear ourselves. When manufacturers send review samples, we always disclose this in the relevant review. Sample status never affects our scoring, and we send all samples back after testing unless explicitly told to keep them.

Real-World Recording Sessions

We don't just plug in a mic and record one note. Our gear gets tested in actual recording sessions:

- Multiple vocal genres (pop, rock, R&B, classical, folk)
- Different voice types (tenor, baritone, alto, soprano)
- Different recording environments (treated room, untreated bedroom, live setting)
- Side-by-side comparisons with reference equipment
- Long-duration testing (4+ hours of continuous use for headphones, multiple sessions for mics)

Objective Specifications

We verify manufacturer specifications independently. If a microphone claims a frequency response of 20Hz-20kHz, we test it. If headphones claim a certain impedance, we measure it. Marketing copy and reality don't always match.

Subjective Quality Assessment

Some things can't be measured by spec sheets. We evaluate:

- How natural the sound feels for a singer
- How comfortable the gear is for extended use
- Build quality and durability
- Value compared to similar-priced alternatives

Long-Term Reliability

When possible, we report on how gear holds up over months or years of use. Our reviews are updated when products fail prematurely or prove more durable than expected.

Our Editorial Standards

Our Editorial Standards

Independence

No company can pay us for a positive review. Our rankings are determined by testing results, not affiliate commission rates.

Disclosure

Every affiliate relationship is disclosed.

Updates

We update reviews quarterly to reflect price changes, course updates, new features, and shifting recommendations.

Corrections

When we get something wrong, we fix it publicly and note the correction.

No Sponsored Content

We don't accept paid product placements, sponsored articles, or "guest posts" from companies trying to promote products.

Honest Cons

Every review includes drawbacks. If a product has serious flaws, we say so.

When We Update Reviews

A review published in 2020 might be wildly out of date by 2026. Prices change. Courses get new modules. Companies launch updated versions. We keep our content current through scheduled and triggered updates.

Quarterly reviews: Every 90 days, we audit our top 20 most-trafficked review pages. We verify pricing, check for product updates, refresh screenshots if needed, and update the published date.

Triggered updates: When something significant happens (a course launches a new version, a product gets discontinued, a major price change occurs), we update the affected reviews immediately.

Annual deep refresh: Once a year, we take our highest-traffic articles through a complete refresh. This includes re-testing the products if they've been substantially updated, gathering fresh community feedback, and rewriting sections that have aged poorly.

What You Can Do

Trust is a two-way street. Here's how you can help us maintain the high standards we're committed to:

Let us know when we're wrong. If you bought a course based on our recommendation and your experience differed wildly from our review, email us. We'll investigate, retest if necessary, and update the review.

Share your experience. Real-world feedback from singers using these products will help make our reviews stronger. We read every email and feedback form.

Suggest products to review. Is there a course or piece of gear we haven't covered? Let us know. We prioritize testing based partly on community demand.

Read our Affiliate Disclosure for full transparency about our relationships with vendors.

We can't promise to test every product or implement every suggestion. But we read everything, and your input genuinely shapes our editorial decisions.

Questions About Our Process

If you have specific questions about our review methodology, our editorial policies, or how we evaluate a particular product category, we'd love to hear from you.

Reach us through our Contact page with "Editorial Question" in the subject line. We typically respond within 2-3 business days.

Otherwise, browse our latest reviews and guides to see this methodology in action:

- Best Online Singing Courses - our flagship comparison article.
- Best Microphones for Singers - example of a gear roundup.
- Singorama 2.0 Review or Singing Carrots Review - examples of in-depth course reviews.