rabbit.reviews
Back to blog

April 14, 2026 · 4 min read · rabbit.reviews

Fake Reviews Are Everywhere. Here's How We Filter Them Out.

In 2019, BrightLocal reported that consumer trust in online reviews dropped from 89% to 81% in a single year. Among 18-34 year olds, the decline was even steeper. That was before the fake review industry really took off.

Today, estimates suggest 30-40% of reviews on major e-commerce sites are fake — paid placements, bot-generated five-star floods, or competitors leaving one-star sabotage. The FTC has started cracking down, but the problem is growing faster than enforcement.

If you've ever bought something based on glowing Amazon reviews and been disappointed, you already know the feeling.

The problem isn't just fake reviews

Even legitimate reviews have issues:

Anonymous user reviews are emotionally driven. Someone has a bad delivery experience and leaves a one-star review on a great product. Someone else is in the honeymoon phase with a new purchase and writes a glowing review before they've lived with it for a month. Neither tells you much about whether you should buy it.

Expert reviews have their own blind spots. Review units are provided by brands. Ad revenue comes from the companies being reviewed. An expert might spend a week with a product in a testing lab — thorough, but different from six months of daily use in a real kitchen, living room, or gym.

Influencer reviews are often sponsored. Even when disclosed, the financial relationship changes the incentive. You're watching an ad, not getting advice.

No single source of reviews can be fully trusted on its own.

What if you could check all of them at once?

That's the idea behind rabbit.reviews.

When you search for something, we don't show you one reviewer's opinion. We cross-reference dozens of sources — Reddit threads, expert review sites like Wirecutter and RTINGS, community forums, user review aggregators — and find where they agree.

A single fake review doesn't matter when you're looking at the pattern across hundreds of real opinions. A biased expert pick gets corrected when the community overwhelmingly recommends something different. An emotional one-star review disappears into the noise when thousands of owners say otherwise.

The signal isn't in any single review. It's in the consensus.

Why we weight community sources more heavily

We give more weight to community sources like Reddit, forums, and user review platforms than to expert review sites. Not because experts are wrong — their technical testing is genuinely valuable. But because:

  • Real owners spent their own money. They have no incentive to lie about a product they chose and paid for.
  • Real owners have lived with the product. Six months of daily use reveals things a one-week test can't — durability, quirks, long-term satisfaction.
  • Communities self-moderate. Reddit downvotes bad advice. Forum regulars call out shills. The crowd is harder to manipulate than a single review.
  • Volume is its own filter. When thousands of people independently recommend the same product, it's hard to fake that signal.

When Reddit unanimously loves a product that an expert site ranked third, we take that seriously. The crowd isn't always right — but when it speaks with one voice, it usually is.

What we don't do

We don't write our own reviews. We don't test products in a lab. We don't accept review units or sponsored placements.

Our rankings are generated by AI that has no idea which products have affiliate links. It reads the sources, counts the consensus, and ranks accordingly. The same search returns the same results whether a product is monetized or not.

We're not trying to replace Wirecutter or Reddit. We're trying to save you the three hours it takes to read both and figure out where they agree.

Trust in reviews is broken. We think the fix isn't better individual reviews — it's better synthesis of all of them. That's what we're building.

Try it at rabbit.reviews.