You're comparing two products on Amazon. One has 4.5 stars with 12,000 reviews. The other has 4.3 stars with 800 reviews. You go with the 4.5. Obvious choice, right?
Not necessarily.
The rating system is broken
Amazon's star rating is a simple average. That sounds fair until you realize what goes into it.
Fake reviews inflate ratings. The FTC has been cracking down, but the fake review industry is massive. Sellers pay for five-star reviews, offer free products in exchange for ratings, or use review farms to boost their numbers. A 2021 study estimated that about 42% of Amazon reviews were fake. The number hasn't gotten better.
Review manipulation is sophisticated. Some sellers use "insert cards" in packaging asking for five-star reviews in exchange for gift cards. Others systematically report negative reviews to get them removed. Some create variations of the same product listing to merge review pools from different items.
Early reviews skew everything. A product's first 20 reviews have an outsized impact on its trajectory. Sellers know this and focus their manipulation efforts on the launch period. By the time real reviews come in, the rating is already anchored high.
Most real buyers don't review. The people who leave reviews are disproportionately very happy or very angry. The large middle — people who think the product is fine but not exceptional — rarely bother. This creates a bimodal distribution that the star average hides.
What 4.5 stars actually means
A 4.5-star product on Amazon could be:
- A genuinely excellent product that most people love
- A mediocre product with hundreds of fake five-star reviews pulling the average up
- A good product with a vocal minority of one-star reviews from people who received defective units
- A product that was great two years ago but the manufacturer quietly changed materials
The star rating doesn't tell you which one you're looking at.
The reviews that actually matter
If you're going to use Amazon reviews, ignore the star rating entirely. Instead:
Read the three-star reviews. These are written by people who have mixed feelings. They liked some things and didn't like others. They're specific. They're usually honest. They'll tell you the real tradeoffs.
Sort by recent. Product quality changes over time. A product that was excellent in 2024 might have switched manufacturers or cut costs since then. Recent reviews reflect what you'll actually receive.
Look for verified purchases. Amazon marks reviews from people who actually bought the product through their platform. It's not foolproof, but it filters out some of the noise.
Search for your specific concern. If you care about durability, search the reviews for "broke" or "months." If noise matters, search for "loud" or "quiet." Amazon's review search is actually useful for this.
Check the review velocity. If a product suddenly gets 200 five-star reviews in a week after months of steady 3-4 reviews per week, something happened. Usually that something is money.
The bigger picture
Amazon's rating system was designed for a simpler time. It assumes good faith from reviewers and sellers. That assumption no longer holds.
This doesn't mean Amazon reviews are useless. Real owners share real experiences there every day. But the aggregate rating — that single number everyone uses to make decisions — is unreliable.
The solution isn't to stop reading reviews. It's to stop relying on any single source. When a product is recommended by Amazon reviewers, Reddit communities, and professional review sites independently, that convergence is meaningful. No amount of fake reviews can manufacture consensus across multiple unrelated platforms.
The star rating tells you what Amazon's algorithm calculated. The consensus across independent sources tells you what real people actually think. There's often a meaningful difference.
