recaps of the top 'ask me anything' interviews from reddit and more...
I'm Tommy, I built ReviewMeta - a site that detects "fake" reviews on Amazon. AMA!

Hello Reddit, I'm Tommy Noonan. In 2015, I spent an entire day reading ALL 580 reviews for a product on Amazon. To my surprise, many reviewers admitted they had not used the product, or they got one for free, but still left 5 stars. I noticed dozens of other extremely suspicious patterns after spending the day analyzing the data.

The gears in my head started turning and I realized I could write a computer program to scrape all the reviews and perform a deep analysis in seconds rather than spending all day doing it manually. I could then point it at ANY product on Amazon and generate the same report. This is when the idea for ReviewMeta was conceived.

I launched ReviewMeta in 2016 - you may remember our video hitting the front page of /r/all - the site got the Reddit Hug-o-Death: (oh, and 3 weeks after the video, Amazon changed their TOS and banned incentivized reviews)

Or you may have listened to NPR's Planet Money podcast titled "The Fake Review Hunter" (that's me!)


You can use ReviewMeta by copying and pasting any Amazon product URL into the search bar at (Example report:

I'll be answering your questions about fake reviews detection, review hijacking and other scams from 9:30am to noon (Eastern Time), but will likely stick around and answer some more Q's if they are still trickling in.


Edit: Answering questions as fast as I can! I apologize in advance: many of the answers might have typos, not be proofread or pull info from the "top of my head" (because I don't have time to run queries or look up info).

Edit #2: Wow, the time has flown by! I've answered every new question for a few hours, but need to slow down. I'll be scanning through the top unanswered questions, but might not to be able to get to every last one.

Edit #3: I'm going to focus on some other things for the moment, but will be casually responding to anything interesting/highly upvoted the rest of the afternoon. Thanks for the great questions Reddit!

November 6th 2019
interview date

I just tested a handful of products on ReviewMeta. On one of the products I tested, it said that it removed 28% of the reviews (because it thought they might be fake/un-natural reviews) but the rating stayed the same. The Amazon rating was 4.7 and the adjusted rating was still 4.7. So it appears that the adjusted ReviewMeta rating isn't getting dinged because of the fake reviews. What's going on with products like this? Are there instances where ReviewMeta's adjusted rating is actually BETTER than the Amazon rating? Thanks.


Great question!

Yes, this happens A LOT. I was surprised by it at first, but there's two reasons why this happens:

First: the Average Rating that is given by Amazon is already a weighted average. So it could be the case that the average rating is already adjusted by Amazon to begin with. Here's their official language on it:

Amazon calculates a product’s star rating using a machine learned model instead of of raw data average. The machine learned model takes into account factors including : the age of a review, helpfulness votes by customer and whether the reviews are from verified purchases.

Pro tip: if you click the "Show rating distribution" on the RM report, you can see the change in raw star-rating distribution.

Second: Just because a product has fake reviews does not mean that the product is garbage. The Amazon marketplace can be extremely competitive and difficult to break into, so sometimes sellers resort to "seeding" their new products with fake reviews until the honest ones can come in and take over. Sellers know that it's not a long-term strategy to prop a garbage product up with fake reviews because eventually the honest reviews will take over. They know that the only feasible long term strategy is to create a quality product that will continue to get positive reviews on it's own.

That said, there are still sellers who are NOT in it for the long-term and just looking for that quick scam. So sometimes you will see the adjusted rating dropping significantly. Check out to see some examples of when the adjusted rating drops.


Hello Tommy: how does ReviewMeta differentiate itself from FakeSpot? I use that site regularly for Amazon purchases and it serves me well.

I subscribe to Planet Money, but I haven’t heard your podcast yet. If I had, I might have already checked out your site. Thanks and good luck.


There's a few key differences - the main one is with transparency. Fakespot shares practically nothing about their analysis. I discovered them while I was still in the process of building ReviewMeta and decided that I wanted ReviewMeta to show everything I possibly could about our analysis.

Both sites are estimates - and that's why I think it's important for visitors to sorta check over our work and make sure that we're on the right track.

We've got a bunch of extra tools and tricks on our site that the common visitor might be unaware of:

Here's an article I wrote a few years back going into more detail about the differences between the two sites:


What would be a better dirty play for an ecommerce seller: increase its own rating with 5 star fake reviews or sabotage its competitors' rating with 1 star fake reviews?


Ah, glad you asked. I always tell people to imagine you are an Amazon seller (with a lack of morals) and you have a budget to "buy" 100 fake reviews on Amazon. What is a better investment? To boost your own product with 100 5-star reviews? or to give each of your (dozen?) competitors a few 1-star reviews? Keep in mind that every one of those competitors is going to have a massive incentive to challenge those 1-star reviews and likely complain to Amazon until they get removed. They won't have the same incentive with your 100 fake 5-star reviews.

That's not to say that fake 1-star reviews don't exist. There's a lot of niche categories on Amazon that only have a few sellers, or when two top dogs in the field are duking it out. It definitely happens, but I think it's much less common than you'd imagine.

The more common cause for bogus 1-star reviews (in my opinion) are review brigades. You see it happen a lot to political books - a bunch of people who disagree with the person flood Amazon to bash the book without even reading it. Here's a bit more on brigades:


What algorithms do you use to accomplish this? Do you detect suspicious patterns in the text of a review, or does the algorithm detect suspicious behaviors of users? Both?


Yes and yes. We have 15 tests that look for "unnatural" patterns in the review data. We look at things like the dates that the reviews were posted, whether they are from verified or unverified purchasers, if all the reviews are repeating the same language, if the reviewers have similar reviewing habits, etc.

We then look to see if the ratings are vastly different between the reviews we think are "unnatural" and those that are not. So, for example, if a product gets 50% of its reviews in one day, and those reviews give it 5 stars on average, but all other reviews give it 4.3 stars on average, we know something's up. This helps us throttle back our suspicion-meter in case there's a natural spike in reviews (eg. could be a holiday item).

We show our work in as much detail as possible on every report page. It's a lot of information at first, but if you're a data nerd like myself, it's fun to go through it. Here's an example report to look at:


Any advice with dealing with a company that's astroturfing reviews on Google? A company I unfortunately have to deal with in my personal life has sent out mass emails where if you write 5 star reviews on google they're going to have a raffle to win prizes which amount to hundreds of dollars in cash. I've reported them to google via telephone but I'm not sure what else to do.

They're currently sitting at a 3.6 star cumulative with most reviews being 5 stars from the past two months. They're a terrible company and I hate standing back and watching this.


Man, that sounds frustrating. My friend Jason Brown over at specializes in Google reviews, so maybe he can help you.


Have you ever noticed that some products listed on Amazon have a high review volume and rating, only to scroll down and see that the reviews given are for a completely different and unrelated product?

I'm wondering how these 3rd party sellers are able to manipulate their product listing with popular products to make theirs seem popular and highly purchased. I've witnessed about 4 different cases of this on various items.


Yup. This is called Review Hijacking and we have an warning we display when detected:


Often when something becomes too popular, it is UTTERLY RUINED (haha) by its own popularity. See ;) Or

The ReviewMeta site itself is of course shielded because it has no user-generated content, but what do you think will happen to the way unscrupulous sellers try to game the system?


Ha! I feel like people literally say that about EVERYTHING. "This [city/country/game/company/brand/website] has gone downhill. It used to be cool 10 years ago!"

When I was building ReviewMeta, I was thinking about what would happen if sellers would try to "game" ReviewMeta (since we are so transparent about our algorithm). The answer is that yes, it would be possible to "game", but it would take a lot more coordination and planning of the fake reviews to make sure they fly under the radar of all the tests.

Also, consider that probably less than 1% of Amazon shoppers use ReviewMeta (which is still A LOT of traffic). Is it still worth it for sellers to "game" ReviewMeta?

PS - For those of you that don't know about SupplementReviews - that was a site I started and ran for about 12 years. I think it was going well until I stepped away last year, and then the new owner basically killed it. I also think that whole industry (fitness supplements) is just toxic.


What unlikely products have fake reviews?


Hmm, a few thoughts come to mind.

First, I would say that pretty much every "natural" mosquito repeller product I've seen is propped up entirely by fake reviews. Mainly because they simply don't work.

Second, I've noticed that a lot of self-published e-books are also completely full of fake reviews. That's a category that makes sense but I guess I didn't expect it at first.


Let's do a little wet dreaming this morning...

  1. How could you improve ReviewMeta if you had access to ALL the data Amazon has? I mean literally everything, including exact time reviews are submitted, full user profile info, full purchase history, even IP addresses.
  2. What are some data that would be especially useful in spotting fake unnatural reviews?


Oh man, if I just had access to ALL the data from Amazon. We could do thinks like have a live meter showing the current rate of reviews flowing in, what % of those reviews look unnatural, which products received the most reviews today, which received the most unnatural reviews today, etc.

I think that the IP addresses, shipping addresses, credit cards and event dates (when the product was ordered, shipped, received, reviewed) would help me come up with additional tests with which would help the analysis.


What will it take to clean up Amazon and the fake review eco-system?


Transparency! I think Amazon (and all the large review platforms) need to dramatically increase their transparency. I'm actually working on an argument for why Amazon (and all major review platforms) should be required to open up their review data to the public. Will be posting that next week.

Last year, I mocked up what I think Amazon reviews should look like:


Is there anyway to get the results of your analysis while I browse amazon, or do I have to read the full report on the website each time?


We have browser extensions for Chrome, Edge and Firefox:

These will show you the adjusted rating and PASS/FAIL/WARN color in the extension icon itself. I'm planning on overhauling the extension in the coming months to show you some of the RM details directly on the Amazon page itself!


Hi Tommy. Thanks for your work!

Which product category has the most fake reviews on Amazon?


I get asked this a lot, and my usual answer is "cheap electronic gizmos", but really anything that gets a lot of search volume, is easy to manufacture and has high margins - so the most competitive categories on Amazon. I came from a background in fitness supplement reviews, and must admit that the entire industry is what made me skeptical of reviews in the first place.


How much does it cost to maintain a website of ReviewMeta's size? Do you rely on it to make a living?


It's thousands a month to run, but we sell advertising, and now the traffic is to the point where it's finally covering all the costs and putting a little extra in my pocket on top. I had a fair amount of success with other projects online before ReviewMeta, and made some good (aka lucky) investments, so I'm not really relying on ReviewMeta's income to pay the bills. It didn't start out as a way to "get rich", more just a project I thought would be fun to hack together.


What would you most like to tell us that no one ever asks about?


People can be MEAN! Before I updated the TOS and made it clear we can publish harassment, I would get a few sellers a month sending us very hateful and threatening messages. I even had one guy call my parents house. I have no idea how he got that phone number, but it was kinda creepy.


This might fall outside of the scope of your project, but one thing I find particularly annoying is when Amazon seems to arbitrarily combine different products/versions/sizes/etc. into a single ASIN. Many of those reviews them seem to no longer apply, or only apply to a specific marketplace seller who is no longer listed, or other issues that make them as worthless as fake reviews. Have you looked into this problem at all?

What about the "Amazon's Choice" badge? This isn't a review, but it always seems completely fake and meaningless, trying to mislead people into thinking a product has actually been reviewed personally by Amazon.


The arbitrary grouping of products (called Review Hijacking) isn't done by Amazon, but by sellers. We have a warning in place when it's detected:

I also agree about the "Amazon's Choice" badge. Congress recently sent Jeff Bezos a letter asking about the badge. We did a blog post about it a few months ago:


What is the average amount (a ball park) of fake reviews you found on Amazon since you started? Is it bad?


We estimate anywhere from about 7% to 11%, but our data might be slightly skewed because we only analyze products that our visitors check on ReviewMeta. That said, we still have hundreds of millions of reviews in our DB, so we shouldn't be too far off.

Amazon's PR team always say "less than 1%" which my response is always:

  1. if you know it's 1%, why not delete them?
  2. with 1 billion reviews on Amazon, 1% fake still means 10 MILLION fake reviews on their own platform by their own admission.

Why does Amazon let this happen? It seems pretty obvious when something is fake. Typically the sentences are all disjointed, with bad wording and grammar. "Is great flashlight. Use with waking dog bright! Great value!"


I think Amazon is doing more behind the scenes than people realize. Though I agree that there are often times I see things where I just have to shake my head in disbelief.

However, I have to mention that not all Amazon shoppers are native English speakers. Just because someone doesn't sound like English is their first language does NOT immediately mean it's a fake review. Yes it can be a red flag if EVERY review for a product is like this, but just one review with grammar issues doesn't immediately mean it's fake.


Hi! Thank you so much for your work, I love it! I work on a data science and research engineering team. We gave a weekly journal club where we read an academic paper and discuss it as a team. I was thinking it would be fun to read something related to your work (e.g. algorithms or detection methods for fake reviews) - can you recommend anything?


Thanks for the kind words!

I don't know if anything I've written meets the standard of "Academic paper" for the journal club. There's a lot of stuff on the blog, but I think the top three posts that would spark a good debate would be:


Why are businesses allowed to have reviews about a completely different product on a product page? Often times this boosts the rating and I see it as deceptive.


Ah yes, Review Hijacking. They aren't "allowed" to, but they do it anyway. It has to do with the open marketplace nature of Amazon. It's been well over a year since we first reported on this and I can't believe Amazon hasn't fixed it yet:


I think we can all agree that Amazon could do A LOT more to filter out worthless reviews, especially since they have access to much more internal data. Do you suspect that Amazon simply doesn't care THAT much about unnatural reviews, or rather that their calculations show that cracking down harder would affect sales too much?


They are definitely doing something. I think they do a lot more than the average person realizes because the average person doesn't get to see what's going on behind the scene.

Our data shows that they've deleted millions of reviews, and we don't even get to see how many reviews never saw the light of day!

That said, there are times when you look at the reviews and you just have to smack yourself on the forehead because you have no idea how they made it through their system.


Hi Tommy, thanks for your site, I use it a lot.

How often have you had to adjust your algorithms since you originally went live? Have you had to make additional changes as scammers became savvy to how your site works?


Thanks! Glad you are using it!

I've had to tweak some things here and there, added several features (including one that helps detect review hijacking), but nothing too major. Here's a few changes we've made:

As far as scammers "gaming" RM, here's an answer


I recently bought an item that came with a card asking for me to leave a good review on their product in return for a free Headphone stand. I then emailed my review and order number to an outlook email and received the item.

How would your algorithm detect this, if my review makes no mention of being compensated (which is against TOS and will get you banned from leaving further reviews.)?


It might not detect this. What we do is an estimate, and it would be impossible for anyone to catch every single inauthentic review.

However, if everyone that reviewed their headphones ALSO reviewed their headphone stand, that would certainly get flagged in our system. There's a lot of other ways your review might get flagged, but it could definitely slip through the cracks.


Hi Tommy. ReviewMeta looks really helpful to people who shop on Amazon, which is pretty much everyone. Great work!

My question is... Is it necessary to read through the entire review analysis provided on ReviewMeta? Or can I basically just look at the "adjusted rating" at the top of the page?



I think that everyone uses ReviewMeta differently. I'll share with you how I personally use it when I shop on Amazon:

First, I check the adjusted rating and analysis results (PASS/FAIL/WARN). If it's a PASS and there's 0% reviews removed, I pretty much stop using ReviewMeta for that product.

If there's some reviews removed, I scroll through the tests and see which ones failed, trying to get a sense of whether or not I want to make a gamble on the product.

There's some categories where pretty much EVERY PRODUCT gets a fail. I was looking to buy a projector (a cheap one, under $200) and pretty much every product I looked at failed. So this is an instance where I was looking to see which ones failed worse.


Hey Tommy. Your website is great —very helpful in theory but can be misleading in practice. I'm the author of a self published book on Amazon and have 73 5* reviews that admittedly surpassed my greatest expectations in their positivity. But my book reviews completely fail your tests even though not 1 review is solicited, nor 1 reviewer known by me.

As a result my reviews are measured as fake (70% removed) even though they're all genuine. Is there anything I can do about this? Or what precautions can you take to not to "penalise" the true reviews.


One thing that's important to keep in mind is that not everyone has the same notion of what constitutes an "fake" review. We don't actually use the term "fake" - we use the term "unnatural" which encompasses much more than just "fakes".

So things that may be standard industry practice in the self-published author field might not be acceptable on ReviewMeta. For example, giving away free copies for a review or reviewing friend's/fellow writer's books.


I was enjoying the free products 'in exchange for an unbiased review' which was obviously just all 5* reviews since you would not get much free stuff if you trashed everything. So good that this practice came to an end. However, as most people know, the companies now contact users off amazon, have the user buy and then reimburse once the review is made. Obviously they get a lot of 5* because they would likely stop responding if someone put a 2* review and then asked for their money.

I noticed that a lot of reviews that seem 'fake' now , at least to me, are the ones where the user puts some photos and writes a long descriptive overly positive review. I think these are related to the payback scheme outlined above. The real ones are generally less professional looking and more anecdotal. However, there are some people that go all out and are still legitimate, including photos and long winded descriptions. What is your thoughts on this and is there a way to determine which are legit and which are bogus among those lengthy reviews?


I've seen similar patterns as well. I think you can tell the products that have the paid reviewers because EVERY SINGLE REVIEW is like the ones you described above.

I've answered a similar question in terms of the underground facebook incentivized review detection:


How much knowledge do you need to create a website such as yours?


I had spent about 10 years developing websites on my own prior to launching ReviewMeta, but I would by no means consider myself an expert in the field.

One thing I always tell people is that I didn't study Computer Science - I studied Construction Management. It has almost nothing to do with what I'm doing now, but that fact hasn't held me back at all. There's a gazillion tutorials to learn any programming language online for free, so if you want to start a project, I encourage you do just dive in and start playing around with things.

That said, ReviewMeta took way longer and was much harder to build than I originally thought. I came up with the idea in maybe October, 2015, and didn't launch until maybe May, 2016, and didn't even work out most of the bugs until 2017. It was a LONG process. I had a few contractors to help with some of the stuff but I did most of the coding myself.

Edit: wrong year


I see that ReviewMeta uses an "Adjusted Rating" system, rather than assigning a letter grade (A - F) like FakeSpot does. Why did you decide to use the system you did?


I think that the most important thing to know is what the honest reviews are saying. Not whether there are fake reviews present, or some completely arbitrary letter grade. We calculate the adjusted rating based ONLY off the existing reviews - we never "punish" a product for having "fake" reviews. So, for example, if a product ONLY has 5-star reviews, the adjusted rating is either going to be 5 stars or "insufficient reviews".


Hi Tommy,

You seem to use medians/averages to detect outliers. Considering the millions of products available on Amazon, a whole lot is not bought and a lot more is not reviewed.

How does your algorithm proceed when it's the first review on a product?


That's a great question. With one review it's almost impossible to run the analysis. There's actually a disclaimer that shows up on our report if you analyze a product with just one product that says:

With only 1 review for this product, it's difficult for us to analyze review trends. We'll try our best, but keep in mind that there isn't a lot of data for us to go off of.

With two reviews, it's still hard to perform the analysis. Our tool works best when there's 20+ reviews.


Hi Tommy, how do you plan to generate revenue? Are you in discussions with Amazon to license your tech directly? If so what do they think about the problem and your solution?


We sell advertising on ReviewMeta, so that's how we generate revenue. I'm not in talks with Amazon to license the tech directly. I'd be happy to help them, but they don't seem receptive to my help.

As far as what Amazon thinks - in my opinion, they are a very reactive company. If you take a look at the incentivized review problem from 2016, it wasn't until there was massive public outcry that they actually solved the problem. This is a pattern with them. Sellers exploit loophole, public gets mad, Amazon reacts.

I don't think I'm aligned with their values and goals. First, they are a publicly traded, for-profit company, looking to increase profits and share price. From what I've seen, they are just trying to get you to click the "add to cart" button faster.

My goals and values are around bringing more information and transparency to the process. That will slow down the process of the consumers hitting the "add to cart" button. So I don't think they'd want my help.

I wrote this a few years back, thought it would be an interesting read for you:


Alexa skills are free to download, so they will never have verified purchases, does that skew the results?


Great question! The "Unverified Purchasers" test is just one of 15 tests that we run on the data. Reviews that are verified obviously do better in that test, but the badge doesn't give them any benefit at all in the other 14 tests.


I wonder if you could do this for sites like Airbnb or VRBO?

I had a situation over the summer where I had rented an Airbnb for a week that, in retrospect, absolutely had to have been populated with fake reviews by the owner, his friends, or whatever scam organization owed the property. There were people who had supposedly stayed at the property just the prior week who were extolling its cleanliness. When we got there there was grease dripping out of a vent hood on to the stove and then down onto the floor and that was merely the most egregious of issues with the cleanliness of the property.


I've touched down on why we don't support other platforms in a few other questions - mainly because it's a massive undertaking from a programming standpoint, and other platforms don't share key data we would need.

As far as AirBnb goes, I think their review system is much tighter than Amazon's. You have to have actually stayed in a place to leave a review. Yes, you could have friends pay you through the platform and leave fake reviews, but it's much more of a process to get those reviews through. Not to say there aren't fakes - just in my opinion, I don't think the problem is nearly as bad.


how much money are you raking in from the affiliate links?


None. Amazon kicked us off the affiliate program after like a week or something.


Are there plans for other web stores?

I was watching a youtuber buying random stuff from wish, he pointed out alot of the reviews had 5 stars but the user hadn't even tried the product yet. I've heard buying from wish can be a gamble, and something tells me fake reviews are apart of the shadyness


I've never really looked at "wish" until now!

We actually get asked this a lot (about supporting other platforms). The answer is that it's a massive undertaking to program in support for other platforms, and Amazon is taking up all of my time at the moment.

Also, there's some technical reasons that we can't analyze reviews on a few other platforms that I've looked at. Walmart, for example, does not show any unique identifier for the reviewer. They just show the "name" of the review (eg. "fred"). There could be a million reviewers named "fred", and I'm sure these reviews could change their display name as well. This makes it impossible for us to collect a reviewer's history which is a huge part of our analysis.

Another reasons why these large review platforms need to become more transparent and open their review data up to the public!

Edit: Heard of Wish, never really looked at the reviews on the platform.


This may be a pretty obvious one, but can you put in words why fake reviews are bad/damaging to customers and their shopping experience? I think some people would just say "don't read the reviews" but obviously a lot of people rely on reviews to make purchase decisions. Could just comment generally on how fake reviews are used/abused and how that impacts consumers?


I think that the ones hurt the most are the honest sellers. The ones that play by the rules and lose all their business to the fakers.


Thanks for this excellent site, will be using it from now on.

Is there any sites you can use this on, that you didn't expect it to be used on?


Just every Amazon site - .com, .ca,, .fr, .it,, .cn,, etc. Here's why we only support Amazon:


Love your site and extension!

What could Amazon do to its review system that would make ReviewMeta obsolete in a year?


Who is asking? Is this Amazon???

I think there's a lot that Amazon could do, but people will always be skeptical of the self-policed platform and look for third-party analysis to double-check. So I don't know that Amazon could make us obsolete (aside from deleting all the reviews entirely).

Here's our suggestions on what Amazon reviews should look like:


I’ve noticed the new trend with fake reviews are private Facebook groups, where the seller organizers reviewers to purchase their product themselves, then later receiving a reimbursement via PayPal. This leaves no money trail that amazon could track, and gives the reviewer a “Verified Purchaser” tag. Is your website able to detect this type of fake review?


Yup, this is definitely something we see as well. There are 15 tests on ReviewMeta, and the "Verified Purchaser" badge only comes into account on one of them (called the "Unverified Purchases" test). So the badge doesn't give the reviews any extra protection on the other 14 tests.

We can still pick up the patterns in other tests. For example, our "Overlapping Review History" test will look at the product review history for all the reviewers of a product, and see if there's a lot of overlap. Basically it checks to see if the reviewers are all reviewing the same products. This happens to be the case for a lot of facebook groups - the cluster of reviewers participating in the group will all end up having similar review histories.


Lately I've been seeing products whose reviews are for completely different products.

Clearly the seller just swaps out the picture and description and price, and Amazon allows that.

Does your algorithm detect those real precious reviews which are for unrelated products?


Yup, this is called Review Hijacking and we display a warning when detected:


Any way this could be ported to a chrome or Firefox extension that flags posts live as you're browsing?


It looks like the main function of ReviewMeta is to copy & paste in an Amazon product URL to see the analysis of its reviews. Don't get me wrong... that's an awesome function... but it seems like it would be cool if there was a SEARCH function that allows me to actually search for a product I'm looking to buy. This way, I could actually START my shopping at ReviewMeta & then go to Amazon to buy the product I want that has good, real reviews (rather than starting my shopping on Amazon & then using ReviewMeta to check it before buying it).

Do others agree that a SEARCH function would be useful?


I agree, a search function would be nice, but it's much harder to build than you'd think. Amazon has a large team of well-paid engineers all dedicated to nothing but the search engine. I don't have the luxury of that resource. It would be really hard for me to make a decent search engine that was able to find the best results for 100's of millions of products, many of which may be out of stock, unavailable or discontinued.

We do have free browser extensions available: - and we're working on some enhancements that will streamline the process even more.


Has Amazon itself ever reached out to you directly about ReviewMeta? Do you know how they feel about your site?


Someone from Amazon's PR team and I had a chat a few years back. It was about a recent article that was written by a journalist who misquoted me. We were both annoyed about the misquote and they were just sharing tips on how to better work with journalists in the future. Other than that, nothing!


Thank you, @ReviewMeta, for what you've done.

Have you considered tackling services like Glass Door and Indeed for their employer reviews? It's not uncommon for employers who've had bad reviews from departing/current staff to flood their reviews with doctored/manufactured testimonies which attempt to shine a better light upon the company. In many cases, these attempts are pretty transparent, but some are clever enough to pay firms to handle this for them.


To be honest, I haven't spent much time poking around Glass Door or Indeed, but just from what I understand about the nature of these reviews, it would be a mess to analyze.

Reviewing a pair of headphones is one thing - it's very objective. But reviewing an organization that you were (allegedly) part of, with people that you were friends (or enemies) with, is going to create a whole different set of challenges to analyze the reviews.

But like I've said in some other responses, it's a massive undertaking to integrate new platforms, which is the main reason we're only looking at Amazon right now.


Since you've already done the work of scraping Amazon why not add a price Tracker?


I guess I never really thought to do it since there are so many other ones available. It's a bit more complicated because every product has multiple different prices and shipping costs associated with it.


I'm a little late to the party but I noticed one shortcoming of reviewmeta, and it has to do with my biggest qualm with amazon--and why I avoid using Amazon these days.

ReviewMeta seems to do a good job at finding misleading reviews but, imo, a much bigger issue is misleading product descriptions. As a climber, one glaring example for me is what comes up when you search "climbing rope." The top results are all wholy unsafe for use as a life-saving devices yet they are marketed as climbing ropes, parachute rope (da fuq?), fire rescue, escape rope, etc. Most reviewers don't realize that while the rope description lists a weight limit, that is a far cry from the independent rating organization that rates real climbing ropes and other safety equipment. Fortunately some people point this out in the reviews, strongly advising people to not trust this rope with their lives. Those reviews get flagged by ReviewMeta as illegitimate because they're unverified or one hit wonders, but they're right, and they could save lives. And there are obviously plenty of other products that have misleading descriptions, electronics are a huge one, books too.

So my question, if you get this far down Tommy, is how can we know when a product is legitimate or not? ReviewMeta will remove reviews deemed overly-critical even when that criticism is entirely valid. The rope I mentioned above is 4.6 stars before and after analysis, the positive reviews it removes all sound like 5 star bot reviews, while negative reviews removed are legitimate complaints about the safety of the product. ReviewMeta is basically making this illegimate product appear more legitimate by removing the very real complaint reviews. (edit to note, I blame Amazon for this issue, not ReviewMeta, people who know a product is fake are not going to buy it and therefore will not be verified purchasers, making it hard for them to warn people of a fake or misleading product). Is this a problem you can address?


I totally feel you. Amazon's got a big problem with counterfeit and dangerous products, and as far as I can see, ReviewMeta's not in a position where I can see us doing anything to solve it with an algorithm. We're not even looking at product descriptions at all - but even if we did, I'm not sure I could figure out a way to grade them.


Please stay away from the 'three wolves and a full moon' shirt reviews! How do you rate the 'Haribo' gummies?


How do you rate the 'Haribo' gummies?

Depends which flavor...


I don’t know if this is something you can determine, but one of my biggest issues with Amazon are all the fake knockoff products being sold as real. I don’t know if the reviews are accurate but many seem to state that the quality isn’t up to par with what you can buy at a store or they receive a completely different product in the same or similar packaging. A large amount of health and beauty products have this issue. Is there any way to determine fake products as well as fake reviews?


Totally agree. Unfortunately, I don't see ReviewMeta being in a position to determine this automatically.


Does ReviewMeta use Big Data technology? If so, I would like to know what should I learn to understand it, since they say the future (and present) will be built on Big Data.


The analysis is custom built in PHP and MySQL. There's billions of rows of data in our DB. Does that count as Big Data technology?

Everything I do has been self-taught. I'd probably learn Python and R if you're looking into a career in the field of data science, but I'm not the best person to be giving career advice.


I understand how your service works, respect such a noble cause, and am very impressed by its performance. However, I can't help but feel like the goal itself is "turtles all the way down," so to speak. A conclusion reached from a series of assumptions relying on one another. One can reasonably assume that shill Amazon reviewers use repeated phrases - that's just common sense. But do you know that such behaviors are indicative? If so, how?


I'm not sure I understand the question. There are 15 tests - no single test is perfect on it's own, and there's a lot more that goes into each test than meets the eye.


2 of my coworkers had the same cheap bluetooth headphones on their desk yesterday, I asked them how much they paid and if they're any good, they said they got em free for leaving a review on amazon. I asked, so you had to leave a 5 star positive review in order to get them free right? Yup. I shamed them, they're part of the problem.


Honestly, I don't blame them too much. There's always going to be plenty of people wanting to get something for free. It's really Amazon's responsibility to better police and protect their own platform.


Amazon could probably buy you, or Fakespot, or any of your competitors with loose change from Jeff Bezos' sofa, yet they do a piss-poor job of policing reviews. At some point we have to wonder if there is an Upton Sinclair reverse Hanlon's Razor at work here, "do not ascribe to incompetence what can be better explained by willful ignorance of something that makes you money".

What's your take on the issue, are they deliberately not taking action because they make money when people make a purchase, even if it is motivated by deceptive reviews?


As I've mentioned a few times, Amazon is a publicly traded, for-profit company. I see them as very reactive. They seem to be good at cleaning up a mess once they get enough bad press. That said, I think that it's hard for us to see what Amazon is doing behind the scenes. My data shows they are deleting millions of reviews, but we don't have any idea how many reviews never see the light of day.


I do a lot of shopping on the Amazon app now (Android). Copy pasting a link sounds cumbersome. Any workarounds?


Yup! There's an app. When you're on the Amazon Shopping app, just click the three little dots -> share -> ReviewMeta, and we pull up the report!


I've seen instances on Amazon where the reviews (especially the older reviews) appear to be for a completely different product than the product I'm looking at. For example, the other day I was looking at the reviews for an iPhone screen protector and a lot of the reviews kept talking about a phone case... NOT the product I'm looking at.

Why would Amazon allow sellers to compltely change the product they're selling? Can ReviewMeta detect this kind of thing?


Yup - this is called Review Hijacking and we have a warning that comes up when detected:


What’s the worst fake review you’ve seen?


Earlier this year, there was this flood of millions of unverified 5-star reviews on Amazon. A product would literally have exactly 1,000 reviews written in 2 days, all unverified, all 5-star, all just a few words. It was crazy to see that getting through the filter.

Here's a post about what we saw at the time. Amazon seems to have closed this loophole up though:


Would it be possible for your own site to have a way to view adjusted ratings for a list of search items, instead of one item at a time specifically?

I'd use it much more often. Think of it this way, if I'm trying to find an item that's the best reviewed item, I have to do the original search but check each item individually with reviewmeta. It'd be great to have a custom search that filters by adjusted reviews.


We're going to be rolling out an enhanced (still FREE) browser extension that will have this functionality in the coming months!


Do you store scanned reviews (content, not just review ID) on ReviewMeta servers for future analysis? Have you analyzed any reviews that Amazon deleted and what did you find if so? For example were they especially egregious?


Yes, we store the review data on our servers. Yes, we look at deleted reviews. Here's an analysis we did of deleted reviews on Amazon a year or so ago:


Awesome contribution. I’m curious to know - what happens to vendors once it’s proven out they are pushing fake reviews?


Do you mean if they are called out on ReviewMeta? Nothing. We just show our findings and that's all.

If Amazon catches them, they could either have some/all reviews deleted or be banned (temporarily or permanently) from selling on Amazon. Some say that sellers can sign right back up under a different name though, but others say it's impossible to get back on.


Why do you suppose Amazon, which sells AI via AWS, doesn't use any to combat the fake reviews? Are they that uncontaminated with integrity?


I'm sure they DO use AI to combat fake reviews - we just never see what is going on behind the scenes. Our data shows that Amazon has deleted millions of reviews, and that doesn't include any that might have never seen the light of day in the first place.


I run a fake review program with Chinese sellers. We frequently vet new accounts and newly approved reviews through your tool. My question is, how has Amazon itself not caught onto fake reviews yet?


Interesting. Our data shows that Amazon is deleting millions of reviews, and it's impossible for us to know what more they are doing behind the scenes. They are definitely not doing nothing.


It's hard to draw comparisons between a brick-and-mortar local business and a Chinese iPhone charger, and you probably don't know as much about Yelp as you do about Amazon, so this is just asking for your opinion:

Do you think Yelp and their mysterious "recommendation engine" does a better job filtering out unnatural reviews? Some business owners complain that this engine is overly zealous and removes fully legit reviews, while Yelp says they prefer to err on the side of caution. Do you think Yelp has a better incentive to have high-quality reviews?


Yes - I think the main draw to Yelp is the reviews, so they have a bigger incentive to keep the reviews legit. The fact that they have a recommendation engine is a good start. Obviously it can't always be perfect, and business owners will complain no matter what.

The only experience I have with yelp is when a previous real estate agent I knew asked for a review. I created an account, posted an honest 5-star review and the review was filtered out by the recommendation engine. I figured that made sense as it was the ONLY review I ever had written on Yelp, so of course I'm not going to look very trustworthy.


I see a trend what some sellers are listing items that are different from the what the reviewers are reviewing. For example, I might be looking at a losing for a kitchen knife, but the reviews talk about a knife sharpener. Does your algorithm recognize this? I realize these are harder for bots to spot.

Also, I used to get free/discounted items to review. I wrote honest reviews, took pictures and took the retail price into consideration. I had a bunch of "helpful" votes. How would my review be interpreted, considering that I review the same subset of items that might often get fake reviews. I've given plenty of negative reviews (ie. A 1080p dashcam they only records at 720p).


Yes, you're talking about Review Hijacking, and we have a warning that shows up if we detect it:

Your reviews are analyzed on a review-by-review basis, so it really depends on the product you are reviewing, and how your review and profile fits into the collective picture we're seeing for that product.


Tommy - I love your product but I am a Safari user and the browser extension no longer works. What are the plans to get that back working?

Edit: And what is the rationale for Amazon not doing what you are doing? I think it would make a lot of sense for them to buy your company and have you do what you do in house. Integrity in shopping would make way more sense and I would assume translate into more sales/profits.


Ugh, I'm so annoyed with Apple on that. They completely changed the way they handle extensions and I just haven't had time to jump through all their hoops again. I'm planning a big overhaul of the extensions soon, and part of that will be fixing (and improving) the Safari extension.

I doubt Amazon has any interest in buying me. They already have a department dedicated to stopping review fraud, and I think that we're more powerful as an unaffiliated third party. People would probably trust us less if we were owned by Amazon.


How long did it take to make ReviewMeta?


Oh man, I was working on it like a madman for like 6 months before we launched. I felt like JP from Grandma's Boy trying to finish his game at many different points:


How quickly are things changing in the world of fake reviews? How are sketchy Amazon sellers changing their techniques as time goes on & as Amazon plugs the loopholes in their review system?


Things do change, but from my standpoint, it's not happening that fast. It seems like every few months, sellers find a new loophole, exploit it for a while, and then Amazon cracks down. Here are some examples of the loopholes we've seen in the last year:


What is the REVIEWER EASE test your site does all about?


Could you turn your site into a browser extension? Detect lies on the fly while I buy useless crap i don't really need?



while I buy useless crap i don't really need

Too true. I honestly think we (as a society) buy too much junk we don't really need. I'm guilty of this myself. I try to shop at thrift stores when I can. Buy used junk, not new junk.


I am wondering what are your thoughts on how fake reviews affect Amazons business?


Amazon's a publicly traded for-profit company. I think they are only concerned about a bottom-line effect. In my opinion, they've been very reactive to the problem. They don't act unless there's something so egregious that they must take action. So I think they are ok with some fake reviews as long as everybody keeps buying stuff.


Did you accomplish this with machine learning/neural networks? If so, how did you build the model?


Each report shows a detailed breakdown of the analysis, so that might partially help answer your question.

But in terms of the model, we're comparing much the data to the averages we'd expect to see within the category - which has been built by collecting millions of reviews from Amazon.


I swear you need to make this analyze reviews on Wish. I've never understood all these "positive" reviews which say 'arrived earlier than expected..haven't opened it yet but looks great". BS Who the hell reviews without trying something!?


Yeah, I've never spent much time looking at Wish reviews, but from what I've heard in this thread, they are absolute trash.




It seems you don't fully understand how our site works.

First off, Reviewer Ease - we're not looking at how the reviewers rated your products, we are looking at how they rated all the products they have ever reviewed. Read more here:

Deleted reviews don't actually affect the adjusted rating, it's just there for the shopper's information. Read more here:

All the other tests actually check to see if the suspect group is giving a rating that's higher on average than the non-suspect group. So, for example, One-Hit Wonders are only flagged if they are rating the product statistically significantly more positively than those with more than one review.

There's also a reason why we have so many tests. Not one test is perfect on it's own. There's a lot more going on with our algorithm than you understand, based on your criticisms. Every test has documentation behind it (see the Reviewer Ease link above) and I encourage you to read it, look at other reports on it, and spend some time fully understanding the site before you reject the entire concept based on a few cherry-picked examples.


I'm looking at the results for a product I just bought the other day. I'm having a hard time understanding how the "Reviewer Participation" test works & what it's really looking for. Can you please explain? Thanks.


Which of your 15 tests that you run to detect fake reviews are you most proud of?


The "One-Hit Wonders" test. I wanted to call it the "One-Hitter-Quitter" test but my friend convinced me to call it "One Hit Wonder" instead.


Has ReviewMeta gotten much national press or publicity?


Tons. Over the years we've been in the WaPo, WSJ, PBS, Good Morning America, etc. I haven't updated this page in over a year:


The system is a bit flawed. I checked an example I knew had been artificially inflated and it said, instead, that it had been artificially deflated. ( Probably because negative reviews are clustered after the point that amazon stopped deleting them. )

The day after it came out a promoted news story about it being "the most highly rated political memoir ever sold on amazon" was at the top of facebook and google news with no mention of the deletions so it was obviously coordinated to some extent.

Is there any way to correct for this kind of rigging on the companies side in your system, or can the heuristics only support its current level of evaluation?


I checked an example I knew had been artificially inflated

How do you know this?

There are thousands of reviews for this product, fake positives and fake negatives. The product is a subjective, polarizing mess and very untypical of the average product on Amazon.


I notice that there are only 3 products from .NL do the dutch not trust you lol >?


I think it's that there are so few products on, and maybe we aren't so popular there.


Do you still do bodybuilding?


I still love going to the gym! I wouldn't call myself a bodybuilder really.


Hey Tommy, I remember the Planet Money episode and tried out my boardgame Secret Hitler right away just for kicks, and I THINK you were giving us a poor rating back then, and fakespot is currently giving us an F. It looks like your rating for us is fine now, though.

I'm pretty sure we've never bought reviews since I'm the person who would have written the check, do you know if this is just a false signal or is it possible other agents are buying false positive reviews on our page? I've looked through some of the flagged reviews and they do seem suspicious, but obviously I know that we've never paid for them.


There's A LOT that could be going on. It could be the nature of a product that has a loyal fan base causing a false positive. Did you see the review about it being a knock-off?

We're also showing that at least 58 reviews have been deleted, 95% of which were 5-star.

So a lot of possible reasons why the result is what it is.


Do you only check for product reviews, or also seller reviews?


Only products.


Who killed Epstein?


The better question is who didn't kill Epstein. And the answer to that question is Epstein.


Does review meta work for the uk and Amazon uk?


What’s your favorite fruit?




Sorry if this isn’t a good question but have you seen ever thought of making a chrome extension for your site?


How often is the data on updated?


On every report, we show the date it was last updated. If it's more than a week old, you'll be able to refresh it.


So basically fakespot which already exists?


We're actually quite different than FakeSpot, and have been around for almost just as long. Here's a bit more about the differences:


It'll be hard to make this comment without linking it to my business on Amazon but I'll try.

As an Amazon seller, I was personally involved in gaining reviews on my products. I find reviewmeta is inaccurate determining genuine fake reviews from real reviews.

I've both purchased reviews, and gained legitimate reviews and used your site to see if it can tell the difference.

At the time we were buying reviews, my score was an "A" with most of the reviews determined to be real. However, when Amazon caught on due to competition reporting us, they wiped a portion of our reviews and we started over getting legitimate reviews at a much slower rate. During that time, we had a C even though all of the reviews were 100% real.

I'm not sure, but it seems to me the site is only good at picking up the really obvious fake reviews like ones that read something along the lines of "Good product. Fast shipping. Nice. Excellent service. Quality"

Can you explain why your site might be ignoring the reviews I know that I personally paid for while ignoring the reviews left by real customers while I was rebuilding feedback without me linking specific examples?


my score was an "A"

we had a C

Sorry, we don't give a score of "A" or "C". Please make sure you are looking at the correct website.


I’ve been noticing reviews of items that are completely unrelated to the actual item.

Example: I was looking for cheap lightning cables, i noticed the reviews were odd. Talking about lights. Turns out some of the reviews, including some pictures, were for a lamp.

I think these were real reviews, but the seller changed the item somehow, but kept the reviews.

Can your site identify these real reviews for the wrong item?


Thanks for this.

I have seen that if you try to rate an amazon product 1 or 2 stars, it is not posted or is quickly removed, giving even more 5 star bias. Do you do anything about this as well?

I personally have tried to give bad reviews to products that deserved it, but as they are connected to Amazon Prime, they are basically untouchable.


There's definitely a lot more incentive for sellers to challenge and complain about their 1 or 2 star reviews, so this could potentially lead to even more bias. There's also a lot of different rules in the TOS about what you can and can't complain about in the review. For example, you're not allowed to complain about fake reviews in your review - that will lead to your review not being accepted and it really makes people angry.

Happened to me once. I bought a tape-to-aux adapter for the car, it sucked, I went to review it, and the listing was hijacked and now for a teeth whitening product (although they kept the reviews from the tape adapter). My review said "I BOUGHT THIS PRODUCT WHEN IT IS A TAPE ADAPTER AND NOW IT'S A TEETH WHITENING KIT", and Amazon rejected the review. It can be infuriating!


Would it be possible to create a browser extension that displays the adjusted rating on any product page you view? I feel like that would be the easiest way to compare ratings.


What do you think about making a chrome extension that automatically displays the reviewMeta score for amazon products?


I am an Amazon seller and just ran about 10 of my products through your site and am impressed. 1 of my products has reviews from a few years ago when “unbiased giveaway” reviews were acceptable and this snuffed it out. Whereas fakespot identified a number of my products to be fraudulent reviews when I know for a fact that they aren’t.

My question as a seller is whether you could add features to notify the user of bad faith product variations?

To explain, A common tactic for black hat sellers is to incorporate any abandoned product with good reviews (related or unrelated) as a variation to absorb those reviews. The result is a product with hundreds of 5 star reviews, but they are of completely different products.


Yup - it's called Review Hijacking and we have a warning that comes up when we detect it:


I agree that fake reviews on Amazon is a major problem, and I'm thrilled to see people like you doing something about it. So kudos to you, Tommy!

But my question is this... It seems like it's nearly impossible to sell something on Amazon today if it doesn't have at least a handful of 4-5 star reviews. So if I'm a new seller on Amazon just trying to get my business off the ground, how would you suggest doing it honestly & legitimately (i.e. without asking friends/family for fake reviews)? It's a chicken-egg problem... without a handful of good reviews, you'll never sell anything. But you can't get any legitimate reviews until you first make some sales!

I'd appreciate your thoughts. Thanks!


Honestly, I think this is a big problem on Amazon. They have the "Vine" program but I hear there are problems with that (can be very expensive). Other than that, I don't know what to suggest. I've never tried to sell anything on Amazon.


Man, so many questions. I didn't (couldn't) read through them all so hope I am not duplicating a question.

The one thing that really got my attention is that you are also rating the reviews and by extension the reviewers. Do you plan to add a section to the site so we can see a reviewers profile? I ask because I would be curious to see my profile. I have done quite a few reviews on Amazon over the years, most of them in depth and verified. A few are for products that I felt strongly about one way or another, but didn't buy from Amazon (mostly negative in these cases).


Yup, just copy and paste the URL to your Amazon profile into the search bar at ReviewMeta.


I've seen a lot of products on Amazon that get really good reviews in the early going (right after they launch)... but then the reviews tend to be lower as time goes on. Every time I see this, I can't help but assume that the early good reviews are fake... and then the later bad reviews are legitimate. Does detect this trend and penalize the product for it?


Somewhat. We look at that in the Rating Trend test - to see if the reviews are created evenly over time or if they appear all at once:

We also show a graph of the rating trend on every report, so you can see which direction the trend is going.


Is it possible to make this into a browser extension?


I love the one-hit wonders test your site does. Brilliant!!! Where did you get the idea for that?

Is there an "inverse" test that checks to see if the same Amazon profiles are reviewing a) other products from the same brand, and b) other "unrelated" products (not from the same brand... but just an unusually high amount of crossover - higher than you'd expect to occur naturally)?


Yup. Look at the Brand Repeaters test and the Overlapping Review History test.


Since the days you personally devised the detection models and personally wrote the computer code, have you enlisted other contributors to the project? E.g. seasoned statisticians, senior programmers, etc.?


Nope, just me.


Do you guys have any plans to develop a mobile app? (I see several posts about browser extensions, but I do most of my Amazon shopping on my iPhone. I'm guessing most other people do as well.)


How do you think Amazon should handle fake reviews?

Have you ever been threatened over the algorithm giving bad scores (if so have you ever felt in danger)


I have a blog post about what I think their reviews should look like:

Yes, we've received quite a few threats before we changed our TOS to state we can publish any harassment publicly. Someone actually called my parent's home phone. I haven't lived there in 15 years, don't know how they got that info.


How does reviewmeta account for a spike in reviews around times that a product has been more aggressively promoted/amazon paid promotions?


We compare the average rating given by reviewers on high volume days vs. the rating given by the rest of the reviewers to see if there's a statistically significant difference.


Hi! Your work looks amazing! One question I’m having: does the software work only with English reviews, or with other languages as well?


It works with all international versions of Amazon: .com, .ca,, .fr, .it, .de, .es,, etc


Is there anyway to incorporate this tech into a chrome extension and if so are there any plans to do so? If not, why?


Very nice, but doesnt work on firefox for me. What security settings need to be opened to allow it to work?


Are you talking about the entire website or the extension? Can you send a screenshot to []( ?


Hi Tommy, which country would you say generates the most fake reviews? Do you have any statistics?


I'd have to run some queries which would take a while, but off the top of my head, I see a lot of stuff going on over at (Germany)


So what percentage of reviews would you say are fake? 5 out of 10? 7 out of 10?


Around 7-11% on Amazon is our estimate.


Has anyone made a Apple shortcut for your great site? Thanks in advance ;/)


I thought I saw one floating around. We have a bookmarklet, an iOS App, a bunch of browser extensions too.


How can our eyes reading reviews be real if reviews of mirrors arent real?




Hi Tommy, any response to this random critique I found on Google?


Don't want to stir the pot, but the guy tweeted at us dozens of times when our site first launched, so we responded to his tweets in a blog post. He seems to be a little upset that our algo questioned (and rightfully so) some of the reviews on his books.

There's a lot of misinformation in that post and misleading oversimplification of how our site works. I've decided to just leave it and not stir the pot anymore.


I'm meta, Tommy. Can you review me?


5-Stars. Best Meta I've ever met-a.

Card image cap