Every time you rate a book, 1-5, it goes into a cumulative rating for that product. Amazon considers a 3 rating as unfavorable. I've had a number of 3-star reviews with titles like "great read," but this balances out over the span of dozens of reviews. The number-one value the customer judges your product by (apart from your cover) is the cumulative rating, How does Amazon compute this all important number?
If you hover over the cumulative rating of a book, it explains that the value isn't an average of all the reviews but a sophisticated proprietary result of machine learning based on review age and verified purchase status. Finding a bug in their algorithm, I explored.
In 21 out of my 23 books over a span of 5 years, it was exactly the average. The two exceptions are the interesting part.
1) My book "The Redemption of Mata Hari" has a straight average of 4 for 6 reviews, yet they rated the book a 3.7. The verified reviews averaged 3 and the unverified were all 5s. The primary reason was a 2-star rating that complained the book talked about sex too much...when the main female character was a succubus. Therefore, there is a clear discrepancy between the two rating groups. To compensate, the formula for the overall star value is weighted,
0.65 * verified + 0.35 * unverified.
On the face, this is fair. It does seem to be a way to avoid those who are "cheating" the system. Unfortunately, Amazon doesn't automatically mark a review as verified if you bought it. Even after the fact, Amazon provides no way to change that flag. Customer service doesn't have that capability. When asked how one can get the designator, they said the only certain way is to go in through your recent orders page on My Account to do the review. If Amazon sends you a "rate this" email, it also connects the dots. Why not when you're logged in normally? At the moment of submittal from your account, the code could easily check the purchase history, or do a periodic system-wide update of this status. The "we can't get there from here" statement doesn't hold water for a company that employs so many experts on data mining. The page for the book itself tells you in a blue banner across the top when you purchased it. That means it's already in an active variable for the java script, no trouble to access when you hit the review button.
2) My book "Foundation for the Lost" had a correct average of 4.5 after 14 reviews, the same as every other book. Then, on 9/1/17, I earned another 5-star review from a verified purchase. Excited, I computed that this would bump my rating by either metric to 4.7. After a week, nothing changed. The computation wasn't updating from the database. So I called support. They agreed that this wasn't reasonable behavior, but there's a special group dedicated to just this issue. He sent in a ticket and told me I would have an explanation by email. A couple days later, I got an email that was literally someone mousing the info text you see when you hover over the star rating, stressing that it was machine learning--nothing more or less. I completed everything but my dissertation for a PhD in computer science with a minor in math. I have 12 software patents and 30 years of commercial programming experience. While learning might phase out the emphasis of the oldest data, it would never throw out the latest and more reliable category. This response is stonewalling for a bug. Amazon needs to be accountable like any company taking 30 percent of my sales in exchange for these services.
I should probably just keep trying until they give me an honest answer, right? For my first audiobook bounty, it took me over 10 months before their bureaucracy gave me a response. That was the best case. In the worst case, Amazon support people gave me another phone number to call for help:
1-888-280-4331. This number offers you three too-good-to-be-true deals that require your credit-card number. If you don't fall for any of them after five minutes, the recording demands that you hang up. It shouts the demand three times and then plays a loud, annoying tone. This is also stonewalling. When asked to rate this interaction, I gave them the lowest possible and used words like "unacceptable." In any company, this would have merited an apology. Either they don't read these objections, or one-star customer service is so common that they can't reply to all of them.
I'm not saying this "machine learning" emperor has no clothes, but the fig leaf is pretty small.
No comments:
Post a Comment