Club ratings and motivation to write reviews
rickdugan
Verified and Certifiable Super-Reviewer
Now I suspect that I did this to myself by giving one guy crap for abusing that privilege, which led to a brouhhaha of sorts. But overall I think that most seasoned members use the privilege responsibly. Also, I think it's a privilege that most of them earned legitimately through years of contributions to the site, which for most came at no small cost in term,s of club spending.
More important, I think it made for better overall club ratings. First because seasoned members generally have more comparative intel to judge with and, second, because it mitigated shill manipulation. Without this sort of system, you effectively hand the controls over the the shills and make the ratings even more useless than some already believe them to be. I use these ratings in sorting through clubs in large urban areas and I'm sure that I'm not alone.
Founder, I hope you will reconsider.
Got something to say?
Start your own discussion
29 comments
Latest
For example, I know I don’t pay too much for ratings when it comes to restaurant reviews, and instead glance at the overall message of what people actually say.
I’ve willingly chosen to go to 2.5 star restaurants where people specifically said the food is great but service sucks, for example.
And that’s just somewhere to find food at the $10-$30 price point. I’d imagine if it’s something like a strip club visit, I’d definitely be even more stringent on checking the qualitative feedback vs quantitative.
One other thing, if ratings were the most fair and least susceptible to manipulation, at what level would Baby Dolls in Dallas be rated?
Then there's the issue of preference and/or bias - for one experienced SCer a particular club may be up his alley and for another experienced SCer the club may suck according to his preference.
It's best to not weigh things and force the #s - let the total of the reviews shake things out.
@Nice: In an area with a small number of clubs, the ratings aren't that meaningful. But in larger metro areas with 30+ clubs, one needs a way to sort the clubs unless one has endless free time to read reviews for all 30 clubs. Also, like it or not, some people will be influenced by them whether they are truly meaningful or not, so it's not fair to the clubs in question if the ratings are easily manipulated.
Unless the scoring has changed in the past day or so, now it is a simple average of all reviews done over the past year. Watch one of a club with a review as of Feb 27 2017 (tomorrow). By March 1 the club rating will budge a bit.
# of reviews is somewhat useful, but the historical total isn't as meaning ful as say # review in past 12 months or past 6 months.
Beyond that you have to read (and write) review to get a sense of what to expect.
Being overly concerned about the numerical score is fairly ridiculous. Considering the broad base of TUSCL members and their varied tastes, the score is of little value compared to the comments.
Regarding fake reviews, who cares? They have little impact.
What the fuck is this "I've been a member xx years, my opinion is more valuable"? I have been a member less than year and have probably been clubbing longer than you have been an adult.
But while it is still there, there are going to be people who use those ratings to help narrow down there club selections. IMHO anything you can do to make those ratings more meaningful will be a benefit to users, which includes using a system that allows seasoned users to have some influence while mitigating the shill issues.
And as always, thank you for everything you do for us.
You can not make ratings any more meaningful by tweaking weights based upon # of reviews. If you have any basic understanding of statistics and modeling you would understand that the net difference is small.
If you are using the rating to determine if you should go to the 6.8 or the 7.2, more research is needed to differentiate. But if you are choosing between a 7.2 and 5.0
in the same market, you would still need to do more research. Comparing by rating alone could keep you out of Bandaids in Phx, even though there are a few PL's in Phx that love it.
Besides, any bias in the current system is basically equally applied to all ratings.
You are trying to fine tune quasi random events. I bet in your business, you spend way too much time trying to fine tune guesses.
A big problem with Metacritic are the numbers; people judging products based on the numbers rather than bothering to read reviews. Why not get rid of them? Maybe replace it with a thumbs up or thumbs down per each review. Calculate a running total of the thumbs up and down over the course of 12 months or X number of reviews to give a percentage of how many positive reviews the club has. All people really need to know is if you had a good time or not and they can read the review for specifics. Do it more like Rotten Tomatoes.
There's also a strong element of personal preference thats hard to account for. Going back to my previous example, I'm cool with the "vibe" at BT. It's mellow and laid back. Others looking for a party atmosphere may find it dull, lifeless, and boring. Again, contrast that to DD which is usually much more upbeat and party style and those looking for that will rate it higher. The same applies to girls, to use another local example Papi doesn't appreciate the move from the thicker, meatier, probably older, talent at the Body Club to the thinner, firmer, probably younger talent. I'm exactly the opposite. If we both reviewed it, our ratings would offset each other.
There's probably another dozen points I could make that are similar. I'd rather see no ratings, or if we have to have ratings I'd like to see it as just a simple 1-10 or whatever for how I liked it overall. I also like the idea of rating reviews as useful or not. I know there's some potential there for abuse, but I think overall that would be a useful feature - to be able to see only reviews with higher ratings.
To me how pretty a dancer is facially and how fit she may be is not what I look for but I know most SCers are into that thus I will usually rate from the POV of facial-attractiveness and fitness of bodies as to not mislead the avg custy, then I will explain how I personally feel in the review itself.
e.g. I will rate the dancers at Baby Dolls Dallas higher than the ones at a black dive even though I much prefer the black-dive dancers.
My general use of the ratings is for comparisons among one set of clubs that are geographically near each other and that are reviewed by a similar set of people. For example, if I visit a new city, I can scan the ratings of all the clubs there to get a sense of which places I'll start reading about. But I'll generally end up reading nearly all the reviews (back about a year or more in time) for all the clubs I'm considering, so the current version of the ratings-averages won't do much to impact my ultimate total information accretion; it will simply rearrange the order in which I accrete that information, letting me keep things in my head a bit more straight than if I'd just read everything in club-name alphabetical order or another similar arbitrary arrangement.
People rate without good sense. Inexperienced people give their first strip club a 10 out of 10 in every category; aggressive anti-club reviewers (paid by competitor clubs or not) give 1 out of 10 in every category, too, and both are inaccurate tendencies for ratings. Few people have both a geographically wide-ranging ("national") perspective and also a temporally long-duration perspective adequate to rate clubs accurately relative to all the proper factors, such that the rating is useful to TUSCL's entire audience of readers, coming as we do from the entire world's geographical regions (though primarily the USA), and comparing as we do across varied time-lines from as recent as only the last few months, to as far back as some of the old farts (myself included) who can remember prostitution, or at least strip-clubbing, in the Nixon Era.
I'm sure there are mathematically rigorous ways to crunch these statistics, such that someone who has never reviewed before, automatically gets less weight than someone with a proven track record; such that extreme out-lying review numbers are discredited and therefore impact less heavily than numbers among the norm; and such that the statisticians are not up in arms about the validity of the sample. But I'm not a statistician and wouldn't pretend to know what kinds of automated stats packages could do that kind of crunching, and anyway, I don't think the implementation of that kind of statistical control would really cause me to use the reviews any differently. It all is still based on someone else's opinion on the matter, and is only as reliable as each individual's sobriety at time of typing :) the review.
Consequently, it's flawed but still important, IMO. I think the "national top 10" lists and so forth, which used to be more prominently displayed in the TUSCL menus, are rightly downgraded to less important information. I think the current submission fields -- girls, prices, vibe, etc. -- are helpful and, since they have a history and therefore have a rather large backlog of data entered into them, should be somehow perpetuated. I suspect there could be a rather large statistical project arranged (by someone other than me!) to start to create more rigorous and more "valid" (whatever that means) ratings systems, but I suspect Founder doesn't want to take on that much of a web project. A strip-club-only Alexa type service, where click-hits per viewer, and costs, and opening hours, and number of girls on shift, and general attractiveness of the girls (by valid standards, not inflated due to beer-goggles or millennial snowflake inexperience), would all be fed into an amazing penis-teasing meat-grinder, the handle would be cranked, and out would squirt loads of TUSCL VIP spermaceti type pasta which would keep young attractive vibrant vivacious girls slavering for more of our swelling meat ...
Here is a simple example. A club with one review over the past year. (September 2018) it was rated 6,7,6 which averages out to 6.33 and that is its overall rating. If nobody reviews this club again by September of 2019, it will cease being rated at all.
https://www.tuscl.net/app/listing-review…