Fiverr Forum

Pros and Cons of the new Fiverr level metrics


#26

Oh, and one last thing before I get off the soapbox. I’ve noticed that sellers in my category aren’t listed as professionals. There are others that use computer generated product and get top rated seller with inferior product. And then there are those like me that actually do the work and end up with lower levels .

Like I said, I don’t think they care as long as they get their money


#27

This is a very interesting post.
Most of them have been discussed previously, but seeing them all together seems like it is all mess up.

Do you have any ideas to improve the list of issues mentioned above?
Or any proposal on how to avoid them?

Either I agree with them or not, it would be interesting knowing your approach to solve them.


#28

the reviews are monthly, giving you the chance to climb as quickly as possible (again, in theory)

Except if they reduce your level and you get shown about 0 Buyer Requests and that was what you relied on for orders that doesn’t make it easy to climb up the levels quickly.

If they also prevent you from creating a new gig without first deleting an existing one (instead of just pausing one) that also doesn’t let you climb up the levels/get orders if you had a good idea for a gig or if a specific new gig was being asked for by buyer(s).


#29

What they could do is show an information page when you clicked on something like “more info” next to the the response rate stat. That information page could show you how the response rate was calculated - ie. it could show the total number of messages you received in the last 60 days, the total replied to and not replied to and a scrollable list of the messages not replied to (and maybe the ones you did reply to).

Doing the above would inform the user as well as mean they have the answer without having to go to CS (assuming the stats/info shown is correct).


#30

I agree with your idea, though I would change it to 3.5 and 4.5 as I find it 3, way to low and 4.8, not bad, just a little bit high.

I didn’t get to understand this. Can you, please, explain?


#31

This is what kills it for me. Already complained about it back in November 2017. And the fact that, 9 months later, nothing has been done to solve it… is a big, big disappointment.


#32

Sure - have a look here: Which One's Correct? and Problems with ratings

Sorry for quoting myself BTW.


#33

From what I read from your links, I guess maybe the problem lies only on the displaying of the values (there must be a bug related to handling numbers and displaying them in different places) and not, or at least it shouldn’t be, in the calculating process for that HAS to be one and just one (centralized data collection).


#34

That’s pretty damning from you @torrelles. In my opinion, you are No.1 for VO’s on Fiverr.

I did reply more in-length to this thread but my post is still pending moderation. - My bad for mentioning a few alt Fiverr sites. However my beef with St Levels and other changes is that it does not benefit good buyers or good sellers in any way.

I was scraping 4.8 a month ago. Now I am considered a 5-star seller. My quality of service has not chjanged in any way. All that has changed is the fact that fewer orders mean I have received fewer buggy 4.7 star reviews and fewer orders for things like website design which I do not offer and have to cancel because of this.

The only people St.Levels benefits are scam sellers who can deliver questionable SEO services etc in seconds and who should have been shown the door years ago.


#35

There could be multiple numeric fields in multiple tables. Some could be calculated incorrectly (or rounded in different ways) from other fields. I don’t think it has to be just an error in rounding for display. It could be differences in rounding when calculating then storing a particular field’s value.


#36

That’s why I said that at least it shouldn’t be a problem related with the calculation process itself as it (the calculation process) HAS to be just one and that one is the one to be considered by Fiverr for whatever Fiverr needs it. (Excuse my tongue twister :flushed:) and the problem must lie somewhere else handling the numbers that are taken from the main gathering and collection table.


#37

There can be one calculation process to determine whether a person’s level will be increased or decreased, but that could be based on tables/fields that were calculated and stored earlier (eg. at the time a review was entered, at the time an order was completed, etc.) that were stored/rounded incorrectly. So although there may be one set of calculations for checking whether to increase/decrease levels, that calculation may be affected by any incorrectly stored (such as incorrectly rounded then stored) fields already in the system.


#38

Yes, that could also be a reason. :slightly_smiling_face:


#39

for of new fiverr level metrics i think it is best for all the designers.it give the same chance of being rated for all levels.before this level metrics only top rated sellers have the good potential buyers. but if a level2 or level1 seller that have great professional skills but can’t reach out to that buyers…after this metrics the one who has good skills will reach to the potential buyers and have lots of work…


#40

Getting back on topic to the actual Fiverr level metrics, these have just turned Fiverr into a trainwreck.

A month ago, I was scraping 4.8 for customer satisfaction. This was due to a run of 4.7 buggy reviews left by people rating my services via the Fiverr app. Today I’m up to 5-stars. Did my quality of service improve over the past 60-days? No. I just received fewer orders and turned away all buyers who seemed the slightest bit iffy.

Given the above and the fact that sellers suffer thanks to impossible to counter cancellations etc, what do current ratings actually tell buyers? Answer = Absolutely nothing.

Monthly levels and now the new 60-day review counts, make it harder for buyers to find reputable sellers and more difficult for reputable sellers to compete against fly by night scammers. Worse, they leave the Fiverr search looking like a search a bottom of the barrel Fiverr copycat site like Gigbucks.

I am serious about that last point.

Fiverr has driven through so many schizophrenic changes, that using the site (as a buyer or sellers) for the first time now looks and feels like using a Fiverr copycat site.

First, there are the lists of gigs. Then there are all the gigs listed with 1-20 reviews if they’re lucky. What does that tell buyers and prospective sellers? Put simply, it says: “Well this looks a bit naff. No one is selling anything. This must just e that cheap and nasty site I heard the guys from PPH talking about.”

FiverrUp:

At FiverrUp, freelancers have hardly any reviews and using the search is a needle in a haystack. But guess what? The price you see is the price you pay! What a crazy idea.

Gigbucks:

Using Gigbucks, aka the inspiration behind Fiverr’s new category search, is also like trying to find a needle in a haystack. More importantly, on both platforms I can safely say that buyers don’t rush to buy services for three main reasoins:

  • The level systems seem to mean nothing
  • Apparently, long-established sellers have a maximum of 1-20 reviews
  • The searches seem to give preference to new sellers with few to zero reviews

Thankfully, the real Fiverr offers… Oh, wait a minute…

The level system, bugs, poor CS support, and the current experiment with only showing reviews from the past 60-days, does not benefit buyers in any way. If anything, all these changes deter people from spending a dime by completely subtracting social proof from established seller scorecards.

On new Fiverr, everyone is a newbie and now has to struggle to maintain their status as a newbie. - Unless they flog dodgy SEO services which can be delivered in under an hour.

Think about that for a second. Fiverr has literally gone to lengths to emulate the look, feel, and perceivable trust level of its most bottom of the barrel competitors.

It’s just blooming nasty.


#41

That was an mistake in a previous post which meant to say “60 days”.


#42

I have asked them in my complaints to add a better way to view the numbers for response rates and times to their suggestion box. We will see…


#43

You are absolutely right.
I hope fiverr looks into this and find a better way of rating.


#44

I agree.
60 days is not good enough to judge a seller because business may be dull for a certain period, that doesn’t make the seller incompetent or lazy.


#45

Yes but you see, “delivering great service” only counts completed orders, (Read: Orders where fiverr makes some money).

Agreeing to a buyer’s request to cancel is counted against a Seller, when it should not be counted at all. Or if it is, it should be a good thing. Because when a Seller accepts the Buyer’s request to cancel, the Seller IS satisfying the buyer by delivering what the buyer wants in the end … which is to cancel the order. And also, if you as a Seller refuse to accept a cancellation order, the Buyer can give you a negative rating, which can lower your level, which will hurt potentially much, much more than losing the money on a gig that the Buyer wants to cancel.

So Sellers are in a catch-22. They’re hurt if a Buyer cancels even for no good reason and they accept. Even if it was done by a sneaky intentional attacker, bent on harming the Seller’s score. And they’re hurt if they refuse to accept the cancellation and the Buyer gives them a bad review, even if they did an extraordinary job.

That Fiverr counts cancellations against the Seller with a cold, context-less robotic system, demonstrates a fairly greedy motivation for that metric to be used, even if it is in fact for other reasons.

Also annoying: The very low post-per-day limit on the forum, here. If a post gets engagement, why limit daily activity on it when the content is good? There’s just no good reason for it. There are other ways to filter content so that the bad quality stuff does not get through and last I knew, they had a great team in place doing that. So it’s not clear why the crude, definitely not properly data-driven, message-per-day limit exists here.