The problem with Download based Rankings: Downloads does not always mean quality
Many people are driven by Rankings for app discovery. Knowing what most people download is really helpful to find what you should get. But is that enough? Far from it. Not only because they do not answer your specific requirement, but mainly because a high position in the ranking does not necessarily guarantee the quality of an app.
Here are a few cases where this can happen
- Many more developers tend to use price shifting as a method to accelerate their download rate. their App originally priced at X$ goes down to X/2 or sometimes zero. Then because of some services like FreeAppAday or many others, users tend to download the app on an impulsive base. There is nothing to lose. But that does not mean the app is great. It just means it was free and it drove downloads.
- If this technic is used recurrently it can create an artificial ranking that does not reflect the natural popularity of an app
- A few services are helping getting reviews, sometimes in order to get higher in the rankings. They pay users to get better reviews. This is not really ethical. But this is the world we live in. Impossible to make a difference in the App Store. All you see is a rating. Not the source that mainly drove the rating.
- Other services pay their first users to get downloads. This is of course not sustainable. you could spot them by looking closely at their date of issues/ranking. But again, most people don't do it.
- Featured/non Featured: if you get lucky you ll get picked by Apple and if you do, you'll appear in the featured/favorite/hot sections and it will send you to the app stratosphere. It reflects the taste of an editorial team. But not necessarily your taste. Mainly you will not really understand why those apps are so great.
- Today TheNextWeb also showed how some developers gamed the system and got themselves high in the ranking with crappy apps. They hacked iTunes account. The result? Poor Apps appear up high, and other quality apps, remain hidden. Here is TheNextWeb conclusion
Download based rankings are not an indication of quality and good match. they are an indication of global download based activity. Which is very different.When some apps are left waiting weeks for approval, only to be rejected by Apple for minor objections, how does a company with no website, no description and apps that are literally swarming iTunes escape punishment? More importantly, how has someone managed to hack users’ accounts and left many, we can only assume, unaware they’ve been robbed?
What would really make a difference?
A ranking based on usage. What are the apps that are really used. Something Apple could release if they wanted. They have that information but use it poorly in the Genius for Apps. Flurry, the leading app analytics company, could have done it, but i doubt they do know since they got in trouble with Apple
What else could make a difference?
A ranking based on your social graph (for which we need Apple to approve Appsfire 2.0 waiting over 80 days...). Knowing what your friends are using is a fantastic filter.
This is why we created AppTrends: a live ranking based on people's excitement about an app, and not about downloads. It is the closest thing you'll find out there that will give you a quality indicator about Apps people really love. You can for any app, see what people are really saying about it. And you can see who are those people (which we filter to avoid noise and irrelevance)
As we grow, we ll release more indicators.