UPDATED JANUARY 2013: CHANGES TO METHODOLOGY. We’ve now updated our rankings for 2013. In the process we made two notable changes to our ranking methodology.
- Missing Data. The first is how we handle missing data points. In the past we simply allowed final scores to be averaged without the missing data. In effect, missing data had NO EFFECT on a particular hosting company’s ranking. For 2013, we’ve decided that when data is missing for a given company, that should reflect negatively, either because it inhibits one’s ability to judge the company, or sometimes because the company in question is too small or not run properly to justify the data. In either case, we think missing data should have a negative effect on score. So where specific data is missing, we’ve now set those data points to “0.” The inclusion of zeros within the averages have the effect of bringing average scores down.
- 5th Highest/Lowest as Baseline Rather than 3rd Highest/Losest. For each data series (ie, Twitter Followers), there are often a few hosting companies that dramatically outperform the others. If these outliers are allowed to set the curve, the resulting data is skewed downward for all other hosting companies. This is not justified – once domains registered or number of Twitter followers crosses some threshold, companies are effectively equal. So in the past, we’ve adopted the 3rd highest (or lowest) as the baseline. That means that three companies are scaled to 100% for every data series. For the latest changes, we’ve taken that logic a bit further and set the 5th highest (or lowest) as the baseline. Now, the top five in any category now receive 100%. This has the consequence of reducing the variance between the scores and making all scores slightly stronger.
We took a fairly standard approach to developing the rankings:
- identify what people care about
- find data that measures or approximates those things
- use multiple measures where possible to increase confidence
- scale data so it’s comparable across each item being compared
- average and/or weight individual ranking factors to calculate a single overall Ranking
Of course, the devil is in the details. The main problem is that many things we care about are not easily measurable or not publicly available. We must instead find data that is publicly available and that acts as an effective proxy. It’s critical that any ranking factor we use is available for the vast majority of hosting companies – otherwise it can’t be used to compare.
Ultimately, we decided on a list of 19 ranking factors for the first-ever version of our Rankings. We recognize this is an imperfect list of factors, but we also think these factors provides a more comprehensive and objective view of each hosting company than almost any other kind of study, ranking, or review site you’ll find.
Below we’ve listed the ranking factors used in the first version of the study. They are broken down by category. For each category and/or Ranking Factor, we briefly decide what we’re real issue we’re trying to measure and why we chose the ranking factors we did.
VIEW THE DATA SOURCES
For each of our Ranking factors, we’ve provided links to every data source for every hosting company on our Data page. For that reason, links are not provided below. We focus instead on explaining why each factor was chosen and what kind of considerations surround each one.
THE RANKING FACTORS
By popularity, we mean two things. The first is how many customers a hosting company has. Here the assumption is that, all things equal, more customers means the company is doing more things right. In effect, customers are voting with their business. The second is how popular the company is among it’s customers – based on things like social followings.
For the first type of popularity, the ideal metric would be number of customers. However, this data is not currently available. We may consider surveying hosting companies for this data in the 2013 rankings, but there’s no reason to expect that a most, especially smaller ones, will volunteer the data. As a proxy for this kind of popularity, we’ve used Number of Domains Registered, Total Linking Root Domains, Total Linking CBlocks, and Alexa Ratings. The rationale and potential issues of each of those is described below.
As a proxy for the second type of popularity, we’ve used social media followings, both Twitter and Facebook. In addition, User Reviews and ratings are a viable proxy for popularity in this sense. We have looked at user review data, but for current purposes have categorized it under User Experience below.
1. Number of Domains Registered
This is the most direct proxy for number of customers. We used data publicly available at webhosting.info, one of the few sites that provides fairly comprehensive data on domain registrations by each hosting company. Unlike sources like ICANN, this site uses a methodology that allows them to show registrations by individual hosting companies, even if they are subsidiaries or resellers. There were still a few hosting companies missing from their data.
2. Total Linking Root Domains
We gathered this data from opensiteexplorer.org by SEOMoz. The logic was that Total Linking Root Domains actually approximates both kinds of popularity, partly because it reflect domains controlled and/or parked by the companies themselves (domain and customer volume), and partly because it reflects popularity in the sense of other sites linking back.
3. Total Linking CBlocks
Again, we gathered this data from opensiteexplorer.org by SEOMoz. The logic is the same as #2 Total Linking Root Domains, but this provides an alternative measure that reduces the weight of domains likely to be owned/controlled by the hosting company itself, since such domains are likely to share CBlocks.
4. Alexa Traffic Rating
We collected this data by searching for each company’s website on Alexa.com. Anyone who has been in the web space for any length of time knows Alexa traffic rankings are unreliable indicators of actual traffic. They do, however, have some value as a comparative measure. We can reasonably trust that a site with an Alexa of 400 gets more traffic than a site with an Alexa of 40,000. The logic here is that more traffic volume to the site can act as yet another rough indicator of the overall popularity of a company.
5. Twitter Followers
Each company’s current number of Twitter followers and Facebook likes was was obtained simply by looking at the appropriate Twitter account or Facebook page. On balance, we think Twitter and Facebook are reasonably solid indicators of popularity, almost by definition. The main caveat is that a companies social following is only partly determined by how much people like it – it’s also determined by the extent to which the company has made building it’s social media presence a priority. Those that don’t prioritize social media are thus penalized. That said, we think companies that invest in social media are intrinsically more customer focused, so that’s a worthwhile tradeoff.
6. Facebook Likes
Each company’s current Facebook likes was obtained simply by looking at the appropriate Facebook page. Same logic as Twitter so see just above.
7. Growth in Domains Registered
To calculate this, we used data from the same sources as #1, webhostinginfo.com. In addition to showing total domains registered, they show historical data for domain gain and loss. The very significant caveat is that only recent historical data is used – 5 weeks. So this Ranking Factors shows the five week gain or loss in domains. Clearly, this creates the possibility that the data are showing a short term variation rather than a medium-to-long term trend. Alas, it is the best data we could find on changes in domains or customers over time.
8. Growth in Twitter Followers
We gathered data for three-month growth (or decline) in Twitter followers using Twittercounter.com. The logic for using this is similar to using Twitter and Facebook followers more general – if Twitter followers indicate popularity, growth or decline indicate a growth or decline in popularity.
9. Growth in Google Trends Brand Trust
For each hosting company, we looked at Google Trends data for the two-year period from October 2010 to September 2011 by checking search volume for each company’s brand: “godaddy,” “bluehost,” “ipage,” etc. This data shows how search volume for each brand has changed over two years. It’s important to note that Google Trends data are “normalized” so that the highest month of search volume equals 100. So all graphs top out at 100, whether the brand get 100 searches or 10,000. That means you can’t use Google Trends data to get a sense for how many searches one brand gets relative to another, you can only use it to see how search volume for a particular brand has changed over time.
CREDIBILITY AND PROMINENCE
10. SEOMoz Domain Authority
SEOMoz Domain Authority is a highly aggregated metric in it’s own right, one that takes into account dozens of factors that indicate popularity, trust, and credibility. SEOMoz uses the domain authority metric to rate any website’s potential to rank high in search engines, and keep in mind that search engines, as a goal, want to rank the highest quality, most trustworthy, and most credible sites for any search. For that reason, we think this is a great overall metric.
11. SEOMoz MozTrust
SEOMoz Moz Trust is designed to measure the extent to which a website is well connected to a cores set of highly trusted websites. To some extent, it acts as a counterweight to situations where a website might have large numbers of links and popularity among smaller or less trustworthy sites. We think this metric is a reasonably solid indicator of credibility and trust.
12. WebHostingGeeks User Reviews
WebHostingGeeks is, in our opinion, the best site for reading user reviews for hosting companies. As we’ve noted elsewhere, most hosting company websites are, without exaggerating, terrible. Even with WebHostingGeeks, we have some serious concerns about what percentage of users reviews are fake. If reviews are real, you would expect the most popular hosting companies to have the largest number of reviews. What you actually see is that a specific set of certain hosting companies tend to have the largest number of reviews on almost all hosting review sites. There’s just no reason that smaller companies like InMotion, iPage, FatCow, and WebHostingPad should have as much as 4 times more reviews than a company like GoDaddy, which is actually 25 times larger than all four of those companies combined in terms of domains registered. At the very least, this can only occur if those companies are actively encouraging users to go to review sites. WebHostingGeeks is the most trustworthy of the reviews sites, but this is still a major concerns. If we knew, without doubt, that we could trust that reviews represented the honest opinions of real users, we would give this single metric a huge percentage of the total weight in the rankings. As it is, our concerns about inaccuracy and fake reviews compel us to give this no more weight than any other factor.
13. 24/7 Support?This is a simple, objective metric that says a lot about customer service. Hosting companies are global, and sites can have problems at any times. In our view, that means that 24/7 support is almost necessary. Offering it not only means users can get support at an point, it also indicates an strong overall service orientation.
14. Phone Support?Our rankings generally target shared hosting, and most shared hosting customers are not highly technically skilled. For such users, the ability to connect with someone on the phone to talk through an issue is enormously important and valued. So Phone Support is important in its own right. Like 24/7 support, it also indicates a strong overall focus on customer service and support.
15. Live Chat Support?We think Phone Support is a more important Ranking Factor, and is generally in higher demand among consumers. That said, a sizeable minority of users strongly prefer live chat support, so it’s availability is important. As with 24/7 support and Phone Support, offering Live Chat also means a company has a strong service orientation and is committed to making many different channels available for support.
16. Better Business Bureau Rating. The Better Business Bureau is a US-based agency that tracks consumer complaints and responsiveness, and rates all company on a scale from A+ to F. We translated their letter grades as follow: A+ (100%), A (96%), A- (94%), B+ (92%), B (88%), B- (84%), C+ (80%), C (76%), C- (72%), D+ (68%), D (64%), D- (62%), F (58%). For companies that have BBB ratings, this is a great metric for indicating whether there are any serious customer service problems. There are a few caveats. We should note that this is a US-centric metric. Several hosting companies based elsewhere do not have BBB pages, including 1and1, 123-reg, and OVH.
17. Number of Better Business Bureau Complaints Per 10,000 Users.
Too main factors seem to shape BBB rankings: how many complaints a company gets via BBB, and how much the company works with BBB to resolve complaints. We think that BBB ratings gives too much weight to the latter. The fact is that if a company gets a lot of consumer complaints, that’s a problem whether it works with the BBB or not. Conversely, a company that gets few complaints, but doesn’t work with the BBB as a matter of policy would be graded unfairly low. For this reason, we’ve also included the number of complaints, scaled to take into account the size of the company.
PRICE & VALUE
Price is an obvious ranking factor, though including it was a matter of much debate. On one side, people care about price. And if a company offers the same quality and service for a lower price, then they should be more attractive and Ranked Higher. On the other side, the price difference between many of these companies is marginal, and there is some argument that pricing too low is actually an indicator of lower quality in itself. Despite these concerns, we’ve decided to include price in our first version of the Rankings.
19. Includes “Unlimited” Domains, Storage, and Bandwidth?
This is a very crude overall measure of what’s technically included for a given price point. A large percentage of the companies in our study offer so-called unlimited plans. Everyone recognized that these are not, in reality, unlimited. There is some level of storage or bandwidth usage past which it’s simply uneconomical for a hosting company to support a single customer for $6.95 per month. Despite this obvious limitation on the meaning of “unlimited”, there is still a difference between an unlimited plan and one that allows a specific number of domains, or limited amount of storage, and then begins to add price when users go beyond those limits. Whenever a hosting company offered an unlimited plan, we used that plans pricing, and they received full credit for “unlimited.” For companies that do not offer “unlimited” we used the most comparable price, and then penalized them to some extent for the fact that they have more add-on prices than most other companies in our study.
HOW THE FINAL RANKING IS CALCULATED
Once we collected the raw data for each of the Ranking Factors above for each hosting company, there’s still the issue of how to scale that data and average it. For example, if two companies have Alexa Ratings of 490 and 12,435, how do you translate those into obvious numbers than can be compared and scaled.
Our goals was to convert the raw date for every ranking factor into a percentage on a scale from 0% to 100%, much like school grade. Once that was done, it was simple enough to average out those scores to get a final ranking score on the same scale for each company.
In order to convert each raw data series to a 0-100% scale, we had to come up with a consistent and fair logic.
For any data set in which larger numbers are better (Number of Domains Registers, Linking Root Domains, CBlocks, social Followers, growth rates, etc), we adopted a simple rule. We took the third highest, and set that as the reasonable ceiling or high point, and calculated all other percentages based on that. So the calculation for Company A would be (Company A’s Data)/(Third Highest Company’s Data). Since it’s the third highest, that means that at least three companies will score 100% on any Ranking Factor. The reason for using the “third” highest rather than the first highest is to avoid basing all percentages based on outliers. For example, GoDaddy has more domain registered than all the other sites in this study combined. If we use that as the base for calculating percentages, GoDaddy would score 100% and every other company in the study would be less than 10%.
For data where lower scores are better (Alexa Ratings, BBB Complaints, etc). We had to calculate the third highest as well as the third lowest for each data set. In this case, the three LOWEST would get 100%, since “low” for these items means highest performing. All other data would then scaled in a very similar way each company’s data by the third highest (third worst performing). So Company A’s percentage would be calculated as (Company A’s Data)/(Third Highest Company’s Data).
Finally, we set 0% as the bottom low point for any percentage. We did not allow any company to have a negative percentage on any data set. This might have happened, for example, if a company was experiencing negative growth rates and we used those negative growth rates to calculate it’s percentage. So any percentage that ended up less than 0% was set at 0%.
Using that logic, we scaled every data set to a 0-100% scale.
We then used a simple average to calculate the overall ranking. This also requires some comment. Instead of using a simple average, one could argue it would be better to weight certain factors more heavily, either because they’re more important or because the data is more reliable. We did run a few versions of the rankings with weighted amount. But there were two issues. First, none of the individual ranking factors is a stand-out that obviously deserves dramatically more weight. So the differences in weights actually ended up being quite small – generally 5-10% each. That means that the difference between using weights and averages was not large. Another consideration was simplicity. This came into play when dealing with missing data points. For averages, we just dropped missing data points from the average, so a company was neither helped nor hurt. With a complex equation based on weights, all weights would need to be calculated for each company that is missing a data point. This is quite possible to do, but adds a lot of complexity for almost no noticeable difference in end result.
LIMITATIONS AND CAVEATS
This is the first time we’ve done our rankings, and as we’ve noted before, one of the big challenges is that most data we’d really like to have is unavailable, either because things just aren’t measurable or because data is confidential to the hosting companies themselves. As a consequence, we must rely on proxies. Most of the limitations and caveats relate either to data that is missing, or to concerns we have about the proxies that were available.
- Heavily weighted toward popularity, harder for new or small companiesThe first caveat is that our rankings are rated heavily toward popular hosting companies. Many of the Ranking Factors either directly measure popularity, or are likely to correlate with popularity. We think this is a reasonable outcome, since the most popular hosting companies have obviously done things right in the past. However, we do recognize that the rankings are potentially biases against outstanding but smaller hosting companies. We tried to offset by including measures for growth, and by including objective measures of customer and user experience.
- Focused on simple shared hosting.
- No data on number of users.The data we would most like to have is the number of customers each hosting company has, and how that number has changed over the last 1-3 years. This would be the most direct way to measure popularity and growth, and if we had it, this single measure could reliably replace many of the other metrics above. We are considering surveying hosting companies for this data in next year’s version.
- Missing data points, excluded from average Ranking.Even though we tried to use ranking factors that were available for almost all hosting companies, some data points were missing. Where data was missing, you’ll see “Not Available” on each hosting companies Ranking Report. When data was missing, rather than penalize the hosting company in questions, we simply removed that factor from the final ranking.
- Heavy reliance on SEOMozWe used several metrics from SEOMoz, including Domain Authority, Moz Trust, Linking C Blocks, Total Linking Root Domains. We did this largely because SEOMoz data is available for every company’s website, and also because the data itself is among the most reliable. But many of these measures are highly correlated (like popularity measures in general), so a company that does well across any one of these metrics is like to do well on others.
This study and the resulting rankings are focused on simple shared hosting plans. We may consider ranking companies based on their VPS, Dedicated, Cloud, and other hosting, but there are many, much more complex needs to take into account, and what customers are looking for often various from company to company. Our sense is also that anyone looking for such advanced need would be less likely to look at this kind of rankings, and would end up doing their own detailed research regardless. So these considerations were left out of the current study.