A client recently attended a webinar where a presenter provided a list of the major search engine ranking factors. The client wanted me to confirm the list, or provide mine. I replied that I did not waste time preparing such generic lists. In addition, the list it provided lacked an obvious factor when searching from mobile devices: proximity to nearby businesses.
If you ask five reputable search engine optimization practitioners for their top 10 ranking factors, you will probably get five different opinions. Everyone speaks honestly about their unique experience, but it may not be directly applicable to your situation. A decent amount of SEO work is haphazard.
In fact, checklists of generic ranking factors have long since become irrelevant. But instead of debating their merits, in this article I will provide a solid data-driven framework to know which ranking factors and initiatives are applicable to your site, and what you need to do to systematically improve your search traffic. organic, and sales.
A popular approach in SEO is to learn by examining first-rate competitors. However, a disadvantage of this approach is that you never have a clear vision of the strategies and tactics of your competitors. In addition, the measures of the competitive tools are not precise, in my experience. (You can easily confirm this by comparing their numbers for your site with your analysis package.)
When you look closely at your site, you will probably find groups of higher ranked pages than others. You can compare the SEO factors of these pages against the less successful ones and use this learning to determine your best SEO strategy.
Optimal page length
For example, a common question I get from customers is: "What is the optimal number of words for my pages?"
The simple answer is that your content should be as long as necessary to help your audience. Generally, however, the more words on a page, the better it will be. In fact, we can group the pages of a site to see if the best artists gravitate to a specific content length.
On the Y axis, above, we grouped the pages according to their number of words: more than 0, more than 1,000, more than 3,000, and so on. On the X axis is the average number of new organic visitors.
The majority of pages on this site do not have the optimal word count (about 3000 words) as measured by actual performance: the average number of new organic visitors. This gives us a good reason to experiment by adding more content to pages that do not work.
Another frequent question is the length of META tags, such as titles and meta descriptions.
On the Y axis, I've grouped the pages according to their lengths of meta description. The X axis shows the average number of new biological visitors.
In this case, we can see that the optimal meta description to attract new visitors is 152.6 characters.
These analyzes do not necessarily mean that increasing the number of words and the length of meta descriptions will increase search rankings. They simply mean that the pages that attract the most new visitors have these attributes. This is useful because it provides clear insights on SEO experiences to try.
Let's go over a last example, a little more sophisticated. After that, I'll show you how to put these visualizations together.
I will be using the data from the new and useful Google Indexing Coverage report, which will be included in a future upgrade to Search Console. The report is not yet available for everyone, but Google promises to make it available soon. The Index Coverage report allows us to finally see the pages indexed by Google, as well as the reasons why the other pages are not indexed.
Google has a detailed help document that explains all the reasons why pages are indexed – and why they are not. However, the report does not tell you if the pages are not indexed because they lack links or inbound content.
It is interesting to see that the pages that Google calls "indexed, low interest" have fewer words than the rest of the indexed pages. But when we look at the incoming internal links, below, we see a clearer picture.
On the Y axis we have the average number of incoming internal links to pages, and the X axis groups them in two: indexed (left column), or not indexed (right column ). The colors describe why the pages are indexed or not in more detail.
After that, the number of incoming internal links to a page is a major factor in whether Google removes the index page (for this site) or not. This is a very powerful overview. If this site wants the most profitable and indexed pages to be indexed, these pages must be linked aggressively.
Visualization of the data
Now I'm going to explain my process of setting up these visualizations in a business intelligence tool – I'm using Tableau.
Step 1. Extract performance data from Google Analytics to get results statistics, such as traffic, conversions, engagement, and revenue.
I will use a handy Google Sheets add-on that makes it easy to query the Google Analytics API and overcomes the limitations of the user interface.
Create a blank Google spreadsheet, and navigate to Add-ons> Get Add-ons> Google Analytics . After completing the authorization step, you will see a pop-up window as follows:
Record the metrics (new users, pages / sessions, average session duration, page loading time (ms), average order value and business count) and dimensions (source / support , landing page) that I have included. above. I like to add Source / Medium so that I can confirm that I'm only looking at organic search traffic.
After creating the report, filter the traffic only to the natural search, as well as the range of dates to be analyzed. Use "Max Results" and "Start Index" to browse large volumes of data and extract all the data you need, exceeding the 5,000-line limit in Google Analytics reports.
Then go to Add-ons> Google Analytics> Run reports to get the data.
Step 2 : Then I will do a basic cleanup of the data to prepare the analysis.
First, delete the lines of information 1-14. The values under ga: landingPagePath must be absolute URLs. You can do this in a separate sheet and copy the results.
Step 3 : Run an SEO spider, such as Screaming Frog, on the pages we pulled out at Step 2 to get their SEO meta-data.
Copy the updated column ga: landingPagePath with the absolute URLs on the clipboard.
In the Screaming Frog list mode, you can paste the URLs you copied previously and let the spider run to retrieve the relevant SEO meta-data.
Once the scan is complete, click the Export button. Load the CSV file into your Google sheet as a separate tab.
Step 4 : Next, we need to connect our datasets to our business intelligence tool. Again, I use Tableau, but you can also use Google Data Studio or Microsoft Power BI.
I link the two sets of data by common page URLs. In the Google Analytics dataset, the column is ga: landingPagePath . In the exploration of the Screaming Frog spider, it is the column Canonical Link Element 1 . If your site does not have canonical (it should), you can use the Address column instead.
Step 5 : Finally, I will create a visualization.
For this article, the first visualization (above) is "New users by number of words".
To replicate this in Tableau, drag the "New Users" metric (called "Measure" in Table) and drop it in the columns. Then select the drop-down menu to switch from the default synthesis operation to the average.
Then, right click on the metric "Number of words", and select "Create> Bins …". This will create a new dimension called "Number of words (bin)." Drag this to the rows.
Then, right-click on the "Canonical Link Element" dimension, and select "Convert to Measure". This will provide the number of unique canonicals. Drag the slider to the color picker and use the "Divergence temperature" palette.
Finally, drag the "Status Code" dimension to the filters, and check only "200" to filter errors and redirects.
Follow these steps to replicate the other visualizations in this article. The latest visualization, "Pages Indexed by Internal Links," will require access to the new index coverage report, which Google slows down.