Usurp The SERP – Planning The Takeover

The example I have provided is very small, to the extent that the nitty gritty analysis could be done manually without taking excessive amounts of time.

However, if you needed to identify tens of thousands of keywords to invest in as part of an enterprise SEO strategy, this would require a more scalable approach to evaluating individual keyword opportunities.

This can be achieved through designing an opportunity evaluation model.

Please let me preface by saying this is not a complete model, it is just an approach to analyzing competitive SEO drivers on a large-scale. However, if refined to project revenues within an order of magnitude, it can become a powerful business driver.

Due to intellectual property rights I can only share some of the model. I believe it is enough to get you thinking and moving in the right direction.

The heuristics I use in the model are as follows:

  • Domain Authority (DA) - a logarithmic measure of a domain's authority and ranking potential.
  • Page Authority (PA) - a linear measure of an individual URL's link and relevancy strength.
  • Number of competing pages with target keyword in title (CP) - relative measure of SERP competitive landscape.
  • Monthly Search Volume (MSV) - average exact local monthly search volume for individual keyword.
  • Click-Through Rate (CTR) - average SERP click-through rate for URL's ranking in position 'y'.
  • Conversion Rate (CR)- average conversion rate for target goal.
  • Monetization metric (M)- Measure of approximate revenue per conversion, for eCommerce this may be average order size, for display advertising this would be eCPM, or earnings per 1,000 pageviews.
  • Discount Rate (dr) - Rate at which the heuristic is discounted to adjust the relative significance within the model.

For DA and PA I used the average of the top n sites. N will vary in size based on your confidence interval requirements.

The model is then built on a spreadsheet so each of the heuristics can be driven by the assumptions, i.e. cost per content, average cost per link, SERP CTR, avg revenue per page or per conversion, etc. And the assumptions can be adjusted as historical data is collected.

What this allows you to do is built a large-scale model for evaluating each keyword, projecting the cost, time, and potential return based on using discount rates to adjust the relative importance of the varying competitive factors; DA, PA, and links. And placing this into a model that discounts each of these heuristics accordingly.

I realize this may be hard to visualize as it is not common in the SEO industry to talk about automating keyword analysis for thousands of keywords.