Reverse Engineering Google’s Search Engine Result Pages

In my youth I always broke things, I loved breaking things, anything that I can take apart, I would take apart. I was the kid in the elevator that pushed every button, my teachers told my parents that I have ADD, perhaps I do, but none the less… putting the parts back together was never really my thing, at least until I got old enough to comprehend how disturbing it was to those around me.

Reverse Engineering and Search - Exploding Watermelon Analogy

 

In the search world, reverse engineering websites helps make up all the elements that fuel organic results, it gives us insights on what’s causing results to shift around, who did what and how. Every page that’s indexed and ranking in Google will have digital footprints leading all the way back to the origins, to the foundations of the project.

Over the past 8 years of optimizing websites to rank in Google’s search engine, I’ve learned a lot of lessons, uncovered a lot of manipulative strategies, some of these strategies worked in the past and in many cases are still working today. Now I’m not going to get into specifics about who is doing what or how. Instead, I’d like to explain the process of reverse engineering Google SERP’s and identifying what you’re up against in your market, is your competition doing an amazing job? or are they heavily cheating? can you even compete?

The first step – identify who is ranking for your most important search queries, if you’re a family dentist in Miami, your targeted terms are likely:

  • Miami Family Dentist
  • Family Dentist Miami
  • Family dentist in Miami
  • Family dentist near Miami
  • etc..

The second step – Copy the names of the top ranking sites for these terms and paste them into a link research tool such as www.majestic.com Majestic will give you a report of who links to each site you run through their search field. After you run a search click on the tab labeled “Ref Domain” this will help you quickly identify if a site has:

  • Links coming from real sources (recent blog posts; built for real people; good data) – Good Links
  • Links coming from sites that are built for robots (crap content; no interaction; foreign; outdated; built for ranking purposes only) – Bad Link

Once you complete your free search on Majestic and hit “Ref Domain” tab, the URL’s will be organised with most linked sources on top, which should help you estimate what you’re up against. Other good tools for identifying backlinks (digital footprints): Open Site Eexplorer and AHREFS.

From data that we extract we can correlate useful information on what’s ranking the page so favorably within Google’s index.

What to look for?

The final steps – With the data extracted we look for opportunities and calculate a realistic estimate of work needed to get similar exposure, if even possible… in some cases your competition’s been active in the space for a very long time and has a large list of good links (authority and trust) pointing to their domain. However, if you don’t dig and look you’ll never know.

In some cases your biggest competitor is ranking with manipulative tactics and if you don’t inspect thoroughly, who will?

In other words your competitors can get away with violating Google’s Webmaster guidelines and may have already been getting away with it for years.

What to do when you find manipulative tactics that violate Google’s Webmaster Guidelines?

When there’s smoke, there’s fire! When you find one manipulative tactic there are usually more… Run a Who.is report on the manipulative sites that link to your competitor and see if the competitor owns it, in many cases you will find that these sites which are built for Google’s robots are owned by your competitor or the marketing/hosting company they hired.

Run another Majestic report on the manipulative site built for robots and see if it’s part of any network, if so, you have a very low chance of competing with them organically, a manipulating site is building 1000 fake links to your 1 real link. Identifying these tactics will only strengthen your case if you choose to report them to Google and level out the playing field.

https://www.google.com/webmasters/tools/spamreport

From my experience Google may not respond to your request, in fact they may not do anything about your findings, whichever outcome is made you’re still better off reporting your cheating competitor than doing nothing about it.

This guide may help you catch yourself before you fall…

SEO Preliminary Measures

Get a better understanding of what challenges you will face and who you’re up against before investing time and money, before you’re up against a real competitor or a spammy cheater that will be very challenging to beat. In my experience cheaters are still out there, even after all the animal updates Google’s released (Penguin, Panda, Pigeon, yada yada).

Have any interesting reverse engineering stories? I’d love to hear them in the comments below.

Dear Google,

Dear Google,

  • Market Share of Search

  • Market Share of Mobile OS

IDC: Smartphone OS Market Share 2015, 2014, 2013, and 2012 Chart

  • Market Share of Browsers (us) – 43%

We don’t really have a choice do we?

As long as links are manipulated, many businesses will fail.

When a website has bad links and gets punished for it, people can get mad.

When a website has natural links and their competitors win with bad links, people can get mad.

Website Punishment is Not The Answer

It makes no sense to punish a website for links, there has to be a better way. Punish the webmaster not the website.

You preach for webmasters not to worry about links, however, if one can be punished by them and one can inherit a site that has them (knowing or not?), falling in rankings can seriously make people worry. Links are a webmasters responsibility, not a businesses, the business should have no tie to a link profile, the webmaster should.

The way things are today, when an uneducated webmaster doesn’t check a backlink profile, he’s in jeopardy of wasting a lot of time and money.

Suggestion for Google Search

License Webmasters, make webmasters take credibility for their own links. How? Make every business that wants to rank in Google, get their license to Google (Current webmaster can have a timeframe).

A license to Google should require some solid form of identity: social security number, a copy of a passport, license number, EIN, IDK something. Once webmasters are accountable for their own links, they would refrain from building manipulative links and they will think carefully before linking to someone. Unlicensed users links shouldn’t count. Google can help the search community without punishment. Isn’t the goal to create a better user experience?

Top 5 ranking sites should receive a congratulations letter with a Google Webmaster License request,


 

“Congratulation your website has made it into Google’s top 5 for several awesome terms in your market.

[You rank for your brand, more terms!]

Keep up the great work!

Notice: In order to maintain your credibility, Google asks that you add your Google License number into your webmaster tools account. Don’t have a Google license? Get one here”


 

Encourage us to do better and reward us for good work, don’t punish.

In the real world, people incorporate their businesses, it helps consumers from being cheated and protects people. In this case it will encourage better user experience for all.

Transparency – we are with our personal information. Can you imagine working on a project for an entire year only to find out the previous webmaster created a crappy link profile, it’s a bummer.

Punish the culprit, not everyone else!

Perhaps my suggestions are not the right solutions, but I can tell you that punishing people isn’t right either.

 

Yours Truly.