Understanding The Primary Ranking Factors For SEO
November 28th, 2023
Posted: April 22nd, 2023
You might be wondering, how does a Search Engine 'read' my website? If you are a developer, this may be a simple explanation, but for the sake of any who may not dabble in the dark arts of web technologies, I'll attempt to explain it in simple terms.
Essentially, a website is built up of components that perform different tasks. There are a number of components that impact what you see, but there are also a lot of components that you will never see, but are used by search engine bots, otherwise known as crawlers. These components are fitted together in what is called a Document Object Model (DOM) and this is an important structure that crawlers read in order to understand your content and how it might be relevant to certain topics.
A search engine works by performing the following actions:
A history Lesson? Seriously? You'd be surprised how important it is to understand this history in order to best understand how SEO works the way it does. The full story is long and fascinating, and perhaps one day I'll write it in longer and greater detail, but for now I'll keep it brief in the interest of relevancy!
To summarize: At one point, early on in Googles infancy, websites were indexed based on how they utilized keywords and not using the complicated algorithm we know and love today. This method gave rise understandably to a problem in the form of spam sites that simply posted an endless stream of keywords on their websites in order to rise to the top of search engine listings. A major problem that needed to be addressed. Google responded to this by penalizing keyword stuffing (see below) and introducing backlinks (also below).
We could probably write an entire article on Keyword stuffing as it is a fascinating part of Search Engine history that helps us understand current practices. Simply put, keywords on your website drive traffic to it. Spam websites learned this early on and so had pages with nothing but spammed keywords in order to rank higher in searches. They did this not only with content that could be seen, but with hidden paragraphs that were invisible to all but crawlers and aimed to bring up their rank deceitfully.
Google made changes over time to determine if too many keywords were used too frequently in content. Keywords should make up 1-2% of content organically, anything beyond that is penalized or scrutinized depending on the context. They also introduced a link ranking system, now commonly known as backlinking, in order to help weed out disreputable websites from listings and give weight to higher quality content.
Backlinks were introduced as a way to establish website authority through relevancy based on how many people linked to your website. When introduced before Christmas in 2003, the change destroyed a lot of small retailers in its wake and didn't prevent spam, however, as spam sites learned that they could simply link to each other to improve relevancy. This gave rise to modern backlinking that places greater weight of linking from trusted websites.
There are a number of ways to improve your backlinks, but one of the best lies in content creation. Fortunately, in our article: 8 Benefits of Blogging for Your Small Business, we go into greater detail on how blogging and content creation can raise this ranking.
Understanding The Primary Ranking Factors For SEO
November 28th, 2023
The Impact Of Responsive Web Design On User Experience
September 5th, 2023
Domain Renewal Vs Website Hosting, What's The Difference?
August 10th, 2023