There are websites that are at the upper end of the organic search results even without active SEO optimization. But just because a page has not been actively optimized according to SEO criteria does not mean that no unconscious optimization has taken place. What does that mean now? Maybe the topic of SEO is not really that important? First of all, it must be clear what Google pays attention to when indexing websites and assigning rankings.
Which factors are decisive in the Google ranking?
According to its own statement, Google takes more than 200 different ranking factors into account. These are each weighted differently and can also be influenced by indirect criteria, e.g. usability. Means: If a website lands in first place in the Google search results for a certain search term, then it receives a better value than other pages competing for the same search term when all ranking factors are taken into account.
Above all, Google wants to place websites that match the search intent as closely as possible. This can also apply to websites that have not been actively optimized for this – for example, if the search engine registers positive user signals.
Does SEO optimization still have advantages? An example:
Let’s assume a website that has nothing but a travel cost calculator for cars. With this, the travel costs for a certain route can be calculated by specifying the distance, fuel consumption and fuel price. There are several technical and content-related aspects of how the page could be structured. In the following, we will look at three possibilities in a simplified manner:
Example 1: the missing topic reference
An otherwise blank page with no text or headings. Only the input fields and the result field for travel expenses are located here. In addition, the site was named after the operator / owner (e.g. mustermann.de)
-> The search intention for the user is covered here – the travel costs can be calculated quickly and easily – but there is no indication for Google as to which topic relevance the page has and which search term it should be assigned. This means that in all likelihood the page will not get a relevant placement in the search results.
Example 2: the correct labeling
This page is structured as in example 1 – with the difference that it has a text description and headings. It also has a topic-related name (e.g. fahrtkosten-berechnung.de).
-> Here, Google can assess much better how relevant the page is for the relevant search term. As a result, it is entirely possible that the page will get a good ranking – if it is used frequently.
Example 3: user-oriented optimization
Building on example 2, this version contains – in addition to the travel cost calculator – other functions that visitors like to use. This could be, for example, the possibility of entering a route, the selection of a certain type of gasoline, the number of people traveling with you, parking fees, etc. In addition, the page is attractively designed and has an extremely good loading speed.
-> Apart from search intention and topic relevance, further (albeit by far not all) ranking factors were considered. Functionality, added value and design lead to a longer stay and signal to Google that the website is not only popular with visitors, but that users are also more interactive. In the best case, they stay longer on the page or even come back – which in turn means that the probability of a good ranking increases significantly.
SEO & technical stumbling blocks
So we have learned that the content of a website – whether consciously or unconsciously optimized – can have a major influence on the ranking. The technical SEO criteria, on the other hand, are (at best) consciously optimized. With the help of this technical optimization, the Google search engine crawler is instructed how it should index a website including its subpages.
Please note: Even the smallest technical stumbling blocks can lead to a complete website (or important sub-pages) being completely de-indexed. The following problems are particularly popular:
1. Incorrect meta tags
Meta tags are located in the HTML code and are invisible to users. In addition to the meta tags for title and description, this also includes tags such as “nofollow” or “noindex”. Carelessly set, the nofollow tag can in exceptional cases lead to the Google crawler not being able to perform its task correctly – with the result that certain sub-pages cannot be found. The noindex command can even be responsible for unindexed pages. Both can be set automatically depending on the content management system and plugins used. Therefore, special care is required here!
2. Incorrect canonical attributes
Canonical attributes are also often set automatically. This can have advantages – but it can also lead to content that occurs frequently on various sub-pages being deleted from the Google index.
3. Faulty Robots.txt
The Robots.txt is able to use simple command lines to prevent complete subdirectories of a website or even the complete crawl process of certain crawlers. Here too, incorrect entries can lead to unwanted de-indexing.
4. Bridging sides
“Doorway Pages” are also a common reason for Google penalties. If you use these types of pages without realizing that Google has classified them as spam in many cases, you run the risk of either the respective sub-pages or even the entire website being temporarily removed from the Google index. Such a penalty can only be lifted with appropriate countermeasures
5. HTTPS encryption
If a website does not have HTTPS encryption, it will be displayed as “not secure” in the user’s browser. Google is also reluctant to see unsafe content. Although these are not necessarily de-indexed, insecure content has a negative effect on your own ranking.
6. Duplicate content
For reasons of user guidance, it can make sense to use larger content elements repeatedly on several subpages. However, Google doesn’t like this at all. Duplicate content was often used in the past to artificially “inflate” the content of a website – which has resulted in a penalty. In order to counteract this, the duplicated content must either be reduced or marked as “canonical content” with a corresponding meta tag.
7. Poor loading speed
The importance of page load speed is often underestimated. This has both a direct and an indirect influence on the ranking. Because the longer the website loads, the higher the bounce rate – which also affects the visibility on Google.
Conclusion: do I need targeted SEO optimization or not?
To come back to the example of the trip computer: You could get into your car, just drive off and possibly arrive at your destination. However, they did not check the fuel level in advance, checked the rear-view mirrors, installed the navigation system or selected the fastest route. In the worst case scenario, you have a crash and the car is totaled.
It’s the same with SEO. Those who do not value sustainable and targeted search engine optimization can be (very) lucky to have success – but there is the risk of wasting potential or neglecting their own visibility. Penalties or the complete deindexing of your own website are anything but unlikely due to technical stumbling blocks.
In plain language: If you want to rank successfully in the long term and on a secure basis, in the majority of cases you cannot avoid targeted search engine optimization. This is the only way to ensure that previous rankings are retained and that they continue to improve based on positive user signals.