The SEO dictionary will be expanded in the near future with new terms, but also with links to knowledge pages.
On these knowledge pages we make a real deeper into a specific topic, so that you can gain extensive knowledge about specific SEO topics.
The Alt text is the alternative description of an image and is important for SEO. It is metadata of the image.
The alt text is originally designed for blind and partially sighted people.
They use a screen reader (or screen reader) to have a web page read aloud.
The screen reader uses the alt text of an image to read what an image is about.
For many years, Google ‘s bots have also used the alt text to determine what an image is about.
It is therefore very important to fill this meta text with descriptive keywords.
Make sure that you do not use over-optimization. In addition, the alt text is also used when the browser is unable to load the image.
For example, because the location of the image has been moved, or because the image is no longer on the web server.
The anchor text of a hyperlink is the text that the visitor sees on the front of your website. For example, the word “over-optimization” in the paragraph below is the anchor text of a link on this page. It is important that the anchor texts of links to your web pages are natural.
When over-optimizing links, you see that most links have exactly the same anchor text. Google can easily review anchor texts on the web and punish over-optimized usage where necessary.
With internal links it is less important than with external links. Google understands that you may want to be consistent in linking on your own website. But the advice is to alternate the anchor texts every now and then.
Black hat SEO
Black hat SEO refers to all SEO techniques that Google (or other search engines) does not approve. These are often so-called tricks. At Pro SEO Expert we strongly advise you not to apply these techniques. They may (just maybe) work in the short term, but in the long term it will hurt your web traffic. Examples of black hat techniques are cloaking, keyword stuffing and the use of hidden text and / or links. You can read more about this on Google’s quality guidelines page.
(Search Engine) Bot
A search engine bot, also called a spider, searches the internet for new and / or changed content. This is called crawling . The goal is to index (visualize) the internet . The search engine bot follows billions of links to find new or changed content. Once the bot has found this content, parts of the code will be saved. These parts are used in the search results. Some of this information is shown to the viewfinder in the SERP .
Think of the meta title of the page, the URL and a description of the page; the meta description . More information about how Google works can be found on the Google explanation page.
The canonical tag or rel = ”canonical” is used to tell search engines that one page is a copy (or almost a copy) of another. The canonical tag is important to solve duplicate content problems. These problems are often caused by your CMS or website platform. With the canonical tag you indicate that the page (or several pages) is / are a copy of an original. This tells Google which page to index , namely (only) the original. You also apply a canonical tag when a page is too similar to another. The choice is yours which is the original and which is the copy.
CTR – Click Through Rate
CTR stands for Click Through Rate. The Dutch translation is click-through ratio. The CTR is calculated by dividing the number of clicks by the number of impressions. When you link Google Analytics with Google Search Console , Google brings two forms of CTR into the picture:
- The CTR of keywords (referred to by Google as searches).
- Landing page CTR.
The CTR says something about the popularity and / or attractiveness of your content and search result (mainly formed by your meta title and description).
But perhaps more importantly, the CTR is an important signal for Google. The higher the CTR, the higher the chance that your web page will meet the search intent.
CSS – Cascading Style Sheets
CSS stands for Cascading Style Sheets. This language is used for the design (styling) of web pages. CSS has no direct relationship with SEO , but it is ultimately an influencing factor. One important determining factor of search engine optimization (SEO) is the user experience, or the user-friendliness of websites. The styling of a website influences the user experience and is therefore also an influencing factor of search engine optimization.
Search engines, including Google, use so-called bots to image ( index ) the internet . This activity of the search engine bots is called crawling. What the bots do very simply by following hyperlinks. The links on the internet can be seen as highways. They connect more than 1.5 billion websites that are rich on the internet. When crawling, the bots follow the links and store the most important information of web pages found in the database. You will see some of this information when you search in Google. Consider the meta title of the page and a description; the meta description . More information about how Google works can be found on the Google explanation page.
The crawlability refers to the extent to which Google is able to index your website . You want to block some pages from Google, that’s normal. Think of the back of your website (the administrator pages). You do that with a robots.txt file.
Google reads this file and respects when you indicate that certain pages may not be indexed. But of most pages of your website, you want Google to index them. However, crawl errors can make the Google bot unable to index your website.
That is of course a shame! So always make sure that your website is as crawlable as possible. One way to discover and fix crawl errors is with Google Search Console .
The Crawl Budget translates to crawl budget and refers to the time spent by Google’s bots or web spiders crawling your entire website. The more time they spend crawling your site, the better. This will make it more likely that all your content will be indexed in the search engine … something essential for web positioning.
The best thing that can be done to increase the crawl budget is to work the link building to increase the authority of the site. It is also very important that the content is not static. It is avoided by scheduling an editorial calendar to update the text frequently and add new content.
You guess? Yes, Trends is another free tool from Google. This, in particular, is used to discover and compare search trends. You just have to enter a keyword and Trends will tell you its popularity over time. You can even compare up to 5 searches to find out which one is most popular.
If you’re not looking for a specific word, you can use Google Trends to discover the latest trends or most popular news, by country. And it also shows the most popular searches in previous years. As you can see, this tool is a gold mine to get ideas to generate content on your website.
Gray Hat SEO
The Gray Hat SEO defines the SEO strategies that do not become as aggressive as the Black Hat SEO but neither Google guidelines are taken very seriously, which basically prohibit any form unnaturally forced to get links. But, of course … if we rely only on creating good content and having a well-optimized website, it is very difficult to beat other sites, especially if it is a sector with a lot of competition.
The Gray Hat SEO is the most common profile among industry professionals who give much importance to the optimization on page but also devoted much effort in getting “natural” links.
Google Pigeon is the algorithm focused on optimizing the local search results. Just like the other algorithms, this one is in charge of showing the best Google My Business (GMB) results based on your location and your search.
For Pigeon to give you a cable to position your local business, you must take care of the SEO of your website but you must also register it in GMB and optimize your company file with all the data you can: address, hours, description, images, etc. . The more complete, the better the better.
The Guest Posting is the art of getting published posts as a guest on other blogs pages, usually related to the theme of your site.
They are usually articles of value for the users of the blog where you write and, in exchange, the owner or webmaster allows you to put one or more links to your website, in the text of the article.
This allows you to get a link to your website, which will improve the positioning. But it also serves to make yourself known to your audience and improve your personal brand or that of your company.
Duplicate content means that there are multiple web pages that are identical or nearly identical to each other.
This can be duplicate web pages on your own website, but it can also be duplicate content across multiple websites.
In particular, solving duplicate content problems on your own website is an important SEO discipline.
Because the search engines need to know which pages to index.
They obviously don’t want to index multiple pages offering the same thing.
With a canonical tag you can solve any problems and tell Google what is the original and what the copy is.
A web page that has a lot of duplicate content problems will penalize Google. Therefore, make sure you solve these problems in time.
The authority of a domain is determined by search engines through links from other websites to your website.
The number of links is important, but certainly not everything. Google also takes the quality of links and the relevance of links into account.
Google assesses the quality of the websites that link to you. When relevant, Google assesses to what extent the website that links to you has something to do with your work / field.
In this respect, a link from a travel company’s website is not very valuable to an IT company. A link from HP or Dell could be.
The Mozbar is a good way to get domain authority (DA) and page authority (PA) easy to visualize for your website, but also for that of your competition.
Since you already know what a domain is, now it’s time to discover the potential of an expired domain .
After buying a domain, you have everything ready in a matter of hours to show your website to the world. You design the site, generate content and start working on SEO. The web is gaining authority and is positioned well in Google (watch out! After a lot of work, effort and investment).
But, when you buy a domain, you are actually paying a “rent” for it. So what happens if it’s time to renew and you don’t? That it becomes an expired domain and you lose the right to continue using it. At that time, SEOs are on the lookout for domains like these that have authority. They register them and use them to redirect them to their website or to set up another one and take advantage of all the SEO work and the authority of the domain. And you can get a lot out of it.
The Domain Rating (DR) is a metric invented and patented by the Ahrefs company that is used to estimate the authority or strength of a domain. It is measured from 0 to 100 and the higher the DR, the more difficult it will be to increase, because it is a logarithmic scale.
Maybe it reminds you of the Moz DA because they look alike and seek the same goal. But the truth is that Ahrefs has greatly improved his formula for calculating DR. This is more difficult to falsify and is a little closer to reality. Although I recommend that you also not be guided only by this metric when analyzing a domain. Always keep the rest of the factors and metrics in mind
Dwell time is the time between the moment the user clicks from the search engine to a website and the moment he / she returns to the search engine.
Dwell time is therefore the time that a user has spent on a website before returning to Google.
Many SEO experts believe that dwell time is a ranking factor and that Google measures dwell time to evaluate user experience on websites.
Exact match domain
An exact match domain means that a domain name exactly matches a keyword, It called a ‘domain with exact match’.
But this term is not used in the SEO world. Therefore the English variant in this dictionary.
For a long time, these types of domain names have had an advantage in search engine optimization (SEO).
This advantage was very large 10 years ago, but today this advantage is getting smaller. Google has also actively communicated this.
In 2012, Google even implemented an EMD update in the algorithms, which penalizes exactly match domains with low quality content with a lower ranking.
Nowadays the advantage of having a brand name that you can load and that people remember is greater than having an exact match domain. That is why the SEO advice is not to focus on this.
An exact keyword means that the keyword you are targeting has also been entered exactly into Google to reach your website.
This means that no extra words were added (before or after) and that the order was exactly the same.
The term exact keyword (or exactly match keyword in English) actually comes from the SEA world. In Google Ads you have an option to set a keyword to exact .
This means that your ad will only be shown if the keyword is entered exactly like this. Or nowadays unfortunately (?) Also with a very similar variant of it.
Google Analytics is a web analysis program from Google that provides insight into the traffic on your website.
Google Analytics is a must-have for all marketers and / or entrepreneurs who take online marketing seriously.
The key to improving online marketing (and therefore SEO) is measurement. Measuring is knowing is not said for nothing.
Google Analytics offers you so many insights that any overview falls short. But here are a number of important insights that you can gain with GA:
- The number of people visiting your website.
- When do people visit your website?
- Important characteristics of people who visit your website.
- The channels people use to land on your website.
- The specific pages viewed (and how often they are viewed).
- The rating of your web pages.
Create a free Google Analytics profile on the official Google Analytics website.
A .htaccess file controls what happens to incoming traffic on a website.
The file has many functions, but for SEO the redirect is the most important. With a redirect, traffic is diverted from one page to another. Some relevant examples are:
- Redirecting visitors who try to reach your site via an insecure connection (http) to a secure one (https).
- Redirecting visitors who want to reach the wrong version of your website to the right version of your website. For example from www.youwbedrijf.nl to uwbedrijf.nl (or vice versa).
- Last but not least: forwarding visitors who want to reach a page that no longer exists or has been moved. You will of course redirect these visitors to the correct page. You do the latter with a permanent redirect. This is called a 301 redirect.
Automatically rerouting visitors to pages that no longer exist or have been moved is an important SEO activity, but also a general best practice on the web for providing a good user experience (and that too is important for SEO).
HTML stands for HyperText Markup Language. HTML is a so-called markup language and provides structure to web pages. This structure is very important for search engines such as Google. It helps the machines determine what the page is about and how it is structured. For SEO it is crucial that a web page has the correct HTML layout. For example, with HTML you indicate what a title is, or what a subtitle or paragraph is. This way, you help Google determine what your page is about and how valuable it is. HTML is not a programming language, as many people think. It is much simpler and easier to learn than a programming language.
With the Google Hummingbird update, Google provides the searcher with the most relevant results. To do this, Google tries to figure out the search intent by putting the search query in a certain context. This is a complicated concept, so we’ll explain it with two examples:
- The SERP is adapted to your location. So if you search for a restaurant in Amsterdam with your mobile, you will see local restaurants. Google estimates that your search intent is of a local nature.
When a search engine indexes a web page, important information about this web page is stored in the database. This data can then be displayed in the search engine.
When indexing, information is stored about keywords , so that the search engine knows which pages to display for which query. But it also stores important metadata of the page itself, such as the title , URL and description of the page.
The process required to index a website is called crawling . Crawling is performed by search engine bots. More information about how Google works can be found on the Google explanation page .
Another word that does not actually have a common Dutch variant.
Google defines Keyword stuffing as the overuse of keywords on a page.
You have probably come across a web page yourself that clearly uses keyword stuffing. Sometimes it is so bad that the page does not even read well anymore.
One of the many reasons for Google to punish it. The correct use of keywords is therefore not easy.
It is important that keywords appear correctly on the page, but don’t overdo it! Some tools, such as the Yoast SEO tool use a certain percentage, but there are more and more SEO specialists who are largely abandoning this.
The reason is that the Google algorithms are getting smarter and even apply AI to analyze pages.
The long tail means that the vast majority of searches consist of long keywords. This means at least two words, but usually even three or more.
A long-tail keyword is used much less often, but there are a lot of them. So much so that it accounts for 70% of the total search traffic.
Another important characteristic is that they are more focused with a stronger intention .
Combine this with lower competition and you know why it is important not to ignore the long tail in search engine optimization!
The local SEO is a discipline of SEO that focuses exclusively on geolocated results, shown both in the SERPs and Google Maps. It is mainly aimed at getting businesses with physical premises to appear in these results.
The local SEO will hand the SEO onpage the web, but adds optimization company data such as address, phone number and hours. In addition, the Google My Business profile is optimized with extra details: photos, category, hours, customer opinions, etc. The advantage is that local SEO results tend to come out higher in the SERPs.
Getting natural links, from authoritative domains and of the same theme is a challenge. Many companies, agencies and SEOs spend hours and hours looking for where to put or buy a link. But one technique that helps a lot is link baiting .
The link baiting is the technique to get natural links, not buy them or order them. These links are obtained when you manage to create valuable content for the user. Sure, it takes a lot of planning and strategy to get content that is useful, interesting, and easily shared. In this way, it will reach the ears (and eyes) of many people who will link you from their blog or website.
How could we talk about links and not explain what link building is ? For many, it is the activity they spend the most time on, within SEO. It is basically about “building” or getting external links that point to your website. How is already such a broad topic that there are blogs and courses that only deal with link building.
Within link building there are many techniques, each one more ingenious. From leaving a comment on a blog, to taking advantage of web security breaches on some sites. It is said that to do a good link building you have to invest a lot of time or money, or both.
We speak of link juice when the authority of a site is distributed by it through the links. Typically, a site’s home page has the most authority.
If you link to ten internal pages, the authority is transmitted to these as if you spilled juice in equal parts.
The link juice is taken into account, together with the web architecture and the pagerank , to give more authority to the most important pages of a site. In this way, a good foundation is laid to position yourself well in Google.
The Microformats are simple portions of HTML code that add meaning to the content of your website, to make it understandable also by web spiders.
When you add these microformats to your website, you allow Google to read them and display it in search results with additional information such as the author, a photo or video, a physical address, ingredients of a recipe, etc.
The mozbar is an online tool created by Moz to display its metrics (DA and PA) in the browser. You can see Moz’s proprietary SEO metrics next to each of the search results. But you can also have them visible at the top of the browser when accessing a specific page. You can even easily detect nofollow and dofollow links (the normal ones).
You can use the mozbar for free and without registering. If you use it, remember to always look at other indicators such as the visibility index or the TF.
In the context of SEO, Metadata is data about a web page or part of it. However, metadata is also widely used outside the SEO world and it is sometimes referred to as data about data.
You will (almost) not see metadata on the front of a website. But this data is in the HTML code where Google’s bots can read it. The most important metadata for search engine optimization are:
- meta title of a web page
- meta description of a web page
- Alt text of images
The meta description is a brief summary of what the web page is about. But you can also see the meta description as a business card of your web page.
Because Google often shows the meta description prominently in the SERP.
Therefore, make sure that you are short, but powerful and encourage the searcher to click through to your website, because the CTR is also an important signal for Google.
The above paragraph contains the word ‘often’ because Google does not always show your meta description in the SERP. Google will only do this if your description is relevant enough for the search query being used.
Sometimes Google finds that your page is relevant to the search query, but your meta description is not. In such a case, Google itself makes a description of (sentence) parts of your web page.
However, this is not a consideration not to use the meta description. The advice is to keep the meta description below 300 characters.
The meta title indicates the title of a web page. An alternative name for the meta title is the title tag. This last name refers to the HTML tag containing the meta title.
The meta title is almost invisible on the front of your website. The only place where a visitor can see this title is in the browser tab.
The meta title is one of the most important SEO elements of a page.
Therefore, a few advice (more or less in order of importance):
- The meta title may be invisible on your website, but this is certainly not the case with Google. Here your meta title is prominently present in the SERP . Therefore, make sure that the title is attractive so that people click through.
- Return the main keyword here.
- Where possible, also return a different keyword (provided this is possible in a natural way).
- Always end with the name of your website with a separator in front of it
- Make sure your meta title is no longer than 60 characters.
Natural search result or traffic
Google displays search results on the search results page ( SERP ). This results page consists of paid results and natural results.
For the paid results, as the word says, is paid.
This is not the case for the natural results.
SEO focuses on the effort required to rank as high as possible in Google’s natural results with the right keywords.
When we talk about natural traffic, we are talking about the visitors who end up on your website via the natural search results.
On-page SEO means that you apply search engine optimization on a single web page.
With on-page SEO you focus on optimizing content. Some important factors for on-page SEO are:
- The content itself: demand-driven, good quality, unique
- The meta title
- The URL
- The meta description
- Alt text of images
- Links (incoming and outgoing)
Organic search result
An organic search result is synonymous with a natural search result.
We talk about over-optimization when a website or web page makes too much use of SEO optimization.
The most common form is the overuse of keywords on a page, or keyword stuffing .
This ensures that the text is no longer legible and is therefore at the expense of the user experience.
Google penalizes this with worse rankings , or in an extreme case with a penalty.
PageRank is Google’s method of determining how much authority has a web page. The PageRank of a specific page is determined by links from other web pages to that page.
In the early days of Google, only the number of links was important.
As the Google algorithm evolved, the quality of links and the relevance of links were added to this, among other things.
Until 2013, Google provided insight into the PageRank of web pages.
Now this is no longer the case. They also stopped using the word PageRank, which is why many people say that PageRank no longer exists.
However, having quality and relevant links to your website is still one of the most important factors for Google to determine your rankings.
Google sees every good link as a vote for your website. The more good voices the better.
The Google Panda Update is a Google Algorithm Update that addresses the quality of website content.
Websites with high quality content are rewarded and websites with low quality content penalized. The first Google Panda Update came out in February 2011 and had far-reaching consequences for millions of websites. Some quality features for the Panda update are:
- Thin content, for example pages with only a few sentences.
- Low quality content, for example pages with automatically generated or translated texts.
- Duplicate content, for example websites with many pages that are very similar.
The Google Penguin update is an algorithm update from Google that addresses the quality of links to a website. The first Google Penguin update took place in April 2012. Where previously only the number of links was considered, the Google algorithms (partly thanks to Penguin) are now much more complex. Google brings:
- Link quality: does the website linking to you have a good reputation? Does this website contain good content?
- The relevance of links: does the website that links to you have anything to do with your field or industry?
- Has a link been paid for? (this is not allowed by Google)
- Is the link profile natural? Or, for example, is a link scheme active in which large numbers of websites link to each other?
- Are the anchor texts natural?
PHP stands for Hypertext Preprocessor. PHP is a scripting / programming language for web pages. Most websites use PHP to create (dynamic) web pages. The PHP language of your website is difficult to influence for marketers or entrepreneurs.
Nevertheless, it is worthwhile to have your website checked for possible PHP errors.
Because Google likes professional websites that contain as few errors as possible.
One tool you can use to determine if your website has PHP errors is Google Search Console .
Pogo-sticking means that someone clicks through to your webpage from a search engine and almost immediately returns to Google to choose a different search result.
Many SEO experts believe that Pogo sticking is a ranking factor and that Google uses it to determine if a search result meets search intent.
When a lot of people pogo-stick, the ranking in the search engine drops.
Rank of tendrils
A rank is a certain position in the search results page ( SERP ) of Google after typing in a keyword.
A rank from # 1 to # 10 in many cases means that you are on page 1 of Google.
It is important to keep track of your rank on keywords that are important to your business. To set out a keyword strategy , you need to know where you are now.
Only then will it be possible to determine what is needed to achieve your goals.
With a redirect, your traffic on your webpage will automatically flow to another webpage.
You do this, for example, when you have deleted a web page. Then you redirect to the most suitable page of your website.
Or when you have moved a web page (then you redirect to the new location of the page). In almost all cases you use a 301 redirect.
This means that the redirect is permanent. There is also a temporary redirect (302), but the general SEO advice is not to use this redirect.
With a robots.txt file you determine which pages cannot be crawled and indexed by search engine bots.
In the file you can grant access (allow) or deny access to (disallow) search engine bots per website, per website folder, or per web page.
A good example of a set of pages that you do not want to be indexed by search engines are your administrator pages of a CMS such as WordPress. In this case, WordPress will of course arrange for you that the necessary disallows are in the robots.txt.
SEA stands for Search Engine Advertising and actually doesn’t have much to do with SEO . Since both terms are often confused with each other, we have included it in this dictionary.
SEA can also be a good test pool for SEO. If you achieve a good conversion rate with a certain keyword in a SEA campaign, for example. Then you will probably achieve that if your web page ranks well in the organic search results .
Google’s SEA program is called Google Ads (formerly AdWords). Google Ads not only allows you to advertise in the Google search engine, but also outside of it, in the Google Display Network. More information on the official Google Ads website.
Do you need help setting up and / or managing your SEA campaigns? SEA is also one of Pro SEO Expert’s areas of expertise. We are happy to provide you with advice and that does not always have to cost you!
Google Search Console is a Google website tool. Google Search Console used to be called Webmaster Utilities.
The goal of Google Search Console is to give you tools to improve your website. Usually on technical points such as crawlability , indexing status , blocked sources, etc.
But Google Search Console also provides important information about keywords and the findability of your website.
You can link Search Console with Google Analytics so that this data can also be analyzed in your Analytics environment.
Create a free Google Search Console profile .
SEO stands for Search Engine Optimization. SEO is the discipline where you increase the amount of relevant visitors to your website via search engines.
With search engines you can almost say Google, In the above paragraph, the word ‘relevant’ is mentioned because when doing SEO you should always keep in mind that the traffic you attract to your website must be relevant traffic.
This traffic must fit within your communication target group.
An SEO analysis can refer to various analyzes that you make with the help of an SEO research . Examples of an SEO analysis are:
- Competition link analysis
- Keyword analysis
An SEO tool is a software program that helps you in search engine optimization. There are various SEO tools on the market.
Some focus on a specific discipline within SEO, such as mapping rankings.
Such a program is also called a rank tracker or rank checker.
An SEO survey is a survey that you conduct with the aim of increasing relevant search traffic to your website. A good example is keyword research .
With a good keyword research you can do various keyword analyzes that identify opportunities for search engine optimization.
In an SEO strategy you indicate where you are now, where you want to go and what it takes to achieve your goals.
When writing an SEO strategy, you can create structure by focusing on improving the four pillars of search engine optimization, namely:
- User experience
SERP stands for Search Engine Result Pages. The SERP is the Google search results page that you will see after you have entered a search query in Google.
The SERP has been reasonably predictable in terms of formatting for a long time. At the top you often found three sponsored results (SEA ads).
Then came ten natural search results . On the right side of the page you also found a number of advertisements (on a PC).
However, nowadays Google regularly enriches the SERP with images, Knowledge Graph items, featured snippets, local search results (local pack), reviews, etc.
This makes it more difficult to determine whether a natural search result will be on the first page or not.
A spider is synonymous with a (search engine) bot.
The term SPAM is used in different areas, such as email. SPAM relates unwanted messages, comments or actions. People who comment on social networks or blogs do SPAM only with the intention of promoting something or leaving a link to their website.
In what has to do with SEO, all the junk links that point to your website from “junk” sites or those that the competition puts you to do negative SEO could also be considered as SPAM.
In terms of SEO, spinear is creating variations of an already written text, using synonyms and similar semantics. Sometimes the order of sentences and even paragraphs is also interchanged. The goal is to generate content that looks original but taking advantage of what someone (or yourself) has already created.
There are very advanced tools that generate spine text thanks to their databases of synonyms. And, although it seems like a very useful resource, it is not highly recommended for serious projects. If Google detects the text as duplicate content, your website can be penalized. And, above all, when a user reads it, they will notice something strange because very forced expressions or phrases always slip through.
The snnipet are fragments or extracts shown in the SERPs and help the user to get an idea of what you will find in a web before clicking. The ones that Google always shows are the title, the description and the url … although every two by three it is making changes.
There are other types, as you saw in the definition of rich snnipets and if you optimize them well they will attract a lot of users’ attention and you can improve CTR, and eventually SEO.
A subdomain is like having a domain within another. In other words, just as you can have directories within the domain (eg: http://domain.com/directory) you can also create divisions at the domain level like this: http://subdomain.domain.com.
Subdomains are very useful. For example, they are used when you want the URLs of your website to appear with the “www” in front of the domain name ( www.yourdomain.com ). They are sometimes used in SEO to separate the blog from an online store. Ah! and Google values subdomains as separate domains from the main one, with their own metrics and all.
The Google sandbox is a filter to prevent new sites from ranking better than old ones. You could say that it is a trial period that newly created sites go through to see how they react over time.
Google has never confirmed its existence. But, after many years, most SEOs have confirmed that many of the new websites they create spend the first weeks or months with almost no visits or increase in positions. And then all of a sudden they start to grow like they hit the SEO stretch.
Scrape is another English term that is used a lot in SEO. It is the technique used to extract information from other web pages, for example, from the competition. We can extract all their contents, or very specific parts of them and analyze them to understand what strategy they are following.
You can also scrape your own website to detect problems and discover ways to improve SEO. You can extract specific data such as the title, description or even portions of code from a website, and there are different methods, techniques, tools and programming languages to perform scraping . An example of the benefit that is obtained from scraping we have it in some price comparators.
The SEM ( Search Engine Marketing ) is a broad concept encompassing any technique related to search engine marketing. What happens is that this concept has been distorted until it is understood only as the management of paid advertising in search engines (this is how many people understand it).
Within the SEM we find activities such as SEO and PPC (Pay Per Click), which makes it a very broad branch of marketing. The normal thing is that people specialize in SEO or search engine advertising (the wrongly called SEM) and that is why this term has changed its meaning colloquially.
Semrush is a very useful tool to help you in the online marketing strategy of your website. You have many options to analyze the SEO of your website and those of the competition. With its advanced functions you will do keyword research like a professional and you will keep track of your positions in Google.
And if you have clients, you can send them very complete reports on their website analytics, new keyword opportunities, competitor’s Google Ads campaigns, etc.
The title tag is the HTML tag for the meta title of a web page.
Ubersuggest is a valuable and widely used SEO tool . What Ubersuggest is good at is offering new keyword ideas. The tool uses the keyword suggestions that Google itself offers.
You can also see this yourself by typing in a keyword but not pressing enter. Also at the bottom of the SERP are keyword suggestions from Google.
In addition, Ubersuggest also gives you keyword volumes. These also come from Google, namely from the keyword planner.
The authority of a web page, like the authority of a domain, is determined by inbound links.
However, not only external links are important here, but also internal links. Search engines also analyze how many links from other web pages of your website point to a particular page.
The logic behind this is clear: if you don’t or hardly link to a certain page yourself, why should it be important?
The Mozbar is a good way to measure Page Authority and an excellent alternative to PageRank that is no longer being updated.
With an XML sitemap you give Google an overview of your entire website. The XML sitemap shows the structure of your website and shows the URLs of all the pages you want indexed.
The sitemap is called XML sitemap because the file is in .xml format and uses the XML standard. XML , like HTML, is a markup language and therefore serves to structure data.
Don’t forget to exclude pages from your sitemap that you don’t want indexed.
Think of thank you pages or test pages. Submit an XML Sitemap to Google via Search Console.
Yoast SEO plugin
The Yoast SEO Plugin is the most popular SEO plugin for WordPress.
The tool mainly focuses on on-page SEO . It gives writers advice on keyword usage on the page and the readability of texts. In addition, the tool has many other functions, such as an XML sitemap , breadcrumbs and social media metadata.
Search intent is extremely important to SEO . You must ensure that you meet the expectations of the searchers with your web pages. There are roughly three different types of search intentions. These categories were created by Andrei Broder in 2001, when he was still working for Altavista (a popular search engine in the 1990s).
- Transactional – here the search is focused on taking action. This can be a purchase, but it can also happen that someone wants to download something or contact you. An example of a search is: ‘buy winter coat’
- Information-oriented – these searches are aimed at gathering / retrieving information. You could say that people here want to become wiser, or want to learn something. A search query could be “what is SEO?”
- Navigational – these searches are aimed at navigating to a website or a specific part of it as quickly as possible. A search can be: ‘Pro SEO Expert’ or ‘Pro SEO Expert contact’.
If your webpage does not meet the search intent , many people will start pogoing.
That is an important signal for Google not to rank you or to rank you lower .
Search intent is also important because it can say something about the stage of the purchase process the searcher is in.
If someone searches for ‘what makes a winter jacket waterproof’, then he / she is still in the exploratory phase. But if the same person later searches for ‘buy waterproof winter jacket Hugo Boss’ then he / she is probably ready to make the purchase.
You already know what long tail keywords are , so guessing the meaning of short tail will not be difficult. This term refers to short searches, to searches of one, two or three words. They are characterized by having many monthly searches and because they are usually difficult to position, due to the amount of competition there is.
It is not bad to try to position yourself for this type of keywords . What happens is that they are very competitive words and that they do not carry a very specific search intention. That means that if you manage to position yourself, no one can assure you that the users who find you in the SERPs for that search will have a lot of value for the business behind your website.
Search volume is an important factor in keyword research . It provides an estimate of the average number of searches per month for the given keyword . When combined with a keyword’s difficulty factor, search volume gives you insight into how valuable it is to focus your SEO energy on a particular keyword. You can retrieve the search volume of keywords with the Keyword Planner , Ubersuggest or other SEO tools.
Ironically, the word “search volume” does not have a very high search volume
A keyword is a single word or a combination of several words that are typed or spoken in a search engine to find content on a web page.
The American SEO company Moz , which has their own bots / spiders to analyze the internet, divides keywords into three categories:
- Fat head – 18.5% of all searches
- Chunky middle – 11.5% of all searches
- Long tail – 70% of all searches
A keyword research makes several keyword analysis possible. For example, with a keyword research you can analyze which pages rank reasonably, but where improvement is desired.
Another example of a keyword analysis is showing pages that rank in Google, but not high enough to bring in traffic.
The results of the analyzes form the basis for a keyword strategy or plan.
A keyword research contains an extensive list of keywords that are relevant to the field in which your company is active.
These keywords are enriched with valuable information such as search volume, difficulty factor, current rankings , page performance and much more. An extensive keyword research combines information from various sources, such as the keyword planner , SEO tools , Google Analytics and Search Console .
Such keyword research forms the basis for making various keyword analyzes.
Learn how to conduct keyword research on our keyword research knowledge page .
In the keyword strategy you visualize for your organization where you currently stand in terms of keywords , but also where you want to go and how you will get there.
You describe which keywords are your ambition, why and how you will rank higher on these keywords.
You set out all of this in a timeline. A keyword strategy does not have to be a grandiose plan, but must be concrete and practical in nature.
The plan serves as a guideline for your SEO work in the field of keywords in the coming period.
Keyword difficulty is a factor used in keyword research . The figure indicates how high the organic competition in the search engine is for a specific keyword .
This factor generally runs from 1 to 100, where 1 means that there is little or no organic competition and 100 means that the keyword has the toughest form of competition possible.
The keyword difficulty factor in most SEO tools is based on the domain authority of the top 10 results in Google for the given keyword.
The keyword planner is a tool from Google Ads (formerly AdWords). So it is actually intended as a SEA tool, but the tool is also very valuable for SEO . The keyword planner is especially valuable for expanding your keyword research.
So to get new ideas for keywords. In addition, the tool also displays search volume for one or more keywords, but we must add a comment here.
If you do not actively advertise with Google Ads, Google nowadays only gives a (too) rough estimate of the volumes.
For example: 1K – 10K. Don’t want to advertise, but are still looking for a tool that displays free search volumes? Then use Ubersuggest.
The term keyword stuffing is used by Google itself to define the practice of overloading a website with highly repeated words. It is a practice that was used a lot before to improve web positioning, and believe me, it worked very well. But, in addition to being penalized, it is no longer effective.
Google claims that this does not help the user and is right. Reading a text in which an annoying keyword is repeated a lot and takes away the desire to continue reading. I do not recommend using it under any circumstances. Better to write for users, with an SEO twist, of course.
You already know that SEO is to get visibility, visits and, in the end, more sales. But what usually happens when you get it? That the competition is fierce.
And sadly, some people play dirty and use negative SEO to knock their competition out of the top positions.
There are some well-known negative SEO techniques. One is to create a lot of toxic backlinks with the intention of causing a penalty from Google.
Another is to generate DDoS attacks to overload the server’s network. Other attacks are made using pogosticking or lowering the CTR.
And they are not the only ones, there is much misused ingenuity used to do evil to others. You do well to know them to detect and deal with them.
We are not going to talk about anything funeral here. In the world around SEO, a niche refers to a small portion of the market made up of people with similar interests. By having their tastes and needs defined, they could be willing to buy certain types of products or services.
Many SEOs detect these niche markets to create websites with content aimed at these audiences in order to show them advertising or affiliate products related to that topic. Such a project usually has a higher profitability than a generalist one.
If you are interested in the subject, you can read the post we dedicate to SEO for niches .
A PBN is a private network of blogs, created in order to help position a web page. It is a strategy to get links with authority, without having to ask anyone or buy them directly. But that does not mean that you do not have to invest time and money.
The usual thing is to buy expired domains , or at auction, that already have antiquity and authority. SEO experts in PBNs register them with different companies and host them on different servers. All this with the aim that Google does not detect PBN, or a link pattern that discovers it. There is a whole science behind creating a PBN but this brief definition can give you an idea of how useful it can be for improving web positioning.
Google puts a lot of effort in making its search engine the best. It wants to give good results because that way we will continue to use it. And for this, it dedicates many resources to remove “bad” results from its search engine, making use of SEO penalties .
Some penalties are carried out manually by a Google quality rater , who detects a violation of its rules on that website. Other times the penalties are algorithmic, that is, a consequence of Google’s algorithms.
The pogosticking occurs when a user enters a result of Google and leaves immediately to return to the results page to look for another responsive to your search. This effect is negative if it comes from your website.
Makes sense right? Google wants to offer good results in its search engine. You want them to be relevant. But if it detects that your website, even if it is first, does nothing more than cause users to pogostick , how do you think it will value it?
Indeed, it will not help you stay in that position because Google understands that your website is not relevant to that search.
You can detect or intuit that your website is being a victim of pogosticking if you analyze your organic visits and see a very low dwell time or a high bounce rate.
In English, the term query is translated as query and is used in SEO for the keywords or phrases that we search in Google.
The search engine makes suggestions as soon as you start typing in it and that causes most of the search queries that are made to be those suggested by Google. These queries start up the Google mechanics and you instantly have the results on the screen.
A 301 redirect is a series of commands or commands that are executed on the server where your website is hosted to direct users (and web spiders) from one URL to another. It is a very practical resource in SEO, for example, to avoid duplicate content or remove 404 error messages.
Although someone with technical knowledge would tell you that there are many ways to do it, the best for Google is the 301 redirect (and the 302) because they transmit Pagerank to the URL where they point.
A dofollow link is a normal link. It is a text or image hyperlink that conveys Page Rank and authority to the linking site.
When Google’s web spiders crawl a web and find a dofollow link, they follow it and convey part of the authority of that page to which it is linked.
Come on, it is a link that we all like, of those that we want when they link us.
But if these are the good links, does that mean that there are bad links? Not so much, but there are other types of links that are not as strong, as far as SEO is concerned. I’ll tell you about it below.
The nofollow links are those that are tagged with rel = “nofollow” in the HTML code. This tag is used to tell Google not to take that link into account, that is, not to transmit PageRank to the linked domain. PageRank is “lost” in nofollow links … let’s say you prefer to lose it to give it to others ?
It is advisable to use them in your link building strategy to give your link profile naturalness . Google says that “generally” it does not take them into account, that is, that it may do so on some occasions. Here’s a good topic for discussion.
In September 2019 Google released two new HTML attributes for links. The sponsored attribute is written with the tag rel = ”sponsored” and is recommended for use for advertising or sponsored links.
The other HTML attribute that Google published was UGC which stands for User Generated Content. It is marked in the html code as rel = ”ugc”.
The use of this attribute is recommended for backlinks in blog comments, in forums and in all those links generated by the user himself.
Link or link
We have already seen the dofollow and nofollow links, but many people do not understand the concept of link or link well . It is basically a link to allow the user to access another web page with a single click.
The HTML format for a link is as follows:
<a rel="nofollow" href="http://www.google.com/">Google</a>
This is the code for a link, where you can see terms that you already know like the anchor text and the nofollow attribute .
The 404 error is a status code in the HTTP protocol. This error message appears in your browser when you try to access a URL that does not exist. This happens by default on web servers when you get a request for a page that you don’t have in your files.
This is a common error that appears when you change the URL of a page but it is still linked to the old URL from somewhere else, be it from your website or another. It is very important to regularly check your website to avoid generating 404 errors, because it negatively affects web positioning.
The H tags are HTML attributes used to define headers or text subtitles. Like other attributes such as article, paragraph, italic … H tags help style the text and make it easier to read.
Here is an example of H tag : < h1 > This is an H1 < /h1 >
The <h1> is used only once in each URL because it serves to identify the main header. The rest of the headings are labeled according to their importance, and in order: <h1>, <h2>, <h3>… until <h6>. Headings are used to let the user know what to read in the text below. That is why it is logical that, if you use keywords in the h tags, it helps you to position them in the search engines.
The footprints are commands that are used to searching on Google to filter the results. Some are commands that Google makes available to us to locate content in a more specific way .
For example, the “site:” command is used to find results from a single domain, related to a single keyword . If you want to search for content related to SEO on the Publisuites website you can do so by putting this in Google: site: publisuites.com SEO .
Also, there are other footprints to find sites where you can leave a link. This is because there are many websites that follow patterns in their source code or in the text, which can be found on Google.