SEO (Search Engine Optimization) is the set of Digital Marketing actions and strategies that aims to increase traffic and performance of a website through organic results in search engines such as Google, Bing and even YouTube.
What is SEO?
every time a page is published on the internet, Google (and other search engines) look for how to index it so that it can be found by those who search for it.
However, there are millions of pages being published every day on the internet, which means that there is quite a bit of competition. So, how to make a page in front of others?
That’s where SEO (Search Engine Optimization) comes in. As the translation itself already indicates, SEO is an optimization for search engines, this means, a set of techniques that influence search engine algorithms to define the ranking of a page for a certain keyword that was sought.
On this page, you will be able to study everything about SEO, from the basics to the advanced. At the top there is an index to find the desired chapter.
But, if you want to read from the beginning, nothing better than starting with the story.
SEO History
In 1993 Architext emerged, considered the first internet search engine (which became Excite). With its success, similar new sites emerged such as Yahoo! (1994) and Google (1997).
Founded by Larry Page and Sergey Brin, Google was created to be a large-scale search tool and “organize the internet”, using the link structure to determine the relevance of pages according to the user’s search.
The idea of using the links received by a page is inspired by the academic environment: an article or scientific research that receives citations from journals and articles by other authors, mainly those with a better reputation, are considered more reliable.
Following this logic, the revolutionary PageRank was developed: a metric from 0 to 10, created by Larry Page and calculated by the quantity and quality of links received.
In December 1997 PC Magazine wrote that Google “has an unusual ability to return extremely relevant results” and ranked it as the top search site in the “Top 100 Web Sites.”
According to this Search Engine Land post, it was also in 1997 that the term SEO was first mentioned, in the book Net Results, written by Bob Heyman, Leland Harden, and Rick Bruner. According to them, the term arose in a discussion about the positioning of the Jefferson Starship band’s site on search sites.
By including more keywords with the name of the band in the content of the site, they noticed that the site returned to the first position. With this, Bob and Leland called that technique as Search Engine Optimization.
Until the popularization of Google, SEO actions were limited to submitting the site to search engines and on-page optimizations, such as the inclusion (and repetition) of keywords in the content.
Once Google became popular, SEO professionals began to look more for the link metric, which is very important for the search engine.
This is how link building strategies emerged, exploring both legitimate techniques for obtaining links and darker practices, focused solely on improving the evaluation of the site, regardless of its quality.
These ranking manipulation techniques are known as Black Hat SEO.
In 2000, Google Toolbar was released for Internet Explorer, which presented the PageRank of the sites, from 0 to 10, which made link building techniques more measurable and popular.
In the same year, Google’s organic results received company: Google AdWords was launched, including sponsored results, which remain in search results to this day.
After years of site optimizations, link generation and a lot of ranking manipulation with Black Hat techniques, in 2003 the first major update of its algorithm was released, called Florida, which changed SEO forever.
According to an article written at the time by Gord Hotchkiss, Florida was a filter applied to commercial-based searches, identified by the use of specific keywords. It cleaned up many of the previously ranked sites (in various tests, the tool removed 50-98% of previously listed sites).
The goal was related sites, with domains containing keywords and a network of links pointing to the home page of the site.
When it was released, the update caused a stir among merchants, who relied on related sites as their main source of traffic (and sales).
Despite the impact of the update, the results were positive, with higher quality sites being launched, companies investing more in their own website and improving search results.
That was just the first Google update. During the following years, new updates were released, always with the aim of reducing the non-legitimate results presented by the search engine and improving the quality of the searches.
Since then, with each update released by Google, various speculations about the death of SEO are also released.
However, site optimizations for search mechanisms go far beyond questionable techniques that seek to manipulate the results displayed by Google, which are penalized and become extinct with updates.
The optimization of the sites for search mechanisms is carried out to reach the user by delivering the answer that he is looking for with the ideal format, offering the best possible experience within the brand environment and following the guidelines of the search engines.
The most popular search engine in the world is Google, and this is the next chapter.
How Google, Works
Have you thought about everything that happens between typing your search and clicking on Google results?
What happens in that period is the secret to the success of the search giant. The quality and speed of its classification transformed the company into the largest search engine in the world, killing all its competitors, even adding their shares.
To get an idea, the domain is so large that in the United States it exists as a verb to google, which is used in phrases like He googled you.
Learn more about the work that exists behind the most accessed results page worldwide.
Crawling, Indexing And Display Of Results
These are the 3 main search return processes.
Crawling is the process in which Google’s robots (called Googlebot) identify pages to submit to the search engine’s index. For this, the robots use algorithms that define the prioritization and indexing frequency of the pages.
The process starts with the URLs generated from previous crawling processes and enriched with sitemaps. As you visit the pages, Googlebot identifies existing links and also includes a tracking list. During the process, new sites, alterations and exclusions are detected and updated.
After this, indexing is done, in which Googlebot processes each of the crawled pages to include them in its index. At this time, information such as page content, publication date, title, description, as well as structured data are indexed.
In this way, when a query is made, a search for corresponding pages is made in the Google index, displaying the most relevant results. It is not a relevance based on assumptions: this is determined with ranking factors.
In the search process there is also the Google autocomplete and the already known: “Did you mean…”, made to save time, correct errors and facilitate searches.
Algorithm And Updates
Even if you’ve never worked with SEO, you’ve probably already heard of the Google algorithm or social networks like Facebook and Instagram.
These algorithms are responsible for filtering what is most relevant to the user and not simply leaving all the content available on the page, without any classification criteria.
Google uses more than 200 ranking factors to define the order of the pages presented to the user for each search carried out.
In order to increasingly improve the information presented to the user and its relevance, this algorithm is constantly updated. Know the main ones and their impacts:
-
Florida (2003)
Florida was Google’s first big update and is considered the update that put SEO on the map.
When it was launched, it removed between 50 and 98% of the sites listed above. The objective was low quality sites (mainly affiliates), which practiced keyword stuffing, with domains containing exact keywords and with a network of links leading to the main page of the site.
-
Panda (2011)
Panda was another big update that affected about 12% of search results. The objective was to penalize sites with low quality, with many ads and sites present in content farms. Since then, his updates have always focused on the quality if the content of the sites.
After 27 updates impacting search results, the last one came in 2015. Panda 4.2 was just a database update, but it ended up impacting many sites that were still producing very poor quality, content.
-
Penguin (2012)
Better known at the time as the WebSpam Update, Penguin was the update responsible for containing the excess content optimizations. With its launch, 3.1% of search results in English were hit.
Its objective is to identify and penalize sites that practice keyword stuffing and that participate in schemes to generate links (techniques considered black hat).
In the same way as Panda, this algorithm update went through a series of improvements and releases, until it reached version 4.0 (2016), when it officially became part of the Google algorithm and began to work in real time.
-
Hummingbird (2013)
Unlike the previous ones, the Hummingbird update was not just a supplement to Google’s algorithm, but a complete overhaul of it.
With the update, the search results for users go beyond the keyword: the search engine not only considers the terms searched for, but also its entire semantic universe, such as the meaning of that search, which includes synonyms and the context in the one that the terms are being inserted in the pages and also other even more complex factors, such as the location of the user and even previous searches carried out by him.
All this is done to make the presented results more and more related to the user’s true search intention and not just the exact search words.
-
HTTPS/SSL Update (2014)
After a warning and strongly encouraging webmasters to invest in security, in 2014 Google announced that HTTPS was becoming a ranking factor, as a way to encourage the migration of the online community and thus make the web more secure.
This incentive is due to the fact that the sites that have an SSL certificate (and thus their migration to HTTPS) use encrypted information, which prevents the data from being identified along the way, in case it is intercepted.
-
Mobile Friendly Update – Mobilegeddon
Google’s update for mobile devices is known as Mobilegeddon (referring to the movie Armageddon) due to the impact that specialists believed it would cause. In practice, however, the impact was not that great.
Long story short, the update began prioritizing mobile-friendly sites, regardless of whether the site was more or less mobile-friendly: either it was or it wasn’t.
In 2016 Google released a new mobile friendly update, which had a lower impact on rankings than the first, according to webmasters (the main reason was that most sites were already adapted).
-
Rankbrain (2015)
Google announced in 2015 the official incorporation into the algorithm of a system that used machine learning and artificial intelligence, with the purpose of helping the interpretation and presentation of search results: RankBrain.
According to Google itself, the system became one of the top 3 ranking factors, along with links and content. However, unlike the other 2, it was harder to optimize sites for this intelligence. What could be done was to explore the words that were part of the semantics of the content and make clear the entire context of the content.
-
Fred (2017)
The Fred update was released to identify sites with low-quality content and lots of banner ads.
According to John Mueller, Google’s Webmaster Trends Analyst at the time, “If you’re following good SEO practices, the only reason your site will be penalized is poor quality content.”
-
2018: YMYL (Medical update)
This is an algorithm that was applied on August 1, 2018 and that mainly affected health and money-themed websites. Many websites that had good traffic were seriously affected by this algorithm change.
-
2019: Neural Matching
In November 2019, a new update was confirmed: Neural Matching. Neural Matching or neural matching allows Google to better understand when the queries that users make have a local search intention, even when the name or description of a particular company is not included in the search.
-
2020: January and May Core Updates
In the long history of Google algorithm updates, the main updates, known as Core Updates, are the ones that can generally affect the organic ranking of many websites. In 2020, Google released two powerful core updates, one in January and one in May.
In the case of the first, the websites that include content related to pets, motor, health and other specific content are the ones that, since the beginning of the central update process, saw changes that are proportionally greater than others.
In the case of the May 2020 core update, the impact was greater: the most affected categories were travel, real estate, health, pets and animals, as well as sites related to people and society.
Google Ranking Factors
As we mentioned in the previous point, Google defines the prioritization and relevance of the results based on more than 200 ranking factors.
Several studies are done each year to unravel all the factors used. The company Backlinko made a list with the 200 ranking factors, being the 10 main factors:
Additionally, at the end of 2017 SEMrush carried out a study on the positioning factors, with 600 thousand keywords from its global base and the first 100 results pages for each of them.
The 5 most important factors that were identified in this study were direct visits to the site, visit time, number of pages accessed, bounce rate and total referring domains.
Some curiosities that were identified in the study:
- The average difference in content size between the first 3 positions and the 20th position is 45%;
- The average bounce rate for domains in the top 3 is 49%;
- 3% of backlinks contain a keyword in the anchor text.
Mobile-First Index
Announced at the end of 2016, this new form of Google indexing has been carried out gradually since 2017.
In practice, what changes is that Google’s indexing always started from how the sites were viewed from the desktop. With this change, the browser uses the mobile version of the site as a base.
This can greatly harm the organic positioning (and consequently the traffic) of sites that do not have their site adapted for mobile devices. However, according to Google, sites that only have a desktop version will continue to be indexed and, if they have quality content, they may not be harmed.
In case you already have a responsive site, you do not need to make any alterations to your site beyond the traditional. If you have a mobile version of your site (m.example.com) it is necessary to make an alteration in the canonical tag of the pages, according to the following example:
5 Reasons To Invest In SEO Today
-
Invest In What Is Really Yours
Have you ever wondered what your greatest internet wealth is? I would say the best answer is: your site. Your site will be with you for as long as your company is active, and it is on it that people will know what you do and what the advantages of choosing your business are.
And what about social networks? They are extremely important in several points of the Digital Marketing strategy however they are not your heritage.
Imagine if 15 years ago we had invested a very high value in Myspace? Without a doubt what seemed to be something very clear, could be disastrous.
Investing in SEO for your site is investing in something really yours, it is betting on your greatest asset on the internet and reaping the fruits of hard work, but very valuable.
-
Conquer Space On The 1st Page Of Google
Being on the first page of Google does not mean that your company provides a better service or has a much higher quality product than the competition. Being among the first means that your SEO work is being done well and that your content is relevant.
However, for the general public that uses the network for research purposes, the feeling is that Google indicates you as one of the best in the market.
Here, small and large companies compete on the same level. One more reason to invest in SEO.
-
Be The Reference Your User Is Looking For
The greater the number of searches that show you on the first pages, the greater your authority in the segment in which you operate.
People search for relevant data, problem solutions, tips to improve their routine and much more information. And if, in these cases, the searches were for topics in which your company operates, without a doubt the brand that people will see as a market reference will be yours.
It is not enough just to produce content, it is necessary to be aligned with ranking techniques so that your site is always ahead of the competition, that is, you need to invest in SEO.
-
Increase Visits To Your Site (And Sell More)
Being on the first pages of search engines means more visits. Optimizing your site will make your navigation easier and your content, in addition to being valuable, can be even more attractive. Result: Increased search volume, new site conversions, more Leads to engage with, and many more new customer opportunities.
A stopped site, without updates and without the use of SEO techniques will not generate recurring accesses and will lose relevance over time, going from a site with the potential to generate many businesses to a simple static virtual visiting card with very few entrances.
-
Increase The Quality Of Your Ads
If you plan to invest in Google Ads and improve your campaigns, it is essential that you focus on SEO.
Google Ads measures the position of your ad through Quality Score. He himself makes the estimate (along with the offer provided) of what will be the position of your ad in the auction. Quality Score is divided into CTR (click-through rate or click rate), the relevance of the keywords and the quality of the page that will be accessed. It is, right at this point, that the role of SEO strategies begins.
Optimizing the page for the user (among other factors) includes adding original content, relevant data, and other information that, along with various SEO techniques, can help improve the browsing experience. There’s no use clicking on an ad looking for one thing and finding another, right?
Another important aspect is that, by having your page optimized, you can make your offer cost less than expected. One more point for SEO!
SEO Techniques: Black Hat x White Hat
There are some unethical techniques that according to Google, are done with the sole purpose of boosting organic positioning, without taking into account the user experience.
Repeat the keyword of the content several times, buy and generate a large number of links from other sites, include hidden words on the page, so that the search engine robots display a part of them.
All this offers a bad experience to the user, who, when accessing a site in the first position for his search, expects quality content and that he has the answers to what he is looking for. Also, it goes totally against Google’s mission: “to organize the world’s information and make it useful and accessible to everyone”
To penalize the sites that use these techniques and guarantee that the best options on the internet are the ones that appear in the first places, Google has been making constant updates since 2000 in its algorithm (as we mentioned before) and also created the Guidelines for Webmasters, with the rules and best practices to position sites organically.
Currently SEO techniques are divided into 2 groups: Black Hat and White Hat.
Black Hat SEO
Black Hat techniques seek only organic positioning and violate Google’s guidelines.
These techniques generate short-term results, but they are considered unethical and very susceptible to sanctions and even exclusion from the search engine.
The 10 most popular Black Hat techniques are:
- Keyword Stuffing;
- Hidden Content;
- Duplicate Content;
- Cloaking;
- Doorway Page (Or Gateway Page);
- Link Farm;
- Private Blog Networks (PBN);
- Paid Links;
- Blog SPAM;
- Negative SEO;
White Hat SEO
We can consider as White Hat SEO the techniques that do not go against the guidelines of Google and other search engines. They work in the medium and long term and do not generate penalties for the site that uses them.
According to Google’s guidelines, it is basically recommended to create original and quality content, always thinking about the user and avoiding other types of tricks to improve the organic position of the site.
Google recently updated its SEO Guide, in which it recommends various SEO techniques that are in accordance with its guidelines.
SEO On Page
Some specific parts of a page are more relevant in search and deserve special attention.
Working on them means having more chances for Google to consider you as a result for a search carried out. This in the industry is called “SEO on-page”.
More important than knowing these techniques is not to leave them above good content. With each algorithm update, Googlebots get smarter at identifying the best answer to a user’s question.
Therefore, what works here is to have on-page SEO techniques together with excellent content. The two things must be integrated to work.
Speaking specifically of on-page SEO, the main attributes that stand out are:
Contents
As we mentioned before, content is the most important asset for a good SEO job.
The use of a certain keyword within the content, facilitates the possibilities for Google to display your page as a result to those who search for that word.
Do not exaggerate in the use of the keyword, distributing it in the text in a forced way. Excessive use of the same word can cause confusion in users.
As this is not a behavior approved by Google, it is very likely that your site will suffer some consequences, such as a reduction in the display of your page in search results.
If you are looking for an answer to know how many times a keyword should appear in a text to have a better position, we must inform you that such an answer simply does not exist.
If in doubt, develop content naturally and ask yourself the following questions:
- Is the keyword cited in the content?
- Is it a reasonable amount?
- Would users be satisfied with the number of appearances?
- Were partial variations used?
- Were synonyms used?
If all responses are positive, the page is ready to be published.
Scannable Content
How is the content you produce for your company? In addition to the ideas themselves, do you also think about how to structure it?
Contrary to what some people think, the internet is not like a book. This means that if you write a long, monotonous block of text, people will simply ignore your message.
On the internet, people tend to multitask. They may be browsing your site while chatting with their friends on Facebook, responding to emails, or other activities.
As a result, they need to filter and scan information on your site easily. Otherwise, they will go to any other page.
Some of the elements you can include to make your content scannable are:
- Headings (Or Headers)
- Bullet Points
- Bold Font
- Italics
- Images
- Videos
- Short Paragraphs
Semantics
As we’ve mentioned, it’s important to optimize your page for a keyword to perform better in search results.
However, there are other factors that help Google understand the context of your content besides the keyword.
These factors can even rank your site for terms for which the content is not optimized, which proves that Google places more and more importance on the quality of the content.
Duplicate Content
You probably already saw in some search results that Google reports about some results being skipped.
To show you the most relevant results. we have omitted some very similar entries to the 3 already shown.
If you want, you can repeat the search and include the results that have been skipped.
On many occasions, the reason for this omission is the existence of duplicate content, that is, more of the same.
That’s why replicating the same content from another site to yours probably won’t bring you organic traffic, and this is why it’s important to invest in original content.
Title And Description (Title And Meta Description)
The title of the page (or “title”) is not necessarily the same as the article. Actually, the title refers to a property of the HTML code (<title>), identified by the phrase that appears in the browser tab or on the Google results page.
This is a very important element for SEO, which often becomes a space occupied by slogans or phrases that do not describe the page well or that do not include the keywords that really matter for the business (for example “home” on the home page of the site).
Our advice here is to try to accurately describe the topic that the page deals with and pay attention to the order of the words: the first ones have more relevance than the last ones.
Do not forget that there is a size limit for the title that appears in the search results. It is recommended to use a maximum size of 65 characters for the title.
Google can read a larger amount, but this is the limit that usually appears when the search engine displays the results on the page.
An example of the importance of the title for SEO is in the following graph, which presents the clicks on a page. After we altered the page title (non-graphic featured period), the number of clicks jumped significantly.
Regarding the description of the link, is it important for SEO?
When you search for a certain article, the result shows a small summary below the title. This field corresponds to the meta description, a code within the page that has no weight as a ranking factor, but directly influences the click rate. And this rate is a factor that will influence the ranking of your page.
In a simple way, the intention of the meta description is to convince the user to click on the link. For this reason, it is a very important field to be worked on.
The tip here is to make a striking, interesting meta description that informs, is curious and also has a Call to Action, touch.
It should be remembered that the included text will not always be displayed on Google. The search engine prioritizes the most relevant text for the user and if a part of the content is considered more appropriate than the configured description, it will be displayed.
URL’s
Another element, that Google scans for keywords is the page address itself.
Consequently, it is essential that your URL is descriptive and contains the desired keyword, something like “http://site.com/post-name”.
Avoid creating URLs with codes like in the model “http://site.com/cfr35643dr3465” or with parameters like “http://site.com/?p=243”.
Additionally, it is more friendly, reliable and easy for people to share the links of your site with their respective contacts.
Look at a good example of content with optimized title, description and url:
- Keyword: digital marketing
- Title: Digital Marketing: What it is, how to do it and everything about Online Marketing
- URL: https://stealthseosolutions.com/
- Description: Learn EVERYTHING about Digital Marketing: what it is, how to do it, its advantages, strategies, tools and much more.
If the site is on WordPress, install Yoast SEO to easily edit this information.
Other CMSs generally allow easy editing of this information. Otherwise, you will need to edit those tags in the code.
To learn more, go to the post “What is a friendly URL for Google and other search engines”.
Internal Links
The entire internet is made up of links. Users browse all the time through links, as well as the robots of each search engine.
Taking this into account, the reasons for your site to have internal links are very simple. Some of them are:
- Improve user navigation.
- Make it easy for search engines to discover new pages.
- Link pages that address similar issues.
In the inclusion of links, it is important to pay attention to the anchor text and not work with generic indications such as “in this post” or “click here“.
The ideal is to use an anchor that is related to the subject of the page, to indicate to the robots that, when it comes to a certain subject, that is the main page.
Alternative Text
All images on a page must have an alternative text (alt text) in case the image is not displayed. It is by filling in this field that you help screen readers and Google understand what the image represents.
Using the keyword in the alternative text (or alt text) also helps the organic positioning of the content, reinforcing to search engines the context for which the image was added.
Google itself recommends that the alternative text be descriptive in relation to the image it represents and that it not be too long.
To find out where to find images for your pages, check out the post “57 free and paid image banks you should know about”.
Featured Snippet (Position 0)
Have you noticed that for some Google searches, a block with a snippet of the content you’re looking for is displayed?
This result is the featured snippet, also known as Google’s “position 0”, and generally shows recipes, tables, detailed definitions, etc.
This is an example of a paragraph snippet. This model displays a small snippet of content with the intuition of already answering the user’s question on the results page itself.
It is important to highlight that there is no relationship between position 0 and position 1. There are even studies that identified sites on the 8th page of the search engine presenting the result in position 0.
Now you must be wondering how to get to Google position 0. truth?
In reality, there is no data tagging or guarantee of getting it. It is something relatively new and many are already studying the subject.
It is recommended to work with content that displays a direct and objective answer to a specific question or a step-by-step tutorial to which the search is applied.
SEO Off Page
We can consider SEO off-page to all the activities carried out outside the domain of the site that directly impact your organic positioning.
It is generally linked to link building, but there is another factor beyond links that influences the results of a site: brand presence.
Brand Presence
Improving your brand presence on the internet is not the same as generating links (or link building). The presence of the brand will not increase the authority of your domain, but it will increase the trust of Google and other search engines in your brand. And if your brand is trustworthy, it will hardly be penalized.
Some factors that can influence your brand presence:
- Mentions of the brand name without links on sites and portals;
- Consolidated presence on YouTube;
- Positive user rating on Google;
- A strong and engaged fan base on social media.
Receiving a mention of your brand and with a link to your site (mainly from relevant and reliable sites), in addition to contributing to the presence and trust of your brand, increases the authority of your site and is already part of the link building actions.
Link Building
As we mentioned in Google ranking factors, page and domain authorities are among the most important criteria. Finally, classifying the sites taking into account the links that point to them was the main differential of Google when it entered the search engine market.
In addition to the search giant announcing that it is one of the main ranking factors (as we said before), a search with 1 million Google results, carried out by Backlinko, pointed to the referring domain number as the factor that more impact on the positions.
However, since the Penguin update it is not only the number of links that Google takes into account when rating a site, now the focus is on the quality of those links. To understand it better, let’s compare links with prizes:
- Until 2015, Leonardo DiCaprio had 2 Golden Globes, 1 Berlin Film Festival award, 1 People’s Choice Awards and 1 AACTA Awards. A total of 5 prizes;
- In 2016 he won 1 Oscar and 1 Golden Globe. A total of 2 awards.
And what had the most impact on his career (and on the cache) of the actor? Surely winning his first Oscar had more impact on his life and career than on social media.
The same applies when we talk about links. It is more important to receive a link from a reliable, authoritative site that has relevance in its content, than to receive several from small sites, without any authority or relevance.
To generate links for your site, there are some tactics used by SEO professionals. Learn about 6 popular link building techniques:
Guest Post
Production of articles for other blogs as a guest (with a link to your site), which explore content related to the universe of the company and a relevant site. It should not be done just to generate links it is important to maintain a quality standard and include links that make sense to the user. Learn more about guest posts.
Brand Mentions
Not all sites that talk about you or your brand include a link to your site. Therefore, it is important to monitor these mentions and, if it is done without a link, contact the person in charge to try an inclusion. You can use Google Alerts for that.
Broken Links
It is very likely that sites relevant to your business have generated links to other sites at some point and that today they are broken (the other site disabled the page or simply does not exist anymore). If you have relevant content to replace the broken link, just contact the author or owner of the site reporting the error and recommending the change for your link.
Interviews With Specialists
Interviewing specialists in your area of activity brings many advantages, since it generates quality content and also offers the possibility of generating links from the specialists themselves. To encourage it, you can contact the interviewee after publication, thanking them for their participation and sending the link to the interview. You can also take advantage of the contact and indicate content from the specialist’s site that fits the interview.
Studies
They are not easy to do and, depending on the chosen methodology, can even be unfeasible. However, this resource places you as a reference when other sites use the statistics raised in the study, generating links.
Press Office
Working with a press office brings several advantages to your company. Among these, there is the possibility of getting news or articles on various sites and portals. In case it is a consultancy that already works with link building or has already agreed that with you, the chances of generating links in each subject obtained is much greater.
In theory it may seem simple, but the larger the site, the more difficult contact and link generation will be. Another problem is that many large sites, mainly news portals, only include external links using “NoFollow”.
What Are “NoFollow” Links?
By including a link on your site in an image or anchor text, regardless of whether it is an internal link or to other sites, the search engine robots, when passing through the page that contains the link, will identify it and follow it, going to the page you are indicating.
Those links are known as DoFollow links and no alteration is necessary to keep them that way.
However, there is a way to prevent robots from following that link and attributing authority for the page that is receiving it: include the “NoFollow” meta in the link. The meta tag is included as follows, inside the section:
So that robots do not follow the links:
<meta name=”robots” content=”NoFollow” />
To stop robots from indexing the content:
<meta name=”robots” content=”NoIndex” />
So that the robots do not index the content and do not follow the links:
<meta name=”robots” content=”NoIndex, NoFollow” />
This information can be inserted inside the head in the site code to be used in all the links presented or only in some specific links. Many sites include it only in external links, as follows:
<a href=”https://mentionedlink.com” rel=”nofollow”>
Despite not generating authority for the site that receives that type of link, this does not mean that they are worthless. In addition to contributing to brand presence, these can generate free traffic, often qualified, according to the site that is present.
Because it is a reflection of actions that occur on other sites, it may seem difficult to measure the results of a link building strategy or the reputation of a site. For this, there are some metrics that can help you.
PageRank
Metric from 0 to 10 was created by Larry Page (one of the founders of Google) to measure the authority of a page. PageRank currently calculates the quantity, quality and relevance of the links received by a page. Since 2016 it is no longer publicly disclosed.
Page Authority and Domain Authority (MOZ)
Metric from 1 to 100 created by MOZ to measure the authority of a page (Page Authority or PA) or the entire domain (Domain Authority or DA). They are calculated taking into account the links received, MOZRank, MOZTrust and other metrics. They are considered good Pagerank alternatives and can be viewed in MOZBar (free) and Open Site Explorer.
URL Rating and Domain Rating (Ahrefs)
They are logarithmic metrics from 1 to 100 to measure the strength of backlinks of a URL (URL Rating or UR) and of an entire domain (Domain Rating or DR). They are calculated in the Ahrefs tool and it is necessary to have an account to access it.
On-Site SEO (or technical SEO)
Many professionals unify these on-page SEO techniques because they are improvements within the environment itself. However, we believe that on-site SEO, also known as technical SEO, is a separate category, which does not involve content production so much, but rather the programming, performance and usability characteristics of the site.
With Google updates giving more and more attention to the quality of content, these techniques are often forgotten by professionals. But it should be remembered that factors such as UX (user experience or user experience), loading speed and security are becoming more important every day and require your attention.
Learn below the main points of attention when it comes to on-site SEO.
UX
UX or User Experience in SEO refers to the entire experience and interaction of users with your site or blog.
Providing an incredible experience when receiving a user in your online environment will positively influence metrics such as bounce rate, pages viewed per session and user time on the site. It is important to remember that metrics like this directly interfere with the organic positioning of your site.
When it comes to the content of your site, mainly blog articles, keep the text scannable, with short paragraphs, images, illustrations, as well as other means and everything that can contribute to make the content easier, more accessible and more pleasant for the user.
When we talk about access via mobile devices, this becomes even more relevant, since the experience offered on desktops cannot be the same on cell phones. According to Google, 61% of users tend to leave a page if the mobile experience was not positive.
And how do you know if the user experience on the site is not positive?
Through the metrics presented in Google Analytics you can already recognize if the experience is positive or not.
Other tools like Hotjar (it has a free plan), Clicktale and Crazy Egg also help in the diagnosis and analysis of the experience, with heat maps, user session recording and other features.
Upload Speed
In 2010 Google announced that page load time became a ranking factor.
In early 2018, the search giant disclosed to the public that the speed of sites on mobile devices would begin to affect organic rankings in mobile searches. This update was known as “The Speed Update”.
Do you think that the loading time of your site is not important? Then consult some data that will make you change your mind:
- 53% of mobile users leave the site if the page takes more than 3 seconds to load
- 83% of users expect the site to load in 3 seconds or less
- The fundraising campaign made by Obama had a 14% increase in donations reducing the load time from 5 to 2 seconds
- Every second of improvement in page speed increases conversions by 7%
Now that you know that improving the page load time of your site affects not only the organic ranking of your site, but also the user experience and conversions, how about starting to optimize?
Learn now how to test and how to optimize the speed of your site in the article: Website speed: learn how to test and make your page faster.
HTTPS
Security is a priority for Google. So much so that, as we said in the “Algorithm and updates” section, in 2014 the search engine announced that HTTPS had become a ranking factor for sites.
“What changes with a site on HTTPS?” you may be thinking.
Basically, the data sent is authenticated, encrypted and cannot be modified during the transfer.
This is so important that Google Chrome alerts non-HTTPS pages that request any user information as “not secure site”. In some cases, this alert is already displayed in Google search results, without even entering the site.
It’s important to note that if you decide to migrate your site to HTTPS, Google and all search engines will consider it a new site and therefore duplicate content.
To avoid this situation, it is necessary to redirect all the old URLs in HTTP to the new ones in HTTPS, in addition to taking other care during the migration.
The ex-Google Pedro Dias developed a checklist for HTTPS migration, with the main aspects of the process. The recommended steps are:
- Get an SSL certificate;
- Validate HTTPS support of external resources;
- Plan and prepare for protocol migration;
- Enable the HTTPS protocol and install the SSL certificate;
- Update resources and internal links for HTTPS;
- Update external resources for HTTPS;
- Include the HTTPS version in Google Search Console;
- Enable HTTP Redirects for HTTPS;
- Enable HTTP/2 support;
- Enable HSTS.
Sitemap
A sitemap is nothing more than a map of your site. Its function is to facilitate the work of search engine robots to understand the structure of the site.
If your site uses WordPress, you can automatically generate the sitemap via plugins, such as Yoast. In case you don’t use CMS, there are online tools that can generate the file, such as XML-Sitemaps.
The most common format of the file is XML, but it is possible to generate a sitemap in TXT file and Atom-RSS (the RSS feed of the site, although it is not recommended that it be used).
After being generated, the sitemap link should be sent to search engine webmaster tools, such as Google Search Console and Bing Webmasters.
To learn more about the subject, be sure to see the full article: XML Sitemap: everything you need to know.
Robots.txt
Have you heard (or even happened in your case) of sites being created, launched a while ago, but no page is displayed in Google, even searching for the company name?
Usually this happens because the search engine robots were told in Robots.txt not to index any pages.
Robots.txt is a text file that stays on the site’s server (such as https://www.rdstation.com/robots.txt, for example) and tells search engine robots whether they can (or can’t) index parts of the site. of a site.
In practice, the file should contain the user-agent and the disallow and/or allow statements. Another common instruction in the file is indicating where the sitemap is.
- User-agent: An indication of which browser should read the instructions within the file. If you want to list the specific configurations of a browser, you can check the list of the main internet user-agents. In case the instructions apply to all browsers, you can simply include a “*”.
- Disallow: These are a subfolder or categories of the site that you don’t want search engines to crawl.
- Allow: By default, pages that are not directly indicated or are not part of the category listed in the disallow command will be crawled. The allow command is generally used for specific pages or subcategories that are within a category that was indicated in the disallow command, but should be crawled.
- Sitemap: it is enough to include the links of the sitemaps of the site.
Generally, a robots.txt file has the following structure:
User-agent: *
Allow: /
Sitemap: http://www.example.com/sitemap.xml
In the case of sites that are not indexed by Google, the file may have been configured indicating that no pages should be crawled, as in the following example (which should not be used):
User-agent: *
Disallow: /
Note that this is configured for all search engines (user-agent: *), has a blocked category (Disallow: /search), but indicates that 2 pages of the category must be crawled (Allow: /search/about and Allow:/search/howsearchworks).
After creating your robots.txt and publishing to the site’s server, you can use a Google tool to test if the file is correct.
It is recommended that the entire site have a Robots.txt configured, however, not all situations should be used with the Robots.txt.
It is possible to view the settings made by any site that owns the file. Therefore, for pages that should not be accessed by the public in any way, such as thank you pages or login pages, it is best to use the command <meta name=” robots” content=” noindex”> within the section <head>.
In addition to being safer, this is the recommended and most guaranteed way to prevent content from being indexed in search engines.
Heading Tags
The heading tags already had more importance when it comes to SEO. Today they serve to present robots (and users) with a logical structure of the information on the pages.
Logically, the text in H1 is more important than the one in H2, which is more important than the text in H3, and so on
According to Matt Cutts, ex-Google and reference in SEO, it is not necessary to have exactly that order, the important thing is that the page has an organization for the user.
Another common question from webmasters is if there is a problem with having more than one H1 tag. Matt Cutts also responds to this, stating that it’s okay to have more than one H1 tag, as long as the page isn’t full of them.
Because they are considered featured texts on the page, it is important to try to fit the keywords that are being explored into the content (mainly in the H1, which is usually also the title of the page).
With this, you will also help the search engine robots to understand that the main subject of the page is the chosen keyword, since it is present in an important part of the text, in addition to being found together with its synonyms in the paragraphs.
Rich Snippets
Have you seen that certain search results presented by Google are more complete than others?
Some have a search bar, rating stars, links within the site, etc. Look at an example of “SEMrush reviews” on Google:
Several of the data are presented along with the site, such as company information on the right side of the screen and sitelinks below the first result.
Did you notice how the search results that present this information stand out on the results page?
According to studies, results using rich snippets have a 30% increase in click-through rate, on average, bringing more traffic and even improving organic page rankings, as click-through rate (CTR) is also a factor. ranking.
To display this differentiated information, these sites must apply some settings on their site. If you’re not using WordPress, the Schema.org site has all the documentation you need to apply the settings.
In case you use WordPress, Yoast SEO itself already applies the most common settings, such as company and article snippets. To apply other settings, you can use the All in one, Schema Rich Snippets.
Error 404
“The page you are looking for does not exist.”
Have you ever seen this type of message on a site? This occurs when trying to access a page from an existing domain that was excluded, had its URL tampered with, or never existed.
The first step is to ensure a 404 error, page offers a positive user experience so that users stay on your site and find what they want.
In addition to showing the navigation for the previous page or for the main page of the site, it also has a search field so that the user can search for the content they want to enter.
Even if you have a 404 error, page that offers a good user experience, it is important to avoid it being displayed as much as possible, ensuring that, in case you need to remove or unpublish content, there is a 301 redirect for similar content.
In addition to improving navigation and presenting more appropriate content, you also redirect the authority that the old URL received to the new URL.
Redirects
Speaking of redirects, do you know what they are?
Basically, a redirect is a configuration on the server so that access to URL A is directed to URL B. For that, you can use the 302 or 301 redirect, which are the most used (and recommended).
The 302 redirect is temporary. By using this format, you mainly show search engines that you are applying some change or improvement to URL A and that is why you are currently redirecting users to URL B, but that will no longer be done in the near future.
For its part, the 301 redirect is final. This tells search engines that they can only consider the new page, which receives the attention, and even the authority of the old URL goes to the new one as well.
In addition to migrating pages that no longer exist and are showing 404 errors, the 301 redirect should also be used in other situations, such as site domain migration (be it just a migration from HTTP to HTTPS or a name change), redirecting the old URLs for the new ones.
Likewise, it is important to apply redirects from other versions of your site, such as the version with www at the beginning, home with /index at the end, etc. Redirect all to the official version.
To test which type of redirect (and how many) a given page has, you can use the free HTTP Status Code Checker tool.
Remember to be careful with redirect chains whenever you are going to perform redirects. Avoid redirecting to a URL that redirects to another.
Canonical Tags
The name may imply that it is something complicated, but the objective of a canonical tag is simple: in case of duplicate content, the canonical tag indicates the main one to search engines.
At first it may seem similar to redirects. However, in the case of the canonical tag, the search mechanisms only attribute authority to the preferred page if they really understand that it is a similar version and if the use of the tag makes sense.
The code that should be inserted in the minor versions of the page, inside the <head> section, is:
<link rel=”canonical” href=”http://www.mysite.com.mx” />
Another difference is that the pages that send to the main one, are not accessible to the user, different from the redirection that ends up going to the page that the user or the robots try to access, leading to the redirected page.
A common use case is if you use a landing page creation software, such as RD Station Marketing, in which you can work with a landing page for each campaign, be it in paid media or social networks.
Since the pages are probably very similar, but with different URLs, you can define one as the main one, and use the canonical tag to point to all the others.
Many developers or plugins configure the site to use self canonical, where each page, when it does not include a canonical tag specific to its parent version, points to itself, as in the RD Station blog
This practice is not mandatory, but it is recommended by Google, as it makes it clearer to search engines that this is the page you really want to index.
Google already interprets canonical tags between different domains. Therefore, it is possible to point example.com to site.com.
Alternate Tag
If you have an exclusive version of your site for mobile devices (m.example.com), search engines will understand it as another site, with content very similar to that of the desktop version (example.com), which can lead to content problems. duplicate.
You can see it in the following example:
On the computer page (https://www.sitio.com/pagina), the tag must be added:
<link rel=”alternate” media=”only screen and (max-width: 640px)” href=”https://m.sitio.com.br/pagina”>
On the page for mobile devices (https://m.site.com/page), the canonical tag must be included:
<link rel=”canonical” href=”https://www.site.com/page”>
To learn more about the implementation of this markup, access the official Google documentation on the subject.
Another application of the alternate tag is on sites with different languages, indicating to search engines which is the ideal version of content for each language (or country). To apply the tags correctly, all the pages involved must have the configured code (otherwise, that indication will not be valid).
An example is a site that has a Spanish version (with no specific country as focus), another version for Brazilian Portuguese, and another for English in the United States. Pages that have all 3 languages must have the following markup in the code, inside the <head>:
<link rel=”alternate” href=”https://site.com.br/page/” hreflang=”en” />
<link rel=”alternate” href=”https://site.com/en/page/” hreflang=”pt_BR” />
<link rel=”alternate” href=”https://site.com/en/page/” hreflang=”en-US” />
<link rel=”alternate” href=”https://site.com.br/page/” hreflang=”x-default” />
Thus:
- For users whose main language is Spanish (regardless of the country), the version presented is “https://sitio.com/es/pagina/”;
- For users in Brazil, the version presented in searches will be “https://sitio.com.br/pagina/”;
- For users in the United States with English language, the indicated version is “https://site.com/en/page/”;
- For users who are not from the indicated regions and languages (or who do not have that certain) the indicated version is “https://site.com/page/”, using hreflang=”x-default”.
It should be noted that these settings serve to indicate to Google which are the ideal versions of the content according to the country and language.
Including these markings will not guarantee that your positioning, in one country, will be replicated for others or that it will not be considered duplicate content (so that this does not happen, use a canonical tag).
SEO For Mobile Devices (Mobile SEO)
You may even think that mobile devices are not affecting the results of your business, but you are probably wrong.
There are many things that are changing. According to Maryna Hradovich of SEMrush, currently more than 60% of internet searches are done on mobile devices in the United States.
It is in this scenario that SEO for mobile devices comes in, which aims to ensure that users accessing content from cell phones and tablets have a good experience.
If you already have your site optimized for search engines, you only need to take a few steps to optimize it for mobile devices as well.
Google released a document (in Portuguese), with several techniques to have a better mobile site. Meet the top 12 below:
- Highlight your Calls-to-Action
- Keep menus short
- Facilitates the return to the home of the site
- Make the search field visible
- Offer better search results
- Allows browsing without registration
- Use click-to-call buttons
- Simplifies and facilitates the inclusion of information (such as forms)
- Optimize the entire site for mobile devices
- Do not force the zoom on the screen (it displays the information in the appropriate size)
- Keep the user in a single window
- Don’t call the “full site” version
Responsive Site vs Mobile Site
Do you know those sites that you open from your cell phone and the screen is adjusted but with such small letters that you have to use the zoom to read? Well, those are not responsive sites.
Responsive sites are intelligently projected so that they are adapted to any type of resolution, without distortions. A responsive design identifies the width of each device and, in this way, manages to determine how much space is available and how the page will be displayed so that these spaces are fully used.
It also adjusts the dimensions of images, fonts, and other elements on a page, so they don’t look disproportionate.
In practice, it is that site that, with the same layout, adjusts perfectly to any resolution in a harmonious way, giving the same experience to the user, regardless of the device through which they access.
It is important not to confuse responsive sites with mobile sites, as they have different characteristics:
- Responsive: the entire site designed to be adapted to any type of screen. It is a single code structure that works at different resolutions.
- Mobile: It is a second separate site, made exclusively to be opened on certain types of devices.
It is always recommended that the site be responsive. However, if this is not possible or if a project for this adaptation is underway, a mobile version or a plugin that adapts for mobile devices is an interesting option as well.
To learn more, go to the post: “9 reasons to use responsive design on your site”.
AMP (Accelerated Mobile Pages)
AMP is the acronym for Accelerated Mobile Pages (accelerated pages for mobile devices, in Spanish. They are site pages optimized for simplified and almost instant loading when accessed via mobile devices.
The project is an open source, initiative of large content publishers and technology companies, with the aim of improving the entire content ecosystem for mobile devices.
Basically, an AMP page has an architecture that prioritizes page load speed. That architecture is divided into 3 different configurations:
AMP HTML: a different HTML code, with restrictions and extensions, going beyond basic HTML. Most of its tags are normal HTML tags, but some are replaced with AMP-specific tags;
AMP JS – Responsible for ensuring fast rendering of AMP pages. Its main function is to make everything that is external asynchronous, so that no element on the page can block the rendering of another;
Google AMP Cache – This is optional, but it caches all AMPHTML pages on Google servers and automatically improves their performance. Other companies may develop their own AMP cache.
When performing a Google search using the mobile, the pages with AMP configured are marked with the acronym. When clicking on a result marked as AMP, the simplified version of the page is loaded almost instantly.
When configured, an AMP page ends up becoming a second version of the page, with the same content as the original version, being generally identified with “/amp” at the end of the link, which makes it easier to identify its performance in isolation in reports. web analytics. This also ended up raising another issue that can become a problem: duplicate content.
To avoid this, it is necessary to include a canonical tag in the version of the AMP page, indicating to Google which is the original version of the content. And, for Google to identify that a certain page has an AMP version, a markup must be included in the code of the original page, as in the example:
<link rel=”amphtml” href=”http://www.example.com/post-example/amp/”>
Local SEO
If it is already a good thing for brands to appear among the first results in search engines, the fact is even better for physical businesses located near the location of the user doing the search.
Discovering a bakery open next door, or a dentist in the neighborhood to treat an emergency is an interesting option for clients and professionals.
This is the proposal of a Local SEO strategy: make the address emerge organically as one of the main solutions. Even better if the establishment appears marked on the map, located at the top of the page.
Making use of local ranking factors such as name, address, company phone number and customer reviews helps in achieving those desired positions.
Below, you will see some resources to attract potential customers for your business.
Google Business Profile
The free tool makes it easy for the business to be found by users. Through it, data such as address, route, telephone and website are displayed in Google Search and Google Maps.
This functionality is even more important when it comes to local businesses. According to the user’s search and location, the search engine shows the closest companies that can solve the problem.
Posting photos, responding to comments and discovering how people searched for the business are other advantages offered by the platform.
In my post “Google Business Profile: How To Set Up Optimize And Rank”, we include a step-by-step guide to creating your account in the tool.
Local Keywords
Keyword tools rely on search volume to define preferred terms for each type of business. That dependency can be an obstacle if you want to see correct expressions to reach an audience in a specific region.
In that case, there is little amount of information available, as searches by month or year are generally minimal. That makes it harder to know what the best terms are.
Even so, it is possible to discover the key expressions for the public you want to reach. There are 3 tactics that help in that task.
Based On Keywords From Nearby And Similar Regions
Test with the most searched terms by filtering cities that are close and larger than the place you want to reach (knowing the amount of population is a good indicator for that).
As the search volume is more abundant, you can look at the numbers generated and guess what the key expressions are for your audience – probably the pattern of searches between the populations is similar.
Take Advantage Of Google’s Automatic Suggestions
When you do a search, Google automatically offers more search suggestions, displaying terms in order of popularity. This is a good indication of the keywords in which you should invest. Other platforms, such as Bing and YouTube, also take advantage of this resource.
Pay Attention To Related Search Suggestions
There is another way by which you can discover the keywords for your niche: look at the related searches, listed at the bottom of the results page.
These can serve as inspiration since the links suggested for your current search are valid for all the terms and expressions raised there.
Keyword Research: What It Is And How To Do One In 6 Steps
Considered the basis of an SEO strategy, keyword research is an essential stage for the success of content production, since it is through the keywords used by the user in the search that Google identifies and presents relevant results.
To tap into that demand and drive qualified traffic to your site, you need to identify what are the most common searches in your niche to produce the right content.
-
Start By Mapping The Issues
Do you already know the buyer person of your business and the purchase process they go through, from the learning stage to making the purchase decision?
This information, together with the main issues of the universe of your company, must be documented, preferably in a mental map (or mindmap), to facilitate the next step, which is the search itself.
The themes of your company’s universe can be done with a brainstorm with your team and evaluating the content categories of your site (and your direct competitors).
-
Look For The Words That Already Generate Traffic On Your Site
Before you go using tools and more tools, it is important to remember that there must already be words that are generating organic traffic for your site.
To identify them, you can use the information presented in the “Search Analytics” report, within the Google Search Console.
If you have ads in Google AdWords, also take advantage of the words that you are using there, mainly those that generate conversions.
You can also use Google Analytics itself to identify the pages with the most organic hits and find out which term users searched for to access them.
-
Look For New Word Ideas
Now it’s time to do your own keyword research. To do so, there are different tools available in the market.
Some free and widely used tools for keyword research:
- Google Keyword Planner
- Ubersuggest
- Keywordtool
- AnswerThePublic
Some paid tools that have keyword search functionality:
- SEMrush
- Ahrefs
- Moz
Use the items checked in step 1 and the words you identified in step 2 as the basis for your search.
The recommended tools will help you create hundreds of new ideas, also identifying other valuable information, such as search volume, cost per click and difficulty, for example. This information will be valuable for the selection of the most important words and also to make a prioritization.
-
Find Long-Tail Keywords
Subjects and finer terms about the subject will eventually come up during the earlier stages of the search. Because they have a smaller volume than broader and more generic words, we often leave these terms aside.
However, these terms are generally easier to rank for in Google, and when combined, they end up surpassing the volume of generic words, which is why they are called long-tail keywords.
Likewise, because they are more detailed terms in relevant matters of your business, these words have the potential to bring more qualified traffic than generic searches.
-
Document Everything And Prioritize Production
All the keywords raised in the previous steps, along with your traffic, difficulty and cost per click data, should be documented (preferably in a spreadsheet).
After doing this survey, the next steps will be to eliminate the words that do not make sense and prioritize the remaining words, in this way it is possible to determine which are the most important terms, which must have content with greater urgency.
Don’t just consider traffic volume. Cost per click is ignored by many in SEO analytics as a sponsored link metric, but a higher cost per click indicates that it is a more contested and valuable term in paid search.
In this way, ranking organically for words with higher costs in paid search is very valuable for your organic results.
-
Monitor Ranking And Results
After selecting the most important terms, it is advisable to monitor the organic positioning of your site for these, to closely follow the performance of the content, as it is produced.
Even if you don’t have dedicated content for the subject, include as many important words as possible in that monitoring, as in many cases you may have some content ranking for those other terms.
In this way, instead of creating content from scratch, you can update, optimize and republish what you already have, increasing the chances of a good organic positioning.
The 3 Most Used SEO Tools
As companies invest in SEO, the need arises to make the processes carried out in the strategy more agile.
It is with this scenario that the demand for the use of tools grows: these come to automate various manual tasks that end up taking a long time to execute and make it difficult to prioritize more strategic activities.
Get to know some of the most used SEO tools:
Google Search Console
Search Console is a free tool that shows how Google sees your site.
If Google Analytics is a powerful tool to analyze the behavior of users on your site, Search Console (former Webmaster Tools) is also a key piece to know how users get to it.
With the democratization of information, even the old name Webmaster Tools was changed, since the public that uses it is now not limited to technicians.
The main functionalities of the Search Console are:
- Search Appearance: How Google reads each part of the page – titles, descriptions, images, etc;
- Search Traffic – What people are searching for on Google that leads them to click (or not) to your site;
- Index of Google (Google Index): how is the performance of Google to index the pages of your site and identify the keywords that appear the most;
- Crawl: what are the difficulties that robots have when inspecting the pages of your site in search of relevant content;
- Security Issues – Notifications of security issues detected on your site;
SEMrush
SEMrush is a Promotional Marketing (SEM or Search Engine Marketing) tool, which has several functionalities, but is famous for one in particular: competition analysis, which shows the keywords of any site or domain.
Through this it is possible to analyze which keywords bring more traffic to a site, both through organic traffic and through AdWords.
It also offers other possibilities, such as the generation of analytical reports, positioning monitoring, link analysis, error reports on the site, social network monitoring, project management, etc.
Yoast SEO
Yoast SEO is one of the most popular plugins for WordPress, being the most downloaded for SEO.
With a free version, the installation of the plugin allows the creation of sitemaps, configuration of templates with standardization for title, description and URLs of the pages, optimization of this information individually, configuration of canonical tags, open graph and other functionalities.
Because it is very complete, even in its free version, its use on WordPress sites and blogs is highly recommended.
Going Beyond Google: How To Do SEO In Other Search Engines
When we talk about SEO, the first thing that comes to mind is optimizing pages for Google search. However, improvements to other mechanisms are also possible.
Now, we will talk a little more about some of them.
SEO for YouTube
Did you know that YouTube is the second largest search site in the world (behind only Google)? This is reason enough to consider SEO for YouTube in your strategy.
According to a study carried out by Backlinko with more than 1.3 million YouTube videos, comments are a factor that influences the ranking, that is, there is a strong relationship between the number of comments and a good ranking on YouTube.
Therefore, keep the comment option open. And stay vigilant to avoid out-of-context discussions, spammers, and negative or malicious comments. Seek to converse with your audience, include videos as answers or relate others related to the topic. The more interaction, the better.
The number of likes, in this case, also matters: there is a significant correlation between them and ranking.
Another point is that long videos tend to perform better than shorter videos. On average, videos on the first page are 14 minutes and 50 seconds long.
See what point in the media your audience loses interest and keep an average for new productions. Check what is common in the drop points and work your content to make it even better.
Google also considers the number of views of a video to be increasingly important. Additionally, the more views, the greater the impression for the user that the video has quality or relevance.
In the same way, the greater the dissemination of the video and the more it is shared by users, the greater the chances that your video will be viewed and the greater the possibility of positioning yourself on YouTube. Therefore, share your material as much as possible, so that it is increasingly viewed and shared, improving its positioning.
There is also a strong correlation between the number of subscribers in a video and the positioning of this same video. However, when it comes to the number of subscribers on the channel and the ranking of a given video, that correlation is moderate. That means that even with a small channel, you can still rank well on YouTube.
So, don’t get discouraged: keep a constant production so that your audience understands your frequency of uploads and news on the channel. That makes many users sign up and automatically receive your news.
Although the use of keywords is important for SEO, in the case of YouTube, the study pointed out that there is no correlation between optimizing video descriptions for keywords and good rankings.
Meanwhile, the correlation between the use of keyword-heavy tags and Google ranking is moderate. This may indicate that YouTube is already able to “read” the video content even without the help of metadata.
Another aspect shown is that the use of the keyword in the video title is more important: the study showed a moderate correlation between using the keyword in the title and having a good ranking.
Finally, if you want to achieve the first places on YouTube, it is best to invest in HD videos (High Definition or High Definition): these are 68.2% of the videos present on the first page of YouTube.
SEO For Bing
There are good reasons for Digital Marketing professionals to pay attention to the Microsoft search engine. One of them is that much of what is already done in terms of SEO for Google also applies to Bing.
This search engine’s market share is growing, and with Google’s constant algorithm updates, it’s good to know you can maintain your audience while adapting to the world’s #1 changes.
Inspired by Google Search Console (formerly Google Webmaster Tools), Bing created its tool for webmasters: Bing Webmaster Tools. This one stands out for its complete SEO analysis reports for sites and for the keyword search resource. The registration and submission of the sitemap also facilitate the indexing of the pages in the search engine.
ASO – App Store Optimization
ASO is a set of techniques used to improve the positioning of a mobile application in app stores, such as Google Play and the Apple Store. In comparison, ASO strategies are equivalent to site and blog SEO.
With simple steps like giving an appropriate name, having a suitable icon and using keywords, it is possible to make an app stand out in the sea of apps. Once the basic tasks are done, pay attention to all the feedback received as well as the bugs and always release new versions.
SEO In Practice: How To Start Doing It For Your Company In 5 Steps
If you got here, you already know a lot about SEO: what it is, important factors for search engines, positive (and negative) techniques, and various other information. It can be so many things that now you are asking yourself: “where should I start?”
With that in mind, we have separated the first steps to start optimizing your site for search engines. But it is worth mentioning that there is no perfect recipe and that these steps are not rules, but rather a simple way to guide your start.
-
Put The House In Order (Site Diagnosis And Improvement Checklist)
First of all, it is important to get everything fixed, according to the best SEO practices.
You can start an audit of your site with a complete performance and SEO analysis, comparing it to your main competitor, in the Website Analysis and Diagnostics tool.
In addition to the analysis, you can do a complete audit on your site, verifying items such as architecture, URLs, titles, descriptions, robots and everything that we have already talked about here.
The full checklist is available in this SEO spreadsheet.
-
Register Your Site With Google Search Console, Bing Webmasters And Google Business Profile
As we said before, registering your site in Google Search Console, Bing Webmasters and Google Business Profile is very important for your strategy.
Google Search Console will help you monitor and maintain your site’s presence in search results. It is not necessary to register for your site to be included in the results, but doing so will optimize its performance. The same goes for Bing Webmasters.
For its part, Google Business Profile is one of the tools that can help potential customers to have more complete access to your company’s information. According to Google, “validated businesses are twice as likely to be considered reputable by users.”
When it comes to local businesses, this functionality is even more important. If you are a local entrepreneur, think of this situation: a customer is a few minutes away from your store – for example, a restaurant – and is looking for a place to have lunch. If you type “restaurant” into Google, would you be found? Google Business Profile will certainly help your business get found.
-
Map Buyer Personas And Purchase Process
Mapping your buyer personas and purchase process, defining what should be done for each stage, helps to balance the production of content (even more so if the production is done in the same company and with limited resources) and to produce it with potential customers in mind. that interest your company.
To create your buyer personas, you can use the Fantastic Persona Generator and, to develop the processes, the Purchase Process creation tool.
-
Do A Word Search Based On Buyer Personas And Their Processes
Based on the buyer personas and the buying process, do a keyword search based on the buyer persona.
This information, together with the main issues of your company’s universe, will help you produce more consistent content that focuses on each stage of the process.
Before doing the search, we recommend you go back a bit and check again in our “Keyword search” section, where you will find all the necessary steps.
-
Optimize What You Already Have And Create For What You Don’t Have
Creating content from scratch requires a lot of effort. Therefore, if you already have content that was produced when you did not yet know much about SEO, you can do an optimization to improve your ranking in Google.
To do so, all you have to do is take the content that makes sense and review it according to the keywords found in the search, optimizing the meta description, title, URL, internal and external links, among the other attributes that we already mentioned.