Our goal: to optimize our web presence for search engines.
Ergonomics: the basis of a website
The ergonomics of a site can be defined by its ease of use by the greatest number of people with different configurations, with maximum comfort and efficiency.
The 3-click rule: useful or not?
The 3-click rule enables web users to find the information they're looking for on your site in a maximum of 3 clicks. The main advantage is that the information is found quickly. Internet users are in a hurry, and if they don't find what they're looking for quickly, they'll leave and look elsewhere.
This rule, while useful, is not unavoidable: the aim is not to create a site with every page accessible in 3 clicks, but simply to be clear and direct.
Page loading speed
Look after your HTML code
- Avoid the <img> tag if it's not mandatory, by using the CSS file's background image property.
Reduce image size
- Photoshop offers this type of "save for web" feature. Use it.
- There are tools available to reduce your JS files. For example: JCompress.
- For CSS, let's take Clean CSS as an example. It will remove all spaces in the code.
- Group them into 1 single file.
- Call the scripts at the bottom of the page before the </body>.
On-Page Site Optimization
The <title> tag
This is one of the most important criteria in the internal optimization of your site. It's what search engine users will see first.
The title of a page is specific to each page.
We recommend inserting a maximum of 10 words, bearing in mind that stop words (de, la, les, des ...) don't count.
Your title should describe what the visitor will find on your page.
The <meta> meta tags
Meta tags are used to send information (metadata). This information can be processed by: search engines, browsers and SEO tools.
The meta description tag: <meta name="description" content="your description" />
As a general rule, 160 characters are used to avoid the description being truncated.
The <h1>, <h2>, <h3>, <h4>, <h5>, <h6> title tags
The most important title tag is still the <h1> tag. Google gives the most importance to this tag.
Take care of your <h1> tag by incorporating your most important keywords.
<strong>, <em> tags
indicate that the text is important.
Optimize your links
Use a keyword related to the content of the link, rather than something like "click here".
Title and alt attributes for images and links
- The alt attribute is taken into account by Google.
- The title attribute is not taken into account by search engines; but is useful for users in case the link is broken, the user can see the description of this link.
Rich snippets and structured data
- Rich snippets - An essential part of your SEO strategy, there are 3 types:
* Microdonnées (recommended format)
- Microdata uses simple attributes in HTML tags (often <span> or <div>) to name elements and properties concisely and descriptively.
Nixa - Développement web
465 rue Saint-Jean
Téléphone : <span
Site web: <a href="
Other examples:Music, Reviews, Products, Recipes, Events, ...
Rel= «Canonical »
The larger and more complex your site, the more you should use the canonical tag to prevent duplication of page content. This can happen when your page has different characters in its URL. Using rel= "canonical" will avoid the problem of duplication, which dilutes the weight of your pages as well as their SEO juice.
Shorter URLs tend to perform better in search results. They're also more likely to be copied/pasted, shared and linked to by other websites.
The closer your keywords are to your domain name, the better. In fact, "site.com/keyword" will perform better than "site.com/folder/subfolder/keyword" and is definitely more recommended for web page SEO.
Using hyphens to separate keywords is always the best way to use your URL.
Instructions for robots
To improve the indexing of your content, as well as the indexing ban, you can tell the robots not to index certain pages.
When robots arrive on your site, the first thing they read is the robots.txt file. This will enable you to give robots instructions on how to index your pages. The role of robots.txt is to prevent robots from exploring certain pages.
- User-Agent :Designates robot names. User-Agent is followed either by a * to designate all robots, or by a robot name.
- Disallow : Then place a relative link to the directory or file not to be browsed.
- Allow : By default, all directories are set to "Allow". Its role is to authorize exceptions to prohibitions.
- The meta robot, an alternative to robots.txt <meta name="robots" content="noindex" />
The purpose of the sitemap is to help search engine spiders index your site's pages (txt or xml file).
The sitemap file does not help to improve your site's ranking, but only to index your pages.
Simply place it at the root of your site, so that the robots can read it.
Spy on your competitors
- If you want your site to achieve excellent results, you need to get ahead of your competitors in the search results. If you're just starting out, it can be a bit tricky to find out how your competitors work and compete with them, but once you've got a good SEO strategy, you can easily rank higher than them.
- To spy on your competitors, use tools to check your competitors' keywords. This way you'll not only be able to analyze their most profitable keywords, but you'll also be able to analyze their best-selling products and most-viewed pages.
- Once you've optimized your site for these keywords, you'll also be able to get more traffic and make more sales. So start spying on your competitors to position yourself on their best keywords and find the directories and links that point to their sites.
What can we learn from SEO best practices?
- Focus on building a website that's useful to your visitors and search engine spiders.
- Optimize your page load times, keywords, and focus on improving On-Page SEO performance to increase your traffic.
- Don't forget to spy on your top-ranked competitors to find better keywords and discover their backlinks, so you can rank higher in search results.