From a development point of view SEO is the concern of how well a robot can read and understand your content. As we will see, a robot being able to read your content easily is normally a good thing for humans too.
le informazioni più rilevanti si trovano in:
Per quanto riguarda la punteggiatura degli URL, è utile sapere che Googlebot, quando indicizza i contenuti, spezza le parole in base alla punteggiatura: quindi, usandola, si fornisce un'importante informazione al robot. Da notare che i trattini (dash) sono considerati da Googlebot come elementi di punteggiatura, mentre gli underscore non lo sono. In ogni caso, dato che la nostra tecnologia è in continua evoluzione, questo potrebbe cambiare in futuro
Per evitare potenziali problemi con la struttura degli URL, ti consigliamo di procedere nel seguente modo:
la guida google contiene info su come si fa a:
http://mysite/blog/Tom-Cruise-Is-Crazy instead of http://mysite/blog/post?id=1033
To a human '1033' is meaningless. However 'Tom Cruise Is Crazy' is teeming with usable key words. This does not improve page rank, but it can be helpful for users who return to your site via browser History/AwesomeBar/etc or use inurl: search parameters.
It allows you to tell search engines not to list certain URLs (admin pages and what not). site update notification - Ping search engines when your site updates so they can check it again if they have the time. This is on top of the check frequency/priority you can suggest in your sitemap.
# robots.txt file to include the following line: Sitemap: http://www.example.com/name-of-sitemap-file.xml
Then there are the things you can't control. The most important is external linking. Search engines want authoritative sources. They determine that by how many external sites link to yours. You can't make this happen morally. You can immorally setup thousands of fake sites to link to your sites; which is sometimes referred to as Google bombing. Before you even ask, yes there are 'companies' that offer this service.
The choice is yours, but SEO, isn't a make or break reasons to platform change to WordPress or else. It helps automate sitemaps, but doesn't help you chose good content that is being searched on that isn't heavily competed for. WordPress is great, but also remember it has security concerns that have be to taken into account too. It is so widely used that it is constantly under attack by hackers.
How fast your site loads and is perceived to have loaded is a highly technical challenge.
The order things come down the network at is important. A global internet means not everyone is accessing your site on a broadband connection. Mobile internet means you can't guarantee the transmission of data will even complete if it takes several cycles. *Why Site Speed is good for SEO: Site speed has been listed as one of Google's ranking factors. Naturally the faster the site the higher potential score you will get for this one part of their algorithm. According to Moz's breakdown of website speed and ranking the key factor is the time it takes for the first byte of information to come across the pipes.
If a search engine's crawlers can download the contents of your page quickly it is going to do it more often than if it takes seconds per request.
When people are researching for an article they are writing, they are more likely to stick around and read a page that responded quickly. This means your content is being absorbed by more people and has a greater chance to be linked to by someone.
Why we should care about Site Speed anyway: Even if you don't care about SEO you can't argue that slower is better, there are several studies showing that faster page loads are better for everyone.
Slow speeds can be an indicator that there is a query that is taking too long, if so your site may not be using the resources on your server efficiently and you may be spending money on a package you don't actually need.
Redirects are the hoops that your server jumps through when a browser asks for a page at a particular URL but knows it lives at a different location. There are several things that need to be considered:
You can do redirects at various levels, each one comes with maintainability issues. If done wrong can have a negative effect on your site. Can be broken for months before someone notices. Each redirect has an implied latency. Why Redirect are good for SEO Search engines like there to be one canonical place for everything, so if you have two paths that lead to the same content this is confusing for them.
Why we should care about Redirects anyway: Nobody likes dead links, this can easily happen when something major about the structure of your site changes (domain name, internal structure).
If a user goes to your site and gets a 404 they are not going to try subtle variations of the URL in order to get to the content, they will go onto the next site.
Even if the link isn't dead, people don't like jumping between 5 different URLs before getting to the content. If done poorly this can result in multiple network requests which is inefficient.
Status Codes are the codes returned from your server after a request has been made, as a developer you need to make sure you are returning the correct code at any given moment.
If you return a status code of 500 but meaningful content still is returned, will a search engine index it? Will other services? Search engines care a lot about the 3xx redirection status codes. If you have used a CMS to build your site it sometimes isn't apparent what codes are being used where. Why Status Codes are good for SEO The status code returned is one of the primary things a search engine has to know what to do next. If it gets a 3xx redirect notice it knows it needs to follow that path, if it gets a 200 it knows the page has been returned fine, etc.
Making sure all your content is returning on the 200 code and all your redirects are appropriately using the 301 code means search engines will be able to efficiently spider and rank your content.
Why we should care about Status Codes anyway: We should care about status codes anyway because search engines are not the only thing that might care about the content on your site; browsers, plugins, other sites (if you have built an API) all could potentially care about what code is returned.
They will behave in ways you might not expect if you return invalid or incorrect codes.
URL Structures are what you see when you look in the address bar, so they could be something like domain_name/my-awesome-page/ or they could be domain_name/?p=233432 Getting these structures right requires some thought and some technical knowledge. Do I want to have a deep structure like site.com/category/theme/page.html.
Why URL Structures are good for SEO: A good URL structure is good for SEO because it is used as part of the ranking algorithm on most search engines. If you want a page to rank for "purple beans” and your URL is domain_name/purple-beans/ then search engines will see that as a good sign that the page is going to be dedicated to the discussion of purple beans.
The URL will appear in search results, if it makes sense people are more likely to click on it than if it is a jumble of IDs and keywords.
A good URL will serve as its own anchor text. When people share the link often they will just dump it out onto the page, if the structure makes sense it will allow your page to rank for those terms even without someone setting it up correctly.
Why we should care about URL Structures anyway: Outside of the context of search engines, we encounter URLs all the time and as users of the web we appreciate it when things make it simple.
Your users will appreciate it when they can look at a URL coming off your site that just makes sense, if they can look at a URL and remember why they have it in a list without needing to click into it, that is a big win.
Over the lifetime of a website you will be surprised how much of your own admin you will need to do that will involve you looking at the structure of the URLs. If you have taken the time to do them right it will make your life much easier.
It used to be that search engines couldn't even follow a link that was using a JavaScript onClick function, they have come a long way since then and can do an excellent job of ranking sites that are completely made in JavaScript.
That being said search engines are not perfect at this task yet so the current advice still has to be that if you want something to be seen by search engines then you should try and make sure there are as few things blocking them by seeing it as possible.