After the major update of Google’s algorithm of 24/04/2012, your rankings could be affected. Techniques that inflated effectively your site have been made for some less effective.
Google spent 24 April 2012 a major upgrade to its search engine. The official goal was to play down the effectiveness against two natural SEO techniques: the “keyword stuffing” (the act of filling a page of phrases) and “webspam” (having too many links pointing optimized to its site).
Find out how to take advantage!
The keyword stuffing was already largely depreciated by Google and in fact, SEO agencies (including Refer) advised to go smoothly with the addition of keyword phrases on web pages. This part of the update moderately impact the corporate sites (key-it is hard to – sites deliberately abusive).
For the section on inbound links (back links or net linking), the impact was very strong on many sites. Our records show two cases: a decline of some fairly large sites and a slight increase in others. Attention, such an upheaval of the ranking algorithm is done on some days and other sites may also be affected (positively or negatively). We interpret the progression of the decline by some competitors previously before them.
Today many SEOs watch pages of irrelevant results. If you want to convince you:
type “search engine” into Google and see that points to bottom of page!
The consequences are below For sites that we have available, here in a 26/4/2012 inventory vs. the previous month:-
– Sharp decline: 11% of sites
– Moderate decline: 12% of sites
– Without significant movement: 11% of sites
– Progress: 66% of sites
What will happen?
Given the drift in its search results, Google is likely to change more or less marked the settings of this update. However, do not think he will return fully back. A page is turned, and like every major update (Panda …), the consequences are there and we must all adjust. Net linking techniques must evolve. What worked (very well) a few weeks ago is now rendered less effective.
Optimizations “on site” that needs to change?
We recommend that long ago to go smoothly and to play with synonyms. Keep putting unique titles but without double word in it, ditto for the meta description and in general, traditionally optimized for tags (h1, h2, h3, alt image, image names, etc.).
We are considering the possibility not yet definitely more complete meta keyword tag, already useless, now potentially detrimental (Google can see what interests you as keywords!). We must continue to highlight key phrases, of course, but I think we should include a maximum of text content to coat.
Strengthening of previous changes, there must be text content above the waterline (the visible area by the visitor without having to use the elevator).
Delete the way the series of external links footer, they are subject to all reviews (internal links utilities such as FAQ, TOS, etc. are kept). If you have links that duplicate your navigation menu, it is perhaps time to review them!
“Optimizations off site (net linking)” what should we change?
According to our preliminary analysis, here’s a some points before / after to see more clearly:
Good practice BEFORE the update
Quantity of links: as much as possible from as many areas as possible different
Quality of links: links from good pages (referenced sites themselves with PR if possible)
Rhythm of creation: regular or increasing
Coherence link anchor / content of the page targeted: low importance
thematic coherence between emitting the link page and the page receiving it: very low importance
Use of synonyms in anchors of links: Small
Universal anchor link (“Site”, “click here” as the url, etc …): reduces the effectiveness of net linking
Proportion of no follow links in the profile of incoming links: Small
Good practice AFTER the update
Quantity of links: as much as possible from as many areas as different as possible [unchanged]
Quality of links: links from good pages (referenced sites themselves with PR if possible) [unchanged]
Rhythm of creation: regular or increasing [unchanged]
Consistency link anchor / content of the target page: strong importance
Thematic coherence between emitting the link page and the page receiving: high importance
Use of synonyms in anchors of links: high importance
Universal anchor link (“Site”, “click here”, as is the url, etc …) in providing a sufficient proportion (% TBD)
Proportion of no follow links in the profile of incoming links: to provide a sufficient proportion (% TBD)
In summary, a bomb site optimized links (ie do follow + keyword as anchor) no longer work as well. The back link remains an essential building block of the Google algorithm (Proof: Google fight against techniques to create). One must work one profile net linking more natural, by providing for always good links, certainly, but least optimized. We’ll have to use URLs as an anchor or phrases without interest (“click here” …).
Do we blame your SEO?
No,Our job is to use techniques that work when they work. If this is not the case, we discover other. It is the specificity of our business: what is true one day may be wrong the next day, there is nothing for granted. The business of Google is to make these methods obsolete. On the one hand Goliath, with its firepower, its peerless engineering, the other thousands of David (including me) who struggle together to bypass filters and develop new methods. The quality of a SEO is also assessed through the ability to respond effectively to these changes.
The purpose the game is to go fast in the analysis to develop new processes. On our side, after making position reports on one hundred sites for several tens of hours and peeled all web publications on the subject, we started to see much clearer.
Why cuts and the issue of negative SEO?
The negative SEO is the fact of de rate sites whose one is not owner. Knock out your competitors, you will go. Until now it was doable, but difficult to do effectively, applying Google in doubt the invalidity of the bond rather than a nuisance as the perceived link fault. Some experiments have shown it seems some small faults not necessarily usable.
This update does open the doors of negative seo? No, not necessarily. Falls are perhaps due to the simple failure to take account of links too optimized, reducing the size of net linking. Less links = less visibility. Those who remain allow you not to sink at the bottom of rankings.
This vision is personal to me and everyone does not share it. The negative SEO is a major risk to the quality of the results of Google and it must implement measures against-. But it is not excluded that we test on our own …
In summary: How to benefit?
Whether you have been hit or not by updating penguin, all your competitors do not necessarily know how to react. Some will continue without changing their approach, SEO, (bad) habits die hard. This is the time to be judiciously quickly without go overboard!
“On site”: –
Lighten the density of keywords, keep the best spots for your keywords and priority use synonyms (or other terms) in other areas.
Reduce severely the links footer that comes out to other sites off thematic.
Limit the footer links to pages that are not utilities. As always since Panda, add content (original) and if possible above the waterline (it’s a bit new).
Not confirmed yet: Delete your meta keywords (we test on this point).
“Off site”: –
Should we modify or delete the incoming links? No! A back link deleted is a negative signal (this is one of techniques negative SEO). You could simply change the anchors of back links. Certainly, but confess your sin and waste valuable time. As well use your time to continue to build new relationships more consistent with the expected dilution effect + novelty should help a few weeks / months. So what a good net linking “Penguin Post”? This is a net linking more natural, with all anchors not optimized, the no follow, too, a stronger theming pages issuing a bond. If these pages are sending a link also to sites other than yours but that are related to your business, it’s better!
Some of other factors to consider in off page:-
Cloaking: The process of creating a site for users and one for search engines. Some webmasters are the perfect content for the Google bot but when users arrive at the page in question, they find a bunch of junk like pop-ups, hidden links, etc … may not have done on purpose, but Google is this criterion applies strict and without any exception.
Keyword stuffing: Another possibility by which you could lose rank. It basically consists of repeating keywords throughout your website but do not need them. This kind of pages does not add value to users but to increase the number of keywords on your site.
Link farms: If your link building strategy (links you receive from an external page) includes a lot of low quality links (some PR), with a totally different topic of your website, most likely you are within this group.
Copied content: Copying content from other web pages and pass it off as their own.
Hidden content: Place content and cover it with font colors similar to the fund for the sole purpose of being favored by search engines, without regard to user experience.
Formerly the keyword density was a factor in position, however this is changing and if you have not updated for a long time, you may have taken its toll.
Generally speaking: keep your cool! The best strategy for now is to measure changes, without making a big change. As with every major update, we must see how to take advantage but taking a step back.