Position your site without worrying about the Google algorithm update


All people want to earn more by working less.But at first we must work hard to make a company profitable. That is the key to success. The same goes for websites. Still there is no update to correct the positions automatically, therefore, must “work hard to connect with customers.” Recent updates of the algorithms punish SEO strategies of low quality, so we should stop using them. Let us review the history of SEO:

Step # 1. Start with “cloaking”

Whoever will take care of SEO before 2000 was “cloaking” because it allowed him to create a page full of bad content and insert keywords in bulk, getting placed in the first pages of search engine to your company website.
Then came the Google PageRank and realized that their search engine would be more successful if web sites had to “earn” the position by adding links to sources of high quality text “anchor”. Google warned that too much was at stake to just “trust” sites containing many keywords or using white text on a white background and were “cloaking.”

Step # 2. The SEOs “scalable” came to reciprocal links

Websites reciprocal links quickly arrived to provide assistance to the SEO “scalable” as the “cloaking” lost effectiveness.
Websites reciprocal links were not trying to find sites which interact, which were of high quality, but the strategy was to move up. There were tools and scripts created just to make link exchanges. For the SEO were fascinated such positioning since it did not require hard work and only used the script, it received the desired positioning and consistent, money.
In the end Google updated the algorithm to punish sites that were getting most of its links with design “scalable”. And this leads to the next operation “scalable” directory links.

Step # 3. Links from the directories on a tragic scale


Shortly thereafter, became the SEO “scalable” with links to an address in a directory. Web sites links to directories taking to improve its ranking, but many directories not control the quality of their links.

Then came the sites offering web sites to place hundreds or thousands of directories for a very low cost. This allowed the SEO who just wanted to work 4 hours a week, paying companies, capable of the binding directories.

The SEO “scalable” does not care to risk customers. When the anchor text used to stop, sure many websites suffer. If you use many directories now begins to balance your links.

Step # 4. Paid links an address

Once the links directories began to work less (though still more work than they should), they arrived networks links. Web sites SEOs hired to place your links on high quality websites. Instead of hiring or interact with bloggers, authors, website owners and journalists, but paid. Then the SEO “scalable” again be fashionable, and to create relationships with bloggers was difficult and buy links from relevant blogs was much easier.

Step # 5. Google Panda

Sites that had a lot of money to create low-quality pages were punished. The strategy “scalable” worked for a while, but eventually caused many problems in different sites that had large drops in traffic. The SEO done the hard work of adding content were not affected. It is always important to work hard and produce quality content. It is important that a page has value.

The SEO “scalable”

For conducting studies and making a capital investment will have to work hard positioning. It is proven that sales increase when exposed videos on the product pages. A video can become a barrier to entry or a factor of positioning in search engines. Moreover, we evaluate the real connections with our customers in social media, and in real life, watch the major writers of the board, monitor the core members of the community and watch your connections on LinkedIn to create links.

Final Thought

Many years have passed since the first search engines such as Excite, AltaVista, Lycos ceased to operate. And although it became the major search engines today, all worked with the same objective as Google and Bing, rewarding companies that create websites true, good content, connect with the community and gain links to high position.

SEO is a necessity



Why SEO is a necessity?

Owning a business website, you can compare with having a single car to travel to work each day. If you work with it constantly, never ready for the race track (business).

But to put it on the racetrack business, we must find suitable developers, builders, and optimizers, which should not only say they know everything about SEO, but also, let your work show the results on the track.

With so many SEO professionals fake customers seeking such services; they are not sure where to find a SEO agency, tailored to your requirements. And so true SEO agencies often do not provide the solutions required to optimize pages and even are able to advise their clients about the need to create pages tailored for search engines before optimizing them for the next level marketing and promotional campaigns.

A scheme adapted to the search engines is also suitable for users.

The website owners should understand the importance of creating websites tailored for search engines and users, by improving the lists of keywords, optimized structure of the site, a minimum number of clicks to get to the content within search, architecture adapted pages for search engines that allows easy navigation and clear channels for the marketing reach to customers, and these interact. This SEO professionals have an ethical responsibility to educate the owners of the websites as to “what” and “how” to when customers come in looking for solutions.

Google evaluates the URLs of home pages that appear in search engine results. Therefore, the websites you wish to position themselves correctly, they should optimize their pages so that the result is locate at the top of the search engine. This would allow a compulsion largest customers click on your site.
Moreover, the more harmonious is the outline of a website becomes easier to navigate. The easy navigability or defective, shall directly related to the acceptance or rejection of the website. A positive experience with the website and good navigation, surely the same experience transferred to a new product or service on the website.

It is not easy to find a SEO professional who can fix a website and give a report of the results achieved. There are many agencies that boast of their skills, but do not make much effort to get the results you want web sites.

Search results and the punishments of Panda

The 2011 was the Year of the Panda. The time to rethink a strategy from the home page, you need to reorganize the URL, the structure, meta-data, content, site-structure, Page Speed, schema elements, signs of social media and all the other things that come into the perspective of search engines for better positioning.

The 2012 no appreciable change, so we can expect more of the same in structure SEO. If you do not get to adapt to the Panda, it’s time you know it works.

The modification of the algorithm with Panda 2011, is only an indication that there might be even more, while the algorithms of search engines evolve, to display the best results in searches. As long as we are willing to solve our websites, we need not worry about future updates of Panda. It would be foolish to blame the new algorithms, when we can fix ourselves.

Google has implemented some 20 amendments to its rating system in the last month of 2011, justifying these changes through their determination to help users find the best answers. And the amendments made by the browser, should be taken into account as they are dominated by all search engines, with 80% of the global market.

Optimization beyond the home page

There are points to consider favorably positioned in search engines, such as to optimize every page instead of the home page only. The classification scheme URLs from Google that was implemented in 2011, indicates the best way to manage our site, but that’s not all. Optimize home page is an option which website owners can’t ignore. Each page should find a way to get better search engine rankings as a duplicate content page with an incorrect or inappropriate schema architecture, it could ruin the rest of the pages of a website and lead to decay of the positioning .

Duplication of content in the sub and defects in the design, code, web site structure and architecture, can lead to degradation of the site quickly, while Panda remains in force.

The best solution to avoid future damage is to optimize every page of a website. Meanwhile, planning’s are performed in parallel marketing and SEO to achieve top organic positioning and traffic.
Then, to allow SEO, SEM and SMP work in your favor, you will begin to optimize the home page and allow the flow to effect the linked pages. Your homepage will succeed only if all its elements work together to persuade visitors. However, the task is never easy when working with a single page, because this requires a deep analysis on potential customers, what they are seeking and developing a message that interests everyone, but it seems enough custom for each individual visitor.

Unless your website is tailored to the search engines, you will not gain any benefit from more campaigns you make, and that will do nothing to improve your rankings. Create pages tailored to the search engines is not the same as optimizing a website for high ranking on a search engine.

Find the correct procedure

Any owner of a website, it’s been a year or more in the business of selling a service or product, you should know that SEO, SEM, SMM and SMP are vitally needed to attract customers within budget marketing.

In 2012 the expansion is expected to SEO budget in over 72% of companies, according to research statistics.

Optimizing and marketing with a growing list of keyword phrases on their own is useless, unless they are used correctly and with respect, to attract customers and therefore profits. This may be much more beneficial if the products and services of a website are highly sought after, have caused prices to customers and want to buy.
To create or arrange a site, adapting it to search engines, you should analyze existing defects, the website will be rebuilt with an understanding to users and it will take to good architecture that directs visitors to the point where we want to carry. You must also adapt to the indexing robots of search engines.

Once your website has adapted to the search engines (SEF – “Search Engine Friendly”), you will find better strategies in SEO, SEM and SMM / P.

You can use these strategies all the time, until you get the visibility and positioning that will generate more organic traffic and purchases on your website. Although SEM and SEO, presented two different approaches, efforts need to be coordinated to maximize the effectiveness of the website.

Google Plus will impact your SEO


It was common knowledge the announcement of Google on the launch of a social search nicknamed “Search Plus Your World”, presumed impact on SEO. In this regard there have been many speculations raised, especially related to Google’s motives for this release. Among those reasons were noted as follows:
Earn a part of social networking market.
Improving the quality of search results through greater reliability and customization (Nielsen research suggests that 42% of users rely on search results, but a 90% trust recommendations from friends).

In fact, all contribute to the reasons why Google has launched social search. In this article, however, does not intend to talk policy, but how this development will inevitably impact on SEO and our positions specifically.
What is Google’s strategy “Search, Plus Your World”?
To really understand the impact this new SEO will need to understand what is Google’s strategy regarding social search. Here are my considerations:

Google is creating a “map” similar to Facebook’s Open Graph, which connects people to their content, their websites and their readers to understand who produces what content, what happens to that content and how people react. Here the labels “rel = author” and “rel = publisher” are very important.
Thus, Google will get background information on the readers and content producers, taken from the activity of their Google accounts (for example, what videos are on YouTube and searches you do when you connect to their accounts) and use these data in combination with existing positioning factors and the shared data to better position G + content in search results.

If I have reason on my findings, then we see a change in prevalence in the results pages of search engines to producers of a public asset, but only in niches where there are communities.
For example, in the niche of social media websites such as Mash able, Social Media Today Social Media Explorer and prevalence receive extra pages of search engine results because they have an audience that includes not only influential but also remaining mass of social media enthusiasts who share their content. The websites of social media that have no active public or a community prevalence lose.
In niches that lack of community social search will not change anything, unless a company create a public asset from scratch.

Here are five things that I think this change to affect social search for workers to think more like online marketing.
The email marketing will become increasingly important because of its power to encourage desired actions, such as social distributions.
Social strategies will have two sub-strategies: the “custom SEO strategy” and “non-customized SEO strategy.” The second will be almost identical to what it does but first you’ll have more to do with winning social approval and involvement of your audience.

The creation of low quality links will become obsolete in niches where there are communities and public assets. The creation of high quality links will continue to play an important role in the positioning of all websites, and creating low-quality links still work for non-competitive niches without an audience.
Companies will have to find ways to win the “social approval” of the influential in your niche (if any). +1 S celebrity in your niche will be very valuable.

Google will work to increase the percentage of people connected: This is pretty obvious.

Facebook EdgeRank


Have you noticed that the latest news on Facebook ? They have been published it recently?

This is what Facebook calls “Top Stories” and are determined by the formula (secret) EdgeRank call.

EdgeRank is the algorithm used to decide which publications Facebook show to other users, both in pages and in personal profiles.

Did you know that only between 7.5% and 9.4% of the pages are viewed by his fans every day? Moreover, a greater number of fans this percentage decreases. According to Web trends whitepaper, only 2.79% of the followers of a page with more than one million fans see your posts.

Try to know who is this algorithm is essential for all Community Manager. Facebook’s efforts to remain as an interesting platform, with content related to each person, according to their interests-makes this algorithm is constantly changing.

Therefore, we might mention that there is not only filter users who decide to view a publication if they are interfacing or not, or if they want to hide a history of his timeline, but Facebook assesses and prioritizes your previous publications in the page of your fans (or timeline).

The EdgeRank consists of three factors that must be understood to optimize your posts.

1. Present
2. Affinity
3.Weight

Present

The present factor simply refers to the date and time of publication. It is based on the most recent publications are sorted before the other. The most recent post is a higher score.

Until relatively recently, Facebook did not allow post on his fan schedule page, which meant that many Community Manager, schedule their publications through external applications. This was “punished” by Facebook, making these publications do not appear as feature stories.

Affinity

This is the most important factor. It has to do with the interaction of users, the “relationship” you have with them. As with your Facebook friends, if you never enter your profile, if you comment or never if you never click “like” in their posts, their stories disappear from your timeline.

That is so important to have an interesting conversation with a call to action. Suggest the user what to do or what you expect of them with your message, is the best way of interacting with your Fan Page.

Weight

The type of interaction (sharing, comments, I love it …) or type of publication together who are the most active users of your post determine this variable.

The photos are more important than video, external links unless photos and videos, and finally outweigh all text. In addition, a share worth more than just content I like. “Share” has a high, a “commentary” has a medium-high, a “like” an average weight and a “click” underweight.

Finally, if many users are active in a publication, its weight increases. If you also are one or more of the recipient’s friends who are active, then the weight of the publication increases.

You see all these factors are focused on the truly relevant and good content, be as viral potential against publications or fan page self-seeking, far from adding value to the user.

Do not forget that this can be a good criterion to differentiate yourself from the competition: it’s easy to be on Facebook, but are you relevant?.

Google panda update


Google has confirmed the launch of Panda 3.7 and reported that the algorithm was released on June 8.
this update effects Less then 1% of us search query results and about 1% results worldwide. Update comes after 6 week i.e the launch of Panda 3.6 the algorithm was released on April 27. The movement led to a hectic period of eight days after the deployment of Google Panda 3.5 on 19 April and the first algorithm “Penguin” on April 24.
Google said that these changes are just another of the updates performed regularly to ensure that users receive the best results in your searches. As with other updates Panda, the 3.6 was quite less and is likely to have affected very few websites.

Google panda update

Panda and Penguin focus on different elements of the quality of the site, both algorithms have marked the SEO strategies should focus on user experience. Thus, the development of content that provide relevant information is crucial to Panda and Penguin, highlighting the need for natural keywords within the site and really relate to what the site provides.

Develop your Seo through Social Media


There is no doubt now that your presence on social media affects your ranking in the results pages of search engines. However, what I want to talk about today are the specific mechanisms by which social media influence your SEO.

Instead of a final response from Google we compiled existing research on the role of social media in SEO into three general recommendations. These measures should help your website to harness the power of social marketing, no matter how these markers are weighted in the algorithms.

Step 1 – Increase of social divisions

According to Eric Enge’s, speaking on behalf of Search Engine Watch, shares, social signals are important:

“Like, 1 s, and social ties are also signals that affect their use by search engines.

By “important,” Enge means that there are social signals, such as Face book “Like”, Twitter “tweets” and Google “s 1” – who still have serious weaknesses.

First, they can be “gamed”. A glance at sites like Fiverr shows the growth of providers that offer to automatically send these social signals in exchange for a small fee. If the rankings in the SERPs were exclusively determined by the number of social signals sent, the highest positions in the search results are all held by people who have invested the largest budgets in social influence.

At the same time, these social cues are not representative of the behavior as a whole. Although nearly half of those in the U.S. use Face book, only 4.8% of the population may be considered “active users” of social networks (according to a study cited by Enge). Set the SERPs based on behavior of such a small percentage of the population is not something that Google will consider, with the objective to provide the best search results possible for everyone (not just for a small group of active users).

But despite these limitations, there is no doubt that social cues such as these play a small role in the SERPs ranking for both Google and Bing both confirmed practice from December 2010. In an interview with Search Engine Land by Danny Sullivan, representatives from both engines have provided the following responses to a question about whether their algorithms assigned a value relative to the authority of Twitter users and the links they shared in their tweets:

Bing:

“We pay attention to the social reputation of a user. We look at how many people you follow, how you follow, and this can add some weight to an advertisement in search results.”

And for Google:

“Yes, we use [tweeted links and RTs] social cues. They are used in our organic rankings. We also use it to improve our new scoring how many people shared an article.”

Social cues are important in terms of SEO, but they are not strong enough to justify the “forcing” Artificial your Facebook “Like”, Twitter “tweets” and Google “s one.” Instead, give your readers a link with the social tools to make sharing your content easier and encourage them to share articles on community sites should be sufficient to grow your social profile of a effective and sustainable manner.

Develop your Seo through Social Media

Step 2 – Expand your audience to build your reputation

When looking at how your social presence influences the SEO of your website, increase your social total number of shares is only part of the whole. In fact, using the power of social networking sites to build your brand and increase the perception of your authority is a way to take advantage of these community sites.

As Google and Bing alluded above, the relative authority of social network users is reflected in the ranking algorithms along with the total number of shares that each site receives.

Of course, the value of a well established brand is not limited to how it can influence its ranking in search engines. From a business perspective, you will enjoy to increase your sales and influence more people by placing you as a leader in your field rather than being an anonymous blogger …

So how will you build your reputation? The famous blogger Chris Garrett offers this ideal to consider how your digital presence must be built:

In this digital age, it is important to remember that everything you say and do contributes to overall perception people have of you.

Imagine you have created a corporate blog and actively participated in community discussions on Twitter and Facebook, where you are considered a figure deemed. Now imagine that these community members do a Google search on your behalf and that alongside these resources carefully cultivated appears an unfortunate “What happens in the locker room, stays in the locker room”: a photo album on Flickr compromising appears. What do you think that this will lead to your overall image and reputation?

To cultivate a truly effective online presence to improve your business results and your natural search rankings, invest time in all the items shown above is required to present an image that you want to associate with your brand.

Step 3 – Share with known users

Finally, while social networking can be extremely useful in terms of building your own reputation within your industry, an even more powerful to use social networks of a SEO point of view is the “piggy back “of the reputation of others.

Social networking sites provide an unprecedented opportunity to connect with opinion leaders in your industry – something that could never happen even a few years ago, when industry-specific celebrities was considered remote and inaccessible.

But the connection with these users may be deemed to have significant value to your business and your personal growth, get this “power figures” by associating with these leaders and share your content on social networking sites can provide an advantage even largest in the ranking of your site. In the article Eric Enge noted earlier, the author goes on to say:

“If you want to be a acteurimportant in your industry, social media should be an important part of your mix. However, obtaining an authoritative person in your space to share your content will probably be a great victory, and this is a great goal for your social campaigns. ”

For this reason, it’s a great idea of using social sites to reach and connect with renowned figures in your business. Does not require them to share your content right away, but rather allow them to develop these relationships over time. Some of these users can be considered will place a link to your site, representing a huge impact for SEO.

A penguin in the elevator


More than a month Penguin bombarding Google (Google Penguin). What if I have been penalized, that if I lost more than half of visits, if I’m not ranked for my keywords … and so on to infinity and beyond. We Penguin every last hair. Last Friday the update 1.1. And again the “professionals” of SEO talking about it around the world. Crying and asking for their losses should make Google to forgive.

And I ask all these SEO now put their hands to their heads and have ruined the business of their clients, do you ever read the Google Quality Guidelines ? Because they are older than the cold.

It is very simple. Have you been a victim of unethical behavior by the people who manage your website and should be held responsible. Cheaters have fascinating your online business.

Relevance can not win with tricks / traps. They hold positions that do not deserve.

I waited a month to write about this new Google algorithm change. And I do not care as it is called. Change is absolutely necessary because it punishes those who have been cheating / tricks for positioning in search results over others who have more merit than they. A change to eliminate spam, to eliminate waste from the search engine results. In short, to eliminate the cheaters, there are many. I hope that every time you tune more. The penguin has taken control of the elevators up and down, sending spammers to the top floor of hell. Right.

Positioning is plenty to talk, but not because the penguin is out now, but because it is not easy to position, regardless of the nonsense they preach they say sells smoke point at the overnight (if you ignore them before or later will come the tears). There are many factors that affect the proper positioning of your website. And it’s not easy.

As we all know, in recent years Google has implemented several changes related to quality, but none have been fatal to the SEO done right.

I’ll play the Google Webmaster Guidelines on quality web sites, if there is still some “professional” clueless:

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other fraudulent practices not mentioned in this document (eg tricking users by registering websites known spelling mistakes). Do not assume that Google approves a specific deceptive technique for the simple fact that is not included on this page. Webmasters who endeavor to respect the spirit of the basic principles mentioned above will provide users a higher quality service and reach a better position than those looking for loopholes to exploit.

If you believe that another site is abusing Google’s quality guidelines, report it via the page https://www.google.com/webmasters/tools/spamreport . Google prefers developing scalable and automated solutions to problems. Therefore, we try to minimize the direct fight against scam sites. The reports we receive on these sites are used to create scalable algorithms that recognize and block future attempts to generate them.

 A penguin in the elevator

Basic principles of quality guidelines:-

-Make pages primarily for users, not for search engines. Do not deceive your users or show site to search engines different content you offer to users (a practice known as “cover”).

-Avoid tricks intended to improve the position obtained through the search engines. A good rule of thumb is whether you feel comfortable having to explain to a competing web site that you have done. Another useful test is to ask, “Does this help my users? Would I do this if there were no search engines?”.

-Do not participate in link schemes designed to improve the position of your site or to manipulate the Page Rank algorithm results in your favor.Avoid links to particular spammers or “bad neighborhoods” on the Web, as these links can adversely affect your own position.

-Do not use unauthorized computer programs to submit pages, check rankings, etc.. Such programs consume computing resources and violate the Terms of Service . Google does not recommend the use of products such as Web Position Gold ™ that send queries to Google automatically or pro grammatically. Google does not recommend the use of products such as Web Position Gold ™ that send queries to Google automatically or pro grammatically.

Specific quality guidelines:-

-Avoid using text or hidden links .
-Do not use deceptive techniques to redirect or cover .
-Do Not stuff pages with irrelevant keywords.
-Do not create domains with duplicate content or sub domains.
-Do not create pages of malicious behavior, such as phishing sites (phishing) or pages that install viruses, Trojans or other malicious software .
-Avoid “doorway” pages created just for search engines or other approaches such as affiliate programs with little or no original content .
-If your site participates in an affiliate program, make sure it adds value. Provide unique and relevant content that gives users a reason to visit the site. ”

More clear water. If your website has violated any of these rules, the penguin will shake you.

Google more local


Since last Wednesday, Google has integrated a tab on the sidebar where he has begun to show results of Google Places focused primarily on local businesses.

Since Google was saying that his strategy would be based on integrating all services and / or products included in Google + as main support.

From now on, the comments will show on Google Maps will be assessed from Google + Local, which will force users to register with the social network. So far, the presence for business in Google Local + is free, and will be an indispensable tool for SMEs in terms of local positioning.

What Google seeks, among other things, is to have more control over spam comments that are issued in Maps, with which you only have to have a Gmail account.

Did we write for humans


When we wrote the content optimized for SEO, do we think our visitors or the robot?

It is very difficult to create content for our site that you like the same in the two types of visitors to our website: robots and humans.

One theory is that Google wants a straightforward content and requiring no creativity in communication for the message more attractive to our users, and thus increase the chances of conversion.

Google wants simplicity, structure, direct message without frills or insinuations, etc.. We could say that the communicator has to choose between “love” to the user or “taste” so dear to GoogleBot to reward you.

Did we write for humans

Balance is the key. We like them both properly structuring our content using tags for the seeker knows, and be informed of what we “tell” him, and on the other side “communicate” our message to be like the “hook”.

We must be optimistic and hope that Google continues to improve your SERP’s and bounce rates continue focusing on their results. Whether you are in the first pages, if the bounce rate is high, usually is a fact that who you visit, did not like your content, although I must say that is not true in all cases.

WPMU Recovery From Google Penguin


Google said it had updated its Google filter penguin fighting against web spam. It is Matt Cutts himself, the leader of the anti spam at Google who made the announcement. But this update does it still impacted sites or restored a chance to sites that were penalized by the filter?

According to the opinions expressed on the canvas and the findings reported by some site owners, if your site has been impacted by the evil penguin, they can get out of this penalty quickly. Indeed, following the deployment of Google Panda, plenty of sites have been penalized back on track and even better positioning before penalties. There will be other updates leaving the chance for a site sanctioned to return to its normal stage.

The concrete case of the site “WPMU” which penalized by Google Penguin filter has fully recovered its position and thus its preliminary traffic. What has happened to this site? Why has they been penalized? Users who wish can generate a weblog sub domain of the main site but plenty of were using a theme that contains a link to this footer redirected to “WPMU” itself. Poisonous to link the main site.

Anyone ever used a template to WordPress day freely distributed by a charitable designer. Often, a link is placed in footnotes, as gold signature of the work done with the new Penguin update these links are provided in the new filter. But the links in footer they are to be avoided? This case clearly shows here that the large majority of identical links placed on Web blogs have dropped the main domain name.

The total removal of those links now to update, complete recovery of the positions of the site and its traffic.

Footer Linking  : poisonous?

The penguin filter Google penalizes sites that do not follow the rules. It is important to understand why your site was penalized, what links may cause issues, have you engaged in trading off topic? Your content is there for the net or Google? It is there to correct places in violation of the filter? Cleaning links is essential in the event you feel that the filter has punished you.

Some tips to create its site:-

A major hard work has therefore been deployed by the site owners “WPMU” to analyze and understand the penalty and implement actions necessary to restore the site’s preliminary cruise traffic. Some unnecessary links were deleted, the canonical url in place, the sitemap broken repaired, removed duplicate title tags. So there’s solutions to getting back on track following a penalty bothered on his own web-site.

WPMU Recovery From Google Penguin

Understanding to solve issues

It is therefore recommended to improve its popularity to generate a weblog to communicate with more users and thus offer an fascinating and relevant content. Publish regular updates in the kind of press releases by its web-site. Visit blogs of similar theme and leave relevant comments with constructive feedback.

Perform link exchanges with other sites similarly identical that could provide more for visiting surfers. Share all news on social networks social network to work in the best manner feasible.

So there’s methods to get back on track if your site has been severely affected. Nice analysis and thorough cleaning will bring you back normally on the results.

But keep in mind, Google penguin is installed but plenty of have forgotten our nice elderly mate Google panda penalizing sites that always abuse the duplicate content and marketing content in abundance.

Take Advantage of Google Penguin update


After the major update of Google’s algorithm of 24/04/2012, your rankings could be affected. Techniques that inflated effectively your site have been made for some less effective.

Background:-

Google spent 24 April 2012 a major upgrade to its search engine. The official goal was to play down the effectiveness against two natural SEO techniques: the “keyword stuffing” (the act of filling a page of phrases) and “webspam” (having too many links pointing optimized to its site).

Find out how to take advantage!

The consequences:-

The keyword stuffing was already largely depreciated by Google and in fact, SEO agencies (including Refer) advised to go smoothly with the addition of keyword phrases on web pages. This part of the update moderately impact the corporate sites (key-it is hard to – sites deliberately abusive).
For the section on inbound links (back links or net linking), the impact was very strong on many sites. Our records show two cases: a decline of some fairly large sites and a slight increase in others. Attention, such an upheaval of the ranking algorithm is done on some days and other sites may also be affected (positively or negatively). We interpret the progression of the decline by some competitors previously before them.
Today many SEOs watch pages of irrelevant results. If you want to convince you:
type “search engine” into Google and see that points to bottom of page!

The consequences are below For sites that we have available, here in a 26/4/2012 inventory vs. the previous month:-

– Sharp decline: 11% of sites
– Moderate decline: 12% of sites
– Without significant movement: 11% of sites
– Progress: 66% of sites

What will happen?

Given the drift in its search results, Google is likely to change more or less marked the settings of this update. However, do not think he will return fully back. A page is turned, and like every major update (Panda …), the consequences are there and we must all adjust. Net linking techniques must evolve. What worked (very well) a few weeks ago is now rendered less effective.

Optimizations “on site” that needs to change?

We recommend that long ago to go smoothly and to play with synonyms. Keep putting unique titles but without double word in it, ditto for the meta description and in general, traditionally optimized for tags (h1, h2, h3, alt image, image names, etc.).
We are considering the possibility not yet definitely more complete meta keyword tag, already useless, now potentially detrimental (Google can see what interests you as keywords!). We must continue to highlight key phrases, of course, but I think we should include a maximum of text content to coat.
Strengthening of previous changes, there must be text content above the waterline (the visible area by the visitor without having to use the elevator).
Delete the way the series of external links footer, they are subject to all reviews (internal links utilities such as FAQ, TOS, etc. are kept). If you have links that duplicate your navigation menu, it is perhaps time to review them!

Take Advantage of Google Penguin update

“Optimizations off site (net linking)” what should we change?

According to our preliminary analysis, here’s a some points before / after to see more clearly:

Good practice BEFORE the update

Quantity of links: as much as possible from as many areas as possible different
Quality of links: links from good pages (referenced sites themselves with PR if possible)
Rhythm of creation: regular or increasing
Coherence link anchor / content of the page targeted: low importance
thematic coherence between emitting the link page and the page receiving it: very low importance
Use of synonyms in anchors of links: Small
Universal anchor link (“Site”, “click here” as the url, etc …): reduces the effectiveness of net linking
Proportion of no follow links in the profile of incoming links: Small

Good practice AFTER the update

Quantity of links: as much as possible from as many areas as different as possible [unchanged]
Quality of links: links from good pages (referenced sites themselves with PR if possible) [unchanged]
Rhythm of creation: regular or increasing [unchanged]
Consistency link anchor / content of the target page: strong importance
Thematic coherence between emitting the link page and the page receiving: high importance
Use of synonyms in anchors of links: high importance
Universal anchor link (“Site”, “click here”, as is the url, etc …) in providing a sufficient proportion (% TBD)
Proportion of no follow links in the profile of incoming links: to provide a sufficient proportion (% TBD)

In summary, a bomb site optimized links (ie do follow + keyword as anchor) no longer work as well. The back link remains an essential building block of the Google algorithm (Proof: Google fight against techniques to create). One must work one profile net linking more natural, by providing for always good links, certainly, but least optimized. We’ll have to use URLs as an anchor or phrases without interest (“click here” …).

Do we blame your SEO?

No,Our job is to use techniques that work when they work. If this is not the case, we discover other. It is the specificity of our business: what is true one day may be wrong the next day, there is nothing for granted. The business of Google is to make these methods obsolete. On the one hand Goliath, with its firepower, its peerless engineering, the other thousands of David (including me) who struggle together to bypass filters and develop new methods. The quality of a SEO is also assessed through the ability to respond effectively to these changes.
The purpose the game is to go fast in the analysis to develop new processes. On our side, after making position reports on one hundred sites for several tens of hours and peeled all web publications on the subject, we started to see much clearer.

Why cuts and the issue of negative SEO?

The negative SEO is the fact of de rate sites whose one is not owner. Knock out your competitors, you will go. Until now it was doable, but difficult to do effectively, applying Google in doubt the invalidity of the bond rather than a nuisance as the perceived link fault. Some experiments have shown it seems some small faults not necessarily usable.
This update does open the doors of negative seo? No, not necessarily. Falls are perhaps due to the simple failure to take account of links too optimized, reducing the size of net linking. Less links = less visibility. Those who remain allow you not to sink at the bottom of rankings.
This vision is personal to me and everyone does not share it. The negative SEO is a major risk to the quality of the results of Google and it must implement measures against-. But it is not excluded that we test on our own …

In summary: How to benefit?

Whether you have been hit or not by updating penguin, all your competitors do not necessarily know how to react. Some will continue without changing their approach, SEO, (bad) habits die hard. This is the time to be judiciously quickly without go overboard!

“On site”: –

Lighten the density of keywords, keep the best spots for your keywords and priority use synonyms (or other terms) in other areas.
Reduce severely the links footer that comes out to other sites off thematic.
Limit the footer links to pages that are not utilities. As always since Panda, add content (original) and if possible above the waterline (it’s a bit new).
Not confirmed yet: Delete your meta keywords (we test on this point).

“Off site”: –

Should we modify or delete the incoming links? No! A back link deleted is a negative signal (this is one of techniques negative SEO). You could simply change the anchors of back links. Certainly, but confess your sin and waste valuable time. As well use your time to continue to build new relationships more consistent with the expected dilution effect + novelty should help a few weeks / months. So what a good net linking “Penguin Post”? This is a net linking more natural, with all anchors not optimized, the no follow, too, a stronger theming pages issuing a bond. If these pages are sending a link also to sites other than yours but that are related to your business, it’s better!

Some of other factors to consider in off page
:-

Cloaking: The process of creating a site for users and one for search engines. Some webmasters are the perfect content for the Google bot but when users arrive at the page in question, they find a bunch of junk like pop-ups, hidden links, etc … may not have done on purpose, but Google is this criterion applies strict and without any exception.

Keyword stuffing: Another possibility by which you could lose rank. It basically consists of repeating keywords throughout your website but do not need them. This kind of pages does not add value to users but to increase the number of keywords on your site.

Link farms: If your link building strategy (links you receive from an external page) includes a lot of low quality links (some PR), with a totally different topic of your website, most likely you are within this group.

Copied content: Copying content from other web pages and pass it off as their own.

Hidden content: Place content and cover it with font colors similar to the fund for the sole purpose of being favored by search engines, without regard to user experience.

Formerly the keyword density was a factor in position, however this is changing and if you have not updated for a long time, you may have taken its toll.

Generally speaking:  keep your cool! The best strategy for now is to measure changes, without making a big change. As with every major update, we must see how to take advantage but taking a step back.