Right here are ten excellent tips on ways to beat Google Penguin updates in addition to an awesome info-graphic … (Keep reading).
SEO Tips: Google Penguin Updates!
Do me a favor and look into my other article on Google Quality Points — I alerted you in my short article of last year – that you are constantly at risk unless you make use of logic — But I personally hate individuals who state “I told you so”. So below are my ten ways to recuperate from Penguin and prevent future charges.
Google Penguin Update — 1-2-3-4-5-6-7-8-9 and 10!
I anticipate there to be various updates — similar to Google’s Panda Updates. Utilizing exactly what we learned about Panda — this update is very similar and is an incremental algorithm change (just like “Panda”).
Right here are my excellent tips on ways to beat the “Penguin”:
- Eliminate Paid Text Links Using Exact Match Anchor Text:You should avoid sponsored or paid links that use exact match anchor tag keywords.
- Remove Comment Spam: Another easy footprint for Google to identify! Avoid making use of automated link structure tools.
- Eliminate Guest posts on questionable websites: Most “personal” blog networks that enable you write a visitor post/article, which you would publish on it, most of their networks have actually all been found by Google.
- Get rid of Article Marketing Sites: Links from article advertising sites have actually been classified as being unnatural links by Google
- Avoid Links from Dangerous Sites: Sites that have been flagged for malware, various pop-ups, link farms or various other spammy problems are an additional indication triggering sites to lose rankings in the Google SERPs.
- Concentrate on Social Media Links:Social Media Links have enhanced over 25% in approximated value to Search Engines this year alone. This is because SE’s understand they are from real people — for the most part.
- Concentrate on pertinent Directory Links: Directories are a terrific method to obtain links— specifically if they relate to your internet site. Exactly how do you understand which directory sites to get? Simple: Just type “[your field] directories” into Google and you will get a long list of excellent directory sites to get listed in.
- Focus on Press Releases — Adding a picture or an image increases views of the launch by 14%.
- Concentrate on sharable content on your Press Release: Adding a video offers you a 28% boost in views.
- Press Releases Sharable: Add a Video, Info-graphic, Image and a Download to each: Go all the way and put in a photo, a video, a graphic and a download and you’ll see a 78% jump in the number of views!
Google Penguin Update: How To Avoid Future Penalties
So exactly how do you know the next Google update will not crash your SEO rankings?
Response: You don’t. But I can offer you a tip: Users and Google are pursuing user experience — quality and Logic.
Among my Favorite Quotes: Wayne Gretzky Said something like: “I Don’t Skate After The Hockey Puck — I Skate To Where The Puck WILL BE”.
I quote another article:
“Every year there are optimization techniques that become generally accepted best practices. However – before an idea becomes accepted – it sits in this intermediate state where enough evidence of its value in increasing rankings is lacking. Many of these items are treated as signals of quality.”
“So what is a signal of quality? It’s some aspect of a web site whose existence indicates that the site is less likely to be engaged in search engine black hat techniques. Enough signals should add up to provide a boost to a site when it is compared to other sites that don’t provide the same signals. These signals aren’t proof – of course – just another piece of the puzzle.”
“Certain Signals of quality exist everywhere & in many cases we subconsciously respond to them. For instance – if you’re looking for a good restaurant and you’re particularly nervous about cleanliness you’ll look for signals that the kitchen is clean. There’s no way for you to know the true state of the kitchen – but you can infer it by answering questions like is the table clean – are the glasses spotless – & is the waiter’s uniform professional looking. Restaurant staff that take care of these things are signaling to you that they are also taking care of things you can’t see.”
Right here are a list of many of the quality points I would search for:
(Now a target for Penguin) The Numerous Site Wide and Irrelevant Links:
Business sites are commonly the worst when it concerns site broad links. Politics and the “a lot of cooks in the kitchen” syndrome commonly result in header and footer links that indicate every division and subsidiary regardless of whether these various other sites are related from a content-perspective. The existence of these links suggests that the user to the website isn’t essential enough to trump corporate egos. And on the other hand — the absence of such links can signal that the user is king which is a viewpoint that Google urges.
Legible and Clear Navigation Hierarchy and Links:
Make a website with a clear hierarchy and text links. Every page needs to be reachable from a minimum of one static text link.
Can your site be seen quickly on a smart phone? Is your site scalable? There are ways to strip off items for mobile phones. I think mobile-friendly websites will be the wave of the future — so begin working on it now. However see to it you do not replicate your content by creating a replicate mobile website. The best method is to make your existing site mobile friendly or create a new one that is.
If you think about it, the faster your site is, the more people can go on it, the more it can do at once and it also allows quicker navigation. Slow websites can drive clients insane (particularly clients or customers with slow-moving connections) and will result in no sales on ecommerce implementations. There are many ways to speed up one’s site, the most common are caching, shrinking file sizes and content delivery networks.
Search Engines are moving much more towards much shorter cleaner URL’s (no added specifications and special characters in them).
Uniformity of URL’s:
There is no chance to separate uppercase URL’s and lowercase URL’s.
Sitemap to Users:
Offer a site map to your users with links that point to the crucial parts of your site. If the website map has an extremely a great deal of links — you could wish to break the site map into multiple pages.
Lower the length of links on each page:
Keep the links on an offered page to a reasonable number (Under 200 is good — 100 links is perfect).
Relevancy — Don’t speak about unimportant topics on your website:
More information is better, but unimportant information is worse. Develop a beneficial information-rich site and write pages that clearly and accurately explain your content. Separate Different Subjects in various folders, If you are a Rehab Center and you are chatting about movie stars keep it in a “Celebrity-Rehab” Category folder off the major material of your site. Don’t mix “Cats” and “Dogs” links on the exact same page. Separate your website by subject.
The Keywords and Top Searches in The Title — H1 and Domain name if possible:
Consider the words users would type to find your pages and make sure that your site actually has those words on it.
Try To Use Text Links and constantly Include “alt” tags for Each Picture:
Use text instead of images to show vital names, content or links. The Google spider doesn’t recognize text included in images. If you must utilize images for textual content, make sure to use the “alt” tag to describe what the image is. Not only does it help SEO but it’s the law to include “alt” tags.
Having Accurate Notations:
Make certain that your title and “alt” tags are descriptive and accurate.
Fix Broken Links and Broken HTML:
Examine for broken links and correct HTML.
Attempt and keep every page you want indexed as a static page:
If you choose to make use of dynamic pages (i.e. the URL contains a “?” character) be conscious that not every search engine spider crawls dynamic pages. It helps to keep them short and few in number.
The Domain Age:
A truly old domain is one that existed prior to any individual cared about SEO. If it existed prior to anyone thought about tricking the search engines then it’s less likely that it is presently attempting to trick them. Keep in mind that domain age is said to be reset whenever there is a change of ownership so a 10 year old domain that you bought last month isn’t going to offer as strong a signal as it did before it altered owners.
The Shared IP Addresses:
If an IP has numerous website related to it, then it can be inferred that the website owner isn’t really paying much for the hosting service. Spammers typically pick this course to keep their costs low and for this reason a committed IP signals that the owner is really interested in a long-term successful internet presence.
The Code to Text Ratio:
Some sites that include 100 KB of HTML code with just 2 KB of material are indicating a lack of sophistication and maybe an absence of interest in doing what’s right for the user (i.e. producing pages that load quickly and feel receptive). Given that search engines wish to keep their users coming back, they wish to send them to websites that are going to be well-received and are thought of as an excellent search experience.
Note that Rand Fishkin of SEOMoz quotes Vanessa Fox of Google and recommends that code is neglected by Google implying that this ratio doesn’t play any duty at all.
All CSS vs. Tables:
There is a bunch of dispute about the advantages of CSS when it pertains to SEO. For me there are 2 signals right here. The first is that a redesign from tables to CSS is increasing as a site-wide financial investment in a website. A site that is maintained and updated sends a signal that somebody cares about it, and this sends a signal that content is up to date and relevant. The second signal is that CSS can enhance the code to text ratio (see previous product).
The Valid HTML / XHTML:
The W3C makes it simple to confirm a websites and guarantee that it complies with standards. Since legitimate websites practically never ever happen without an aware effort to make them error-free, having such pages is a signal that there is somebody behind the site that is careful with their efforts.
The Existence of Robots. txt File:
This file, which sits in the root folder of a website, supplies instructions to web spiders about what they must and shouldn’t index. Without it search engines are left to assume that all material is level playing field. Therefore — one might suggest that if the file exists and explicitly allows online search engine to crawl the website then if all various other things were equal, the site that specifically allowed this will beat on the one that didn’t specify.
The Bounce Rate:
It’s known that Google Webmaster Tools reveals how long people remain on your website and exactly how many people bounce. Although Google has actually not formally mentioned that this is any deciding factor (can you imagine what would happen if they had stated this, individuals gaming the system, computers remaining on pages hundreds of hours simply to make this increase). I think it plays a big part of your site’s SEO performance and reveals a clear image of how pertinent to search teams your site is. Anything you can do to enhance this and enhance time on your website is an excellent idea.
Can you think of other essential Quality Points?