What’s new about Google’s latest algorithm in 2010 / 2011?
There is much talk in the SEO world about what Google is going to focus on in 2010. Matt Cutts, head of Google’s Webspam team has mentioned and hinted in various forums, Youtube and on his Blog that SEO professionals’ latest focus should be in 2010. Part of the Caffeine project, apparently in 2010 the speed of your website and web pages loading will now play a major factor in the algorithm.
To achieve faster speed your website needs to be hosted on a super fast host and reducing the overall size of your web pages. Obviously this means moving to a better Internet Service Provider and serving faster websites by increasing the download speed.
This will mean less content on a page, utilising CSS (Cascading Style Sheets) and images have to load faster. A webpage downloading speed of 3 seconds or less is pretty good, but 1 second or lower might have to be achieved in cases. You can now measure the speed with a program in the Google’s Webmaster Tools.
Does this mean Google in a way is pushing developers to write and create better websites that load faster and content writers to write quality content instead of quantity content? and the digital marketing guys, will they have to be at the same time creative but their art work must be short, sweat and fast loading? Does using flash animations go out the window? Hosting Video clips on your own website? would that be wise? or better to be hosted on You Tube instead?All these are questions and probably the answers you are looking for is, yes it is!What could happen is Google favouring certain websites and in the other hand penalise other ones. Websites owners that keep stuffing their pages with content, customer reviews, endless blogs and articles, videos, links and endless images will be penalised.
The good news is websites that are clean, focused, compatible and fast will benefit.
Many SEO professionals are saying that this is not fair for the SME’s (small to medium sized business) that cannot afford to be hosted on super fast hosts and do not have large teams that can restructure their websites to go faster. The question is how fast can the cooperate companies react to the ever growing demands of the Internet? Probably not and it will take them months before they even decide what to do. In the other hand SME’s probably will be in a better position to react a lot faster due to quick decision making compensating for all the other factors.
The SEO e-marketers also mention about countries that have slow internet access, like in Scotland and other rural areas where you cannot get broadband? Are they going to get penalised? Probably not, as its the Google Bots measuring the speed and not at what speed the website is loaded in these areas on an end users computer. Actually Google in a way by “forcing” developers to increase their websites overall speed would mean these areas with no broadband will benefit in the long run. Mobile phone users with their web enabled devices like the iPhone and Nexus by Google will be able to load more and more websites, specially the ones that are more lean and load faster.
White Hat Works Conclusion for 2010 and 2011:
Host your website on a super fast Internet Service Provider
Use CSS as much as possible and comply with website usability
Write quality content, not quantity content
Convert your a Flash website to a HTML website
Export your Flash animations where possible – more SEO Tips
Keep images and their size to a minimum without loosing the quality
Test your website in various browsers and mobile phone devices
Make your website Google Friendly
Get free guidance from a reputable SEO E-marketing Company
Faster websites means Google can keep up with the growth of the Internet without the need to keep buying and installing new servers. Many have asked, how many servers does Google have? This is a well kept secret, but if you do your maths and calculations the number of servers could be around 700,000 and growing. These servers are spread over 100 data-centres around the globe and makes Google the largest IT producer of (Co2) carbon emissions on Earth.
[color=#FF0000]Google is in the business of providing a top notch service and encouraging growth, but at the same time be profitable by keeping their costs and carbon emissions down.
Hence the need for this new factor that will be included in the updated Google 2010 Algorythm. Maybe the main reason Google is focusing on speed is because what they really are trying to do is to reduce their carbon foot print and we (SEO guys and website owners) all need to help them achieve this.
One effected company said:
We are an ecommerce site that has been running for almost 5 months and have seen steadily growing traffic. We have made no major changes to our website recently but are continually adding content and products to the site. Our site has been severely affected (almost 50% down on traffic) since last Tuesday (07/12/10) which coincides with the latest 2011 algorithm change which was the response to the New York Times article regarding Black Hat SEO techniques.
I have struggled to find much information about it and it doesn’t seem to be widely talked about at the moment. As far as I can tell it will specifically target site reviews and penalise people with negative feedback. Because we are a relatively young site we don’t have any feedback (negative or otherwise). Is it visible that this could have such a dramatic effect on a site that has essentially done nothing wrong and has anyone had the same problem?
Any advice would be greatly appreciated because we are being hit particularly hard in the lead up to Christmas which should be our busiest weeks.
Source: SEARCH ENGINEERS – SEO WEBLOG: - Author: Denishverma
Please your contributions: thoughts, questions and comments are very much welcome