Skip to main content

Google Panda for Low Quality Web Pages

"Google's Panda update 4.0 is aimed at low quality sites like content farms."

Google introduced series of algorithmic updates before penguin to filter the low quality webpages and to improve the quality of Google SERP's by reducing the pages with thin and unauthentic content.

According to Matt Cutts, Content developed/rewritten should be of high quality and panda should not be able to match at least any 2 pieces of the content on the same topic of the original content and this entire exercise is to offer content which adds significant value to the users of it.

Many classifieds and property portals will rely heavily on Google for traffic with/without adequate details due to which the visitors may be misguided with the partial information that is available with the listing which users provide. Classifieds and listings website(s) such as yellowpages.com and yelp.com should be able to gather adequate information from the visitors for business/property/job/services/other listings and should be able to follow-up with the corresponding visitors to make sure the content is updated, complete, fresh and also they should find out whether the listing is still active or expired periodically. There are several scenario's how search quality team at Google might rate the quality of the pages and make suggestions to amend the algorithms.

Scenario 1: If there are multiple pages ranking from the same classifieds/reference website or from multiple portals in the top 10 positions of the SERP's and the information such as telephone phone number or email Id provided by any of the pages to a specific business/service listed in the page of the same website will give very bad experience to the end user and cannot create significant value to the end user. Google continuously monitors the activity of the visitors in SERP's. If any visitor who types a keyword and enters the SERP and then select a link assuming that it will contain adequate relevant information. In case if the visitor comes back to SERP by hitting the back button Google will treat such activity as negative impression(s) and more such instances to several url's of the same portal will yeild bigger offense and are punishable by panda. This one scenario how Google might keep an eye on low quality pages in case any portal.

 Scenario 2: Google crawlers have the capabilities of text/Content matching, which will identify the areas where the corresponding fields are left blank. They will also use structured data formats to find the missing links within the data and classify them as incomplete in which case the pages can be rated as low quality pages.

Scenario 3: Keyword relevance and usage of key terms across the page. While constructing any page the webmaster will gather the relevant keyword(s) and related search terms. Ideal scenarios of the page construction is the total page content should balance the no of times the keyword(s) and relevant semantic terms usage. Hence the webmasters should understand or set criteria of usage and maintain uniformity in the usage of the keywords and related terms. If the keyword(s) usage has crossed (let's say 3-5%) a certain percentage of the total content available in the page and excess usage of  the keywords in url's, titles, meta content, heading tags, alt tags, anchor text and etc shall also be treated as low quality page(s).

Scenario 4: Amount of Internal links and external links. Webmasters should be able to maintain the quality of the content in a way to compel other websites to give a link to them in order build external links apart from this other linking activities shall be treated as negative impressions to the websites. Also the link gained should not be participating in any link spammy techniques. a) Pages without internal links may not be able to do well in search engines as they are not well connected with other relevant parts/pages of the websites with relevant anchor text which may include a related keyword.  b) At times there may be several unusual no of internal links shall be identified by Googlebot pointing to a page. Both (a) and (b) together or independently constitute low quality pages.

These are 4 major scenarios where a webpage shall be treated as low quality. More such scenarios shall be explained soon in this page and you may want to bookmark this page to come back later.

Good Luck!

Comments

  1. I discovered so many valuable things in your blog site especially its conversation. From the plenty of feedback on your content, I think I am not the only one having all the entertainment here! Keep up the outstanding perform.

    website services in Bangalore | Website development in Bangalore

    ReplyDelete

Post a Comment

Popular posts from this blog

Developing a Holistic SEO & Online Strategy

Identify the components of your business which may include Rich Media experience to interact with Videos, Social Media, Community Forum, Blog, Mobile Website/APP and a cool Website. Try to figure out how these components are connected which may defines your visitor journey to conversion(s). Now we shall go through the steps of building Search Engine Optimization strategy. 
1) Searcher Persona Workflow: Typically, a visitor journey will follow the steps shown in the below diagram. Firstly, visitors enters a search term to perform a Search in a search engine. Secondly, visitors will look for relevant results in SERP to click and enter a website which may offer the intended information, products or services.



Figure 1.1 - Searcher Persona Workflow   Third, visitors may go through the content of your page(s) if the visitor selects your result in search page and then goes to the last step of conversion provided if the visitor(s) satisfy with your products/services that you are offering and …

Gartner Survey on Digital Marketing Impact

Gartner conducted a survey in UK, canada and USA with 315 marketing executives and predicted the figures of digital marketing spend will increase by 17% in the year 2015 as most of the respondents are pro digital marketing expenditure. Gartner research director Jake Sorofman, also mentioned that is's very difficult to gauge upon how much spend is going to increase as the companies are maintaining seperate digital marketing budgets. 



Although, digital platforms are transforming the ways of marketing, Print and broadcast still dominate the advertising spend as many companies yet to hit double digit mark in sales revenue from digital platforms apart from UK.