Electrosmart Digital Publishing BMAPS Digital Publishing
 
HOME
ESMART BLOG
 
PRINT PUBLISHING
EBOOK PUBLISHING
BOOK MARKETING
ONLINE PUBLISHING
ONLINE MARKETING
ONLINE BUSINESS
BUSINESS TOOLS
NEWS
ARTICLES
EVENTS
RESOURCES
SUPPORT
Sitemap
Members Vault
ESL Login

Latest Updates

IM News
-
Blog
-
Resources
-
Free Tools

Authorship

5 Steps to Being a Successful Author

Guidelines for Editing Non-Fiction Books

Book Formatting

Selecting Font Size and Type

Getting Published

Realistic Odds of Getting Published

7 Steps to Writing A Book Publishing Query

Tips on Writing Book Proposals The Way Editors Want

Publishing Ettiquette




website security

Negative Factors Affecting Googles Page Ranking Algorithm

 

SEO factors that have a negative impact on Googles page ranking algorithm include:

  1. Poor Site Accessibility
  2. Low Quality Content
  3. Low Quality Links
  4. Poor Visitor Support

 

Poor Site Accessibility

Server is Often Inaccessible to Bots - a major problem for my sites when I first started, and my hosting provider was having technical issues. Servers down may impact the attractiveness for search engines to send visitors to that site. In addition, if they cannot crawl your content quickly - it may end up duplicate to another site. Rankings dive quickly when sites are unavailable for longer than 48 hours.

Very Slow Server Response Times - crawlers operate within milliseconds, so site availability is more important than server response times - unless it is so slow it times out. Concentrate instead on ensuring your site is able to be crawled as often as needed.

 

Low Content Quality

Content Very Similar or Duplicate of Existing Content in the Index - I'm not sure how they manage this one - I have a number of well researched, totally unique content relegated to Supplementary Results!  Rather than penalize you, I think Google will just simply not index the content in the results.

Duplicate Title/Meta Tags on Many Pages - Having the same titles and metas on the entire site is not going to inhibit rankings or crawling. The site just doesn't rank as well because its not optimized correctly.  Not sure how sensitive this is - I prefer to use a page title architecture that is Site Title | Section | Page Title [ which is either the same or very close to my H1 Tag]. From the above, it may be I need to change this. A few tests will tell me the answer.

Overuse of Targeted Keywords (Stuffing/Spamming) - any page over 10% keyword density is going to be dicey. Keyword density creeps up when total page words are low, so make pages at least 200 words.

 

Low Quality Links

Participation in Link Schemes or Actively Selling Links - there is a wide variety of link farms, link exchange sites or link lists. Avoid reciprocal links with those low spammy sites that use identical looking reciprocal link directories.

Inbound Links from Spam Sites - if a majority of the links are from spam sites it can be detrimental. It is more about the ratio of good links to bad links than the exact number of bad links.

External Links to Low Quality/Spam Sites - if you partake in link exchange sites such as Link Metro, check your link invitations carefully. I reject more than I accept.

 

Low Visitor Support

Low Levels of Visitors to the Site (Measured via Toolbar, Clicks in SERPs, etc.) - Relative importance to other similar sites is likely to become more important in the future.

 

Read Full Article

SEO Index | Search Engines | Google Page Ranking | SEO Testing | Optimization | Keywords | Using Videos | Using Sitemaps