PageRank – Want To Know How Google Ranks Your Webpage?

Google uses two factors to rank your webpage; relevance and authority.

 

Relevance:

relevance

relevance

Google is a search engine, and as such much provide a relevant search result for the search query that is presented to it.   In order for Google to properly rank your site, you need to make Google understand what your website is all about.

Because the search engines are not human, they cannot distinguish between words that mean the same thing; like trolly and tram, or plane (as in airplane) and plane (shaving wood).

For the search engines to correctly identify your site, you need to write your content in the language that the search engines understand.  This is called keyword identification and is nothing more than providing the search engines with the language that your customers use to find your website.

The process of discovering the language that your customers use to locate your site is called keyword research and is something we spend a great deal of time on.

Keyword research is extremely important for effective Search Engine Optimization.  The kicker is that Google has improved to the point that they now use Latent Semantic Indexing which could make keyword research moot.  By the way, Latent Semantic Indexing simply means that Google understands synonyms.

If and when this occurs, it could be huge,  but for now we need to continue researching keywords and use them to create good content.  Then Google will recognize that your site is relevant and rank your site accordingly.

Good keyword research gives Google a sense of the terms that people are using and those that are directly relevant to your business.  When you use a keyword research tool to discover a list of keywords, pick the ones that most closely identify your site and build your content around those keywords.

If you create a logical structure with keyword based navigation, Google will rank your site for those keywords.  If your home page is about “fishing equipment”, you could have category pages for “fresh water” and another for “salt water” with subcategories under each for “fishing rods”, “fishing reels”, “hard baits”, “soft baits”, etc.  By logically creating your site in this manner, Google will better understand the content and is more likely to direct traffic in your direction.

The more quality content you provide for the search engines, the more keywords you will rank for and the more traffic you will receive.

 

Authority:

Authority

Authority

Google’s search engine business depends on how reliable and accurate their answers are to search engine queries.  They must return a correct search result that directs the user to a reliable website.

Humans instinctively understand authority.  When we read something on the front page of the Wall Street Journal, we instinctively believe that it is more truthful than an article posted on a random blog.

Google does not have this innate ability and must assume that people will link to the websites they trust, that offer them something of value, are newsworthy, or are interesting.

This is where PageRank plays a role in identifying an authority site.

Google assigns each site a rank number between 1 and 10.  A site rated 1 has the lowest authority and a site ranked 10 is the highest and most difficult to attain.  Only 11 sites have a PageRank of 10 as of this posting.

Several factors are used to determine PageRank but generally the more quality links a site has, the higher the site’s PageRank will be.  This is why webmasters are constantly trying to get more people to link to their site.

Link building is is important to PageRank, but to really improve your PageRank you need quality links.

Not all links are equal and although any link is better than no link, quality links are what really improve the PageRank on a website.

What is a quality link?  It is a link that most people will want to visit to find what they want online.  All quality links are not equal.  Although a quality link will improve your PageRank, it may not help you rank for the relevant term being searched for.

For example, Home Depot is an authoritative site if you are looking for building materials but not if you are searching for tropical fish.  A link from Pet Smart would be more relevant and authoritative.

Google looks at the links that websites attract in order to determine what the site is all about.

When people link to Pet Smart, they will probably use search terms like tropical fish, pet supplies, dog and cat food, aquariums, etc.  These search terms that are used in the content on your site are called “anchor text”.

Every inbound link will benefit your site to some extent, but to rank for specific terms like the ones above, you need those exact terms reflected in the anchor text of the links coming into your site. Anchor text links have more authority with Google.

Where your quality links come from also matters in how Google ranks your web pages.

Google understands that authoritative websites do not normally link to non-authoritative sites.  Quality sites link to other quality sites in their “league” so to speak.  This linking generally holds true online which is why a link from one of the authoritative websites will greatly improve the authority of your site when you manage to acquire one.

By association, the more links you get from quality sites, the more quality your site will have in the eyes of the search engines.  Conversely, the more low quality links you receive from sites that are inferior to yours, the lower the quality of your site.  In fact links from “bad” sites can actually do a great deal of damage to your PageRank.

So what exactly is a “bad site”?

Bad sites are those you wouldn’t like to show your wife.  Porn sites, gambling sites, sites selling erectile enhancement products, some pharmaceutical sites, etc.  These sites are sites you do not want inbound links from, in fact you don’t want inbound links from websites that even link to those sites.  Those links are referred to as links from “bad neighborhoods”.

Unfortunately you have no control over sites that link to you,  so you need to make sure that you have a strong link profile to offset any bad links that may be coming in to your site.  For every “bad” link you get from a site or from a site in a “bad neighborhood”, you need at least one quality site to offset it.

Social media interaction also has some bearing on PageRank but is too involved to get into here.  Suffice to say that if your website is “shared” a lot on Facebook or has a lot of “Tweets” on Twitter, it will help your PageRank.

Although there are other factors that Google uses to rank your webpage, the gist of their decision making is all about relevance and authority.

Be guided accordingly when creating new posts.

Bookmark and Share

Does Your Website Need A Content Audit?

How do you know if your website needs a content audit?

Content Audit

Content Audit

Well, if your site has been on line for any length of time and has a few hindered pages or more, it is probably a good candidate for a content audit.

The purpose for a content audit on a website is to improve the overall performance, ranking, and look of the site.

It always helps to purge a website of unnecessary, unimportant, irrelevant, and low quality web pages when they can be identified.  In many instances these pages will slow down your site, provide outdated information to your visitors, and occasionally even effect your site’s ranking.

Almost all websites have pages or sections than can be deleted, updated or improved, and a content audit can identify these problem areas.

Websites with hundreds of web pages can sometimes benefit from a content audit with improved rankings.  By interlinking older content and more effectively use your link equity and internal anchor text, you can in many cases revitalize your website and improve your overall site ranking.

The first thing you need know about is where you stand with your website.  You will need to compile some data about your website, and depending on how many pages your site has, it could take some time to complete.

At the minimum you will need a complete list of all the pages of your website, the amount of visitors that each web page receives, and how many inbound links they receive.

If you are using Google Webmaster Central, you can export a spreadsheet that contains all the pages of your site with the number of inbound links.

To determine the number of visitors each page receives, you need page views.  Using Webmaster Central, select a time frame to use.  (A year to two years is a reasonable time frame) The purpose here is to identify pages for deletion or modification.

The primary factors to look at are how many links a post or page has vs. the amount of traffic that it generated over a set period of time (12 to 24 months).

Pages that generate very few page views or a minimal number of links should be either deleted or rewritten.  This is where you need to use some discretion.

Some pages may generate a lot of links but no traffic. You may want to keep these pages “as is”.  Other pages may generate large amounts of traffic, but no links or a low number of links.  These pages you will also probably want to keep.

The real decision making occurs when you have pages with zero or low links and little to no traffic being generated.  If the page is deemed important and is still relevant, you may want to keep it and rewrite or modify it.  However, if the page was important at the time but is now out dated, you may just want to delete it.

Personally, I believe in rewriting and improving existing pages in lieu of deleting them.  It takes less time to update an outdated page than it does to create a new page.  On the other hand, when a page requires a complete rewrite to update the information, it takes more time to modify it than to rewrite the entire page.

Rewriting a page also gives you an opportunity to maximize your internal anchor text and links.

If your website is on the WordPress platform, you might be interested in trying out the Scribe SEO plugin.

It provides content analysis and helps you create search engine and social media friendly content for your site.  It also provides suggestions for improving your web pages.

Although I lean towards rewriting marginal posts, there are good reasons for deleting old posts and pages.

Link equity has everything to do with deleting unproductive webpages on your site.

The concept of link equity suggests that every website has a given amount of authority, trust, and links associated with it, and that link equity can support only a certain amount of pages on each site.  On sites with an inordinate amount of unproductive web pages, link equity is diminished.

The search engines don’t have enough quality signals to support anything more than superficial web crawling on newer websites with few links and  that don’t have thousands of pages in the index.   However, websites with thousands of pages of content getting high rankings has also come to an end.

The focus should be on creating productive pages.

When you decide to delete a a web page from your site, be sure to back up your post in case you decide to change your mind or delete a post accidentally.  If a page has links on it you don’t want deleted, redirect it to another post that has a similar topic or to the home page, the archives page or the sitemap.  You do not want any 404 errors.

If you are using WordPress, the redirection plugin takes care of this problem as well as redirects.

If you decide that your website needs a content audit, you may be pleasantly surprised at the results you get when it is completed.

Bookmark and Share

What Is The Truth About Duplicate Content?

People always seem to get hysterical about duplicate content on their websites and blogs.

Duplicate ContentIt seems that before you even get the P in PLR out of your mouth, somebody hysterically points out that “Google hates duplicate content” and that “Your website is going to get banned if you use PLR in your content“.

When you stop to think about it, a plethora of sites build their content around the exact same content found on other websites or blogs.

For example; the Huffington Post is loaded with articles that are obtained from other news sources.

Yahoo’s home page is also a prime example of a notable website that uses articles obtained from other news sources.

Although you could successfully argue that your site is not the Huffington Post or Yahoo’s home page, let us use these two examples to understand the truth about duplicate content and the so called “duplicate content penalty”.

First and foremost, the truth about duplicate content is that in itself, duplicate content is not necessarily malevolent.  In today’s real world, duplicate content can be nothing more than content syndication.

Regardless what they are promoting, internet marketers want to get their content syndicated in as many places as possible to increase exposure for their product or service.

If you write a “How To” ebook and want to promote it as quickly as possible.  You would first write an article about it in your blog or website with the hope that social media or a high ranking website with more traffic than your site, would pick up and republish it.

If an established site like Yahoo or the Huffington Post republished your content, I’m sure you would not object to the additional flood of prospects you would receive from the so called “duplicate content”.

The truth about duplicate content in this instance is that the only penalty incurred, (if you could call it a penalty) would be that your post would probably show up in the search engine results page for the same query on Yahoo’s website, before the same post showed up on your site.

This is obviously because Yahoo’s domain most likely has a higher authority rank than your site.

In this situation your blog or website will not be de-indexed by Google because you decided to syndicate some content.   In fact, your goal to promote your ebook in as many venues as possible was partially achieved by the syndication.

If you wanted to target different search queries, you could make your post “more original” by changing your content to reflect the keywords you are going after.

The truth about duplicate content is that people who intentionally create multiple URLs with the exact same content will eventually discover their sites de-indexed by Google.

If your website is Surffishing101.com and you have everything on your page duplicated on Surffishing101.com/1html, Surffishing101.com/home, Surffishing101.com/2html, etc. you can count on having serious problems with your search engine ranking.

Marketers who intentionally clone entire websites word for word, will also obviously have duplicate content issues.  These rip off artists do not last long online.

The truth about duplicate content is that most people misunderstand what the search engines mean by “duplicate content”.

When we talk about using PLR for content on your site, it does not necessarily mean that your website will automatically be de-listed.

PLR content is intended to be used as a starting point to help you get off of the ground with your topic and to help fill in the gaps to your original content.

As long as you add your own flair to the purchased PLR content, and not just copy and paste it word for word to your site without any modification, you should have no concerns about using duplicate content on your website or blog.

The truth about duplicate content that you should take away from this is that:

  • Your website or blog will not be banned from Google just because you and another person publish the same content.
  • Most people misunderstand the term “duplicate content penalties” and apply the meaning to a broader range of things that it does not apply to.
  • You should never use PLR “as is”.  Even if the common erroneous understanding of “duplicate content penalties” were true; you will never have the exact same content as another person when you modify the PLR that you use in your posts.
Bookmark and Share