BThemes

News Update :

BTricks

Moto GP News

Basketball News

Formula 1 News

Tampilkan postingan dengan label SEO. Tampilkan semua postingan
Tampilkan postingan dengan label SEO. Tampilkan semua postingan

What Would Happen If Google Removed The Nominal PageRank?

Gone are the days when I used to obsess with Google PageRank. Especially with the nomimal PageRank, which is the 0 to 10 scale we see on the various toolbars around the web. Why is that? Because as the name implies, the nomimal PageRank is just an indicator of how much trust Google has on a certain website. It doesn’t have a direct impact on your organic traffic, and certainly it doesn’t have a direct impact on your profits, which is the most important metric for any online entrepreneur. Even the real PageRank, which influences your search rankings, is only one out of hundreds of factors that Google’s algorithm takes into consideration.

That being said, I still find myself curious to check the nominal PageRank updates rolling out. I like to track the frequency of the updates, as well as the PR fluctuations on my and other people’s websites. Out of vanity, perhaps.

The last PageRank update happened early in April, and the next one was expected late in July/September, but so far nothing has happened.

Thinking about this issue, one questions came to my mind: What if Google completely removed the nominal PageRank? That is, what if all the toolbars stopped working, and no one would be able to see the indicator of how much trust (or how many backlinks) any website has?

What impacts would such a change have upon the webmaster/blogging/SEO industry?

Some people argued in the past that removing the nominal PageRank would kill the market for paid links. I don’t think so. As long as backlinks play a role in the search ranking algorithm, there will still be people buying them.

But without the PageRank the link buying process would change a bit. I believe that paid backlink analysis services would gain many more clients, as this would be the best way to evaluate the link authority of any website.

I think that more important than that, however, is the effect that such a change would have on the mind of most website owners. Probably most of them would realize (as most experienced webmasters do sooner or later) that it is better to worry about more tangible metrics like traffic and profits. As a result they would focus more on producing quality content.

Another interesting aspect the consider is the linking one. I believe that if the nominal PageRank was gone bloggers and website owners would become less paranoid about linking to external websites, feating to leak PageRank.

But what do you think? Would this change be positive or negative? What other aspects would be influenced?

Google Released The Most Searched Terms in 2010

Every year Google’s releases the so called Google Zeitgeist, which is the compilation of the most searched queries of the year. The 2010 edition is out already, with some interesting points.

For example, here are the 10 search queries that grew faster in 2010:

  1. chatroulette
  2. ipad
  3. justin bieber
  4. nicki minaj
  5. friv
  6. myxer
  7. katy perry
  8. twitter
  9. gamezer
  10. facebook

And here the 10 search queries that fell faster in 2010:

  1. swine flu
  2. wamu
  3. new moon
  4. mininova
  5. susan boyle
  6. slumdog millionaire
  7. circuit city
  8. myspace layouts
  9. michael jackson
  10. national city bank

In the official page you’ll also find lists for specific niches (e.g., health, entertainment, sports), so check it out.

5 Tips for Using Google Webmaster Tools

Google Webmaster Tools is a free toolset that’s absolutely invaluable for SEO trouble shooting.

It’s pretty simple to set up, you just need to verify that you’re the site owner (there are a number of ways to do this, so just use the one that is best for you) and you’ll have instant access to an abundance of useful information that will help you to improve your website and your search engine optimisation (SEO).

Here are five tips that will get you started:

1. Crawl Stats

Crawl Stats give you information in Google’s crawling activity for the last 90 day period. When you click into this report which is located in Diagnostics, you’ll see three reports:

Pages crawled per day: Overall, it’s a good sign to see this graph going up. Whilst there are peaks and troughs, you’ll be able to see if there is a steady incline, decline or no change at all. Spikes in this report are often due to the introduction of new pages or an increase in inbound links.

Kilobytes crawled per day: This graph should bear some resemblance to the Pages crawled per day graph in terms of the peaks and troughs in the graph.

Time spent downloading a page: This graph will be different from the above two and is likely (hopefully) to not show as many peaks. Peaks on this graph could be a server problem as in the norm, Google should not take very long downloading your pages.

These stats are useful for diagnosing problems and gauging performance issues.

2. Not Found Errors

Not found crawl errors are very useful for usability & SEO. If customers are browsing around your site and finding that links are not taking them anywhere, they’re likely to get annoyed and go elsewhere. This tool (which is accessed on the top right of the dashboard) will identify all not found URLs in your site. Be aware, that this can sometimes be slightly outdated, and Google state:

If you don’t recognize these URLs and/or don’t think they should be accessible, you can safely ignore these errors. If, however, you see URLs listed in the ‘Not found’ section that you recognize and would like crawled, we hope you find the ‘Details’ column helpful in identifying and fixing the errors.

So don’t dwell too much on getting this down to 0 errors in GWT, just use the information to improve site usability.

As well as links from within your site that are leading to a 404, this will also show you links from outside sites that are leading to a 404. This aspect is particularly valuable for SEO. Use this feature in GWT to do is identify the linked to pages within your site that no longer exist and redirect those pages to a real page within your site. This tactic will lead to increased link juice and increased visitors.

3. Meta Descriptions and Title Tags

Google Webmaster Tools will provide you with a list of URLs that have problems in their title tags or Meta descriptions, this list will include duplicates as well as incidences of titles or Meta descriptions that are too long or too short. Go into Diagnostics and HTML suggestions to find this information. Duplicate meta titles, especially can affect your rankings within Google and meta descriptions should be snappy and targeted to each specific page to help CTR of each page on your site.

4. Top Search Queries

Whilst you can get your top search queries out of Google Analytics or whatever analytics tool you use, I particularly like the Webmaster Tools version for the simple reason that it shows your average position within Google as part of the data. This enables you to look at your top search terms by position. The reason this is helpful is that when deciding which keywords to push, I particularly like to focus on the keywords that are currently in positions 2-4 as increases in positions at this level will have the most increased in traffic.

5. Site Links

If your site had a list of links below its Google listing, you can use the sitelink section within Site Configuration to control the links that are shown. You can’t actually tell Google which links to show, but you can block links that you don’t want shown.

These are just a few of the many tools available in Google Webmaster Tools and Google often add new features to this great tool. If you’re not a regular user of GWT, try these features out for size and look around to get used to other features on offer. If you are a regular user of GWT, let us know your favourite features and why

20 SEO Terms You Should Know

If you have a website or blog, or if you work with anything related to the Internet, you’ll certainly need to know a bit about search engine optimization (SEO). A good way to get started is to familiarize yourself with the most common terms of the trade, and below you’ll find 20 of them. (For those who already know SEO, consider this post as a refresher!).

1. SEM: Stands for Search Engine Marketing, and as the name implies it involves marketing services or products via search engines. SEM is divided into two main pillars: SEO and PPC. SEO stands for Search Engine Optimization, and it is the practice of optimizing websites to make their pages appear in the organic search results. PPC stands for Pay-Per-Click, and it is the practice of purchasing clicks from search engines. The clicks come from sponsored listings in the search results.

2. Backlink: Also called inlink or simply link, it is an hyperlink on another website pointing back to your own website. Backlinks are important for SEO because they affect directly the PageRank of any web page, influencing its search rankings.

3. PageRank: PageRank is an algorithm that Google uses to estimate the relative important of pages around the web. The basic idea behind the algorithm is the fact that a link from page A to page B can be seen as a vote of trust from page A to page B. The higher the number of links (weighted to their value) to a page, therefore, the higher the probability that such page is important.

4. Linkbait: A linkbait is a piece of web content published on a website or blog with the goal of attracting as many backlinks as possible (in order to improve one’s search rankings). Usually it’s a written piece, but it can also be a video, a picture, a quiz or anything else. A classic example of linkbait are the “Top 10″ lists that tend to become popular on social bookmarking sites.

5. Link farm. A link farm is a group of websites where every website links to every other website, with the purpose of artificially increasing the PageRank of all the sites in the farm. This practice was effective in the early days of search engines, but today they are seeing as a spamming technique (and thus can get you penalized).

6. Anchor text: The anchor text of a backlink is the text that is clickable on the web page. Having keyword rich anchor texts help with SEO because Google will associate these keywords with the content of your website. If you have a weight loss blog, for instance, it would help your search rankings if some of your backlinks had “weight loss” as their anchor texts.

7. NoFollow: The nofollow is a link attribute used by website owners to signal to Google that they don’t endorse the website they are linking to. This can happen either when the link is created by the users themselves (e.g., blog comments), or when the link was paid for (e.g., sponsors and advertisers). When Google sees the nofollow attribute it will basically not count that link for the PageRank and search algorithms.

8. Link Sculpting: By using the nofollow attribute strategically webmasters were able to channel the flow of PageRank within their websites, thus increasing the search rankings of desired pages. This practice is no longer effective as Google recently change how it handles the nofollow attribute.

9. Title Tag: The title tag is literally the title of a web page, and it’s one of the most important factors inside Google’s search algorithm. Ideally your title tag should be unique and contain the main keywords of your page. You can see the title tag of any web page on top of the browser while navigating it.

10. Meta Tags: Like the title tag, meta tags are used to give search engines more information regarding the content of your pages. The meta tags are placed inside the HEAD section of your HTML code, and thus are not visible to human visitors.

11. Search Algorithm: Google’s search algorithm is used to find the most relevant web pages for any search query. The algorithm considers over 200 factors (according to Google itself), including the PageRank value, the title tag, the meta tags, the content of the website, the age of the domain and so on.

12. SERP: Stands for Search Engine Results Page. It’s basically the page you’ll get when you search for a specific keyword on Google or on other search engines. The amount of search traffic your website will receive depends on the rankings it will have inside the SERPs.

13. Sandbox: Google basically has a separate index, the sandbox, where it places all newly discovered websites. When websites are on the sandbox, they won’t appear in the search results for normal search queries. Once Google verifies that the website is legitimate, it will move it out of the sandbox and into the main index.

14. Keyword Density: To find the keyword density of any particular page you just need to divide the number of times that keyword is used by the total number of words in the page. Keyword density used to be an important SEO factor, as the early algorithms placed a heavy emphasis on it. This is not the case anymore.

15. Keyword Stuffing: Since keyword density was an important factor on the early search algorithms, webmasters started to game the system by artificially inflating the keyword density inside their websites. This is called keyword stuffing. These days this practice won’t help you, and it can also get you penalized.

16. Cloaking. This technique involves making the same web page show different content to search engines and to human visitors. The purpose is to get the page ranked for specific keywords, and then use the incoming traffic to promote unrelated products or services. This practice is considering spamming and can get you penalized (if not banned) on most search engines.

17. Web Crawler: Also called search bot or spider, it’s a computer program that browses the web on behalf of search engines, trying to discover new links and new pages. This is the first step on the indexation process.

18. Duplicate Content: Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. You should avoid having duplicate content on your website because it can get you penalized.

19. Canonical URL: Canonicalization is a process for converting data that has more than one possible representation into a “standard” canonical representation. A canonical URL, therefore, is the standard URL for accessing a specific page within your website. For instance, the canonical version of your domain might be http://www.domain.com instead of http://domain.com.

20. Robots.txt: This is nothing more than a file, placed in the root of the domain, that is used to inform search bots about the structure of the website. For instance, via the robots.txt file it’s possible to block specific search robots and to restrict the access to specific folders of section inside the website.

Take It One Step Further and Use PPC for Keyword Research

By now, you may already be familiar with the foundation of keyword research to search engine optimize your company’s website. Still, if you truly want to get a leg up on your competition, it really pays to take it one step further and use PPC for your keyword research.

PPC As A Reality Check

PPC, or pay per click, advertising allows you to create ad campaigns that will display on the search engines, like Google, Yahoo, and Bing. However, we are going to look at these same campaigns from the point of understanding your market and doing keyword research. The reason that this is so effective for keyword research is that through a PPC ad campaign you can get actual statistics of the impressions, click through rates, and keywords being used to actually bring sales to your website. In short, the data you receive from your PPC campaign will give you a legitimate report of what customers are actually searching for to get to your website instead of just what you think or hope they may be looking for within your niche market. Remember most keyword tools give only approximations. A PPC campaign is like a reality check.

Testing, Testing, Testing

Why this is so vital is because SEO means a lot of work and time. It can often take months to get to the top of a niche for your chosen keywords, and if at that stage you realize that these keywords are not that profitable then there is little you can do about it. A PPC campaign gives you a shorter, faster way to do that testing.

The other thing is that this kind of research can reveal the inner desires or your market that cannot be researched by any keyword tool. A well know example is the book “The Four Hour Work Week”, Tim Ferris. He came to that title through split testing with Google AdWords campaigns. He basically created several campaigns with all potential names he had, and tracked which one got the highest CTR. Clever huh?

Finding Nuances And Optimizing for Them

A PPC campaign also allows you to fine tune certain parameters on your keywords – E.g. when we were looking to optimize our campaign for the term “virtual assistant,” Google keyword tool will show us almost equal statistics for “virtual assistant” and “virtual assistants”. However an exact PPC match campaign showed that the first one is being used almost four times more than the later one. If we hadn’t done that test and gone ahead with using the later then we would have been getting four times lesser traffic and hence four times lesser clients. So, you see what difference it can make.

Once you start to run your ads with broad match, you will receive data detailing the long tail variations of your keywords that are often unreported in commonly used keyword research tools. Some variations in the example above may include “offshore virtual assistants” and “remote virtual assistants”. These long tail keywords were unreported by regular keyword research tools, yet they were uncovered through the results we received in our PPC ad campaign. These kind of keywords are easy SEO targets. It is often to your advantage to target your SEO campaign toward the narrower, long tail keywords earlier on because there is normally less competition, and they are easier to rank for right away.

Let us take it to next level.

You can use tools like KeywordSpy to uncover what keywords your competitors are bidding on and find other relevant keywords for your SEO campaign. Many times, organic keyword analysis will not give you these overlooked keywords that bring in real sales. You can use a keyword tool like Keywordspy that will be able to provide you with a list of Google keywords and Adwords keywords for your competitors. Remember if your competitors are bidding on those terms then they must be profitable.

Remember, this is one of the best way to do lateral keyword research. If you are in a highly competitive niche then this kind of tool can give you all the data you need early on. Importantly this kind of tool can reveal keywords that you might not even guess. (E.g. we found a lot of our competitors bidding on book names like “Four Hour Work Week”; now that is something that we were not expecting at all.)

The truth of the matter is that it is very important to go beyond standard keyword research if you want to create a profitable SEO campaign. Yes, much of your SEO campaign has to do with building authority, ranking for multiple terms, getting the best back links, but with right keyword research you can improve on the most important metric: ROI.

Google Finally Rolls Out A New PageRank Update

For those of you who thought this was the end of nominal PageRank updates, well, it was not. Google is rolling a new update right now. Some people started noticing it yesterday, and I believe the ups and downs will continue over the weekend.

The last update had been on April 2010, so around nine months ago. That is quite a long time, especially if you consider at some periods in the past Google would roll a new PageRank update every three months or so.

The PR on most of my sites hasn’t changed a lot. Some of my niche sites gained 1 or 2 points, but that was it.

Judging by the comments on online forums I believe this update has an upward trend, as most people are seeing increases in their PageRanks.

What about you, did you see any fluctuations?

5 Unique Ways To Get Backlinks

Getting a ton of quality backlinks can take forever and most people just don’t have the time or patience to wait that long. There are many great ways to get backlinks such as; blog commenting, forum posting, article submissions, social bookmarking and guest posting, but the problem with these backlink building methods is that you will end up spending too much time away from your site and more importantly your readers. Wouldn’t it be nice if getting backlinks could be set on autopilot?

Great news, there is a way you can set your link building on autopilot that only requires you to do a minimal amount of upfront work. Before I get into the methods that will put your link building on autopilot I want to make sure you understand that this entire method revolves around good quality content.

Remember, “Content is king”!

1. Utilize Yahoo Answers

If you have been building backlinks to your site then I bet you know that Yahoo Answers is really a No-Follow site, right? Well to tell you the truth, it is, but that isn’t the reason you want to post answers on Yahoo Answers. The reason you need to post on Yahoo Answers is because there is a piece of software out there called WP Robot that when placed on a blog has a tool that pulls Yahoo Answers off Yahoo and places it on their site.

Do you get why this is beneficial for you?

Getting links is not that tough. All you have to do is answer questions that have a descriptive title such as, “How do I make money online?” or “How can I lower my car insurance premiums?”. The reason you need to answer questions with a large niche in the title such as making money online, insurance, weight loss or whatever is so you have more people pulling Answers from Yahoo onto their site.

2. Create A Multiple Series List

There are a few reasons to create a multiple series list and all three reasons combined work very well to get backlinks and traffic. The first reason multiple series lists work is because readers can follow along better with a list, they are just easier to scan and make learning less complicated. The other reason why multiple series lists get more backlinks is because people love to link to lists and the more lists you have the more backlinks you will get.

Example: If you write a 3 part series titled, “15 ways to get more traffic through Facebook” I would be a fool to only link to one part on my blog, so instead I would need to link to all three. I don’t know about you but I would much rather have three links pointing to my site as opposed to just one.

3. Add Information To Wikipedia

This is something I tested a while back and it works beautifully. Wikipedia is the biggest online encyclopedia and has over 3 million articles just in the English language. This is why finding a page to edit is very easy to do.

Here is how it works, you first need to find a page that can use some editing (I recommend finding a page that doesn’t have a lot of content but has a decent foundation to go off of). Once you have found the page you want to edit and that relates to your site you need to make sure that you have an identical topic on your site so that you have a reason to link to your site.

Example of identical topics: If you had a site about blogging then I great page to edit would be the page of Problogger, Darren Rowse(considering you had something new to add). Now in order to make it every more closely related I would try to get an interview with Darren that way I could fill in some missing parts to his Wikipedia page. After I have done that I can edit the page, insert my website link and wait.

Something that you must know is that like Yahoo Answers Wikipedia is a No Follow site so the entire point of using Wikipedia to get people to see the article on your site and link to that instead of the actual Wikipedia page.

4, Exchange Site Links For Content

Something that I have done quite a bit for my niche websites is exchange site links for written content. This isn’t guest posting, this method involves a site owner placing a link back to your site in order for you to write a specified number of articles.

The reason this works is because site owners want to get more content whether it is to submit to article directories, sell or just put on their site. I know this may not make a lot of sense to you now but it will. If you spend enough time on forums then you will notice that not everyone wants to buy content for a decent price, so all you have to do is tell the person wanting to buy the article that you will write 1 article each month in exchange for 1 link on their site. Trust me, people like to do this.

5. Give Away A Gift

My favorite way to get backlinks is to actually put out some link bait(AKA give away a gift). The link bait I always put out is a $25 gift card for Amazon.com. Everybody wants to get the free gift card so I tell my readers that all they have to do is link to my site and they will be entered into the drawing for the $25 gift card.

How I set it up: Since I want to get traffic and not only links I make the requirement that in order to be entered into the drawing the linking site needs to send at least 5 visitors having all different IP addresses and anything over 5 will count as another entry into the drawing. Think about it, if you only get 3 links from quality sites it would probably cost you around $200 per month for those links, but instead you only pay $25 with the hopes that the site owner keeps your link up for the chance to win another gift in the coming months.

The biggest different between these 5 unique link building methods and the normal link building methods is the fact that these are a more set and forget type thing while the others have to be catered to constantly.

Using Linkbait to Gain Dozens of Targeted Links to Your Site

Wouldn’t it be nice if dozens of bloggers in your niche gave you backlinks to a targeted post that could drive affiliate sales if you ranked well in Google? Well it is possible with a bit of creativity and by creating a linkbait post that boosts the egos of other bloggers in your niche.

I recently received dozens of backlinks and tweets on a post which was simply a list of personal finance books. The topic itself wasn’t anything special, but the way I created the post was beneficial because it drove enough attention that it now ranks on the first page of Google for my desired search term.

The idea is simple, poll the bloggers. I’ve seen it done before with several other blogs about blogging so I decided to try the same thing in the personal finance niche. Seeing as there are hundreds of blogs in my niche it wasn’t that hard to reach out to other blogs and ask them a quick question.

I created a list of about 100 personal finance bloggers and sent them all an email which said:

“Hi (bloggers name),

Ryan here from plantingdollars.com. I’m doing a post about the best personal finance books according to 100+ personal finance bloggers and I’m wondering if you could simply reply by telling me your favorite personal finance book. I’ll let you know when the post goes live and give you a backlink from it if you decide to participate.

Cheers,

Ryan

Granted I didn’t have a mailing list established beforehand so I literally emailed each one individually which took a good chunk of time, but following up was easy once the original email was set.

I waited about 2 weeks to see who responded. In total 38 bloggers did end up responding out of 100 which was more than I expected. I then created the post based on their responses and in the end gained an insightful post and a piece of linkbait simply by asking others questions.

After the post was written I also added affiliate links to amazon for each book that was mentioned and as promised gave each blogger a backlink to their site.

When the post went live I then emailed each one back with a personal message about their pick and let them know where the post was located so that they could easily link to it. This is what I sent the second time around:

Hey (bloggers name),

I Just wanted to send you a quick note to say thanks for taking the time to respond to my question about your favorite personal finance book. “Art of the Deal” was a book I also enjoyed quite a bit and was pretty inspiring.

I just posted the results from other bloggers here: (inserted link)

Hope you’re doing well and thanks again,

Ryan

In total I spent about five hours emailing the bloggers, compiling their answers, adding backlinks, and then following up with them, but the result was dozens of backlinks, and a post that ranks well for the term “best personal finance books” which should drive affiliate sales indefinitely.

I would consider the post a success considering it would take significantly longer to gain as many backlinks via guest posting. From this experience I hope to do a few more “poll the bloggers” type posts to gain a post that’s insightful for readers and provides a win win for backlinks for myself and the bloggers who participate.

Have you have any luck compiling information from other bloggers in your niche and turning it into a post?

Consistency Is A Very Important SEO Factor

If you ask people what they think is necessary to get a website ranking high for a particular keyword, most of them will mention things like an optimized title tag, unique content, backlinks and so on.

While those factors are indeed important for SEO, they will produce very small results if you don’t combine them with consistency.

That is, you could launch a new website today with a lot of unique content and many backlinks, but unless you keep adding new content and getting new links consistently over time your website, would not rank high for any competitive keyword.

The opposite is also true. Even if you launch a website with little unique content and no backlinks, you can still get it ranking high for competitive keywords if you consistently keep adding new content and getting new backlinks.

Let me illustrate this point with some numbers. Suppose we launch two websites targeting the same keyword, website A and website B. On website A we publish 50 articles right away and manage to attract 500 backlinks on the first day thanks to a viral campaign on social media sites. On website B, on the other hand, we work our way up gradually, publishing one article every other day and getting 10 new backlinks per week. After 6 months I would be willing to bet that website B is ranking higher than website A, and that is because of the consistency factor.

In fact Google confirmed in the past that the pace at which a website publishes new content and gains new backlinks is indeed used inside its algorithm.

Whenever you plan an SEO campaign for one of your sites in the future, remember that slow and steady can win the race, even on the Internet!

SEO Strategies For 2011 And Moving On From Algorithm Changes

I got a chance to attend the Affiliate Summit West 2011 in January got a chance to meet some heavyweights in the SEO world such as Todd Friesen of Performics, Greg Boser of Blueglass Interactive, Stephen Spencer of NetConcepts. I had a lot of takeaway information from the seminar, one of the most useful was the ask the seo pros talk panelled by the aforementioned gurus.

I’ve noticed a lot of talk in forums about many sites getting deranked or even completely removed from SERPS altogether. Let’s not forget the content farm debacle of late, and all the talk around that either. SEO has changed, and black hat methods are fast becoming redundant, and if you haven’t switched to more long term white hat means now is the time. Here are some great takeaway SEO tips I got from the summit.

1. Article Spinning

Article spinning is the act of rewriting articles by computing alternative words and sentence structures. Most spinning articles tend to read very poor, and thus end up on unmoderated low quality sites, and therefore the links tend to be low quality too. Additionally the poor return on investment on spending so much time spinning articles are better spend on high quality content that sticks around builds linkjuice over time, secondly if it works it still leaves room for your competitors to do a backlink analysis on your site and report you to Google.

The days of article spinning is numbered, it may work to an extent, however Google recently tightened the noose around content farms that accept low quality articles, which also means the linkjuice passed has reduced as well.

2. Scalable Whitehat Solutions

Focus on quality not quantity; guest posts on high quality blogs with high RSS readerships and daily visitors are valuable. Aside from creating high quality content, you should build relationships with other bloggers and influential social media players and leverage your contacts to get your content promoted across the social media networks.

There are many high PR homepages that do not treacle pagerank over to subpages, so much of the work lies in promoting the individual blog post, which can be speeded up with high quality content and good social media contacts.

Outsource

The most scalable method is to outsource article writing, you may need to split tasks up between researchers and writers to utilise people skillsets better, and focus your attention on building relationships with influential figures in the blogosphere.

Linkbait

Viral marketing is a fantastic way to organically build links, giveaways, personality tests, top 10 lists, articles, widgets can help massively a fantastic way is to place linkbait/viral content on your site or produce widgets that other bloggers can install on their site which link back to you. You simply write a few press releases about the linkbait content and place on high traffic blogs and websites, and let the linkbait content rise in popularity through social media.

Authority over low quality

The future lies in being the authority in your niche, not just another cookie cutter player in the field, aim to have the absolute best knowledge in your field, and as you do you will naturally get more organic links over time, rather than having to work hard to get people to link to you.

3. Backlink Anchor Spam

More and more websites are being penalised for unnaturally high anchor text backlinks. Most organic sites tend to have the url as the anchor for the vast majority of cases, but many manipulative seos focus solely on keyword rich anchor text which is unnaturally high.

Google has evolved away from relying on anchor text to index sites on appropriate keywords. While anchor text still counts for a lot, it’s important not to overdo it. It will look more closely into your page content, and the content of the linking page, if the two pages are semantically similar then google will classify the link appropriately regardless of the anchor in the link. A high authority, domain name anchored link is worth a lot more than a keyword rich low quality link.

Additionally on organic sites anchor text are more diverse, a page about cheap boots will get a variety of different keywords and synonyms, blackhatters tend to overkill it with the same anchor, to understand this Google search for ~cheap art, it will use lots of different synonyms for “cheap” in the search query such as affordable, budget, low cost etc. So a site aiming for just “cheap art” as an anchor may trigger a penalty because it appears unnatural for organic backlinks to use the exact same adjectives to describe a site, if you search ~cheap art, you will end up overwhelming with results of inexpensive art.

Exact match domains names are an anomaly, it seems because they can’t be penalised for having too many keyword rich anchors for the domains, because the keyword IS their brand name. However this may get rectified soon as there are many inferior exact match domain websites ranking up against high quality content websites.

Google now has much better insight onto real user behaviour from Google toolbar, and Google chrome. And can use these metrics to evaluate quality of sites and quality of links a lot better. And can use actual click through traffic data as a sign of quality, and nullify the effect of dormant black hat links that nobody ever clicks.

4. Link Networks

Sitewide links have a low tendency of being organic and more often than not paid links, in particular footer links across a large website. It doesn’t hurt to have them, however their worth is largely subsided, for cross linking related web properties it’s far better to have a dedicated links page with links and a brief description of the site.

Also with regard to paying money for insertion in link networks, what tends to happen is that certain link network owners will have the same clients, and therefore across multiple websites there is a correlation of the same recurring links of unrelated websites appearing across the network, which is easy to spot algorithmically and therefore easy to penalise/discount.

Football News

 

© Copyright Gudangnya Informasi 2010 -2011 | Design by Herdiansyah Hamzah | Published by Borneo Templates | Powered by Blogger.com.