Thursday, October 17, 2013

Ultimate Guide to Google Penalty Removal

Introduction

A few months back, I wrote an article on Moz all about a penalty our web agency received for unnatural links pointing to our website. At first, this was a bit of a shock to the system, but since then, we've learned so much about Google's webmaster guidelines and we've helped lots of companies get their businesses back on track and remove manual penalties associated with their websites.

What did we get hit for?

Cutting a long story short, the main reason we were hit with a manual penalty was for adding followed anchor text into the bottom of clients websites that said 'Web Design Yorkshire by Pinpoint Designs' (both 'web design yorkshire' and 'Pinpoint Designs' linked to our website). At the time, we were just doing this out of habit, but we never varied anchor text, always had followed links and we were basically violating Google's quality guidelines.

After a lot of work and research, we managed to remove the penalty from our website and since then have worked on lots of other clients websites to help them lift penalties. We've worked with clients who've had both unnatural link penalties along with on-site penalties for low quality content, cloaking issues and malware issues. We have a great success rate and are consistently trying to improve our processes to become even better!

What are we doing to improve?

Over the past few months, we've been trying out different tools to aid link recovery, speeding up our process of getting in touch with webmasters, finding new ways to contact webmasters and ultimately just trying to streamline the process of getting spammy links removed. In this guide, I'm going to review a few of the different link removal tools we've tried out and also give you some ideas of how you can carry out link removal work manually to get your website penalty removed.

This guide is mainly for people who've specifically received a penalty for unnatural links warnings. To find out if you've received a penalty for unnatural links, simply log in to Google Webmaster Tools and use Google's new 'Manual actions' tab to see what type of penalty you have. If it's for 'Unnatural links' with a yellow warning symbol next to it, then you're on the right guide!







My aim over the next few months is to try to write guides on different types of penalties in order to help people out further. We're also working on a tool ourselves called Peel, which we're building from the ground up in order to try and deliver exceptional analysis of links. You can sign up to our newsletter for tips and information about the launch of the product using the link above.

Let's get started!

Step 1: collecting backlink data

First of all, we need to pull a list of all the links pointing to your website from a few different sources. A Google employee on the Webmaster Central Forums has recently stated that they recommend you focus on the Webmaster Tool Links.





Whether you choose to believe that though is another question, there's an interesting discussion on this available at the Webmaster Central Forums:


Matt Cutts has also responded recently saying the following:

"It's certainly the case that we endeavour to show examples from the set of links returned by Webmaster Tools, and likewise we prefer to assess reconsideration requests on that basis of those links. However, if there's a really good example link that illustrates a problem, we do leave enough room to share that link, especially because it can help point the webmaster in a better direction to diagnose and fix issues."

Whilst it may be that Google just want you to remove the majority of bad links and not necessarily every single one of them, I would personally recommend doing the job properly and start by collecting as much data as possible from various different link sources:
  • Google Webmaster Tools - Login to Google Webmaster Tools and click into the website with the issues, go to 'Search Traffic' > 'Links To Your Site', then click the 'more' link under the 'Who links the most' tab. Once in here, click the 'Download more sample links' from the top.

    Google Webmaster Tools Export
    • Open Site Explorer - Visit http://www.opensiteexplorer.org and type in your domain name (make sure you get this absolutely correct as having a slight variation of your domain such as missing out the www. can cause a different set of results). Once loaded, click the 'Advanced Reports' tab, select 'Links that come from: External linking page' and leave all the other settings as they are, then click export.
    • Ahrefs - Visit http://www.ahrefs.com and enter in your domain. Click the 'CSV' tab and export the list of 'Backlinks/Ref pages'. Ahrefs is an absolutely brilliant tool that's helped us so many times with link removal campaigns. It allows you to narrow down quickly sitewide links, followed / nofollowed links and more.
    • Majestic SEO - Visit http://www.majesticseo.com, enter your domain name and select 'Historic Index'. This will show all your previous links instead of just the most recent. Click on the backlinks tab, scroll to the bottom of the page and click 'download data'.
Most of the above sites will require you to have subscriptions in order to gather the data. Sites such as Majestic SEO will allow you to use the tool free of charge (or a limited amount) if you have the domain verified in your Webmaster Tools account. That being said, for the sake of one or two month's membership, it's worth paying for the wealth of data you'll get.

Note: Google have previously recommended you add both the www. and the non-www. version of the domain to Webmaster tool and gather links from both sources. It's worth search different variations of your domain to get as much data as possible.

Step 2: documenting your work

Start by creating yourself a Google Docs Spreadsheet where you can track all of your work to see exactly where you are at with the link removal process. This will be extremely useful when you come to submitting a reconsideration request with Google, as they'll be able to see exactly what you've done to sort out the issues with your site.

I should also point out at this point that Google say they don't tend to trust external links from sources they don't trust. For that reason, I recommend only using a Google Spreadsheet when documenting your work as it's a Google product and trusted by them. You can also include a link to this document in your reconsideration request very easily and it's free!

We would usually start by creating tabs at the bottom of spreadsheet for each of our data sources. For example, Google Webmaster Tools, Ahrefs, Majestic SEO & Open Site Explorer. You don't have to separate your data into different sheets, but I personally think it shows more work being carried out and allows more in-depth analysis.

The only disadvantage to doing this is that you'll have lots of repeat links - we do however have this covered!

If you visit the Pinpoint Designs Blog, we've created a Google docs spreadsheet that you can use which helps to combine multiple sheets together and remove duplicates. This is a brilliant asset to use on your link removal campaign.

Google Docs only allows importing of 50,000 characters at a time. This may sound a lot, but it's surprising how quickly you hit this limit. You can import data from a CSV by going to your Google Docs file, then clicking 'File > Import'. You can do this for each document you export from OSE, Ahrefs, Majestic and Google Webmaster Tools and import them into separate sheets within the same document. There's still a limit, but it's higher than 50,000 and will speed up the process.

IMPORTANT: Once you've got all your data into the spreadsheet, make sure you add the following columns to the end of each of the sheets (or alternatively your 'Master Sheet' if you're combining all the data into one sheet):
  • Contact Name
  • Contact Email Address
  • Website Contact Form
  • Twitter URL
  • Facebook URL
  • Google+ URL
  • LinkedIn URL
  • Date of 1st Contact
  • Date of 2nd Contact
  • Date of 3rd Contact
  • Link Status
  • Notes
This may seem like a lot of data, but it's the best way you can document your work. That way, when you're carrying out research on your domains, you'll be able to start populating the spreadsheet and making your life easier in the long term. You'll also be able to show Google all the work you've been doing to get the links removed and making it very easy for them to see everything you've done.

If you don't want to go to the effort of populating all the data above, you could combine all of the different forms of contact method into one cell and just populate that. The chances are, you're not going to need Twitter, Facebook, Google+, LinkedIn, Email and Website Contact Form for every single URL so it's just down to preference. The above recommendation is the best way we've found of documenting data.

Finally, add an additional sheet to your Google Docs file called 'Contact Examples'. In here, you can upload images of a few examples of the emails you've sent out to the webmasters you've been working with. Be careful what you put in here, Google will be reading these messages so make sure you're not badmouthing them.

Don't threaten Webmasters by saying Google will block their site and how they're going to get harmed if they don't remove your links. Instead, say that you're trying to clear things up so that you're fully compliant with Google's webmaster guidelines. You can apologise for the inconvenience to the webmaster and thank them for their help (examples further down in this article). That way, when Google reads them, they'll understand you're genuinely trying to sort things and hopefully be a little more forgiving under the circumstances.

Tip: If you're on a Mac, you can press 'Cmd + Shift + 4'. This command allows you to take screenshots quickly and easily of a specific section of your monitor. Perfect for quickly snapping contact forms, emails you're sending, etc. and uploading them to the 'Contact Examples' sheet in your Google docs file.

Step 3: spotting low-quality links

This is a hugely important section of link removal. It sounds simple, but you have to be extremely careful with what links you attempt to remove. Good links generally take a long time to build and if you ask for them to be removed thinking they're potentially spammy, that hard work may all be for nothing.

Over the past year, we've learnt that the best way to identify spammy links is to manually review each and every one of them. If a link is good, mark it in your spreadsheet so you know not to remove it. Don't delete any links from the sheet, as it's all research to show Google what you've been doing. Either highlight the cell in a colour, or add a note in one of the columns so you know it's safe / genuine.

So, how do we spot spammy links?

Some links are easy to identify. For example, if you've been featured on the BBC News, Guardian or a high quality, authoritative website, you can be fairly sure it's a strong link that doesn't need removing.

The definition of a natural link is fairly hard to summarise. I would class these as links that have appeared from other content you've written naturally. For example, let's say you were involved in the automotive industry and wrote an article all about the history or cars, different manufacturers and really went into great details about every aspect. This type of article is obviously going to end up very big, hopefully interesting and should be a brilliant read. If you've done a good job and shared the article in the right places, you should hopefully acquire links naturally. People will link to your guide / post naturally without you having to ask for it and those types of links are ok.

Throughout this article, I'll link to other articles I've read on the internet that I believe are helpful for link removal. All the people I link to have created good quality content that I believe will help you in removing your penalties with Google. They have written an article for the sake of it, they've written it with an aim in mind so that it's beneficial.

On the same side, it's easy to spot some spam links. For example, if you're listed on a website with 10,000 links on the same page in a list, or if you've commented on a blog with xx,xxx other comments just for the backlink. I realise that's a bit of an extreme situation, but hopefully my point is made. If you know the link has been placed on the site purely for SEO purposes, then it's most likely unnatural.

Some links however are harder to spot, so here are my top tips for identifying lower quality links:

Whether the URL is indexed in Google or not:


If not, remove the links as the site has most likely received a penalty. You can also see if a domain is indexed in Google by searching Google for 'site:yourdomain.com'.
Advanced Search Operators - Google
Page Authority and Domain Authority

Don't pay too much attention to this metric. A new domain name can still be a very strong link even though it has low page and domain authority. A poor quality site can also still have good page authority and domain authority, so just be careful when looking at this.

Site-wide Links:

Site wide links are generally associated with websites you have an affiliation with (friendly links), or links you've paid for. Due to this, it's better to make sure you don't have links across every page of someone's sites.In this case, either nofollow the links, or remove them and only put them on one page. Ahrefs does a brilliant job of spotting site-wide links on websites quickly and easily.
That being said, the same rules still apply, if the links looks spammy, remove all of them.
Link Directories:
Link directories are fairly easy to spot, if they contain words like 'backlinks', 'links', 'seo' etc. in the URL, then the chances are that they are low quality links and need removing. If it lists every category under the sun from Website Developers to Sunglasses, then it's most likely a directory that you need to remove yourself from!
There are some good link directories on the internet, generally, these are targeted to a particular niche, are manually reviewed and sometimes locally targeted. You can look at link directories and think 'Will I ever receive any traffic from this site?' or 'Is this genuinely a valuable link?' if the answer is likely to be no, then the link should be removed. Be fairly honest with yourself on this one, if it looks like spam, then it most likely is.
If you want some more tips, here are a few bullet points:
Remove a link if:
  • The site is not indexed in Google, this would indicate a penalty.
  • The site automatically accepts links without any manual review.
  • The site has lots of spam links pointing to it (type the URL of the directory into Open Site Explorer and see what you can find!)
  • The site has categories for everything imaginable (cars, holidays, sunglasses, websites, hosting, dresses etc.).
  • The site is obviously part of a network where there are lots of related directories with similar names / similar looking websites etc.
  • The site contains keywords like 'best', 'links', 'seo' etc in the name.
If the site shows 'powered by phpLD' or 'PHP Link Directory' in the footer, it's most likely going to be a fairly spammy directory. That's not always the case, but 9/10 times, it's most likely true.

During a recent link removal campaign, we managed to get a webmaster to take down a set of 20 link directories that were pointing to our client's website. They couldn't be bothered to remove the links from each site individually, so instead they took every site offline so everyone's links disappeared!

Forum Profiles:

These are usually very spammy. If you have the odd profile link on a forum that you're very active on, then this is generally fine. However, if you're listed on a forum in Romania (an example only!) and have never posted before, get the link removed.

If you're a member of the forum purely to get a link back to your website, then the link should be removed. It's a very easy to identify spam technique, so stay safe and remove them.

Blog Comments:

Similar to Forum Profiles, blog comments are an easy one to spot. If you've commented every now and again on a blog that you generally feel has helped you, or the blog is in your industry and you've added value to the discussion with your comment (rather than just posting a boring comment like 'Good work dude'), then it's probably ok.

That being said, if you've got a large number of blog comments with very little substance, you should remove all the links. If the site has hundreds and hundreds of comments and you're one of a huge list of spam comments, you should remove the link too.

Social Bookmarks:

Very similar to both the blog comments and forum profiles, social bookmarks are ok if they're genuine. Remember that the penalties you have received are manual actions and when you put in a reconsideration request, the chances are that a Google employee with manually be looking through some of your links. If your social bookmarks look spammy, remove them.

Paid Links:

If you've been paying for links, make sure you remove them or add a rel="nofollow" attribute to the link. When you're writing your reconsideration request, mention the fact that you have previously purchased links and have now rectified the issue by ensuring they are all nofollow / been removed.

Google is getting much smarter at detecting advertorial links, so don't try to trick the system.

Blog Posts:

A slightly trickier one to detect straight off. Usually, you can spot the spammy blog posts generally by looking at the URLs. If they're dynamic URLs which end in something like 'index.php?id=99', then it's usually a sign of a site being launched very quickly. The best way to identify spam blog posts is to load up each blog. Use these tell-tale signs to spot the low quality posts:
  • Does the article make sense? - Is it English, do the sentences make sense? or is it spun, low quality rubbish that benefits nobody.
  • Is the site nicely designed? - Does the website look like a genuine blog that's been looked after, cared for and regularly updated? Or is it using a standard template with awful layouts and content.
  • Is the website content unique? - You can use a website such as http://www.copyscape.com to find duplicate / spun content.
  • Is the article high quality? - Again, this is down to interpretation, but does the article provide any value to your business or is it there for the sake of being there? If it's high quality, but just linked in the wrong way, ask the webmaster to add a nofollow attribute assigned to it.
  • Is the blog too consistent? - Does the blog post about shoes one day, software the following day and medicine the following day? Also, does every post have an outgoing link to a different website, the same length and full of anchor text using 'money keywords'? If so, they're not focused and most likely either automating their posting / trying to build a site for SEO purposes. Avoid these sites and remove the links.
  • Is there an easy way to contact the owners? - Lots of lower quality blogs will remove the contact form, or completely hide who is behind them. If you can't get in touch with the owner, it's likely to be low quality and should be removed.
If the answer to any of the above questions is no, then have the links removed. Other things you can look at are as follows:
  • Are you using money terms and anchor text in the article? - This is ok as long as it's not overdone and as long as it's genuine. You have to use common sense when it comes to looking at blog posts. If it could be considered a spam article, remove it to stay on the safe side).
  • Are there lots of keywords stuffed into the article? - If your article reads like something that's been written for a search engine, remove it
If the answers to the above are yes, then remove the articles.

Link Networks:


Link networks are bad news, but many 'SEO' agencies still use them. You can use some of the metrics gathered in the reports above to detect link networks. For example:
  • IP Address - If the IP address is the same on multiple different domains, this is a sign that you could be part of a link network. You could run a quick whois search on the various domains to see who the owner is. We've carried out link removal campaigns on clients' websites where over 150 domains are from the same IP address.
  • Whois Searches - By running a whois search on a domain name, you can see who the registered owner of the domain is (in most cases). If you find that the same owner owns a large number of domains pointing to your site are from the same owner, this should send danger signals.
  • Google Analytics IDs - Look at the source code of the sites and search for the Google analytics code. These follow the format UA-XXXXXXXX-X (where the X's are replaced with numbers). If you find the same UA- code being used on multiple sites, this shows signs of a link network.
  • Site Design - Does the site look exactly the same as other sites pointing to your domain? This can sometimes be the same links in the footer, a 'sponsored company' or even just the same look and layout. If so, it's possibly built by the same person and an example of a network.
Some of the tools listed further down in this article will help you to speed up the locating of link networks by automatically pulling in Whois Results, IP address and Analytics Details for each domain. You can then use their system, or software such as Excel to manually filter through the data and spot offending links straight away.

Over Optimised Anchor Text:

If you have articles out on the internet which just contain anchor text pointing to keywords you want to appear on Google for, then these need removing. There has been a lot of talk online about the correct 'Anchor text ratio' to have for brand vs commercial anchor text terms, but I don't think you should think of SEO in this way.

Instead, focus on building your brand. If it's appropriate to link to your money keywords, or keywords surrounding the money terms, link to it. If your trying to build your brand, you'll find that you're linking to your brand name and it's URL more regularly and you're going to have a much more natural and organic link profile.

Tip: Use Eppie Vojt's Link Detective tool to see what your current link profile looks like. You'll be able to see very quickly which keywords you've been targeting too regularly.

If your anchor text is only targeting money terms, remove the links. Chances are, it's probably not the best article / content anyway. If it is, it's probably worthy of a money term link.

Link Exchanges:

This is a fairly old tactic these days, but many people still do it. Don't create a page on your site called links and swap these with other peoples 'links' pages. This creates a huge footprint of spammy link building which should be avoided.

Press Releases:

I still believe press releases are a great way to carry out SEO, but only if they're done correctly. If your press release is simply put out there and contains 3-5 links back to your site with a money keyword as the anchor text, you're doing it all wrong.

Press releases should be used to build your brand, shout out to the world about what your company is doing and only used when you have something relevant to tell people. If you've just received investment from a company, helped out your local community or had your best ever years profits, then write a really good quality press release that people find engaging and interesting. You can then put in links to your website under its brand name or the website URL.

If you are adding money terms in, I'd recommend adding a nofollow attribute in order to stay on the safe side as you can bet this is where Google will be targeting very soon. If you've got press releases out there which are blatantly spam with no real value, have them removed.

Malware:


A simple one. If the site contains malware, have your link removed. Alternatively, if it's a site that you know is hugely reputable, contact the owner and have them fix the problem as soon as possible.
The above section hopefully gives you an idea of how to spot some poor quality links. It's not a definitive guide, but it should give you an idea of some of the more common sections we come across when carrying out work for clients.

The biggest tip I would be able to provide is use common sense. Deep down, you'll know if a link is worth having or not. Sometimes likes are good, but they may need a nofollow attribute adding to them. So just work through the list systematically and put yourself in Google's shoes.

All you need to do is go through each link manually and make a note of whether it needs to be removed or not. If you consider it a safe link, mark it so you know not to remove the link. You could do this either by a colour marker, or just by adding a piece of text next to each link. If you've got a lot of colours going on, make sure to make a sheet at the bottom called 'Key' so that the Google employees can see what work you've carried out.

Tools for making your life easier...

There are a lot of tools available online that say they'll help to identify spammy links in bulk. Some of these tools are brilliant for speeding up analysis, but I wouldn't rely on them 100%. I've put some comments below on systems we've used in the past and my thoughts on using them.

Websites / Tools available for full link removal:


This is probably one of the better tools we've used over the past year. The detox tool assigns a risk score to each of your backlinks which it pulls from 22 different link sources. In addition to this, it will pull in lots of statistics about the links and will try to retrieve an email address / contact name for the owner of the offending website. Personally, I've found the contact finder to be brilliant and some of the additional metrics (such as IP address of the website and domain registrant) have helped hugely in discovering link networks. They still have a lot of work to do as far as the link scores are concerned. Sometimes, the 'Very Low Risk' links are obviously spammy and should be removed, so if you use this tool, you need to be very careful and double-check the links that you're removing. Overall, a very good system though.


I've never actually used Link Risk, but I've heard some good things about it. The team behind Link Risk are certainly very talented and know their stuff, so I can't really comment on the results seen with it.


A new tool on the block - Linkquidator offer a 30 day free trial with a limited number of backlinks. I have signed up to look at the tool but haven't had chance to give it a full test drive.


I've used Remove'em on a couple of campaigns we've done for clients as a tool to help us identify bad links. Personally, I found the interface very clunky in terms of outreach to webmasters and a very slow process. As for uncovering bad links, it did a fairly good job of this but didn't show me the safe links so it is hard to see how many it may have missed. I found the outreach to webmasters very slow with this system and would opt to use mail instead for speed.


Rmoov have personally tried not to identify which links are classed as spammy. Personally, the idea behind this system is brilliant, instead, it just helps to speed up the process of contacting webmasters by pulling contact information for each domain, allowing you to create and send emails then following up each of the emails with reminders. It will check the status of your links periodically and record each step of the process in a Google Doc.

Barry Schwartz from Search Engine Roundtable has done a brilliant write-up on link tools identifying toxic links. It really highlights that manual review is still necessary when using tools such as the above and you should never rely on them 100%.

Our agency Pinpoint Designs is currently working on a link removal system called Peel App which we're hoping will launch in 2014. Whilst we still push that manual research is required, we'll be trying to make our system identify spam links in bulk as we do feel there are still algorithms we can use to detect this better than some of the tools already on the market.

For More Info about Google Penalty Removal


Wednesday, October 16, 2013

How to Do Guest Blogging for Seo Purposes

Search engine optimization is evolving with many changes post google panda and penguin updates. In simple terms SEO is nothing but optimizing your website for the search engine.

Google is the most efficient search engine bringing maximum web traffic for your website. Google sensed the fact that many websites do black hat SEO or illegal SEO by means of various techniques like keyword stuffing, doorway pages, link spamming. Thus google introduced panda update to slash content spamming and penguin update to slash link or web spamming.

Link building is one of the most sought and effective off page SEO procedure. One of the effective ways to build quality links is guest blogging. Guest blogging and getting links from quality guest blogging sites is an white hat seo technique that does not lead to penalizing your site.

This article will explain you the step by step procedures in guest blogging activity.

Steps :

1. Determining your guest post domain : Firstly you need to identify your web domain’s niche topics. For example if your website is about cars, you can post guest blogs explaining latest model cars, car brands, car accessories and so on. So you need to list out the keywords or content domain around which you need to pen your blogs.

2. Research : Now that you have your list of topics, you need to search for popular blogs with relevance to your domain space and allow guest blogging. Consequently you will get a few relevant guest blogging sites.

3. Qualitative analysis : You need to determine the quality of the guest blogging sites you have collected with respect to their google page rank, traffic status, activivty status, member count and visibility in search engines.

4. Page rank analysis : Google rates wbsites based on their performance, backlinks and relevance. Then it ranks the website from 0 to 10, 10 being the highest page rank. You can check the page rank of the guest blogging site with tools like http://www.prchecker.info

5. Traffic analysis : Only if there is condierable traffic to the guest sites, you will be able to generate traffic to your website through your backlink. The traffic statistics for the guest post site can be determined using sites like www.compete.com

6. Activity status : Go for guest blogging sites that have recent activity and good readership. This can be determined by checking the date of the recent posts.

7. Member count : Select blogging sites that have good member counts. You may follow or share your posts to relevant members and create more visibility to your blog posts.

8. Posting guest blogs : After all the research you would have zeroed in one a few guest blogging sites. Firstly register with these sites by using the sign up form. These sites offers a contact form or provides an email ID through which you can send your content and post review you may have your post published after moderation.

Tips :-:

Always go for high Page rank sites ranging from PR4 and above.

Keep your content informative : If at all you want to include some advertising or marketing stuff do it in the ratio of, info content 80% and marketing content 20%.

Link back to relevant webpage in your website which may not necessarily be your index page. Linkback to webpage in your site that adds value to your blog topic.

Warnings :-:

Do not rewrite existing content : Post only original content if you really wish to promote your brand name.

Do not include too many links to your website : Use only two or one contextual link, linking back to you website.

For More Info about Guest Blogging for Seo Purposes


Monday, October 14, 2013

How to Build a Website

How to Build a Website Step 1 - Hosting :

Hosting is where you put your website and all the Web pages. While it's possible to build a website on your personal computer and never move it online, it's somewhat pointless. No one but you will ever be able to see it. So the first thing you'll want to do is find a Web hosting provider.

There are several types of Web hosting options you can choose from
Most people gravitate to free Web hosting without too much thought, but there can be drawbacks to free hosting. You don't always get as much space, you might be required to run their ads on your site, or there may be bandwidth limits. Be sure to read all the fine print before you put your website on a free Web host. I recommend using free hosting providers for testing Web pages and for personal pages.

How to Build a Website Step 2 - Do You Need a Domain Name?:

You don't need a domain name to put up a website. You can put up a site on free hosting or even paid hosting plans without a domain name. A domain name provides extra branding for your site and makes it easier for people to remember the URL. But domain names cost money, typically between $8 and $35 a year.
You don't need a domain name to put up a website. You can put up a site on free hosting or even paid hosting plans without a domain name. A domain name provides extra branding for your site and makes it easier for people to remember the URL. But domain names cost money, typically between $8 and $35 a year.

How to Build a Website Step 3 - Plan Your Website:

Once you've gotten a domain and decided on your URL, you can start planning your site. You need to decide:
  • Type of site - Most websites are either news/information, product, or reference sites. As such they each have a slightly different focus.
  • Navigation - The navigation affects the information architecture of your site.
  • Content - Content is the actual pages you'll be building.
If you can recognize page types, you'll be able to recognize what types of pages you need for your site. Play the Web Page Types game.

How to Build a Website Step 4 - Build Your Website Page by Page:

Building a website requires that you work on one page at a time. To build your site you should be familiar with:
  • Design Basics - The elements of good design and and how to use it on websites.
  • Learning HTML - HTML is the building block of a Web page. While it's not absolutely required, you'll do better if you learn HTML than if you don't.
  • Learning CSS - CSS is the building block of how pages look. And learning CSS will make it easier for you to change your site's look when you need to.
  • Web Page Editors - Finding the perfect editor for your needs will help you learn design, HTML, and CSS.

How to Build a Website Step 5 - Publish Your Website:

Publishing your website is a matter of getting the pages you created in step 4 up to the hosting provider you set up in step 1. You can do this with either the tools that come with your hosting service or with FTP clients. Knowing which you can use depends upon your hosting provider. Contact them if you are not sure.

How to Build a Website Step 6 - Promote Your Website:

The easiest way to promote your website is through search engine optimization or SEO. You build your Web content so that it ranks well in search engines. This can be very difficult, but it is inexpensive and can result in good results if you work at it.
Other ways to promote your site include: word of mouth, email, and advertising. You should include your URL on all professional correspondence and whenever it makes sense in personal messages. I put my URL in my email signature along with my email address.

How to Build a Website Step 7 - Maintain Your Website:

Maintenance can be the most boring part of website design, but in order to keep your site going well and looking good, you need to do it. Testing your site as you're building it and then after it's been live for a while is important. And you should also work on content development on a regular basis.

HTML References / Web Design References
HTML Tutorials / Web Design Tutorials / CSS Tutorials
Editing Websites / Managing Websites
Related Articles
For More Info about How to Build a Website


Friday, October 11, 2013

Facebook privacy : Users should check these settings as new changes roll out

Facebook users may be wary after hearing that the social network is following up on a promise to cut a privacy setting that kept user names out of the social network’s graph search.

Here’s a quick guide to what changes are coming and which settings to review as they hit users’ accounts.

What’s disappearing?  - The social network announced in December that it was retiring the option for users to control whether they show up when others type their name into the search bar, but said Thursday that it would finally be notifying those who use it that the setting will be removed. The option, which shows up in privacy settings as “Who can look up your timeline by name?” was already cut for people who weren’t using it.

How was that setting useful? -  It wasn’t perfect. It would not have stopped, for example, Facebook users from being able to access those protected profiles if those users had been tagged in a public post or picture. Still, it did help those users to keep a lower profile on the social network, such as those trying to hide their profiles from abusive ex-partners or harassment.

For those who chose that option— a “small percentage” of its users, Facebook says -- the change now allows anyone to type those names into Facebook’s search engines to see their profiles.

Can I replicate that option with other tools? -  No. The main feature of the setting was that people wouldn’t be able to find a user by name in the search bar. There appears to be no way to keep that function. But there are still some setting that can lower a profile on the site.

Facebook is trying to encourage people to control their privacy on an item-by-item basis. So, whenever and however you post, you should be checking to see if what you’re putting up is for public view or just for friends or specific lists of friends. Also, consider turning on Timeline approval, which shows you what your friends may be posting about your location or whom you’re with. You can ask them to remove your name from those posts. Facebook has settings that let you review posts and photo tags before they’re posted to your Timeline. If privacy is a major concern, use these tools and don’t hesitate to ask other users to remove posts about you that make you uncomfortable.

It’s also a good idea to create group lists of friends so that you can share some posts exclusively with certain lists and exclude other groups from seeing those posts. That’s particularly important if you spend a lot of time on Facebook’s mobile app, where privacy tools can be harder to finesse.

What else can I do for privacy? -  Another key option in the privacy settings menu is one lets users disable your search engines from linking to their timelines. That will at least cut down on the chance that someone looking for you outside the social network will be able to find your profile.

If your whereabouts or similar information are sensitive, particularly if it’s a safety issue, you should be very aware of locations on your posts — no check-ins — and be careful about writing posts that give clues about where you are.

Users should also remember that they can also always block specific users from seeing their Facebook page or from contacting them, but this is more of a reactive step than a proactive one. Plus, just as you could alter your name (yes, in violation of Facebook’s guidelines) to hide your identity, so could anyone who is looking for you.

If you’re concerned about past posts, Facebook has a setting that lets you limit the audience for posts and information that are already on your profile. You can also go to the “Activity Log” on your timeline to get an action-by-action view of how your activity shows up on the site.

And finally, as Facebook itself makes clear, remember that “things you hide from your timeline still appear in news feed, search and other places on Facebook.” There include some things you just can’t hide, namely profile pictures and cover photos, but also some news feed activity.

There’s been much discussion over how much social networks should expect users to fine-tune their settings in order to hit the levels of privacy that they want. Facebook has been evolving its settings to be more granular — which can be good for thinking about privacy at a more real-time pace. But it also means that it requires those for whom privacy is a major concern to set broader controls.

For More Info about Facebook privacy


Monday, October 07, 2013

No More PageRank Updates This Year

Google’s head of search spam, Matt Cutts, said on Twitter yesterday that Google won’t be pushing out a new Google Toolbar PageRank update this year.

Niels Bosch asked on Twitter if we should expect an update to PageRank before 2014. In response to that, Matt Cutts said, “I would be surprised if that happened.”

It has now been 8 months since the last Google Toolbar PageRank update. In fact, the last update was on February 4, 2013 and honestly, I think PageRank is finally dead – at least the Toolbar PageRank.

I’d be surprised if there was another Toolbar PageRank update ever. Maybe Google will do one next year and then let it quietly go away forever.

Over the years, support for PageRank has dropped. Google never offered a Google Toolbar for Chrome or any add-on to show PageRank values. Google dropped the Google Toolbar for Firefox in June 2011. Internet Explorer is the last browser to still have a PageRank display offered by Google, but the data that flows into that display hasn’t been updated for over six months.

When we asked Google two months ago about when the next update would come, Google had no comment.

Earlier this year, Google said that PageRank in the toolbar wouldn’t be going away:



Update: This afternoon, Matt Cutts posted this timely video on PageRank:



For more on this, topic, check out our Google PageRank Guide.

For more info about No More PageRank Updates This Year


The Penguin 2.1 Spam-Filtering Algorithm


The fifth confirmed release of Google’s “Penguin” spam fighting algorithm is live. That makes it Penguin 5 by our count. But since this Penguin update is using a slightly improved version of Google’s “Penguin 2″ second-generation technology, Google itself is calling it “Penguin 2.1.” Don’t worry. We’ll explain the numbering nonsense below, as well as what this all means for publishers.

Previous Updates

Here are all the confirmed releases of Penguin to date:
  • Penguin 1 on April 24, 2012 (impacting around 3.1% of queries)
  • Penguin 2 on May 26, 2012 (impacting less than 0.1%)
  • Penguin 3 on October 5, 2012 (impacting around 0.3% of queries)
  • Penguin 4 (AKA Penguin 2.0) on May 22, 2013 (impacting 2.3% of queries)
  • Penguin 5 (AKA Penguin 2.1) on Oct. 4, 2013 (impacting around 1% of queries)

Why Penguin 2.1 AND Penguin 5?

If us talking about Penguin 5 in reference to something Google is calling Penguin 2.1 hurts your head, believe us, it hurts ours, too. But you can pin that blame back on Google. Here’s why.
When Google started releasing its “Panda” algorithm designed to fight low-quality content, it called the first one simply “Panda.” So when the second came out, people referred to that as “Panda 2.” When the third came out, people called that Panda 3 — causing Google to say that the third release, because it was relatively minor, really only should be called Panda 2.1 — the “point” being used to indicate how much a minor change it was.
Google eventually — and belatedly — indicated that a Panda 3 release happened, causing the numbering to move into Panda 3.0, Panda 3.1 and so on until there had been so many “minor” updates that we having to resort to going further out in decimal places to things like Panda 3.92.
That caused us here at Search Engine Land to decide it would be easier all around if we just numbered any confirmed update sequentially, in order of when they came. No matter how “big” or “small” an update might be, we’d just give it the next number on the list: Penguin 1, Penguin 2, Penguin 3 and so on.
That worked out fine until Penguin 4, because Google typically didn’t give these updates numbers itself. It just said there was an update, and left it to us or others to attach a number to it.
But when Penguin 4 arrived, Google really wanted to stress that it was using what it deemed to be a major, next-generation change in how Penguin works. So, Google called it Penguin 2, despite all the references to a Penguin 2 already being out there, despite the fact it hadn’t really numbered many of these various updates before.
Today’s update, as can be seen above, has been dubbed Penguin 2.1 — so supposedly, it’s a relatively minor change to the previous Penguin filter that was being used. However, if it’s impacting around 1 percent of queries as Google says, that means it is more significant than what Google might have considered to be similar “minor” updates of Penguin 1.1 and Penguin 1.2.
What Is Penguin Again? And How Do I Deal With It?
For those new to the whole “Penguin” concept, Penguin is a part of Google’s overall search algorithm that periodically looks for sites that are deemed to be spamming Google’s search results but somehow still ranking well. In particular, it goes after sites that may have purchased paid links.
If you were hit by Penguin, you’ll likely know if you see a marked drop in traffic that begins today or tomorrow. To recover, you’ll need to do things like disavow bad links or manually have those removed. Filing a reconsideration request doesn’t help, because Penguin is an automated process. Until it sees that what it considers to be bad has been removed, you don’t recover.
If you were previously hit by Penguin and have taken actions hopefully meant to fix that, today and tomorrow are the days to watch. If you see an improvement in traffic, that’s a sign that you’ve escaped Penguin.
Here are previous articles with more on Penguin recovery and how it and other filters work as part of the ranking system

What About Hummingbird?

If you’re wondering about how Penguin fits into that new Google Hummingbird algorithm  you may have heard about, think of Penguin as a part of Hummingbird, not as a replacement for it.

Hummingbird is like Google’s entire ranking engine, whereas Penguin is like a small part of that engine, a filter that is removed and periodically replaced with what Google considers to be a better filter to help keep out bad stuff.



Tags : Penguin 2.1, Penguin 2.1 Spam-Filtering Algorithm, hummingbird algorithm, google algorithm updates, google algorithm, nilesh patelseo services providerbest information of the world,  nilesh patel forum,  fans of photography,  nilesh patel seo

Thursday, October 03, 2013

DEFINITIONS

What is SEO?
SEO (Search Engine Optimization) is an internet marketing concept whereby you improve your Web site's design and content to achieve higher rankings in search engines. Search engine optimization is also known as search engine positioning and search engine placement

What is Satisficing? 
Satisficing is a cross between 'satisfying' and 'sufficing.' It refers to the fact that when human beings are presented with numerous choices, we usually select the first reasonable option, rather than the best option available.

What is Scalability?
Scalability is the ability of a computer application or product (hardware or software) to continue to function well as it (or its context) is changed in size or volume in order to meet a user need.

What is a Screensaver?
A screensaver is a program or animated image that is activated on a personal computer display when no user activity has been sensed for a certain time.

What is a Search Engine?
A search engine a set of programs that includes:
  • a spider (crawler, bot or robot) that crawls the Internet retrieving Web documents that want to be searchable, then follows hypertext links to retrieve other Web documents;
  • a program that creates an index from the documents retrieved by the spider; and
  • a program that receives your search request, compares it to the entries in the index, and returns results to you.
What is Search Engine Ranking?
Search engine ranking is a measure of a Web site's visibility in a search engine.

What is Search Engine Submission?
Search engine submission or registration is the act of registering a site with the search engines, so that it is searchable by search engine users.

What is a Search Term?
A search term or search phrase is a word or phrase that people enter into search engine and Web directory search forms.

What is Section 508?
Section 508 is a federal mandate requiring that information technology be made accessible to people with disabilities. Much of Section 508 compliance concerns making Web sites, intranets, and web-enabled applications accessible.

What is a Secure Server?
A secure server is a web server that uses encryption technology to prevent non-authorized users from intercepting and reading sensitive messages sent via the Internet.

What is Sell Through Rate
Sell through rate is the percentage of ad inventory sold, excluding traded or bartered inventory.

What is SERPs?
SERPs (Search Engine Results Pages) is the list of search results returned by a search engine or web directory in response to a search query.

What is a Server?
A server is a computer program that provides services to other computer programs in the same or other computers. For example, email, FTP, Usenet, and HTTP connections.

What is a Session Cookie?
A session cookie is a temporary cookie placed into a computerÂ’s RAM for use during the user's visit to the Web site. When the user exits the site, the cookie is removed.

What is Settlement?
Settlement is the process by which a merchant and a cardholder exchange financial data resulting from a transaction.

What is SFA?
SFA (Sales Force Automation) is a means of increasing your sales team's efficiency and effectiveness using technology to help automate, organize and track the sales process.

What is Shareware?
Shareware is software that is distributed free, on a trial basis, with the understanding that the user pays a fee if they decide to keep using it. Sometimes shareware programs are 'lite' versions of a program with certain functions disabled as an enticement to buy the complete version of the program.

What is Shockwave?
Macromedia's Shockwave is a program for viewing interactive media on the Web.

What is a Shopping Cart?
A shopping cart is an interactive online catalog where a user can view, add and remove items to and from a cart. 

What is a Signature?
A signature or sig file is a short block of text at the end of emails, often used as virtual business cards. 

What is Signed Volume?
Signed volume is projected annual volume in sales for a new merchant. This figure is used to track the risk of a merchant account. 

What is a Site Map?
A site map or sitemap is a visual model of a Web site's content that allows the users to navigate through the site to find the information they are looking for. Typically, site maps are organized hierarchically, breaking down the Web site's information into increasingly specific subject areas. 

What is Site Popularity?
Site popularity refers to click through popularity, and how long visitors remain at the site after getting there.

What is Site Search?
Site search is a search engine or application that searches a single web site, as opposed to the whole World Wide Web.

What is a Skyscraper Ad?
A skyscraper ad is a tall online banner ad, usually displayed at the side of a page. Dimensions include skyscraper (120 x 600 pixels) and wide skyscraper (160 x 600 pixels).

What is a SLA?
A SLA (Service Level Agreement) is a commitment to a merchant by the merchant account provider for delivery of services.

What is SLD?
SLD (Second-Level Domain) is the portion of an Internet address that identifies the specific and unique administrative owner associated with an IP address. The second-level domain name includes the TLD (top-level domain) name. For example, in ECommerce-Dictionary.com, 'ECommerce-Dictionary' is a second-level domain. 'ECommerce-Dictionary.com' is a second-level domain name and includes the top-level domain name of 'com.

What is a Slotting Fee?
A slotting fee is a fee charged by sites to advertisers for premium ad positions on their site.

What is SMTP?
SMTP (Simple Mail Transfer Protocol) is a method by which computers use to send and receive email.

What is Sniffer?
Sniffer is a software program that detects user's browser capabilities, such as bandwidth, JavaScript, plugins and screen resolution.

What is SOHO?
SOHO (Small Office Home Office) is a term used for the small office or home office environment.

What is Spam?
Spam is typically used to refer to unsolicited junk email on the Internet. Historically means an electronic message which is sent to a large group of people when such messages are prohibited or discouraged.

What is a Spider?
A spider (also known as a robot, crawler and bot) is a program that crawls the Internet retrieving Web documents that want to be searchable, then follows links found within the document to retrieve other Web documents.

What is a Splash Page?
A splash page is the first page, usually the home page, of a site used to capture the visitors attention for a short time as a promotion or lead-in to the proper home page.

What is SPM?
SPM (Sales Process Management) is the front-end evaluation and assessment of a company's sales process. Innovative diagnostics benchmark current processes, identify areas of improvement, and codify best sales practices. Through SPM, effective sales processes are put into place that enhance productivity and help generate and increase revenue.

What is a Sponsored Feature Listing?
A sponsored feature listing is a paid text ad displayed at the top of a Web page, before the search results, in search engines.

What is SQL?
SQL (Structured Query Language) is a standard interactive and programming language for getting information from and updating a database.

What is SSI?
SSI (Server-Side Include) is a variable value that a server can include in an HTML file before it sends it to the browser.

What is SSL?
SSL (Secure Socket Layer) is a commonly-used protocol for managing the security of a message transmitted via the Internet.

What is a Stealth Script?
A stealth script is a CGI script that switches page content depending on who or what is accessing the page.

What is Stickiness?
Stickiness is a measure of a user's loyalty to a site, measured in the amount of time spent over a given time period.

What is a Stock Photo?
A stock photo is a copyrighted photographic image that can be licensed for use in a Web site.

What is a Super Affiliate?
A super affiliate is a top performing affiliate, who is capable of generating a significant percentage of an affiliate program's sales.

What is a Superstitial?
A superstitial is a rich media ad format developed by Unicast that combines Flash or other animation technology with Java programming to deliver video-like Web commercials.

What is a Surround Session?
A surround session is a concept whereby ads follow readers as they link from page to page, deploying ads from only one advertiser during the entire visit.

 

SEO advice: url canonicalization

Before I start collecting feedback on the Bigdaddy data center, I want to talk a little bit about canonicalization, www vs. non-www, redirects, duplicate urls, 302 “hijacking,” etc. so that we’re all on the same page.

Q : What is a canonical url? Do you have to use such a weird word, anyway?
A : Sorry that it’s a strange word; that’s what we call it around Google. Canonicalization is the process of picking the best url when there are several choices, and it usually refers to home pages. For example, most people would consider these the same urls:
  • www.example.com
  • example.com/
  • www.example.com/index.html
  • example.com/home.asp
But technically all of these urls are different. A web server could return completely different content for all the urls above. When Google “canonicalizes” a url, we try to pick the url that seems like the best representative from that set.

Q : So how do I make sure that Google picks the url that I want?
A : One thing that helps is to pick the url that you want and use that url consistently across your entire site. For example, don’t make half of your links go to http://example.com/ and the other half go to http://www.example.com/ . Instead, pick the url you prefer and always use that format for your internal links.

Q : Is there anything else I can do?
A : Yes. Suppose you want your default url to be http://www.example.com/ . You can make your webserver so that if someone requests http://example.com/, it does a 301 (permanent) redirect to http://www.example.com/ . That helps Google know which url you prefer to be canonical. Adding a 301 redirect can be an especially good idea if your site changes often (e.g. dynamic content, a blog, etc.).

Q : If I want to get rid of domain.com but keep www.domain.com, should I use the url removal tool to remove domain.com?
A : No, definitely don’t do this. If you remove one of the www vs. non-www hostnames, it can end up removing your whole domain for six months. Definitely don’t do this. If you did use the url removal tool to remove your entire domain when you actually only wanted to remove the www or non-www version of your domain, do a reinclusion request and mention that you removed your entire domain by accident using the url removal tool and that you’d like it reincluded.

Q : I noticed that you don’t do a 301 redirect on your site from the non-www to the www version, Matt. Why not? Are you stupid in the head?
A : Actually, it’s on purpose. I noticed that several months ago but decided not to change it on my end or ask anyone at Google to fix it. I may add a 301 eventually, but for now it’s a helpful test case.

Q : So when you say www vs. non-www, you’re talking about a type of canonicalization. Are there other ways that urls get canonicalized?
A : Yes, there can be a lot, but most people never notice (or need to notice) them. Search engines can do things like keeping or removing trailing slashes, trying to convert urls with upper case to lower case, or removing session IDs from bulletin board or other software (many bulletin board software packages will work fine if you omit the session ID).

Q : Let’s talk about the inurl: operator. Why does everyone think that if inurl:mydomain.com shows results that aren’t from mydomain.com, it must be hijacked?
A : Many months ago, if you saw someresult.com/search2.php?url=mydomain.com, that would sometimes have content from mydomain. That could happen when the someresult.com url was a 302 redirect to mydomain.com and we decided to show a result from someresult.com. Since then, we’ve changed our heuristics to make showing the source url for 302 redirects much more rare. We are moving to a framework for handling redirects in which we will almost always show the destination url. Yahoo handles 302 redirects by usually showing the destination url, and we are in the middle of transitioning to a similar set of heuristics. Note that Yahoo reserves the right to have exceptions on redirect handling, and Google does too. Based on our analysis, we will show the source url for a 302 redirect less than half a percent of the time (basically, when we have strong reason to think the source url is correct).

Q : Okay, how about supplemental results. Do supplemental results cause a penalty in Google?
A : Nope.

Q : I have some pages in the supplemental results that are old now. What should I do?
A : I wouldn’t spend much effort on them. If the pages have moved, I would make sure that there’s a 301 redirect to the new location of pages. If the pages are truly gone, I’d make sure that you serve a 404 on those pages. After that, I wouldn’t put any more effort in. When Google eventually recrawls those pages, it will pick up the changes, but because it can take longer for us to crawl supplemental results, you might not see that update for a while.

That’s about all I can think of for now. I’ll try to talk about some examples of 302′s and inurl: soon, to help make some of this more concrete.

For More info about  url canonicalization