Thursday, September 26, 2013

The New Google “Hummingbird” Algorithm

Google has a new search algorithm, the system it uses to sort through all the information it has when you search and come back with answers. It’s called “Hummingbird” and below, what we know about it so far.

What’s a “search algorithm?”

That’s a technical term for what you can think of as a recipe that Google uses to sort through the billions of web pages and other information it has, in order to return what it believes are the best answers.

What’s “Hummingbird?”

It’s the name of the new search algorithm that Google is using, one that Google says should return better results.

So that “PageRank” algorithm is dead?

No. PageRank is one of over 200 major “ingredients” that go into the Hummingbird recipe. Hummingbird looks at PageRank — how important links to a page are deemed to be — along with other factors like whether Google believes a page is of good quality, the words used on it and many other things (see our Periodic Table Of SEO Success Factors for a better sense of some of these).

Why is it called Hummingbird?

Google told us the name come from being “precise and fast.”

When did Hummingbird start? Today?

Google started using Hummingbird about a month ago, it said. Google only announced the change today.

What does it mean that Hummingbird is now being used?

Think of a car built in the 1950s. It might have a great engine, but it might also be an engine that lacks things like fuel injection or be unable to use unleaded fuel. When Google switched to Hummingbird, it’s as if it dropped the old engine out of a car and put in a new one. It also did this so quickly that no one really noticed the switch.

When’s the last time Google replaced its algorithm this way?

Google struggled to recall when any type of major change like this last happened. In 2010, the “Caffeine Update” was a huge change. But that was also a change mostly meant to help Google better gather information (indexing) rather than sorting through the information. Google search chief Amit Singhal told me that perhaps 2001, when he first joined the company, was the last time the algorithm was so dramatically rewritten.

What about all these Penguin, Panda and other “updates” — haven’t those been changes to the algorithm?

Panda, Penguin and other updates were changes to parts of the old algorithm, but not an entire replacement of the whole. Think of it again like an engine. Those things were as if the engine received a new oil filter or had an improved pump put in. Hummingbird is a brand new engine, though it continues to use some of the same parts of the old, like Penguin and Panda

The new engine is using old parts?

Yes. And no. Some of the parts are perfectly good, so there was no reason to toss them out. Other parts are constantly being replaced. In general, Hummingbird — Google says — is a new engine built on both existing and new parts, organized in a way to especially serve the search demands of today, rather than one created for the needs of ten years ago, with the technologies back then.

What type of “new” search activity does Hummingbird help?

Conversational search” is one of the biggest examples Google gave. People, when speaking searches, may find it more useful to have a conversation.

“What’s the closest place to buy the iPhone 5s to my home?” A traditional search engine might focus on finding matches for words — finding a page that says “buy” and “iPhone 5s,” for example.

Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words.

In particular, Google said that Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.

I thought Google did this conversational search stuff already!

It does (see Google’s Impressive “Conversational Search” Goes Live On Chrome), but it had only been doing it really within its Knowledge Graph answers. Hummingbird is designed to apply the meaning technology to billions of pages from across the web, in addition to Knowledge Graph facts, which may bring back better results.

Does it really work? Any before-and-afters?

We don’t know. There’s no way to do a “before-and-after” ourselves, now. Pretty much, we only have Google’s word that Hummingbird is improving things. However, Google did offer some before-and-after examples of its own, that it says shows Hummingbird improvements.

A search for “acid reflux prescription” used to list a lot of drugs (such as this, Google said), which might not be necessarily be the best way to treat the disease. Now, Google says results have information about treatment in general, including whether you even need drugs, such as this as one of the listings.

A search for “pay your bills through citizens bank and trust bank” used to bring up the home page for Citizens Bank but now should return the specific page about paying bills

A search for “pizza hut calories per slice” used to list an answer like this, Google said, but not one from Pizza Hut. Now, it lists this answer directly from Pizza Hut itself, Google says.

Could it be making Google worse?

Almost certainly not. While we can’t say that Google’s gotten better, we do know that Hummingbird — if it has indeed been used for the past month — hasn’t sparked any wave of consumers complaining that Google’s results suddenly got bad. People complain when things get worse; they generally don’t notice when things improve.

Does this mean SEO is dead?

No, SEO is not yet again dead. In fact, Google’s saying there’s nothing new or different SEOs or publishers need to worry about. Guidance remains the same, it says: have original, high-quality content. Signals that have been important in the past remain important; Hummingbird just allows Google to process them in new and hopefully better ways.

Does this mean I’m going to lose traffic from Google?

If you haven’t in the past month, well, you came through Hummingbird unscathed. After all, it went live about a month ago. If you were going to have problems with it, you would have known by now.

By and large, there’s been no major outcry among publishers that they’ve lost rankings. This seems to support Google saying this is very much a query-by-query effect, one that may improve particular searches — particularly complex ones — rather than something that hits “head” terms that can, in turn, cause major traffic shifts.

But I did lose traffic!

Perhaps it was due to Hummingbird, but Google stressed that it could also be due to some of the other parts of its algorithm, which are always being changed, tweaked or improved. There’s no way to know.

How do you know all this stuff?

Google shared some of it at its press event today, and then I talked with two of Google’s top search execs, Amit Singhal and Ben Gomes, after the event for more details. I also hope to do a more formal look at the changes from those conversations in the near future. But for now, hopefully you’ve found this quick FAQ based on those conversations to be helpful.


Tags : hummingbird algorithm, google algorithm updates, hummingbird, search algorithm, google algorithm, nilesh patel, seo services provider, best information of the worldnilesh patel forum,  fans of photographynilesh patel seo

Google unveils major upgrade to search algorithm

Google has unveiled an upgrade to the way it interprets users' search requests.
 
The new algorithm, codenamed Hummingbird, is the first major upgrade for three years.  

It has already been in use for about a month, and affects about 90% of Google searches. 
 
At a presentation on Thursday, the search giant was short on specifics but said Hummingbird is especially useful for longer and more complex queries.
 
Google stressed that a new algorithm is important as users expect more natural and conversational interactions with a search engine - for example, using their voice to speak requests into mobile phones, smart watches and other wearable technology.
 
Hummingbird is focused more on ranking information based on a more intelligent understanding of search requests, unlike its predecessor, Caffeine, which was targeted at better indexing of websites.  

It is more capable of understanding concepts and the relationships between them rather than simply words, which leads to more fluid interactions. In that sense, it is an extension of Google's "Knowledge Graph" concept introduced last year aimed at making interactions more human. 
 
In one example, shown at the presentation, a Google executive showed off a voice search through her mobile phone, asking for pictures of the Eiffel Tower. After the pictures appeared, she then asked how tall it was. After Google correctly spoke back the correct answer, she then asked "show me pictures of the construction" - at which point a list of images appeared.  

Big payoffs?
 
However, one search expert cautioned that it was too early to determine Hummingbird's impact. "For me this is more of a coming out party, rather than making me think 'wow', said Danny Sullivan, founder of Search Engine Land.
 
"If you've been watching this space, you'd have already seen how they've integrated it into the [predictive search app] Google Now and conversational search.
 
"To know that they've put this technology further into their index may have some big payoffs but we'll just have to see how it plays out," Mr Sullivan said.
 
The news was announced at an intimate press event at the Silicon Valley garage where founders Sergei Brin and Larry Page worked on the launch of the search engine, which is fifteen years old on Friday.
 
At the event, the search behemoth also announced an updated search app on Apple's iOS, as well as a more visible presence for voice search on its home page.
 

Tags : google algorithm updates, hummingbird, search algorithm, google algorithm, nilesh patel  seo services provider  best information of the world   nilesh patel forum   fans of photography   nilesh patel seo

Signature Allow Forum Posting Site List 2013

http://forums.digitalpoint.com
http://www.sitepoint.com/forums
http://www.webmasterforums.com
http://www.allcoolforum.com
http://www.warriorforum.com
http://forums.webicy.com
http://thehyipforum.com
http://www.webmasterforumsonline.com
http://www.webmasters.am/forum
http://nileshpatel.forumotion.com
http://www.webmasterforums.net
http://www.devhunters.com
http://www.webmaster-forum.net
http://www.geekvillage.com/forums
http://www.zymic.com/forum
http://www.webmastershelp.com
http://www.webmasterdesk.org
http://www.webmasterground.com
http://developers.evrsoft.com/forum
http://www.websitebabble.com
http://www.talkingcity.com
http://www.australianwebmaster.com
http://www.wtricks.com
http://www.forums.webzonetalk.com
http://www.htmlforums.com
http://www.searchbliss.com/forum
http://www.webmasterize.com
http://www.webmasterserve.com
http://www.freehostforum.com
http://www.seorefugee.com/forums
http://www.cre8asiteforums.com/forums
http://forums.delphiforums.com
http://www.web-mastery.net
http://www.webproworld.com
http://www.bzimage.org
http://www.v7n.com/forums
http://www.dnforum.com
http://www.webcosmoforums.com
http://forums.webicy.com
http://www.affiliateseeking.com/forums
http://www.webmaster-forums.net
http://www.geekpoint.net
http://www.smallbusinessforums.org
http://forums.ukwebmasterworld.com
http://www.experienceadvertising.com/forum
http://opensourcephoto.net/forum
http://forums.seochat.com
http://forums.searchenginewatch.com
http://www.ihelpyou.com/forums
http://www.businesss-forum.com
http://www.9mb.com
http://acapella.harmony-central.com/forums
http://forums.seroundtable.com
http://forums.comicbookresources.com
http://www.acorndomains.co.uk
http://forums.onlinebookclub.org
http://www.ableton.com/forum
http://www.davidcastle.org/BB
http://www.webtalkforums.com
http://www.bloggertalk.com/forum.php
http://paymentprocessing.cc
http://www.directoryjunction.com/forums
http://www.internetmarketingforums.net
http://forum.joomla.org
http://loanofficerforum.com/forum
http://iq69.com/forums
http://forum.hot4s.com.au
http://forums.mysql.com
http://forums.amd.com/forum
http://forums.cnet.com
http://seotalk.me
https://www.computerbb.org
http://forum.vbulletinsetup.com
http://www.irishwebmasterforum.com
http://www.app-developers.com
http://forums.stuffdaily.com
http://forums.seo.com
http://www.webdigity.com
http://www.inboundlinksforum.com
http://forums.gentoo.org
http://ubuntuforums.org
http://forum.textpattern.com
http://talk.iwebtool.com
http://www.frogengine.com/forum
http://www.capitaltheory.com
http://www.smsbucket.com/forums/

Tags :  signature allow forum,  Nilesh Patel  SEO Services Provider  Best Information of The World   Nilesh Patel Forum   Fans of Photography   Nilesh Patel SEO

Monday, September 23, 2013

On Site SEO Vs Off Site SEO Strategies

While there are advocates on each side, the most successful SEO strategy is one that incorporates both on-site AND off-site.

Let’s take a look at exactly what each category means via handy checklists:

On-Site SEO

On-site SEO is exactly what it sounds like, SEO that can be done on your site without exterior elements. This can include the use of:
  • Title Tags and Meta Tags (Keyword, Description)
  • H1 Tags
  • Alt Tag optimization
  • Layout and Design Elements
  • Keyword Research
  • Keyword Density
  • Keyword-Formatted URLs
  • Fresh Relevant Content
  • XML Sitemap
  • Robots.txt file
  • 301 redirects
  • Loading Time Control (via HTTP compression or other method)

Off-Site SEO

On the flipside, off-site SEO is equally as self-explanatory; it deals with all the elements of SEO-based marketing that are done outside of your site. These elements can include:
  • Directory Submission
  • Bookmarking Submission
  • Search Engine Submission
  • Article Submissions
  • Press Release Submission
  • Classified Submission
  • Forum Posting
  • Optimized Video Submissions
  • Guest Posting 
  • E-Mail Marketing
  • Social Media Marketing (Facebook, Twitter,)
  • Commenting on related sites, forums and blogs
  • Keyword analysis
  • Competitor analysis
  • RSS Feeds
  • Web 2.0 marketing
  • Paid linking
  • Posting on Q&A sites
  • Set up Google Webmaster Tool
  • Set up Google Analytic
  • Trackbacks

On-site AND off-site: Together as One

When thinking about on-site and off-site SEO marketing you should be thinking about creating a unified marketing strategy.

Remember SEO is constant vigilance; it is a consistent and successful string of on-site and off-site SEO strategies.

Tags :  what is seo ?, on page seo, off page seo, seo strategies, Nilesh Patel  SEO Services Provider  Best Information of The World  Nilesh Patel Forum   Fans of Photography   Nilesh Patel SEO

Saturday, September 21, 2013

What is SEO, and What Does it Mean Today?

For an intangible, virtual landscape, the internet is simply massive. Actually, massive doesn’t quite cover it. It’s a behemoth. It’s super-colossal. It’s really, really, really big. That is to say that there’s a lot of information available on the web, and it’s not always easy to locate the exact datum that you want. Simply combing through web pages one at a time to find what you’re looking for is the equivalent to searching for an atom-sized needle in a haystack the size of Texas. This is where search engines come in.

Search engines—such as Google—are designed to take an overview of the entire accessible internet and then give you links to sites that it believes are authoritative and relevant to your search. Relevance has to do with the words used on the page, while authoritativeness is usually based on the number of high-quality links that are directed to the page from other sites. It has been said that links are like votes, and in the world of Search Engine Optimization (SEO), that’s certainly true.

SEO is a practice that began back in the 1990s, when people first started to realize that they could make money off of their websites. In order to do that, they would need people to visit the sites, and in order for that to happen, they would have to utilize search engine results to the best of their abilities.

Thus, website owners were tasked with creating authoritative content into which promising keywords and links could be inserted. At the same time, the purpose of good content is to draw other respected sites in the hope that they will link to the page in question. The end goal is to have enough quality links and relevant content that the target site shows up on the first page of search results when an online search engine is used (for example a search for the words “Home Automation” using Google will result in the top results including sites for top rated home automation providers such as Vivint, as opposed to random pages of nonsense that just happen to use the right keywords). Better still, if you can land at the number-one spot for a specific search, then you know that your site will be getting a substantial amount of traffic. It is now a big part of the digital marketing industry, and is very popular.

Of course, to some people, anything worth doing is worth doing underhandedly. Using unethical “Black Hat” SEO techniques (which are techniques that break terms and conditions set forth by search engines), some unauthoritative and irrelevant sites began to take advantage of—and ultimately damage—the entire system. Keyword stuffing, hidden text, and doorway pages were all used in this way, killing the credibility of otherwise viable search engines, and making it much more difficult to find useful information on the internet.

However, Google, the world’s most popular search engine, began to develop ways to fight against this type of devious SEO. Two specific updates, first Google Panda (in February 2011) and then Google Penguin (in April 2012) were released and strengthened the Google algorithm. Panda was basically an improved intelligence which was designed to keep low-quality sites away from the top ranking spots, whereas Penguin was focused more on identifying sites that utilize Black Hat techniques, and lowering their search engine ranking as a deterrent.

But these new updates aren’t perfect. As long as search engine results are an important factor to online business and advertising, there will be people looking for innovative new ways to increase their site’s ranking without having to actually improve its content. Naturally, search engines such as Google will continue to fight against these tactics with new updates and programs. One major factor in the future of SEO will be its involvement in social media. Search results will be forced to include more social media results, and will also take into account personal information to provide the best and most useful returns. Of course, as these changes begin to take place, one can expect SEO as we know it to change as well. One thing that won’t change, however, is the necessity of search engines to deliver the most relevant data from authoritative sources. After all, the internet isn’t getting any smaller, and we’re all going to need a little help navigating it.



Friday, September 20, 2013

What Makes You A Best In Class SEO Survey Says

Today, at Conductor’s annual client summit, #C3NY, Director of Research and Search Engine Land contributor Nathan Safran unveiled research from a pool of over 380 enterprise search marketers and SEO professionals, analyzing common behaviors which lead to success.

Three key areas where the most successful search marketers thrived? Content, reporting excellence, internal education and evangelism.

The results of the study identify best practices in each of these areas to separate “best in class” from “laggards”.  The characteristics that define ‘best in class’ include being involved early in the content creation process and employing advanced reporting techniques.

Specifically, reporting best practice behaviors include:
  • Using reporting data to determine strategy
  • Reporting early and often
  • Varying reporting requirements by stakeholder interest
  • Reporting automation to free valuable time
  • Data mash-ups to draw meaningful insight from multiple variables
  • Drawing insights from ‘hidden’ data


How Much Does Budget Really Matter?

Interestingly, not as much as some might think. Conductor found that 43% of ‘best in class’ had more than 10% of overall marketing budget, while 57% of ‘laggards’ had less than 10% of overall budget.


Size Of Search Team

Search professionals often feel that they need more bodies on their team, but the study shows that nearly 1/3 of ‘best in class’ organizations only have a one person team, while 68% of industry leaders have a team of just two to four people.


This study was published in collaboration with Search Engine Land, a media partner at this year’s C3 event hosted by Conductor.  You can download the complete study, which includes a foreword by Search Engine Land and Conductor’s tips to become a best in class search marketer here.

For More Info about What Makes You A Best In Class SEO? Survey Says...


Tuesday, September 17, 2013

Nina Davuluri is first Miss America of Indian descent

Moments after winning the 2014 Miss America crown, Nina Davuluri described how delighted she is that the nearly century-old pageant sees beauty and talent of all kinds.


The 24-year-old Miss New York is the first contestant of Indian heritage to become Miss America; her talent routine was a Bollywood fusion dance.

"I'm so happy this organization has embraced diversity,'' she said in her first press conference after winning the crown in Atlantic City, New Jersey's Boardwalk Hall. "I'm thankful there are children watching at home who can finally relate to a new Miss America.''

Her pageant platform was "celebrating diversity through cultural competency."

The native of Syracuse, New York wants to be a doctor, and is applying to medical school, with the help of a $50,000 scholarship she won as part of the pageant title.

She is the second consecutive Miss New York to win the Miss America crown, succeeding Mallory Hagan, who was selected in January when the pageant was still held in Las Vegas. The Miss America Organization will compensate Hagan for her shortened reign.

Davuluri's victory led to some negative comments on Twitter from users upset that someone of Indian heritage had won the pageant. She brushed those aside.

"I have to rise above that," she said. "I always viewed myself as first and foremost American."

She had planned to go to the scene of a devastating boardwalk fire in the New Jersey communities of Seaside Park and Seaside Heights Monday afternoon. But pageant officials canceled that visit after learning that Gov. Chris Christie was making cabinet officials available at that same time to business owners victimized by the fire.

Her first runner-up was Miss California, Crystal Lee. Other top 5 finalists included Miss Minnesota, Rebecca Yeh; Miss Florida, Myrrhanda Jones, and Miss Oklahoma, Kelsey Griswold.

In the run-up to the pageant, much attention was given to Miss Kansas, Theresa Vail, the Army sergeant who was believed to have been the first Miss America contestant to openly display tattoos. She has the Serenity Prayer on her rib cage, and a smaller military insignia on the back of one shoulder.

Vail won a nationwide "America's Choice" vote to advance as a semi-finalist, but failed to make it into the Top 10.

In a Twitter message on Sunday before the finals began, Vail wrote: "Win or not tonight, I have accomplished what I set out to do. I have empowered women. I have opened eyes."

Jones made it into the top 5 wearing a bedazzled knee brace. She tore knee ligaments Thursday while rehearsing her baton-twirling routine, which she executed flawlessly Sunday night.

The pageant had pitted 53 contestants, one from each state, plus the District of Columbia, Puerto Rico and the US Virgin Islands, in swimsuit, evening gown, talent and interview competitions.

Sam Haskell, CEO of the Miss America Organization, said he was thrilled it all played out in Atlantic City after a six-year stint in Las Vegas.

"This is where we belong," he told The Associated Press. "This is the home of Miss America, and this is where we're going to stay."

The pageant started in Atlantic City in 1921 as a way to extend the summer tourism season for an extra weekend. 



Monday, September 16, 2013

Why is Content Part of a Smart SEO Strategy?

Over the last several years, search engine optimization (SEO) has matured quite a bit. Now, it’s no longer the practice of stuffing web pages with as many carefully-placed keywords as possible and hoping that Google notices. These days, it takes a much more sophisticated and refined approach built on fresh, original content that will provide value to visitors, while also attracting search engines and helping sites to move up the natural search rankings.

Quality is Key

Creating content for SEO today means going beyond traditional SEO practices like on-page keyword optimization or link building. Although both of these still play an important role in a business’s SEO success, they can no longer be relied on as the best ways to drive search traffic to a website.

As time goes on and technology gets more sophisticated, Google continues to push for a quality over quantity approach. Sites that offer visitors valuable content are going to be looked upon more favorably by the search engine, and will – therefore – appear higher in the search rankings. In Google’s digital eyes, this means providing high-quality, relevant content on a regular basis.


Obviously, what defines “high quality” content is up to the person reading or watching it, but that hasn’t stopped Google from trying to filter the stuff it finds to be the most beneficial to the top of the search engine results pages (SERPs). Some of the things that Google looks for to determine quality are longer content, images, videos, correct spelling, proper grammar, proper text formatting and, of course, links – including both outbound links to other high quality sites and inbound links (and social shares) from high quality sources.

It’s also important to remember that content must be relevant to the website publishing it and the people who are most likely to read it. This means staying around the general topic of the website (as in, only publishing content about tech gadgets on a technology review site, rather than – say – automobile parts).  This way, the search engines will see a common, consistent theme across all of a site’s content when they come to crawl it. Relevance is important to Google, because it means that visitors that end up at a website looking for information about a specific topic will be able to find it quickly once they get there and poke around.

Kinds of Content (And How They Help)

Admittedly, “content” is a vague term that can be applied to pretty much anything on the Web, in one way or another. But there are certain types of content that fit into Google’s loose criteria for “high quality” that can also help businesses move up the search rankings. The most obvious (and the easiest to produce) are blog posts.

It seems that every business has a blog these days (or, at least, they should have one) that allows them to consistently publish new long-form content related to their specific industries. With blogs, companies can satisfy both Google (and other search engines) and their target audiences by publishing original posts that provide readers with some kind of informative or actionable value. They also allow bloggers to stay on top of timely or topical news items – another thing that search engines like.

But really, the most obvious way that blog posts help improve a site’s search rankings is that they give writers more opportunities to insert relevant keywords into their sites in a natural, readable way that will attract the search engines and cause them register the site as being relevant to those specific terms. This benefit only grows and become more powerful as a site publishes more blog posts.

Blogs also provide businesses with a way to garner more backlinks from other high quality sites or blogs. This gives the business owner more authority with Google, makes the site more visible to its target audience and helps spread its content around the Web.

In addition, blog posts give companies content that they can push on social networks like Facebook, Twitter, LinkedIn and Google +, in turn, giving their fans and followers a reason to visit their websites. Plus, when they like, share or retweet a company’s posts, it provides social signals that act as inbound links, which adds credibility to a website and results in another SEO benefit.

However, not everyone likes to read, so it’s worth keeping in mind that blog posts aren’t going to entice everyone in your target audience. That is why many businesses have started turning to infographics – large-format images that can be used to organize multiple data points on a single subject or topic. Many users like these because they can quickly read and understand a significant amount of information in a way that’s visually stimulating, making it easier to remember later.

Businesses like infographics because they represent a great way to increase traffic and to gain authority needed to help them move up the SERPs. Because of their visual nature, infographics (or well-designed infographics, at least) are far more likely than blog posts to go “viral,” meaning that they tend to receive more reposts and shares on social networks and other websites than standard blog posts. This results in more links, more traffic, a wider audience reach and, ultimately, more credibility with the search engines.

Content Optimization is the New SEO

Back when Internet technology and search engine algorithms where still in their infancy, web professionals tried to find crafty ways to artificially generate signals that would convince Google and other search engines of their credibility. This included tactics like keyword stuffing and low quality link building that were only concerned with manipulating the search engines into prioritizing a site in the SERPs. Nowadays, search engines take a much more sophisticated approach to the way they determine which sites should be given authority in their niches.

These days, search engines require relevant, high quality and fresh content that is published on a regular basis. This can be done in the form of blog posts or infographics, as well as other types of “viral” content, such as videos, slideshows and more. Search engines want to be able to provide their users with valuable information in the same way that businesses want to offer their target audiences something relevant to their interests. An abundance of quality content is the answer for both parties. 



Friday, September 13, 2013

Panda Update: Cutts Call for Review of Site Content

Google’s maintains stress on “quality” of content with Google Panda-Index Integration

Google has always been pushing webmasters to better the quality of content on their sites, or face the penalty by being pushed to the far end of the search pages. Google’s “site quality algorithms” are designed to reduce the rankings of sites found to be hosting low-quality content as the recent Panda update tackles the difficult task of algorithmically assessing website quality based on content. What this means is that the sites with good “quality content” will gain their way up on to the first pages of search, enabling people (searchers) find sites with good quality first, which is, falling in line with Google’s overall intent of “enhancing the user experience.”

Cutts’s call on Content

Google’s Anti-Spam Chief Matt Cutts, in answer to a question on Panda’s integration in to Google indexing said, “If you are not ranking higher as you were in the past, overall it is always a good idea to look at the quality of the content in your site. If there is content that is scraped, or duplicated or not just not as useful ... (you should think of remedies like ) ....  Can I come up with something that is original, something that people would really enjoy …. those kinds of things tend to be a little bit more likely to rank higher in our rankings.” 

Panda Integrated into Google Indexing

The integration of the Panda into the normal indexing happened sometime in mid-March 2013.  Unlike the explicit, sudden changes that have hit sites with every Panda or Penguin update, the integration of Panda into indexing will be less noticeable though the site with low quality content are sure to feel the heat.  

In SMX West, Cutts revealed “In 2011, we launched the Panda algorithm change, targeted at finding more high-quality sites. We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines… You are more likely in the future to see Panda deployed gradually as we rebuild the index. So you are less likely to see these large scale sorts of changes.” 

What this Means to webmasters

As sites with good content pick their way to the top in the search pages, sites with thin and duplicated content, scrapped content, outdated content, will be slipping down in rank till they become invisible on the search pages.  Getting right to the point, Matt Cutts calls for an immediate assessment of your site’s content, in case you have done everything possible including optimizing internal pages and carrying out a site audit in the recent past. What webmasters should do is replace old and thin content with quality content that is interesting, engaging and influencing. Google wants you to build the quality of content on your site, page by page, as just one page of bad content could pull the site down, altogether.

Here is the original Matt Cutts’s Q/A video on Youtube and the transcript below.


“Recently Google integrated the Panda update into the normal indexing process. Now, how will webmasters get to know their site is hit by Panda? And, if the site is already hit, how will one know the site has recovered from Panda? (after having done remedial SEO)

Panda is an update we rolled out a couple of years ago, targeted towards lower quality content. It used to be that roughly every month or so we had an update. We used to say there is something new, we’ve got a launch, we got new data and lets refresh the data. And it got to a point with Panda that the changes were getting smaller, more incremental we had pretty good signals… we pretty much got the low hanging winds. So there were a lot of really big changes going on with the latest Panda changes, and we said lets go ahead and rather than it be a discreet data push, ie. something that happens every month or so at its own time, when we refresh the data, let’s  just go ahead and integrate it into indexing. 

So at this point we think that Panda is affecting small enough number of webmasters on the edge. We put out a blog post, which I would recommend, penned by Amit Singhal. It talks about the sorts of signals we look at whenever we are looking to assess quality within Panda….basically we are looking for high quality content. And if you think you might be affected by Panda, the kind of over writing rule is trying to make sure you got high quality content, the sort of content that people really enjoy, that’s compelling, the sort of thing they would love to read, that you might see in a magazine, in a book, that people would refer back to, or send to friends, those sorts of things. And that would be the overriding goal and since Panda is integrated into our indexing that remains the goal of our entire indexing system. So if you aren’t ranking higher as you were in the past, overall it is always a good idea to look at the quality of the content in your site. If there is content that is scraped, or duplicated or not just not as useful…. can I come up with something that is original, something that people would really enjoy ….those kinds of things tend to be a little bit more likely to rank higher in our rankings." 



Wednesday, September 11, 2013

How would fingerprint technology benefit iPhone 5S users?


Reports from factory production lines and leaked parts indicate that Apple is about to put a fingerprint sensor into its next-generation flagship iPhone 5S. But what exactly can a fingerprint sensor do for the average consumer?
 

What does it do?

A fingerprint reader or sensor does what it says on the tin – it scans your fingerprint and matches it to a pre-defined image of your finger. Since every fingerprint is unique, the system can then securely verify your identity. 

How does it do it?

A type of image capture system specialised for quickly capturing and storing the imprint of your finger will be embedded below a swipe panel – in this case possibly below the home button on the iPhone 5S – which the user runs their finger over. The sensor captures the image and software analyses it for the skin indentation pattern on your fingertip, comparing it to a set of pre-stored data and verifying your identity. According to a recent patent filed by Apple in Europe, the sensor will implement an RF sensing system that will not only accurately capture the ridges of your finger, but also image the live skin below the surface of your fingertip to prevent spoofing of the system with a Mission Impossible-style fake fingerprint. 

What will it enable?

Potentially, fingerprint readers could sound the death knell for passwords. The multi-character password is a failing piece of security, given that pretty much any password can be cracked by high-powered computers these days, regardless of how long or complex it is. Two-factor authentication, where another piece of the security puzzle, such as a secret code or key, is used to strengthen simple password logins is currently the best system on offer to consumers.

In theory, fingerprint scanners could allow users to completely remove the need for passwords, securely logging into their phones, and enabling higher security functions, which would be particularly useful for online banking and shopping without the need for two-factor authentication.

A built-in fingerprint scanner could also make the iPhone more amenable to big business for security reasons, although in reality, according to Matthew Finnie, CTO at Interoute, the owner operator of Europe's largest cloud services platform, "the smartphone is now intrinsic to how people work, so it's time for businesses to change".

"Rather than focusing on the security merits and nuances of the devices, attention should shift to how businesses should secure and control corporate data and make relevant parts securely accessible from anything, anywhere." 

Will it really work?

Fingerprint scanners in the past have been a bit hit and miss. The technology, although relatively established in industry and enterprise settings, has never really been available to the mass market consumer or on anything other than secure laptops. That's generally because it has been a frustrating experience for the end user.

If Apple manages to make the process of secure login via an in-built fingerprint sensor a smooth and seamless experience, it could revolutionise the way consumers use their phones and bring about faster, more secure platforms for developers to expand upon.

However, there have been rumours that the sensor Apple is expected to build into its next iPhone flagship has a limited use lifetime. For example, a rumoured 500-scan limit "could be used up in only six months, based on users accessing multiple accounts three times a day. This would render the scanner useless for the remainder of a typical mobile phone contract, potentially 18 months," according to research by David Webber, managing director of Intelligent Environment, a specialist in the financial security field. If a consumer keeps their smartphone for two years, as is the length of many mobile phone contracts currently, there is a possibility that the fingerprint sensor could fail, or cease to work leaving users stranded without access to secure logins for their phone, banking or shopping. 

What alternatives are there?
 

Biometric authentication, where a unique part of your body's function is used to verify your identity, is a growing field. Many different factors can be used to securely identify the consumer. Iris scanners were once hailed as the holy grail of identification, but the technology required to scan an iris accurately is both expensive and often bulky – not something suitable for phones yet. Recently the unique rhythm of individual heartbeats has been pushed forward as another tool in the biometric armoury, with a bracelet such as the Nymi that monitors your pulse on your wrist, which would offer a much more realistic and consumer-friendly entry into biometric security.