Monday, March 24, 2014

Google’s Matt Cutts Gives SEO Advice For Times When Your Products Go Out Of Stock

Google’s Matt Cutts answered in a video what webmasters and site owners should do about their out of stock products on their e-commerce sites.

Matt Cutts basically said it depends on the size of the e-commerce site. He broke it down into three sizes: small sites with tens of pages, medium sites with thousands of pages and massive sites with hundreds of thousands of pages or more.

Small E-Commerce Sites

Small sites that sell items, such as handmade furniture, that showcase a product that is out of stock should likely link to related products. This way the customer can see that this owner can make or design something as displayed but at the same time, show other products that are currently available in stock that the customer can purchase today.

Of course, it may make sense to add a manufacturing time next to the items that are out of stock.

Medium E-Commerce Sites

The normal, medium sized, e-commerce site, that sells thousands of products, and where some of those products are out of stock. In that case, the site owner should 404 – page not found – the products that are out of stock.

That is unless you know the date that the products will come back in inventory. If you know when the products will come back in inventory, inform the customer on the site and let them choose if they want to order it for later delivery or not.

Otherwise, 404 the page because it can be frustrated for a customer to land on a product page that they cannot buy.

Large E-Commerce Sites

For really large e-commerce sites, with hundreds of thousands of pages, such as Craigslist, you should set the date the page will expire using the meta tag, unavailable_after tag. This way, when the product is added, you can immediately set when that product page will expire based on an auction date or a go-stale date.

This information is treated as a removal request: it will take about a day after the removal date passes for the page to disappear from the search results. Google currently only supports unavailable_after for Google web search results

Here is the video:



For More Information about Google’s Matt Cutts Gives SEO Advice

Saturday, January 25, 2014

The Resources You Need to Stay Current With SEO In Under 100 Words

understatement: SEO has changed.

To understand what’s in store for the future (which you can learn about here), you have to know where SEO has been (which you can learn about here).
With that context, let’s lay out some resources to make you a modern SEO.
First, learn how to optimize your blog content — a cornerstone of top-notch SEO. Then, figure out how to integrate search and social.
Finally, stay on top of those constantly changing trends. Resources like this SEO coffee tips series, and this mythbusting guide will keep you in the know.

The Latest SEO Challenge? Maintaining Rankings

Unfortunately for hopeful small businesses with polished websites and great new content, staying high-up in the rankings of major search engines is an ongoing challenge. Because search engines constantly index content and search for new pages, it requires proactivity and perseverance to keep a website at the top of the rankings.

Several factors cause a website to slip in ranking. Namely, competitors optimize their own sites, which can cause theirs to rise in ranking and others to drop. There are also constant changes in search engine algorithms, which improve search engines, but often leave some websites behind which used older SEO methods. Not to mention changes in search engine market share, or the simple habits of online buyers.

Telx Web, a Miami website design business, announced today that they now provide new and improved rankings maintenance for clients. This means Telx Computers monitors clients’ rankings weekly for targeted keywords, they write articles and distribute them with links to clients’ websites, as well as back-linking from other sites and social media postings.

This Miami SEO company has a reliable and reputable staff that is dedicated to updating and maintaining clients’ website rankings. A few years ago, technology like “Meta Tags” was essential to maintaining traffic. As search engines have changed and grown, link building is considered much more important, and the largest search engines have changed their algorithms along these lines. Telx Web is one of the few search engine optimization companies that will actually continue to write and distribute articles with links to client websites, provide back links and social media postings, and monitor the search status of clients’ sites.

Being able to not only understand the current trends and search engine technologies, but being able to adapt and overcome changes in these trends is what Telx Web is set on accomplishing for customers. Through SEO consulting, Telx can teach clients how to successfully optimize their businesses for search engines of all types, and to reach the customers they intend to impact.

About Telx Web - A hallmark of the best SEO companies are full-ranging SEO features. Telx Web provides these, and more; On-Page optimization - which involves the use of correct keywords and metadata and Off-Page optimization - which is all about back-linking to a site via other sites. Not to mention Social Media Marketing through blogs, Facebook, Twitter, Google+, and other social networking sites, as well as manually submitting and monitoring the ranking of clients’ sites and pay-per-click campaigns. Telx also provides clients with scalable services, so that they can serve every budget and every business, small to large.

For More Info about The Latest SEO Challenge

Friday, January 10, 2014

SEO in 2014: How to Prepare for Google's 2014 Algorithm Updates

It has been an incredibly eventful year in terms of updates from Google. Major 2013 changes included further releases of Penguin and Panda, Hummingbird taking flight, and the shift away from providing keyword data thanks to encrypted search.

Many have gone so far as to ask whether SEO as a profession is dead: for one interesting perspective, see my recent Forbes interview with Sam Roberts of VUDU Marketing. My own take is less alarmist: Google has taken major spam-fighting steps that have shifted the playing field for SEO professionals and anyone trying to get their site on the map in the year ahead.

At the same time, the need for an online presence has never been stronger, while the landscape has never been more competitive. The potential to make a real ROI impact with your company's online marketing initiative is greater than ever. But defaulting to so-called "gray hat" tactics no longer works. Instead, SEO professionals need to step up and embrace a more robust vision of our area of expertise.

You might call it a move from tactician to strategist: the best and most successful players in our space will work to anticipate Google's next moves and respond to them with laser focus. In a sense, the infinite digital game of chess that is SEO will continue, but the rules of the game have become more complex.

Through a mix of what I'm observing and reading and what I'm seeing working out in the field today for my clients, here are some suggestions for companies and SEO professionals that are thinking ahead to 2014 for their digital strategies.

Everything You Learned in 2013 is Still Relevant, Just Amplified

When you look closely at the targets of the 2013 updates (ie, websites that cheat their way to the top of the rankings or provide no value to visitors), I anticipate seeing these carried forward throughout 2014. We can continue to expect micro adjustments to Panda and Penguin that continue to target both link quality and content quality.

When you look closely at the targets of the 2013 updates (ie, websites that cheat their way to the top of the rankings or provide no value to visitors), I anticipate seeing these carried forward throughout 2014. We can continue to expect micro adjustments to Panda and Penguin that continue to target both link quality and content quality.

Smart marketers will benefit from keeping a close eye on their link profiles, and performing periodic audits to identify and remove inbound links built unnaturally. High quality content investments will remain critical.

A solid SEO performance in 2014 is going to be built on a foundation of really understanding what happened in 2013, and what these changes mean both strategically and tactically for SEO. SEO really has changed in critical ways.

Content Marketing is Bigger than Ever

Content marketing will move from buzzword to mature marketing movement in 2014. From an SEO perspective, Google will be looking at companies that have robust content marketing efforts as a sign that they're the kind of business Google wants to support.

Think of all the advantages of a good content strategy:
  • Regular, helpful content targeted at your audience.
  • Social signals from regular sharing and engagement.
  • Freshness or signs that your site is alive and growing.
  • Increasing authority connected to your body of work.
Sound familiar? It's the very approach to SEO that all of Google's recent updates have been designed to shape.

What changes you need to make in 2014 depends largely on where your company stands now in relation to an active content marketing strategy. Companies with existing content strategies will need to assess the role of mobile, specifically.

If you've just begun to move in the direction of content marketing, it's time to really commit and diversify. If you haven't started yet, it's time to take the plunge.

Social Media Plays an Increasingly Visible Role

Social media has been a major player in the digital marketing landscape for the last few years. First we saw the rise of mega platforms like Facebook and Twitter. In the last couple of years, visual content from networks like Pinterest, Instagram, and various micro-video services haa swept through.

Social media has been a major player in the digital marketing landscape for the last few years. First we saw the rise of mega platforms like Facebook and Twitter. In the last couple of years, visual content from networks like Pinterest, Instagram, and various micro-video services haa swept through.

Today, diversification is a major trend: depending on who you're targeting, it's no longer enough to be active on a single network. In fact, The Content Marketing Institute recently released a study that the most successful B2B marketers are active on an average of seven networks. Companies and SEO professionals will need to be asking the following questions in the year ahead:
  • Are we taking our social media seriously? Are we employing the pillars of strong profiles, good content, reciprocity, and engagement?
  • Is easy social sharing enabled for all of our content?
  • Does our content strategy include a dissemination phase that includes maximizing its potential for distribution through social networks?
  • Are we active on the social networks that matter in our industry?
  • Are we active on the social networks that matter to our customers?
  • Are we active on the social networks that matter to the search engines? (See below for more thoughts on making that strategic investment).
  • Does our social media marketing strategy stimulate the level of social signals required to achieve our goals?
Google's updates are likely to increasingly rely on social signals as active human curation of good content.

Invest in Google+

In addition to strengthening your overall social media marketing position, it's going to be absolutely critical that you are investing in your Google+ presence.

Moz's most recent study of ranking factors confirms that Google+ is playing an increasingly significant role in a solid SEO ranking. The immediate areas to focus on include:
  • Establishing Google Authorship of your content, and tying it to your Google+ account. Authorship, which brings your body of content together, will play an important role in the SERPs as well as strengthening your Author Rank.
  • Those +1's add up. It isn't clear exactly how much Google +1's directly contribute, but it's fair to say that it's a major factor in the "social signals" component of Google's algorithm. I expect this to increase in the year ahead.

Hummingbird Was Just the Tip of the Mobile Iceberg

2014 will be the year of mobile SEO. Hummingbird was just the very small visible tip of a very large iceberg as Google struggles to respond to the rapidly shifting landscape where half of all Americans own smartphones and at least one-third own tablets. Those statistics will probably shift upward, maybe dramatically, after the 2013 holiday season.

As a result, your site's mobile performance matters to your SEO rankings. Properties that you're trying to rank need to be designed first for mobile and then scaled up for the big screen. If you don't have a mobile-optimized website, this needs to be your top priority in terms of SEO and design investments for 2014.

Some underlying changes that happened with Hummingbird, including the increasing importance of both semantic search and Knowledge Graph, will continue to grow in influence. Practically speaking, this is to help prepare the search engine for the rise of voice search associated with mobile. But it also has direct implications (which we're still learning about) for broader SEO. This is one area that you should pay close attention to, from how you structure your content to what content you choose to put out.

The Long Versus Short Debate

Which is better, long content or short content? The answer depends on who is creating the content, who is reading it, what it's about, in what context it's being consumed, and how you define "better."

For the purposes of this argument, which form of content will help you best prepare to rank well in 2014? Frustratingly for some, the answer is more "both/and" than "or."

Vocus recently cited a study that showed that the top 10 results for a specific keyword search tended to be more than 2,000 words in length. The validity of that study has been debated, but it's probably fair to say that length is a proxy for depth of expertise and value delivered to the reader.

Google values both expertise and value. As a result, we've seen a trend where the "minimum desirable length" for text-based content has shifted from something in the range of 550 words to articles in the range of 1000-plus words.

Yet we're also confronted with the reality of the mobile device: if I'm reading about something I'm only moderately interested in, there's a high probability that I won't want to scroll through 2,000 words on my iPhone. That leaves content marketers faced with the challenge of producing mobile-friendly content, which tends to be (in a sweeping generality) much, much shorter.

Proposed solutions have run the gamut from content mixes to site architectures that allow you to point readers to specific versions of content based on their devices. This is great for the user experience, but where it all comes out on the SEO algorithm front remains to be seen. For now, I'll just acknowledge that it's an area of concern that will continue to evolve and that it's something you should keep your eye on.

Advertising and PPC has a Shifted Relationship with SEO

Since Google made the decision to encrypt the vast majority of its searches, our ability to access keyword data for research purposes has been restricted. However, there's a loophole. Keyword data is still available for advertisers using PPC on Google's platform.

More SEO budgets may be driven toward PPC simply because access to the data may otherwise be restricted. It's also possible that we'll see the release of a premium Google product to give us access to that data through another channel from Google in the year ahead.

Guest Blogging Remains One of the Most Effective Tactics, With a Caveat

Guest blogging has exploded in the past year, and it's going to remain one of the most effective means of building quality inbound links, traffic, and branding exposure in 2014. However, it's absolutely critical that you're creating high quality content, and using extremely stringent criteria when selecting your target sites.

In other words, you need to apply the same high ethos approach to guest blogging that you do to the rest of your SEO efforts. If you dip a toe into spammy waters where guest blogging is essentially scattershot article marketing with a 2014 update, you're likely to be penalized in a future Penguin update.

Conclusion

This has been a year of significant change in the SEO industry. Even contemplating strategies for 2014 can feel staggering.

The good news is that looking back, it's easy to see which direction the trends are heading in terms of the years ahead. Staying the course on solid white hat tactics and paying attention to a few priority areas that are shifting rapidly should give you the insights needed to improve your organic search visibility in 2014 and beyond.

What trends do you anticipate seeing from Google in the year ahead? How are you preparing?


Tags : googl's algorithm 2014, nilesh patelseo services providernilesh patel seo

Thursday, November 28, 2013

SearchCap: The Day In Search, November 27, 2013

Below is what happened in search today, as reported onSearch Engine Land and from other places across the Web.

From Search Engine Land:


  • At some point recently, Google started to indicate terms that are trademarked in the Keyword Planner with the TM mark. Dan Shure of Evolving SEO tweeted about the new feature, noting that McDonald’s “I’m Lovin’ It” tagline is not marked as trademarked while Staples’ “The Easy Button” is marked as such. (Burger King’s “Have It […]

  • We hear it regularly – the death knell tolling for SEO. In the past several years, we’ve been bombarded by a barrage of change in organic search, from Penguin to Panda to Hummingbird, from inbound link penalties to [not provided]. Let’s face it: the only constant in search is change. But I propose that SEO isn’t […]
  • The holidays are officially upon us, and if you’re reading this, I must give thanks to you for your time. When I still worked at Google, this was the time of year when new product launches came to a crawl since most advertisers were too busy to deal with change, and only the most persistent […]
  • Many marketers think of social media in the context of B2C companies: after all, 4 in 10 consumers buy products that they’ve favorited, liked, tweeted or pinned on various social networks. Yet, according to a recent study from MarketingProfs, 87% of B2B marketers use social media platforms in their content marketing efforts. In fact, of […]
  • In a significant move, Google rolled out CPM bidding by viewable impression in AdWords this week. Advertisers will only be charged for ad impressions that can actually be viewed in-screen by users, rather than on the traditional served impression basis. New reporting metrics are also available, all powered by Google’s viewability measurement solution, Active View, […]
For More Info about Search Cap

Google Keyword Planner Now Shows Trademarked Terms

At some point recently, Google started to indicate terms that are trademarked in the Keyword Planner with the TM mark.

Dan Shure of Evolving SEO tweeted about the new feature, noting that McDonald’s “I’m Lovin’ It” tagline is not marked as trademarked while Staples’ “The Easy Button” is marked as such. (Burger King’s “Have It Your Way” also gets a TM.)


What is marked as trademarked and what isn’t appears to be somewhat inconsistent at this point. For example, in this set of queries for several Pittsburgh Steelers’ related terms, “steellers” and “steelers pictures” are no’t marked as trademarked. I assumed that meant that “Pittsburgh” needed to be included in the search to trigger the trademark. However, for some reason “steelers hat” is marked with with the trademark symbol.

The NHL was the only major sports organization I could find that wasn’t marked, likewise NHL team names such as Pittsburgh Penguins are not marked.

The trademarked terms can still be added to campaigns. The “TM” mark is not included with the keywords when they are added. Presumably Google is adding the marker to help eliminate user surprise when trademarked terms get disapproved. The inconsistency could cause more confusion if in fact keywords such as “steelers pictures” are eventually disapproved for trademark reasons.

For More Info about Google Keyword Planner

Friday, November 22, 2013

Google Seeking Feature Requests For Webmaster Tools

Google’s head of search spam Matt Cutts posted on his personal blog a request for webmasters to provide feedback and feature requests for Google Webmaster Tools.

Matt and the Google search quality team is looking for new ideas on what would make Google Webmaster Tools more useful to you. Matt talked about how far Webmaster Tools has come but they want to continue to make it more useful.

To submit feedback, go to Matt’s blog and leave your feedback.

Matt’s disclaimer:

To be clear, this is just some personal brainstorming–I’m not saying that the Webmaster Tools team will work on any of these. What I’d really like to hear is what you would like to see in 2014, either in Webmaster Tools or from the larger team that works with webmasters and site owners.


Monday, October 28, 2013

Google’s Matt Cutts : More Pages Does Not Equal Higher Rankings

In a new video released today by Google’s head of search spam, Matt Cutts, we learn that the more pages a web site has, does not necessarily mean you will have better rankings.

Matt Cutts said, “I wouldn’t assume that just because you have a large number of indexed pages that you automatically get a high-ranking. That’s not the case.”

He goes on to explain that the more pages you have, the more chances you have to rank for different keywords. Plus, the more pages you have, the more likely you have more overall links and PageRank, which do directly impact your rankings. But the number of pages on a specific site, does not have a direct ranking benefit.

Here is the video:


Matt said it at the end again, “just having a number of pages doesn’t give you a boost.”



Sunday, October 27, 2013

LiveZilla Live Chat Server Installation

I. Users & Groups
________________________________________________________________________
The LiveZilla Server Admin contains several functions and options you will need to setup, update and maintain LiveZilla Server installations. The first step when setting up LiveZilla is to create a LiveZilla Server Installation on your webserver. Select "Create new LiveZilla Server" and press next to proceed. 

You are asked to enter the details of your administration account on the first page of the wizard. The username and the password you specify here will be used to maintain and update the server you are going to create. Please keep username and password in mind. The administration account is your first internal user (=operator) which can also be used for all daily tasks. You can create additional internal users once you have finished this installation. 

The next step will be to create the first LiveZilla Group. Please enter the email address of your support department and click next again.

II. FTP Upload
_________________________________________________________________________
You can directly upload the files to your webserver using FTP. All you need is a working FTP account with permission to create files and folders (write access). Enter the account details on the FTP setup page and do not forget to validate the target location for your LiveZilla Server Installation. 

How to find / validate the target location for your LiveZilla Server?
Press the Select button in order to see the file structure on your webserver. Make sure to (create and) select a folder as target for the installation, it is elsewise possible that you overwrite some files of your website. Make also sure that you don't upload the files to a location above / outside of your public web folder (wwwroot, public_html or htdocs). You would not be able to access your LiveZilla Server elsewise. Press next to initiate the file upload process. 

Alternative to FTP Upload: Extract to local system
__________________________________________________________________________
It's also possible (alternatively to FTP) to extract the files of the LiveZilla Server Installation to your local computer. In case you are working on the webserver you can extract the files directly into the public web folder.

This option should also be used if you intend to upload the server files to your webserver using a common FTP Client (like FileZilla). Keep in mind that You will need to set the file permissions manually.

III. Determining the Server URL
__________________________________________________________________________
When all files have been uploaded to your webserver you will be asked to determine the absolute URL of the folder of your new LiveZilla Server Installation. Please do not proceed until you have passed the server test, what guarantees that the LiveZilla Server URL is correct. 

How to find the correct URL?
You should copy the URL suggested by the wizard and try to open it in your webbrowser. In case the suggestion of the wizard was correct you will see a page like this and you are done here. Elsewise, you will need to find the correct URL yourself. In case that you have uploaded the files to a folder named LiveZilla below your website structure, it's likely that the URL will look like http://www.yourwebsite.domain/livezilla.

Once you have determined the right URL and passed the Server Test click next to proceed to the next step.

IV. MySQL Database Settings
__________________________________________________________________________
The final step in this wizard is to setup the MySQL connection on your new LiveZilla Server. 

MySQL Database

You will need to specify:

  • Host
    The Hostname or IP of your MySQL / Database Server. Please note that all connections to the database will be initiated by the LiveZilla Server Script (PHP) and not by your local LiveZilla Server Admin application. In most cases, Webserver and MySQL Server are running on the same (logical) server which means that localhost will work as hostname. If your MySQL database is installed on a remote machine you will need to enter the hostname of the remote database server.
  • User
    The database user you like to use for LiveZilla. The user needs the permission to CREATE/ALTER/DROP tables and SELECT/UPDATE/INSERT/DELETE data in the regarding database.
  • Database Name
    The MySQL database you want to use for the LiveZilla Server Installation.
  • Prefix (optional)
    The prefix entered here will be added in front of each table name. This will help you to organize data in a database and prevent one type of data from interfering with another.

Enter your database details and press Create Tables to create the set of LiveZilla tables. Validate database access and data consistency by pressing Validate Database

Please note: you will not be able to create the tables and validate database access until the correct URL of your LiveZilla Server has been determined (previous page of this wizard)! 
_________________________________________________________________
Congratulations, your LiveZilla Server is ready now. The final step is to integrate LiveZilla with your websites


Tags : LiveZilla Live Chat Server Installation, live Zilla chat, nilesh patelseo services providerfans of photography,  nilesh patel seo

Thursday, October 24, 2013

Matt Cutts At Pubcon 2013: Moonshots, Machine Learning & The Future Of Google Search

This morning, the head of Google’s webspam team Matt Cutts gave a keynote speech at Pubcon in Las Vegas. The keynote comes on the heels of a scathing day — 1 keynote from Jason Calacanis, who said that Google rules everything, that they were essentially evil. On Twitter yesterday, Matt asked if Jason wanted the polite response, or a thorough one.  All of us here in attendance are hoping for “thorough.”


Matt starts with the state of the index talking about where will Google go in future.

He’s proud that Google as doubled down on ‘moonshot’ changes, specifically:

  • Knowledge Graph Google has been trying to understand entities — not just the searches. So essentially they are trying to learn about “things not strings.”
  • Voice Search
  • Conversational Search
  • Google Now Matt is proud that today, sometimes you don’t even have to search to find information you need.
  • Deep Learning Google is looking more into the relationships between words. Google will be able to read at a higher level and interpret the relationships between words. Works well with voice search when a user asks Google, “Who is the Prime Minister of Turkey?” then searches again for “How old is he?” and Google can reply with the previous context.

Core Quality Changes

  • Hummingbird This change targets better natural language translation. Search is more than just matching words — instead it’s looking at specific words that are more meaningful for intelligent scoring. For instance, a voice search for “what is the capital of Texas, my Dear” the “my Dear” isn’t that important — Hummingbird will be able to detect this. While Hummingbird affected 90% of queries, it was a very subtle change that most users didn’t recognize but will help users get more pertinent results.
  • Panda Softening This is something that Google has looked into to help bring some sites and content back.
  • Detecting/Boosting Authorities Not done by hand, but applies by topic areas. Webmasters can keep deepening their content on a topic to further their authoritativeness on a specific content area.
  • Smartphone Ranking Doesn’t have flash? won’t display a site that has flash to you then.

Webspam Changes

  • Penguin 2.0 & 2.1 Penguin 2.0 was released – not that intensive. Black hats said wasn’t big, so Google then released turned it up in 2.1. More changes will be continually coming, so buckle up.
  • Spammy Query Algorithms Items like porn and payday loans will be targeted for better results. Right now the SERPS aren’t great, but they will be working on it.
  • Advertorials/Native Advertising Google has cracked down on publishers selling ads that blended in as editorial with dofollow links.. You shouldn’t be paying for links that pass pagerank.
  • Spam Networks They’ve got a pretty good list, just working their way down them. Matt joked that he should talk a poll to determine who to axe next.

Communication

Google has done a great job of increasing the communication with webmasters, especially:
  • New videos for malware/hacking
  • Concrete examples in guidelines
  • >100 speaking events, Hangouts on Air, webmaster office hours
  • How search works website

Future of Search

  • Machine Learning Google’s goal is to provide the world information.  The word “search engine” isn’t anywhere in their mission statement. They want to be able to give answers to specific queries.
  • Mobile Mobile is coming faster than anyone expected. 40% of YouTube videos are now served to mobile devices. If you haven’t thought about mobile, it’s time to start thinking about it.
  • Social/Identity/Authorship Matt starts with “Facebook did a great job of social and knowing who people are.” Then talks about the fact that signal is not just likes/+1s/Tweets but in the long terms; social signals are a sign of authority. You are someone worth listening to — search engines will think you are worth listening to as well.

Webspam Trends

  • Hacking Next 6 months – it’s going to look like we aren’t working on much. Now working on next generation of hacking. Queries like “buy viagra” still looks bad because people are breaking the laws.
  • Hot Topics Items like child porn, international issues and really nasty queries are being addressed.
  • No Toolbar PageRank scheduled for rest of year The pipeline for updating PageRank broke this year and PageRank stopped updating. Google realized that it wasn’t that bad and stopped updating as people seem to pay too much attention to the metric. It’s something they will reassess at a later time.

Advice

  • Mobile Get ready, you need a mobile plan.
  • Request Autocomplete New item in Chrome that allows users to auto-fill forms. Saves users time by using the standard to pull in all information and increase chance of conversions.
  • Ad-heavy pages above the fold Some tweaks are coming to “turn up” this algorithm. Users shouldn’t see a barrage of ads above the fold when they visit a site.
  • Tightening Authorship Matt mentions that a tightening of Authorship may provide better results. Google is looking for a 15% reduction to ensure that the quality of the authorship is still high and relevant.
  • Rich Snippets The ability to have and use rich snippets may be taken away for low quality sites in the coming months
  • Smarter on JavaScript Google is now fetching, rendering and indexing items called by JavaScript. Google is getting smarter and understanding smarter libraries.
Now to the Q and A section:

Matt talks about +1′s specifically and that they are a short term signal, but very bullish on long term signal of authorship.  Next Matt talks about Negative SEO. Worked on Negative SEO for years. With Penguin, it not only removes sites, but can actually have a negative effect on the site. Disavow tool announced last year, use as a last resort. Use Webmaster Tools, find links and disavow at link or domain level. Webmaster tools is now giving better backlinks, not just A-Z, so use Webmaster Tools to help identify, can see 100,000 links.

In response to Jason Calacanis’ claims from yesterday, Matt polls the crowd on whether or not to go into the matter. Crowd wants to hear the response. Matt talks about the initial version of Panda and whether or not they should have rolled out slowly. Matt says that this wouldn’t have been good and cites multiple articles showing the degrading quality of the search results. Google needed Panda. A Googler made a personal blocklist to block specific sites and nearly 200k users installed — people did not want these content farms.

In response to Jason’s claims that Google wasn’t a good partner, Matt talked about the fact that no companies have partnerships with Google. There are YouTube partnerships, not Google search partnerships. In aggregate, Mahalo simply wasn’t a quality site and they came to an impasse at a personal meeting. This wasn’t even a webspam issue, it was a quality issue and nobody received special treatment.

With the Mahalo issue behind, Matt talks about press releases. “If you are paying for PageRank, you probably aren’t doing something right.” Google has identified “a lot” of the top Press Release sites and ignores the links but doesn’t penalize those who are using them.

On infinite scrolling issues, Matt recommends using some type of paginated versions as a safety guard to index all content. On the growing size of the Google bar, Matt mentions that they are aware of the size and pixels being taken up by Google.

That’s a wrap folks.

For More Info about Matt Cutts At Pubcon