Thursday, November 28, 2013

SearchCap: The Day In Search, November 27, 2013

Below is what happened in search today, as reported onSearch Engine Land and from other places across the Web.

From Search Engine Land:


  • At some point recently, Google started to indicate terms that are trademarked in the Keyword Planner with the TM mark. Dan Shure of Evolving SEO tweeted about the new feature, noting that McDonald’s “I’m Lovin’ It” tagline is not marked as trademarked while Staples’ “The Easy Button” is marked as such. (Burger King’s “Have It […]

  • We hear it regularly – the death knell tolling for SEO. In the past several years, we’ve been bombarded by a barrage of change in organic search, from Penguin to Panda to Hummingbird, from inbound link penalties to [not provided]. Let’s face it: the only constant in search is change. But I propose that SEO isn’t […]
  • The holidays are officially upon us, and if you’re reading this, I must give thanks to you for your time. When I still worked at Google, this was the time of year when new product launches came to a crawl since most advertisers were too busy to deal with change, and only the most persistent […]
  • Many marketers think of social media in the context of B2C companies: after all, 4 in 10 consumers buy products that they’ve favorited, liked, tweeted or pinned on various social networks. Yet, according to a recent study from MarketingProfs, 87% of B2B marketers use social media platforms in their content marketing efforts. In fact, of […]
  • In a significant move, Google rolled out CPM bidding by viewable impression in AdWords this week. Advertisers will only be charged for ad impressions that can actually be viewed in-screen by users, rather than on the traditional served impression basis. New reporting metrics are also available, all powered by Google’s viewability measurement solution, Active View, […]
For More Info about Search Cap

Google Keyword Planner Now Shows Trademarked Terms

At some point recently, Google started to indicate terms that are trademarked in the Keyword Planner with the TM mark.

Dan Shure of Evolving SEO tweeted about the new feature, noting that McDonald’s “I’m Lovin’ It” tagline is not marked as trademarked while Staples’ “The Easy Button” is marked as such. (Burger King’s “Have It Your Way” also gets a TM.)


What is marked as trademarked and what isn’t appears to be somewhat inconsistent at this point. For example, in this set of queries for several Pittsburgh Steelers’ related terms, “steellers” and “steelers pictures” are no’t marked as trademarked. I assumed that meant that “Pittsburgh” needed to be included in the search to trigger the trademark. However, for some reason “steelers hat” is marked with with the trademark symbol.

The NHL was the only major sports organization I could find that wasn’t marked, likewise NHL team names such as Pittsburgh Penguins are not marked.

The trademarked terms can still be added to campaigns. The “TM” mark is not included with the keywords when they are added. Presumably Google is adding the marker to help eliminate user surprise when trademarked terms get disapproved. The inconsistency could cause more confusion if in fact keywords such as “steelers pictures” are eventually disapproved for trademark reasons.

For More Info about Google Keyword Planner

Friday, November 22, 2013

Google Seeking Feature Requests For Webmaster Tools

Google’s head of search spam Matt Cutts posted on his personal blog a request for webmasters to provide feedback and feature requests for Google Webmaster Tools.

Matt and the Google search quality team is looking for new ideas on what would make Google Webmaster Tools more useful to you. Matt talked about how far Webmaster Tools has come but they want to continue to make it more useful.

To submit feedback, go to Matt’s blog and leave your feedback.

Matt’s disclaimer:

To be clear, this is just some personal brainstorming–I’m not saying that the Webmaster Tools team will work on any of these. What I’d really like to hear is what you would like to see in 2014, either in Webmaster Tools or from the larger team that works with webmasters and site owners.


Monday, October 28, 2013

Google’s Matt Cutts : More Pages Does Not Equal Higher Rankings

In a new video released today by Google’s head of search spam, Matt Cutts, we learn that the more pages a web site has, does not necessarily mean you will have better rankings.

Matt Cutts said, “I wouldn’t assume that just because you have a large number of indexed pages that you automatically get a high-ranking. That’s not the case.”

He goes on to explain that the more pages you have, the more chances you have to rank for different keywords. Plus, the more pages you have, the more likely you have more overall links and PageRank, which do directly impact your rankings. But the number of pages on a specific site, does not have a direct ranking benefit.

Here is the video:


Matt said it at the end again, “just having a number of pages doesn’t give you a boost.”



Sunday, October 27, 2013

LiveZilla Live Chat Server Installation

I. Users & Groups
________________________________________________________________________
The LiveZilla Server Admin contains several functions and options you will need to setup, update and maintain LiveZilla Server installations. The first step when setting up LiveZilla is to create a LiveZilla Server Installation on your webserver. Select "Create new LiveZilla Server" and press next to proceed. 

You are asked to enter the details of your administration account on the first page of the wizard. The username and the password you specify here will be used to maintain and update the server you are going to create. Please keep username and password in mind. The administration account is your first internal user (=operator) which can also be used for all daily tasks. You can create additional internal users once you have finished this installation. 

The next step will be to create the first LiveZilla Group. Please enter the email address of your support department and click next again.

II. FTP Upload
_________________________________________________________________________
You can directly upload the files to your webserver using FTP. All you need is a working FTP account with permission to create files and folders (write access). Enter the account details on the FTP setup page and do not forget to validate the target location for your LiveZilla Server Installation. 

How to find / validate the target location for your LiveZilla Server?
Press the Select button in order to see the file structure on your webserver. Make sure to (create and) select a folder as target for the installation, it is elsewise possible that you overwrite some files of your website. Make also sure that you don't upload the files to a location above / outside of your public web folder (wwwroot, public_html or htdocs). You would not be able to access your LiveZilla Server elsewise. Press next to initiate the file upload process. 

Alternative to FTP Upload: Extract to local system
__________________________________________________________________________
It's also possible (alternatively to FTP) to extract the files of the LiveZilla Server Installation to your local computer. In case you are working on the webserver you can extract the files directly into the public web folder.

This option should also be used if you intend to upload the server files to your webserver using a common FTP Client (like FileZilla). Keep in mind that You will need to set the file permissions manually.

III. Determining the Server URL
__________________________________________________________________________
When all files have been uploaded to your webserver you will be asked to determine the absolute URL of the folder of your new LiveZilla Server Installation. Please do not proceed until you have passed the server test, what guarantees that the LiveZilla Server URL is correct. 

How to find the correct URL?
You should copy the URL suggested by the wizard and try to open it in your webbrowser. In case the suggestion of the wizard was correct you will see a page like this and you are done here. Elsewise, you will need to find the correct URL yourself. In case that you have uploaded the files to a folder named LiveZilla below your website structure, it's likely that the URL will look like http://www.yourwebsite.domain/livezilla.

Once you have determined the right URL and passed the Server Test click next to proceed to the next step.

IV. MySQL Database Settings
__________________________________________________________________________
The final step in this wizard is to setup the MySQL connection on your new LiveZilla Server. 

MySQL Database

You will need to specify:

  • Host
    The Hostname or IP of your MySQL / Database Server. Please note that all connections to the database will be initiated by the LiveZilla Server Script (PHP) and not by your local LiveZilla Server Admin application. In most cases, Webserver and MySQL Server are running on the same (logical) server which means that localhost will work as hostname. If your MySQL database is installed on a remote machine you will need to enter the hostname of the remote database server.
  • User
    The database user you like to use for LiveZilla. The user needs the permission to CREATE/ALTER/DROP tables and SELECT/UPDATE/INSERT/DELETE data in the regarding database.
  • Database Name
    The MySQL database you want to use for the LiveZilla Server Installation.
  • Prefix (optional)
    The prefix entered here will be added in front of each table name. This will help you to organize data in a database and prevent one type of data from interfering with another.

Enter your database details and press Create Tables to create the set of LiveZilla tables. Validate database access and data consistency by pressing Validate Database

Please note: you will not be able to create the tables and validate database access until the correct URL of your LiveZilla Server has been determined (previous page of this wizard)! 
_________________________________________________________________
Congratulations, your LiveZilla Server is ready now. The final step is to integrate LiveZilla with your websites


Tags : LiveZilla Live Chat Server Installation, live Zilla chat, nilesh patelseo services providerfans of photography,  nilesh patel seo

Thursday, October 24, 2013

Matt Cutts At Pubcon 2013: Moonshots, Machine Learning & The Future Of Google Search

This morning, the head of Google’s webspam team Matt Cutts gave a keynote speech at Pubcon in Las Vegas. The keynote comes on the heels of a scathing day — 1 keynote from Jason Calacanis, who said that Google rules everything, that they were essentially evil. On Twitter yesterday, Matt asked if Jason wanted the polite response, or a thorough one.  All of us here in attendance are hoping for “thorough.”


Matt starts with the state of the index talking about where will Google go in future.

He’s proud that Google as doubled down on ‘moonshot’ changes, specifically:

  • Knowledge Graph Google has been trying to understand entities — not just the searches. So essentially they are trying to learn about “things not strings.”
  • Voice Search
  • Conversational Search
  • Google Now Matt is proud that today, sometimes you don’t even have to search to find information you need.
  • Deep Learning Google is looking more into the relationships between words. Google will be able to read at a higher level and interpret the relationships between words. Works well with voice search when a user asks Google, “Who is the Prime Minister of Turkey?” then searches again for “How old is he?” and Google can reply with the previous context.

Core Quality Changes

  • Hummingbird This change targets better natural language translation. Search is more than just matching words — instead it’s looking at specific words that are more meaningful for intelligent scoring. For instance, a voice search for “what is the capital of Texas, my Dear” the “my Dear” isn’t that important — Hummingbird will be able to detect this. While Hummingbird affected 90% of queries, it was a very subtle change that most users didn’t recognize but will help users get more pertinent results.
  • Panda Softening This is something that Google has looked into to help bring some sites and content back.
  • Detecting/Boosting Authorities Not done by hand, but applies by topic areas. Webmasters can keep deepening their content on a topic to further their authoritativeness on a specific content area.
  • Smartphone Ranking Doesn’t have flash? won’t display a site that has flash to you then.

Webspam Changes

  • Penguin 2.0 & 2.1 Penguin 2.0 was released – not that intensive. Black hats said wasn’t big, so Google then released turned it up in 2.1. More changes will be continually coming, so buckle up.
  • Spammy Query Algorithms Items like porn and payday loans will be targeted for better results. Right now the SERPS aren’t great, but they will be working on it.
  • Advertorials/Native Advertising Google has cracked down on publishers selling ads that blended in as editorial with dofollow links.. You shouldn’t be paying for links that pass pagerank.
  • Spam Networks They’ve got a pretty good list, just working their way down them. Matt joked that he should talk a poll to determine who to axe next.

Communication

Google has done a great job of increasing the communication with webmasters, especially:
  • New videos for malware/hacking
  • Concrete examples in guidelines
  • >100 speaking events, Hangouts on Air, webmaster office hours
  • How search works website

Future of Search

  • Machine Learning Google’s goal is to provide the world information.  The word “search engine” isn’t anywhere in their mission statement. They want to be able to give answers to specific queries.
  • Mobile Mobile is coming faster than anyone expected. 40% of YouTube videos are now served to mobile devices. If you haven’t thought about mobile, it’s time to start thinking about it.
  • Social/Identity/Authorship Matt starts with “Facebook did a great job of social and knowing who people are.” Then talks about the fact that signal is not just likes/+1s/Tweets but in the long terms; social signals are a sign of authority. You are someone worth listening to — search engines will think you are worth listening to as well.

Webspam Trends

  • Hacking Next 6 months – it’s going to look like we aren’t working on much. Now working on next generation of hacking. Queries like “buy viagra” still looks bad because people are breaking the laws.
  • Hot Topics Items like child porn, international issues and really nasty queries are being addressed.
  • No Toolbar PageRank scheduled for rest of year The pipeline for updating PageRank broke this year and PageRank stopped updating. Google realized that it wasn’t that bad and stopped updating as people seem to pay too much attention to the metric. It’s something they will reassess at a later time.

Advice

  • Mobile Get ready, you need a mobile plan.
  • Request Autocomplete New item in Chrome that allows users to auto-fill forms. Saves users time by using the standard to pull in all information and increase chance of conversions.
  • Ad-heavy pages above the fold Some tweaks are coming to “turn up” this algorithm. Users shouldn’t see a barrage of ads above the fold when they visit a site.
  • Tightening Authorship Matt mentions that a tightening of Authorship may provide better results. Google is looking for a 15% reduction to ensure that the quality of the authorship is still high and relevant.
  • Rich Snippets The ability to have and use rich snippets may be taken away for low quality sites in the coming months
  • Smarter on JavaScript Google is now fetching, rendering and indexing items called by JavaScript. Google is getting smarter and understanding smarter libraries.
Now to the Q and A section:

Matt talks about +1′s specifically and that they are a short term signal, but very bullish on long term signal of authorship.  Next Matt talks about Negative SEO. Worked on Negative SEO for years. With Penguin, it not only removes sites, but can actually have a negative effect on the site. Disavow tool announced last year, use as a last resort. Use Webmaster Tools, find links and disavow at link or domain level. Webmaster tools is now giving better backlinks, not just A-Z, so use Webmaster Tools to help identify, can see 100,000 links.

In response to Jason Calacanis’ claims from yesterday, Matt polls the crowd on whether or not to go into the matter. Crowd wants to hear the response. Matt talks about the initial version of Panda and whether or not they should have rolled out slowly. Matt says that this wouldn’t have been good and cites multiple articles showing the degrading quality of the search results. Google needed Panda. A Googler made a personal blocklist to block specific sites and nearly 200k users installed — people did not want these content farms.

In response to Jason’s claims that Google wasn’t a good partner, Matt talked about the fact that no companies have partnerships with Google. There are YouTube partnerships, not Google search partnerships. In aggregate, Mahalo simply wasn’t a quality site and they came to an impasse at a personal meeting. This wasn’t even a webspam issue, it was a quality issue and nobody received special treatment.

With the Mahalo issue behind, Matt talks about press releases. “If you are paying for PageRank, you probably aren’t doing something right.” Google has identified “a lot” of the top Press Release sites and ignores the links but doesn’t penalize those who are using them.

On infinite scrolling issues, Matt recommends using some type of paginated versions as a safety guard to index all content. On the growing size of the Google bar, Matt mentions that they are aware of the size and pixels being taken up by Google.

That’s a wrap folks.

For More Info about Matt Cutts At Pubcon


Tuesday, October 22, 2013

Character Definitions for htaccess

#
the # instructs the server to ignore the line. used for including comments. each line of comments requires it’s own #. when including comments, it is good practice to use only letters, numbers, dashes, and underscores. this practice will help eliminate/avoid potential server parsing errors.

[F]
Forbidden: instructs the server to return a 403 Forbidden to the client.

[L]
Last rule: instructs the server to stop rewriting after the preceding directive is processed.

[N]
Next: instructs Apache to rerun the rewrite rule until all rewriting directives have been achieved.

[G]
Gone: instructs the server to deliver Gone (no longer exists) status message.

[P]
Proxy: instructs server to handle requests by mod_proxy

[C]
Chain: instructs server to chain the current rule with the previous rule.

[R]
Redirect: instructs Apache to issue a redirect, causing the browser to request the rewritten/modified URL.

[NC]
No Case: defines any associated argument as case-insensitive. i.e., "NC" = "No Case".

[PT]
Pass Through: instructs mod_rewrite to pass the rewritten URL back to Apache for further processing.

[OR]
Or: specifies a logical "or" that ties two expressions together such that either one proving true will cause the associated rule to be applied.

[NE]
No Escape: instructs the server to parse output without escaping characters.

[NS]
No Subrequest: instructs the server to skip the directive if internal sub-request.

[QSA]
Append Query String: directs server to add the query string to the end of the expression (URL).

[S=x]
Skip: instructs the server to skip the next "x" number of rules if a match is detected.

[E=variable:value]
Environmental Variable: instructs the server to set the environmental variable "variable" to "value".

[T=MIME-type]
Mime Type: declares the mime type of the target resource.

[]
specifies a character class, in which any character within the brackets will be a match. e.g., [xyz] will match either an x, y, or z.

[]+
character class in which any combination of items within the brackets will be a match. e.g., [xyz]+ will match any number of x’s, y’s, z’s, or any combination of these characters.

[^]
specifies not within a character class. e.g., [^xyz] will match any character that is neither x, y, nor z.

[a-z]
a dash (-) between two characters within a character class ([]) denotes the range of characters between them. e.g., [a-zA-Z] matches all lowercase and uppercase letters from a to z.

a{n}
specifies an exact number, n, of the preceding character. e.g., x{3} matches exactly three x’s.

a{n,}
specifies n or more of the preceding character. e.g., x{3,} matches three or more x’s.

a{n,m}
specifies a range of numbers, between n and m, of the preceding character. e.g., x{3,7} matches three, four, five, six, or seven x’s.

()
used to group characters together, thereby considering them as a single unit. e.g., (perishable)?press will match press, with or without the perishable prefix.

^
denotes the beginning of a regex (regex = regular expression) test string. i.e., begin argument with the proceeding character.

$
denotes the end of a regex (regex = regular expression) test string. i.e., end argument with the previous character.

?
declares as optional the preceding character. e.g., monzas? will match monza or monzas, while mon(za)? will match either mon or monza. i.e., x? matches zero or one of x.

!
declares negation. e.g., “!string” matches everything except “string”.

.
a dot (or period) indicates any single arbitrary character.

-
instructs “not to” rewrite the URL, as in “...domain.com.* - [F]”.

+
matches one or more of the preceding character. e.g., G+ matches one or more G’s, while "+" will match one or more characters of any kind.

*
matches zero or more of the preceding character. e.g., use “.*” as a wildcard.

|
declares a logical “or” operator. for example, (x|y) matches x or y.

\
escapes special characters ( ^ $ ! . * | ). e.g., use “\.” to indicate/escape a literal dot.

\.
indicates a literal dot (escaped).

/*
zero or more slashes.

.*
zero or more arbitrary characters.

^$
defines an empty string.

^.*$
defines one character that is neither a slash nor a dot.

[^/.]+
defines any number of characters which contains neither slash nor dot.

http://
this is a literal statement — in this case, the literal character string, “http://”.

^domain.*
defines a string that begins with the term “domain”, which then may be proceeded by any number of any characters.

^domain\.com$
defines the exact string “domain.com

-d
tests if string is an existing directory

-f
tests if string is an existing file

-s
tests if file in test string has a non-zero value

Redirection Header Codes
  • 301 – Moved Permanently
  • 302 – Moved Temporarily
  • 403 – Forbidden
  • 404 – Not Found
  • 410 – Gone

Tags : Character Definitions for htaccess, nilesh patelseo services providerfans of photography,  nilesh patel seo