In #Google We Trust (Your Identity) – Search Engine Watch
Oct 30th, 2011 by netizenshane

In 1993, Peter Steiner famously wrote in the New Yorker that, “on the Internet, nobody knows you’re a dog.” In 2011, trillions of dollars in e-commerce transactions and a growing number of other interactions have made knowing that someone is not only human, but a particular individual, increasingly important.” Radar O’Reilly.

According to the White House Blog Post of October 14, 2011, Google, PayPal and Equifax are the first three official, federally credentialed, Identity Providers of the NSTICs (National Strategy for Trusted Identities in Cyberspace) Identity Ecosystem. Meh, you say, what the heck does that mean and why should I care?

Well hopefully today we can shed a little light on exactly what the NSTIC is, how it is being used, and why it (hopefully) matters to you.

Why Did The NSTIC Emerge?

 In #Google We Trust (Your Identity) Search Engine Watch

The NSTIC arose out of the perceived need to create a more secure and frictionless space (read: no passwords) for online interactions and transactions, especially in the areas of e-commerce and government. The NSTIC was formed with the hopes of abolishing passwords, verifying your identity through trusted identity providers and away you go, no log-ins, no fuss, and no muss.

Well that is the simplified version. The full version of the NSTIC can be found here, though make sure you have settled in for a full day of reading and following up on references. Obviously, we will not go over everything it contains, but trust me, you will not be bored!

Hmm Curious? Innovation stopped because of having to log in? Really?

What Is The NSTIC?

The NSTIC is a government strategy that is guiding the development of the “Trust Framework” and the “Identity Ecosystem”.

The Trust Framework is that framework which all Identity Providers, private (federally credentialed) companies, will operate. It sets out rules, procedures and a network of peers that will peer-regulate the others in the framework.

The Identity Ecosystem is the process and protocols in which the identities, yours and mine, will be held. It governs the transaction of your identity across providers and third parties who interact with your identity. It also will define the systems that will be utilized and how credentials will be passed between user and 3rd party, allowing for the ID Provider to act as the intermediary to confirm the trustworthiness of the individual.

Because the government is governed by such legislation as the Privacy Act of 1974, it would not be able to secure and authenticate individual identities without significant changes to our existing laws, so an Identity Provider would be a private corporation. Because it would be a poor choice economically and for security to have only one provider, there will be many Identity Providers that users can select from to act as their intermediary.

Note that ALL private and government agencies participating in the NSTIC would be required to follow the FIPPs Standards to be credentialed. These are based on the Privacy Act of 1974, but that can only be applied to Federal agencies.

Trustmarks

These agencies then will be given Trustmarks, along with accredited third party recipients. A Trustmark is “a badge, seal, image or logo that indicates a product or service provider has met the requirements of the Identity Ecosystem, as determined by an accreditation authority. To maintain Trustmark integrity, the Trustmark itself must be resistant to tampering and forgery; participants should be able to both visually and electronically validate its authenticity. The Trustmark provides a visible symbol to serve as an aid for citizens and organizations to make informed choices about the providers and identity media they use.”

In the Identity Ecosystem and Trust Framework, Trustmarks serve almost the same purpose as say a VeriSign would for a website now, with the exception that without a Trustmark you would not be able to engage in the Ecosystem and with any person or corporation that was requiring Identity System credentials from users.

Information Cards

On the individual user level this concept of a Trustmark is handled with an “Information Card”. An Information Card might be a combination of the data you would find in your Driver’s License and your Credit Card and would assist in verifying you to your Identity Provider.

Again this is a simplified version, for a more detailed understanding visit the NIST Site on the NSTIC.

Privacy, The NSTIC and You.

The NSTIC claims the new Trust Framework and Identity Ecosystem will be privacy enhancing and retain your anonymity. However, if you read the documents carefully you will see this “privacy” is only in relation to the data shared between you and a third party, it does not state that the data the Identity Provider holds is more private or more anonymous.

Why?

Well it is hard to say, but does seem the NSTIC documents goes to great lengths to NEVER address the actual database functionality at the transactional level. Why do I say this? Because I have read more than thirty of them and they all write in glowing language about how your ID is protected at the third party level, but do not how the data is stored and processed at the Identity Provider level. That was until I ran across the precursor documentation pages 190-191 ( in the PDF) Who Goes There? Authentication Through the Lens of Privacy. Stop by and take a read, it is enlightening. Or just wait for my next article!

So How Does The Identity Process Work?

How does the actual Identity Transaction work? After all, someone, somewhere HAS TO BE holding all that data, right? Yes, someone is, your Identity Provider holds all your data, certificates of trust and information. That is then exchanged with limited amounts of data from your data profile with 3rd parties, so you may perform transactions using the authenticated profile from your Identity Provider and the 3rd party site.

Risk?

The risk is that ID Provider who is authenticating your data is also able to then de-anonymize your data to discover who did what, when, where and why. Now you ask, why would they do that? Or you may say, meh, I am not doing anything wrong, why should I care?

Well first I would say to you and the naysayers… no matter who has held your information or data online before, no one has ever held it in aggregate before, like this, just one huge data carrot.

Have you ever watched a great mystery show where they have to go from clue to clue to put all the pieces together? That is how it is today. In the NSTIC world, that data is all held in one or a few Identity Provider databases with all your online activities of ANY SITE that participates in the NSTIC online authentication system and maybe even outside of that (have you heard of the Facebook tracking cookie that logs your visits to sites AFTER YOU LOG-OUT of Facebook?).

I Don’t Care I Don’t Do Anything Wrong!

Well, to that I would say to ask the three Wikileaks Volunteers who were turned into authorities after their Google accounts were searched without their knowledge because the Federal Government requested access (no subpoena needed), or the new search Facebook is allowing employers to have access to, the one that allows them to search not just you, but EVERYONE in your profile.

One only has to go to the EFF (EFF.org) to know that the laws around privacy in cyberspace are weak and unspecified in most cases. In these cases your data is kept in a third party database, so it can be accessed at the will of the company. And if you say it has nothing to do with you, I am sure that is true – oh wait – you have paid for all those downloads in your music files right?

The Identity Ecosystem, The FTC and You.

So what does the Data Collection System look like? How does the government see you and your data in this new Ecosystem? Here is a draft from the FTC of how it sees your role in the data collection process. Ironically, you seem so large, until you get to the second image where you suddenly seem to be swarmed by entities and tiny.

So let’s take a look at the charts.

First are the Data Collectors

 In #Google We Trust (Your Identity) Search Engine Watch

Now We Have The Data Users (OUTSIDE RING)

 In #Google We Trust (Your Identity) Search Engine Watch

And in between the Data Brokers

Notice in the second image, the presence of government, law enforcement and oddly enough marketers. Marketers? Well not that unusual when you consider Google (and at some point Facebook) are two of the Identity Providers, plus others like them. Yes, your data, available to marketers. Now the question will be at whose whim?

Voluntary

So it is all ok, after all if you do not like this idea, the NSTIC documentation stresses that this is voluntary. That in this new Ecosystem the user has to be a willing participant who has volunteered to join. While this sounds great, to volunteer, it has a meaning which means that you have an option to not volunteer, to NOT participate. The question then becomes, “Do you?”.

Critics of the NSTIC (and me) argue you do not have the ability to truly be a volunteer. When government agencies especially state, local and federal all move to this system as they are now, you will not be able to interact with it without a proper Ecosystem ID.

Then think about it, what about G+, which as Eric Schmidt of Google himself stated is an Identity Network, not a Social one. Then there is PayPal, Facebook, what happens when your sites become Identity Providers? Sure, I guess you could completely opt out of your online life, but is that really practical? Now according to the NSTIC, there are supposed to be alternatives for those who don’t want to participate, but there are no laws to force this and for those who do offer the secondary option, for how long will those be maintained?

Good Vs the Bad

So this press release by Identity Finder seemed to be echoed by many in the security community. If implemented WITH NEW Federal Regulations:

If done well, an ideal NSTIC Identity Ecosystem could establish:

  • High levels of identity assurance online, increasing trust between Users and service providers
  • More secure online transactions
  • Innovation and new services
  • Improved privacy and anonymity
  • Increased convenience for Users and savings for service providers

HOWEVER if..the NSTIC fails to implement the necessary regulations; the resulting Identity Ecosystem could turn into a free-for-all Identity marketplace, and create the following risks:

  • Powerful identity credentials which, if lost or stolen will enable hyper-identity theft
  • A false sense of control, privacy, and security among Users
  • New ways to covertly collect Users’ personal information
  • New markets in which to commoditize human identity
  • Few consumer protections against abuse or sharing personal information with third parties
  • No default legal recourse against participants who abuse personal information without consent

With the establishment of the new credentialed Identity Providers and NO NEW regulations in place, my fear is there have been no steps taken to prevent the second outcome.

It is quite simple, when you put something in a database, and you tie a user to a dataset, even by a unique identifier, somewhere that data is all held together and reverse engineering is not that difficult. Your Identity not being kept with the government, sure, that is a good thing and of course the legal thing. However, is your data, where you go online, how you interact, what you do better kept with private marketing and media companies as illustrated in the FTC image? And what are the legal implications? Now that PayPal is a trusted provider can the government go into your accounts without ever telling you? I don’t know the answer to these questions, yet!

Til Then!

Understanding Google’s Latest Landing Page Quality Score Release
Oct 29th, 2011 by netizenshane

As Pamela Parker described at SEL recently, Google has announced that going forward, landing page quality will be a larger factor in an ad’s overall Quality Score.

Given that my last post for SEL was on Quality Score and suggested that landing page quality was mostly a hammer used to beat up bad actors and didn’t have much meaning for legitimate businesses, this announcement was a bit embarrassing. It also made me curious as to how quality will be defined.

I got a chance to speak with Jonathan Alferness, director of product development on the Google ads quality team, to delve deeper into the purpose of this initiative and try to glean insight as to how landing page quality would be measured.

Jonathan was very helpful with respect to the former question and understandably protective when it came to the latter.

Below, I’ll make a clear distinction between what Google is willing to say publicly about landing page quality and what I believe this means. I’ll summarize what Google has said either through my conversation with Jonathan or elsewhere in print in italics. My speculation will be in normal font.

Why Did Google Do This?

Jonathan made the case to Pamela and to me that the intent is to improve user experience.

When landing pages don’t line up well with the user’s search, the user has a poor experience, and ultimately that’s bad for everyone.

It’s bad for the user because they don’t end up where they expected to end up; it’s bad for the advertiser because they clicked on an ad and ended up somewhere they didn’t want to be, winning the advertiser no great brand impression; and it’s bad for Google, because that user is more likely to seek a different search experience and less likely to use Google sponsored links in the future.

This makes perfect sense. As I argued in last month’s post, Google maximizes its immediate term revenue by making QS 100% based on anticipated Click-Through Rate (CTR). Factors related to landing page or anything else reduce Google’s revenue per SERP view by reducing the importance of CTR.

However, poor landing page experiences might reduce Google’s long-term revenue by training users not to click on sponsored listings. That could jeopardize the business.

By creating a finer gradation between “your landing page stinks” and “it doesn’t stink” — essentially where we’ve been with landing page QS — Google rewards advertisers that pay attention to landing page decisions and creates another incentive to provide a great user experience.

In the long run, better landing pages lead to higher conversion rates, which in turn means an increase in the value of traffic, therefore allowing higher bids and correspondingly more traffic at the same efficiency. It’s a beautiful virtuous circle.

It’s also at least part of the reason that Google bought Urchin and made Google Analytics free. Giving advertisers the tools to diagnose user behavior and improve website effectiveness should ultimately lead to better monetization of traffic and higher bids in the paid search auction. Smart. Very smart.

How Much Weight Will Be Given To Landing Page Quality?

More.

Google isn’t saying much on this topic, and what they’re saying is a bit of a mixed bag. Jonathan told Pamela that ads with high-quality landing pages will get a “strong boost”; he told me it would be a “slight boost”; and on the AdWords blog post it says: “We expect most campaigns will not see a significant change in overall performance.”

They began the roll-out in South America a month ago and appear to be rolling it out to smaller accounts in the US first. This suggests to me that they think it could have a material effect that they want to watch very carefully. Perhaps not.

Making even minor changes to software should be tested thoroughly before roll-out. Perhaps most campaigns won’t see a significant effect because most landing pages are roughly equivalent, so this change will mostly impact the “outliers”?

It’s hard to imagine that Google will sacrifice revenue on this, either. They’ve likely done the math to see how much gradations in landing page quality affect the propensity of people to use sponsored links in the immediate future, and have tuned this dial to be revenue positive even in the short run. It will go on to be yet another one of Google’s revenue dials, subject to periodic “tuning.”

How Will Google Determine ‘Quality’?

Jonathan said: “Google is focused on both relevance and content on the landing page in addition to the user response to such landing pages.”

I pressed on this with the question: “Will advertisers with Google conversion tracking on their sites have an advantage over other advertisers, given that conversion rates are a very clear signal as to whether users found what they wanted?”

His response was: “There is no special information derived for some customers but not others. We apply the principles of landing page quality and relevance to every ad and every landing page. The advertising system is dynamic, as an auction takes place for each query.”

The behavioral signals that Google traces must therefore be related to what they can track for all advertisers, not just those with Google conversion tracking: bounce rate, propensity to search again in short order, time on site, etc. I’d love to know how much of this behavioral data is gathered through Google toolbars as a sampling method rather than from watching browsers return behavior from Google.com, but I don’t know that.

With respect to the relevance and content on the page, Google means to determine: Does this page match the user’s intent? Behavior is one guide, but content on the page will matter as well. Pages that seem to be about the topic of the user’s search will do better here than general landing pages.

It would be surprising if something as simple as blowing the user’s search string or the keyword back at the user dynamically on every page would be meaningful in terms of content on the page unless it affects user behavior.

I suspect the advertiser will be rewarded for landing a “ball peen hammer” search on a page with ball peen hammers on it, rather than a page with only wrenches.

High quality landing page for “ball peen hammer” search:

Ball peen category page Understanding Googles Latest Landing Page Quality Score Release

Previously high quality landing page in Google’s eyes — ie: “not spam, and loads fast” — now considered low quality content:

wrench category page Understanding Googles Latest Landing Page Quality Score Release

Another High quality landing page for the same search: I bet they like a product page just as much

ball peen product page Understanding Googles Latest Landing Page Quality Score Release

Whether the advertiser selects the ideal ball peen hammer page chosen among several options is probably beneath the level of detail they’re getting into at this point on the “content on the page” piece of the equation.

I’d be stunned if Google tried to differentiate between the first and third based on content on the page, but our experience suggests that the conversion rates will probably be higher for the first page.

One worry that I expressed was that Google would bake in some preconceived notions about good pages and bad pages.

For example, we’ve heard it expressed that Google thinks taking someone from a Google SERP to someone else’s SERP is a bad experience, therefore a website’s search results pages might be frowned upon as a landing page choice.

Would Google arbitrarily say this was a poor quality landing page for a “ball peen hammer” search?

ball peen search results page Understanding Googles Latest Landing Page Quality Score Release

That would fly in the face of our experience, which suggests that often a good site-search results page outperforms a well-chosen sub-category page in terms of conversion rates.

Similarly, a “view all” page often outperforms landing folks on page 1, but it’s conceivable that latency considerations due to loading “all” the images will trump what we know to be better performance from a conversion rate perspective.

Google could also conceivably decide which page on an advertiser’s site ranks highest for them in organic listings on a given search and reward folks who choose that page for paid search ads as well. That would rankle those of us who believe smart humans who pay attention pick better pages than Google does. More on that in a later post.

Jonathan responded that Google wasn’t baking in preconceived notions, that relevant content and user behavior would define the landing page quality.

Conclusions

Within e-commerce, this could serve as one more penalty for those who aren’t mass merchants. If you sell office chairs of the $500 variety, not the $40 variety, you’ll likely find your conversion rate on the term “Office chairs” to be worse than a “competitor” who sells the $40 version even with qualifying ad copy to wit “Starting at $499.”

The conversion rate is lower, the bounce rate is higher, so behavioral signals after the click incur a landing page quality penalty on top of the CTR penalty incurred by the qualifying ad copy. This double whammy may make it even tougher to compete.

Maybe that’s the “right” result for users, but maybe it takes away diversity of choices for those who actually read the copy.

At the end of the day, it appears likely that this change will primarily reward advertisers for doing what they should have done all along. Pick the best page among the available choices for your PPC landing pages, and design those pages for maximum effectiveness in monetizing traffic. If you’ve been doing that since day one, expect a bit of a boost in traffic in the coming weeks.

Google Tweaks Competition Rank In AdWords Keyword Tool To “Low,” “Medium” or “High”
Oct 28th, 2011 by netizenshane

Many users are now seeing a change in the way that Google’s Keyword Ranking tool is providing competition data. Instead of the traditional bars to display competitiveness, Google is using text as the description.

Up until now, this tool used small green bars to display the competitiveness of keywords. The more green in the bar, the more competitive the term was:

Search Total Bars Google Tweaks Competition Rank In AdWords Keyword Tool To Low, Medium or High

Now the Google tool is only showing text-based descriptions of competitiveness: Low, Medium or High:

New Competition Metric 600x184 Google Tweaks Competition Rank In AdWords Keyword Tool To Low, Medium or High

While the competitiveness has been replaced by text in the web version, the good news is that Google has maintained the granularity as to how this ranking was assigned. In the previous version of the tool, users could download a report to get the exact number that made up the green bar. This is still the case with the new changes. So users who require more information can still download the report to a spreadsheet and get the same information:

Keyword Ranking Data 600x172 Google Tweaks Competition Rank In AdWords Keyword Tool To Low, Medium or High

For any of those who regularly use this competition metric, the new competitive labeling is as follows:

  • Low — Competitiveness number under .33
  • Medium — Competitiveness number between .33 and .66
  • High — Competitiveness number over .67
It should be known that the competitiveness score is directly related to AdWords, not organic search. Google defines the metric as:
“This column shows the number of advertisers worldwide bidding on each keyword relative to all keywords across Google. The shaded bar represents a general low-to-high quantitative guide to help you determine how competitive ad placement is for a particular keyword.”
How To Save Money On AdWords Placements With Google Analytics
Oct 23rd, 2011 by netizenshane

Google’s display network can bring you tremendous amounts of clicks and conversions if used correctly. If it is not used correctly, you can quickly spend mass amounts of money and have nothing to show for it.

A couple of years ago, I wrote an article on how to manage the display network so you can spend most of your money on sites that are bringing in quality traffic. This is a quick graphic of the workflow that I still use today.

  • The Discovery Campaign is one of your lower daily budgets, and its goal is to find good placements where you want to spend more money.
  • The Placements Campaign is one of your higher budgets as it only contains sites that are helping you reach your overall goals.

SEL1 600x285 How To Save Money On AdWords Placements With Google Analytics

While I find this workflow very useful, the overall problem is when do you decide to block placements?

In your AdWords account, the only data you can see for any placement is conversions and conversion rates. The problem with so little data is that if you wait until you have enough statistically significant data to make a decision, you will never find all of the good placements, and you will have spent too much money on bad ones.

There is another way to gain insight into placements with Google Analytics that can help determine whether a site is sending you quality traffic.

Evaluating Placements With Google Analytics

SEL2 How To Save Money On AdWords Placements With Google Analytics

To have access to this data, you need to link your Google Analytics account to your AdWords account.

Next, navigate to the placements information under the AdWords reports (found under traffic sources).

If you have goals set up, then you can sort by the goal completions, conversion rates and other data points to find the sites that are doing well for you.

SEL3 How To Save Money On AdWords Placements With Google Analytics

While this data is useful for adding placements, it can also be useful for finding placements that you want to block even if you don’t have statistical data.

Sort Placements By Bounce Rates

Instead of trying to find sites that you want to add as placements, examine bounce rates to find sites where the traffic is so poor, you don’t want to wait for statistical data.

SEL4 How To Save Money On AdWords Placements With Google Analytics

In this case, we have a handful of sites that have sent more than 18 visitors and have a 100% bounce rate. No one from any of those sites has even gone to a second page, therefore, we will often block these even though we don’t have statistically significant data.

Please note, you don’t want to just block sites if their bounce rates are 100%. You should also double-check the ad copy and landing pages to make sure the offers are relevant for that site. If you consistently see high bounce rates for your display campaigns, then you might need to change the offer and landing page before deciding to block placements.

Just remember, a bounce in Google Analytics is a visitor who only went to a single page and then left your site. If someone gets to your landing page, picks up the phone and calls you, finishes an order over the phone and then leaves your site, they will be counted as a bounce even if they spent 20 minutes on the phone with you.

Create Interaction Goals

A quick way of seeing what sites are bringing in good versus poor traffic is to create interaction goals within Google Analytics. With Google Analytics, you can create goals based upon time on site or page views per visit.

SEL5 How To Save Money On AdWords Placements With Google Analytics

If you create goals with these types of metrics, then you can easily examine what sites are not meeting your basic minimum interaction and then block those sites that are underperforming.

Conversely, if you find that sites are bringing in visitors that are spending several minutes on your site, you shouldn’t block those sites until you have enough clicks to determine whether those visitors will eventually convert.

By using interaction goals, you can gain another level of insight into the placements where you are spending money, so you can make better decisions about blocking the sites or spending more money on the sites to gather more data.

SEL6 600x445 How To Save Money On AdWords Placements With Google Analytics

If you have various types of goals on your site, I would recommend splitting out these types of goals by goal set. You might have one goal set that is all revenue events, and another one that is site interaction. By splitting these different types of goals out by goal set, you can see one tab of just interaction goals, and another tab of just revenue goals. That way your revenue goal events will not be polluted by site interaction events and vice versa.

Conclusion

Overall, I like Google’s display network. There is a lot of traffic and conversions to be had from managing it correctly. However, if managed incorrectly, the display network can be a money pit. Therefore, you do need a system for managing the display network so it will perform for you.

However, the patience and money required to always have statistically relevant data is beyond what most AdWords advertisers have. Therefore, when you see sites that have several visits and 100% bounce rates, feel free to block them quickly. When you see sites that have some visitors, and those visitors are spending time on your site, you should be more patient in determining whether the site will eventually be a converting one for you.

By using Google Analytics to examine your AdWords data, you can go beyond just examining conversion rates to also determining interaction rates and gaining another viewpoint into the placement sites where you are spending your money, so you can spend your budget as wisely as possible.

Siri,Quora, And The Future Of Search | TechCrunch
Oct 23rd, 2011 by netizenshane

With the rise of Google+, the decrease in controversial posting activity by famous tech people and the allure of other shiny new things, the majority of tech press has turned the focus of their gazes away from Quora, my favorite startup of 2010.

Well now that Apple has gone and integrated the most sophisticated piece of AI to ever to see the light of the consumer market into its iPhone 4S, I thought it was time to brush some dirt off of Quora’s shoulder and shine a light on what the future of the company could hold.

What most people who don’t get Quora miss when they write it off as “another Q&A site” (or whatever it is they say then they write it off) is this: When they first launched Quora in the Fall of 2009, Quora’s founders and their first hire—designer Rebekah Cox—created the core of the most impressive “subjective knowledge extraction” machine ever constructed. (Yes, Wikipedia deserves its credit as the first juggernaut of this space, but Quora is positioned to eventually seize its mantle. Meanwhile, you could argue that the whole internet is the most impressive subjective knowledge extraction technology ever constructed, but that’s just semantics).

By combining an answer voting mechanism and a reward addiction loop (upvotes are crack) with a strict identity requirement and a one-to-many follower model, Quora started solving the problem of extracting high-quality experiential knowledge out of humanity’s collective head and getting it into structured form on the internet. What’s more, Quora is also using humanity’s collective wisdom to rank it.

With this engine, Quora is building a database of human experience that could eventually contain the answers to a lot of questions people carrying the iPhones of the future might have.

Which brings me back to Siri.

For those of you who haven’t thought through it yet or haven’t played with the iPhone 4S, Siri is a game-changing technology: The thing knows how to translate the garble of human language into targeted API calls that subsequently pull out the correct information from a potentially ever-expanding set of databases (assuming that Apple one day integrates other databases into Siri, which I’m confident it will). The main thing standing between Siri and the best answer for our likely questions is that the database that contains these answers is still a work in progress.

That work in progress is Quora, which is probably why I heard the rumor that some massive search and advertising company that shall go unnamed until the next paragraph allegedly offered to pay upwards of $1B to acquire it.

If that rumor is true, it means that Google looked at Quora and understood the magnitude of the threat. If it’s not true, it means someone making high-level strategic decisions at Google is not paying attention. As I wrote in a post about Quora and Google in March:

Consider an internet on which the best answers to the majority of our queries come not from the vast, increasingly noisy expanses of the world-wide-web but from the concentrated knowledge and experience of its most articulate experts. Here, you no longer filter through 10 blue links (or hundreds) to find what you seek; you simply input your query and are delivered the top response. Should you find yourself asking a question no one has asked before, you merely add it to the stream, where it makes its way to the people who can answer it best.

Take Siri as the primary interface for these queries and that just about wraps it up: If Quora’s brilliant team successfully navigates the chasm between its passionate early adopters and the rest of the articulate set, their company could eventually, along with Siri, become an existential danger to the core of Google’s business.

Organic Click-Thru Rates Tumbling; Only 52% Click On Page One
Oct 22nd, 2011 by netizenshane

Only 52% of Google users click on an organic search result found on page one, and only 26% of Bing users do the same. Those stats are from A Tale of Two Studies: Establishing Google & BING Click-Through Rates, a new report from search agency Slingshot SEO. And they’re substantially lower than several prior CTR studies.

CTR Curve Google vs Bing SlingshotSEO Organic Click Thru Rates Tumbling; Only 52% Click On Page One

According to Slingshot’s research, a number one ranking on Google gets about an 18% click-thru rate and the number two organic listing gets about 10% CTR. Both of those are about double the CTRs of comparable organic listings on Bing — 9.66% and 5.51%, respectively.

Slingshot measured click-thrus on both search engines for 624 non-branded keywords between January and July of this year. The company identified a selection of keywords that showed stable rankings for 91 of its clients, which include large retailers and other enterprise-level clients.

This is the latest in a handful of attempts at measuring click-thru rates in organic search listings, and might be the most debated — largely because the numbers in Slingshot’s study are dramatically lower than previous CTR studies.

History Of CTR Studies

AOL Data, 2006

The first real chance that search marketers had to examine click-thru rates on organic search results came after AOL released 20 million search queries made by more than a half-million users in 2006.

aol search data Organic Click Thru Rates Tumbling; Only 52% Click On Page One

Based on that data, a number one ranking was worth about 42% of clicks, a number two was worth about 12% and ranking third was worth about 8.5% CTR. The 42% CTR for being number one is dramatically higher than the Slingshot study’s figures, but the CTR for lower positions are fairly similar.

The AOL data showed that the top 10 search results combined to get just less than 90% of all clicks — substantially higher than Slingshot’s 52% for Google and 26% for Bing.

Enquiro, 2007

In 2007, Enquiro (now known as Mediative) did a small CTR study that focused on the B2B market. The study is referenced about halfway down this blog post. Enquiro’s CTR figures were lower than the AOL data suggested a year earlier.

enquiro study Organic Click Thru Rates Tumbling; Only 52% Click On Page One

Enquiro’s data showed number one rankings getting about 25% of clicks, with CTR dropping to about 11% and 8% for the second and third positions, respectively. Overall, Enquiro’s study showed page one getting about 73% CTR.

In its blog post from last year, Enquiro goes on to say that it has “conducted additional research which suggests that the CTR is most likely around 26% for top organic, 12.68% for position two and 8.89% for third organic spot.”

Chitika, May 2010

About a year-and-a-half ago, the online ad network Chitika published its own study of organic CTRs from Google. Chitika examined Google traffic — 8.2 million impressions — coming into its network of advertiser sites and organized that traffic based on Google placement.

chitika ctr Organic Click Thru Rates Tumbling; Only 52% Click On Page One

The Chitika data shows the top three search results combining for almost 63% CTR, with the entire top ten broken down like this:

  1. 34.35%
  2. 16.96%
  3. 11.42%
  4. 7.73%
  5. 6.19%
  6. 5.05%
  7. 4.02%
  8. 3.47%
  9. 2.85%
  10. 2.71%

Chitika’s study credited page one of Google’s search results with a total CTR of almost 95 percent.

Optify, December 2010

Earlier this year, Optify published its results from a December 2010 study of Google CTRs covering both B2C and B2B websites.

optify ctr Organic Click Thru Rates Tumbling; Only 52% Click On Page One

Optify’s study showed a 36.4% CTR for the number one result in Google, followed by 12.5% and 9.5% for the second and third spots, respectively. Again, the top spot has a much higher CTR than in the Slingshot study, but other positions aren’t dramatically different.

Optify’s overall CTR for page one is about 89%, much higher than what Slingshot found for both Google and Bing.

Are CTRs Dropping? Why?

The search results page has changed dramatically since 2006 when AOL’s search data became public. In May 2007, Google launched Universal Search, which added news results, images, videos, local business listings with maps and a variety of other content types beyond the traditional 10 blue links. Last year, Google added Instant Search, a feature that changed the search results page while users typed in the search box.

Both of those, along with countless other features/tools, have changed the way in which searchers interact with search results. (Think about the amount of queries that have instant answers and don’t even require a click.)

When Bing launched in 2009, its search results page was anything but typical, too. Bing organized search results into categories and showed a lot more than 10 links on page one. It also shows Related Searches and Search History links above the fold, giving users several click options outside of the actual search results. Bing’s web page previews — that Google has since copied — also mean that searchers can sometimes get the info they want without having to click on the search result.

Slingshot’s study aimed to measure the impact of new search results on CTR, but it “failed to prove that there are significant differences in user behavior regarding blended versus non-blended results.”

slingshot blended ctr Organic Click Thru Rates Tumbling; Only 52% Click On Page One

Slingshot’s conclusion: “The effect of blended results on user behavior remains to be seen.”

Yes, the search results page has changed. Yes, raw search numbers are going up. It could be that any CTR study will produce results that are unique to the keywords/industries that the study covers — i.e., if Slingshot had studied non-retail and non-enterprise keywords, CTRs might be quite different.

Ultimately, there’s no easy explanation for why click-thru rates are so much lower in this study than in past attempts to measure organic CTR.

Google Intros Dynamic Search Ads
Oct 22nd, 2011 by netizenshane

Google is opening wider a beta test of Dynamic Search Ads, an interesting new type of AdWords ad for larger advertisers that eliminates the need for keywords.

With this ad type, designed for retailers or other advertisers with large, often-changing inventory, Google automatically generates ad copy — based on the advertiser’s template — by looking at the content in the advertiser’s Web site. Google also automatically displays the ad in response to search terms it thinks are a good match, without the advertiser having to select keywords. Google has been using a similar no-keywords approach in its program for small local advertisers, AdWords Express.

For Dynamic Search Ads, advertisers input their Web site URL or the URL of a range of pages on their site — say, a retailer wanted to promote their women’s clothing — and select a bid price based on the value of that category to them. Google then continually crawls the Web site so it knows when inventory changes, and can theoretically respond with relevant ads more quickly than the marketing team that’s manually creating keywords and ads. The system is also designed to keep on top of changes in the types of queries people are performing — Google says 16% of searches every day are new.

In an effort to keep this from impinging on advertisers’ existing campaigns, the system will hold back the dynamically generated ad in favor of advertiser-created copy, if the advertiser already has a campaign targeting the specific search term.

“We want to make sure it doesn’t affect keyword campaigns,” Baris Gultekin, director of AdWords product management, told me. “This is purely incremental.”

Gultekin says the company will provide advertisers with reporting on search terms that generated clicks, the matched destination pages and ad headlines generated, average CPC, clicks and conversions. Advertisers may optimize by adjusting a max CPC bid.

The new ad type has been in development for two and a half years, and “a couple hundred” advertisers across a variety of verticals have already been testing it. Gultekin says advertisers on average are seeing 5-10% increase in conversions with a positive ROI.

One advertiser in particular — ApartmentHomeLiving.com, a real estate Web site with constantly changing inventory — says it saw a 50% increase in conversions at an average cost-per-conversion that’s 73% less than their normal search ads. The company is already a seasoned search marketer with campaigns of up to 15 million keywords.

Dynamic Search Ads are available in all languages and all countries currently, but only to advertisers in the limited beta. The company is soliciting inquiries from customers that might be interested in participating in the beta in order to widen its reach.

Google: Local Now 40 Percent Of Mobile, Physical Distance Becomes A Ranking Variable For AdWords
Oct 22nd, 2011 by netizenshane

Today Google is introducing some new ad formats for mobile and announcing that it will now use location proximity as a scoring factor in deciding which AdWords to show the mobile user. Google’s Surojit Chatterjee also shared some mobile data in the context of announcing the new units and policy:

Click to Call is now driving “millions of calls per week” globally for Google. (This is a big contributor to Google’s $2.5 billion in mobile advertising.)

Google now officially says that 40% of mobile queries are related to location. (Microsoft has said this number is roughly 53 percent.)

Just as Google recently made the presence of a mobile-optimized landing page or site a factor in mobile ads quality scoring it is now doing the same with location proximity:

Today we’re announcing that the distance between a user and an advertiser’s business location is now a factor in mobile search ads ranking. This means an ad for a business with a physical location close to to a consumer may perform better in AdWords—driving more mobile traffic at a lower cost. The feature will be effective only when consumers opt in to share their device location for mobile searches.

Marketers will need to use Location Extensions to take advantage of this, and the smartphone user will need to be opted-in to location for location proximity to kick in. But it’s another platform-specific ad scoring variable being introduced in mobile.

Beyond this, Google is debuting some new mobile ad formats:

Custom Search Ads for apps: will appear when users search within apps for content. Search ads now can be integrated into apps, which wasn’t available previously

Click to Download ads: will direct users to download areas in Android Market or iTunes app store.

Mobile App Extensions: these ads can direct people to specific apps or pages within apps already on their phones. Here’s Google’s example: “If someone searches for sneakers on a mobile device, they might see an ad that takes them directly into a cool shopping app they’ve installed on their phone.” It’s not clear what happens if the app isn’t on the user’s phone already.

Circulars (not limited to mobile): these are more graphically rich ads that contain product images, pricing and deals/offers. Google says, “When someone clicks on a search or display ad (on desktop, mobile or tablet devices), they may see these engaging ads which contain photos of relevant products and special offers.”

Google’s mobile business is really accelerating and these new ad formats and policies reflect that growth. There’s nothing comparable to this coming out of Microsoft or Yahoo right now

 

The Highs & Lows Of Search Retargeting: Version 3.0
Oct 22nd, 2011 by netizenshane

this industry evolves fast, but damn! Just 18 months ago, most media planners and search marketers had not heard of search retargeting, and already we are in what could easily be called version 3.0. With the agency hat back on (for today), we look at whether this tactic is living up to the growing hype.

When the principle was first explained to me, I was running an agency display media team at a search agency that was focused on direct response clients; I was therefore interested in tactics that involved precise data points as a way to focus on user intent.

Search retargeting seemed to fit the mold perfectly: target just those individuals with display ads who have actually searched for the terms that were relevant to the client, eliminating nearly all wastage from the plan.

We were building what we called the agency’s “foundation layer” of display: site retargeting to fix on-site conversion, search retargeting to prospect and plug the leak from SEM, and social retargeting to add further scale to the audience. So we picked five clients who had a pre-agreed testing budget and rolled out search retargeting with an early vendor, only to see four out of five of the campaigns bomb!

The primary reason was that in Search Retargeting 1.0, there was no scale in the data and little effort invested in the media placement. The campaigns were great when spending $100, but as soon as the vendor tried to scale to fill the budget, they would have to broad match and lose the relevancy, and of course the ROI.

Growing The Data – Search Retargeting 2.0

But sticking with it, campaigns began to perform better over time, and in almost direct correlation with the quantity of data that was available. Now we could focus more on the relevant terms and ignore some of the broad head terms. And as any search marketer can tell you, volume comes from the broad terms, but ROI comes from the specific. With data volume no longer such a problem, search retargeting 2.0 was on the horizon.

The theory states that search retargeting should outperform most other display placements because of its accuracy, and even come close to the performance of your search marketing efforts — as an industry it simply was not there yet.

But with the intersection of search and display looking like the future of digital, I left the agency to go help make this work.

The Data High

Too many marketers went through a phase of being high on data, believing that a single reference point was all that was needed to generate great DR results. They lost sight, in their excitement, of the continuing importance of creative messaging and the context of the media placement. Knowing who to talk to is important, but doing that in the right environment and with the right story really matters.

As popularity in search retargeting grew, so did the funding, which allowed the successful players to build their own DSP (Demand Side Platform) technology to manage the quantity of data and build in these essential elements. Marrying thousands of keywords with thousands of potential ad placements is not easy though, particularly when you have to do it in real time and at the keyword level.

Advanced Optimization — Search Retargeting 3.0

In a recent Chango search retargeting campaign from a large retailer, the need for keyword level optimization is clear:

Search retargeting example 1:

  • ‘clothes shop’ — CTR of 0.87%
  • ‘clothes shopping’ — CTR of 0.25%

Search retargeting example 2:

  • ‘shoes mens’ — 0.16%
  • ‘mens shoes’ — CTR of 0.21%

The search marketer is used to a world where this type of analysis is commonplace, but what is different is the choice of media sources.

In SEM, you choose from two major engines and then can add the extended network, usually by just ticking a box and forgetting it. But with real-time display, including search retargeting, we can buy in excess of 100,000 QPS (Queries Per Second — a simple measurement of media capacity).

Therefore managing search retargeting campaigns today is complex. Typically a campaign will need to be optimized manually once a day, but then “machine learning” must be used to balance the multitude of options available.

In our examples above, the term “clothes shop” clearly had a better type of intent that “clothes shopping” for our client, but that could only be determined by analyzing the placement on tens of thousands of sites. The balance of people and technology provide the scalable solution (but interestingly also blur the line between agency and vendor).

In addition, search retargeting 3.0 leverages dynamic creative, but unlike a typical dynamic setup, there is actual search data to work with, producing richer and more relevant experiences for the end consumer. Search retargeting sprang out from a sea of providers buying on the exchanges, but now seems to be leading in terms of what can really be achieved.

searchretargeting 600x162 The Highs & Lows Of Search Retargeting: Version 3.0

A Blessing & A Curse

Search retargeting would probably not be the name our micro-industry chooses if it got to choose again. When media planners hear it they immediately get excited, as they know a good media plan should always include some type of retargeting. But their first assumption is that it targets their existing site visitors. Many conversations begin by saying this isn’t the retargeting you thought it was!

But once marketers understand, they see its value for the long term. Like site retargeting and the SEM program itself, it typically becomes an evergreen program, running continuously as a reliable source of revenue.

In Summary …

Search retargeting arrived on the media scene less than two years ago, and less than one year ago for most media planners. It leverages the power of search and executes it with the scalability of display. It is enormously complex because of the volume of both keywords and media placements, so early campaigns were often not successful (v1.0). But as the industry grew, so did the data, and with it came a certain amount of reach (v2.0).

Today, major brands invest hundreds of thousands per month on evergreen and seasonal campaigns in search retargeting thanks to the results that in-house DSP bidding technology allows for true keyword level granularity. Machine learning, dynamic creative and lots of experience means that version 3.0 is upon us … and growing.

Google Puts A Price On Privacy- SEL
Oct 22nd, 2011 by netizenshane

Google’s a big company that goes after revenue in a variety of ways some critics feel put users second. However, I’m struggling to think of other examples where Google has acted in such a crass, it’s all-about-the-revenue manner as it has this week. The best comparison I can think of is when Google decided to allow Chinese censorship. Yes, this is in the same league.

It’s in that league because Google is a company that prides itself by doing right by the user. Yet in this case, it seems perfectly happy to sell out privacy, if you’re an advertiser. That’s assuming you believe that Caller ID-like information that’s being blocked (except for advertisers) is a privacy issue.

Google doesn’t, as best I can tell. Instead, the blocking is a pesky side effect to a real privacy enhancement Google made, a side effect Google doesn’t seem to want to cure for anyone but advertisers.

If it had taken a more thoughtful approach, ironically, Google could have pushed many sites across the web to become more secure themselves. It missed that opportunity.

I’ll cover all of this below, in detail. It’s a long article. If you prefer a short summary, skip to the last two sections, “Why Not Get Everyone To Be Secure” and “Moving Forward.”

Default Encrypted Search Begins

Let’s talk particulars. On Tuesday, Google announced that by default, it would encrypt the search sessions of anyone signed in to Google.com. This means that when someone searches, no one can see the results that Google is sending back to them.

That’s good. Just as you might want your Gmail account encrypted, so that no one can see what you’re emailing, so you also may want the search results that Google is communicated back to you to be kept private.

That’s especially so because those search results are getting more personalized and potentially could be hacked. The EFF, in its post about Google’s change, pointed to two papers (here and here) about this.

Encryption Can Break Caller ID

There’s a side effect to encryption that involves what are called “referrers.” When someone clicks on a link from one web site that leads to another, most browsers pass along referrer data, which is sort of like a Caller ID for the internet. The destination web site can see where the person came from.

When someone comes from an encrypted site, this referrer information isn’t passed on unless they are going to another encrypted site. That means when Google moved to encrypted search, it was blocking this Caller ID on its end for virtually all the sites that it lists, since most of them don’t run encrypted or “secure” servers themselves.

This is a crucial point. Encryption — providing a secure web site — doesn’t block referrers if someone goes from one secure web site to another. Consider it like this:

  • Unsecure >>> passes referrer to >>> Unsecure
  • Secure >>> passes referrer to >>> Secure
  • Secure /// does NOT pass referrer to /// Unsecure

Google’s Referrer Problem

If everyone on the web ran secure servers, aside from the web being a more secure place in the way that Google itself wants it to be, the referrer hypocrisy that Google committed this week wouldn’t be an issue.

The vast majority of sites don’t run secure servers, of course. Tha posed a problem for Google. Referrers from search engines are unique. Since as long as we’ve had search engines — over 15 years — those links people click on from search engine results have contained the search terms people have used.

For publishers, this has made search marketing incredibly powerful. They are able to tell exactly what terms were used when someone found their web site, at a search engine like Yahoo, Bing or Google

Moving to secure searching meant that Google was suddenly, dramatically, no longer going to send this information to publishers, because as I’ve covered, virtually none of those publishers were running secure servers. As a result, Google almost certain realized there was going to be backlash.

Putting A Price On Privacy

Google could have endured the backlash, saying that if publishers still wanted this data, they could move to secure servers. Instead, it deliberately chose to override how referrers are passed, so that they would continue to be provided to just its advertisers.

Backlash, Google would endure, but it seems apparently not from those who made Google nearly $10 billion last quarter alone.

To solve this, Google changed from the standard way that referrers are supposed to be passed to its own unique system, which works like this:

  • Secure /// does NOT pass referrer to /// Unsecure unless…
  • Secure >>> passes referrer if ADVERTISER to >>> Unsecure

Let me be very clear. Google has designed things so that Caller ID still works for its advertisers, but not anyone else, even though the standard for secure services isn’t supposed to allow this. It broke the standard, deliberately, to prevent advertiser backlash.

The PR Plan For Publisher Backlash: It’s A Tiny Loss!

Google still knew there would be backlash from another group of publishers, those who have received this Caller ID referrer data from Google’s “free” or “organic” or “editorial” or “SEO” listings. What was the solution for that problem?

Here, Google seems to have a three-fold approach. First, suggest that only a tiny amount of data is being withheld. Some scoffed at Google’s estimate that I reported, that this would impact less than 10% of query data. But so far that seems to be holding true.

For example, here was our second most popular keyword sending us traffic from Google yesterday, according to Google Analytics:

Organic Search Traffic Google Analytics Google Puts A Price On Privacy SEL

“Not Provided” is what Google reports in cases when it now blocks referrers — or technically, it still provides referrers but is specifically stripping search terms out of them.

Our number two keyword! And yet, we received nearly 15,000 keyword-related visits from Google yesterday. These terms that were withheld amounted to only 2.6% of them.

On my personal blog, this is in about the 2% range. SEOmoz reported around 2%, as well.

These low figures will makes it easier for Google to gloss over publisher concerns, especially when they’re almost all being voiced by those in the SEO industry. The industry has a bad name, so if it’s against something, that can almost seem like a ringing endorsement for good.

Ars Technica had some comments like this, in response to its story on the Google change:

I’m playing the saddest song in the world on the smallest violin in the world. Poor, poor, SEO leaches

I AM completely unsympathetic. The sooner these SEO leeches, parasites, spammers and scammers die die die the better off the web will be.

Don’t make this mistake. This is not just an SEO issue. This is a user privacy issue. SEOs are simply the harbingers spotting Google’s hypocrisy around privacy.

The Data’s Still Around!

The second bit of PR messaging was to reassure that plenty of search data can be found in another way through the Google Webmaster Central service.

This is true. Google does provide search query data through this service, and it’s warmly welcomed by many site owners.

However, Google also provides search query data to its advertisers through the AdWords system. That’s the publisher equivalent to Google Webmaster Central.

Since advertisers can get data through AdWords, just as publishers can use Google Webmaster Central, why does Google still need to deliberately override how referrers would normally be blocked just for advertisers?

Google argued in its blog post that advertisers needed referrer data “to measure the effectiveness of their campaigns and to improve the ads and offers they present to you.” Outside of conversion tracking to the keyword level and retargeting, that doesn’t hold up, to me. I’ll get back to these.

Google Said Referrer Data Was Better

By the way, Google is on record as saying the data in Google Webmaster Central for publishers is not as good as referrer data.

This comes from an online exchange between Matt Cutts, the head of Google’s web spam team and who acts as a liaison on many publisher issues, and Gabriel Weinberg, the founder of tiny Duck Duck Go search engine who was challenging Google over providing referrer information.

Weinberg wrote:

So now that we know what is going on, why allow this personal information to leak? As far as I can tell, the only reason is so Webmasters can do better at Google SEO. And that reason can be wholly mitigated through the use of Google’s Webmaster Tools.

Cutts responded (and I’ll bold the key part):

Google’s webmaster tools only provide a sampling of the data. We used to provide info for only 100 queries. Now we provide it for more queries, but it’s still a sample.

Please don’t make the argument that the data in our webmaster console is equivalent to the data that websites can currently find in their server logs, because that’s not the case.

In January of this year, data from Google Webmaster Central was deemed inferior to referrer data. In October, it’s repositioned as an acceptable alternative to blocking referrers.

Referrers Are Private!

Google’s third and most important method of countering backlash is to make out that referrer data is somehow so private that it can no longer be provided to publishers. If you read closely, however, you understand that Google never actually takes this position. Rather, it’s implied.

Google’s blog post on the change made no mention — none — that this move was done because referrers had private information that might leak out. It was only about protecting the search results themselves:

As search becomes an increasingly customized experience, we recognize the growing importance of protecting the personalized search results we deliver.

Remember those studies I mentioned? Those were all about search results, not about referrers.

Referrers only get mentioned in Google’s post as a heads-up to publishers that they’ll be lost, and not because they’re also private and need to be protected but rather — well, Google doesn’t explain why. The implication is that they just have to go.

As I’ve read stories in the broader press, I’ve seen the assumption that Google is blocking referrers because it considers them to be private. Heck, I came away from my initial interview with Google when the news broke thinking the same thing.

It’s no wonder. Because Google has deliberately broken security to pass referrers to advertisers but not publishers, it had to lump that qualification into the overall security story. It made referrer blocking seem like it was done to protect privacy, rather than the troublesome side effect it really was.

But You Didn’t Say They Were Private Before

To emphasize how not-private Google has viewed referrer data, consider two issues.

The first was in 2009, when Google made a change to its search results that broke referrers from being passed. Publishers were upset, and Google restored referrers.

Cutts — who keep in mind is one of the people Google had talk about this week’s encryption change –  tweeted “yay” about the restoration. Clearly, he didn’t see any privacy issues being lost by it then. He was happy Google went out of its way to bring referrers back.

duckduckgo google Google Puts A Price On Privacy SEL

Think 2009 is too far back? OK, at the beginning of this year, Duck Duck Go — aside from buying a billboard to attack Google on privacy grounds – launched an illustrated guide to alleged Google privacy issues, including concerns over referrrer data.

In reaction to that, Cutts pushed back on referrers being a problem:

Referrers are a part of the way the web has worked since before Google existed. They’re a browser-level feature more than something related to specific websites.

When he was further challenged on the issue by Duck Duck Go’s founder Weinberg, Cutts specifically did not include referrers in a list of things that seemed to be private:

That was the same day we announced SSL search, which prevents referrers to http sites….

The fact is that Google has a good history of supporting privacy, from fighting overly broad subpoenas from the DOJ to SSL Search to creating a browser plugin to opt out of personalized advertising.

On a personal note here, I like Matt Cutts. I’m not trying to single him out unfairly by citing this stuff. He’s just a Googler extremely close to the issue, knowledgable about it and even when speaking in a semi-official manner, it reflects back on what’s true with Google.

Personally, I get the impression he might not agree with the referrer blocking for publishers but is going to put the best spin he can on a decision that his company made. Just my gut feel, and no special knowledge here. I could be wrong. Maybe I can get him to share more later.

Google Change Benefits Google

I think it’s fair to say that Google has not agreed with the view that referrers are private, nor has it clearly said referrers were blocked to protect privacy.

So why do it? One reason is that it makes Google more competitive. If someone lands on your web site, and you know the search term they used, you can then target them in various ways across the web with ads you believe reflect that search interest. All you need is the initial term.

This is called “retargeting,” and Google’s a leading provider of retargeted ads. When you cut the referrers out, except for your own advertisers, Google makes it harder for its competitors to offer retargeting services. Search marketers already understand this. Wait until Google’s anti-trust enemies clue in. They’ll be swooping in on this one (and we’ll have more to say on it in the future).

Another benefit is that it prevents anyone but Google’s own advertisers from doing keyword-level conversion tracking. With search referrers, you can determine what someone who searched for a particular term later did on your site. What further pages did they go to? Did they purchase a product or service? Without the search terms, you can’t do this degree of analysis.

That is, of course, unless you buy an ad. Conversion tracking at the keyword level turns into another sales feature for Google.

Didn’t Think Or Don’t Care?

I think the biggest reason Google hasn’t fixed the broken referrer problem is either that it just didn’t care about publishers or didn’t really think through the issues more.

Either is bad. The latter has some weight. Consider the last time that Google broke referrers, Cutts explained that the impact just hadn’t been considered:

[Cutts] says the team didn’t think about the referrer aspect. So they stopped. They’ve paused it until they can find out how to keep the referrers.

Surely someone had to have thought about the impact this time? Someone decided that it was a good idea to keep passing referrer information to advertisers. Someone decided that for whatever reason — and it wasn’t privacy — that publishers couldn’t keep getting this information. But what that reason is remains unclear.

Why Not Get Everyone To Be Secure?

What I do know is that Google missed a huge opportunity to make the entire web much more secure. Google could have declared that it was shifting its default search for everyone  – not just logged-in users — to be secure. Privacy advocates would have loved this even more than the current change which, using Google’s own figures, protect less than 10% of Google.com searchers.

Google could have also said that if anyone wanted to continue receiving referrer data, they needed to shift to running secure servers themselves. Remember, referrers pass from secure server to secure server.

Millions of sites quickly adopted Google +1 buttons in the hopes they might get more traffic from Google. Those same millions would have shifted — and quickly — over to secure servers in order to continue receiving referrer data.

Better protection across the web for everyone, while maintaining the unwritten contract between search engines and the publishers that support them to provide referrer data. That would have been a good solution. Instead, we got Google providing protection for a sliver of those searching, withholding data from the majority of sites that support it and solving problems only for its advertisers.

Moving Forward

I’m expecting to talk further to Google about these issues, which I raised with the company right after writing my initial story. I’m still waiting for them to find anyone appropriate higher up in the company to respond. Fingers-crossed. The best I could get so far was this statement:

We’ve tried to strike a balance here — improving privacy for signed in users while also continuing to provide substantial query data to webmasters.

To conclude, I think the move to secure searching is great. I’d like to see more of it.

As for referrers, there are some who do believe that they are private. Chris Soghoian is a leading advocate about this, and I’d recommend anyone who wants to understand more to read the blog post he wrote about an FTC complaint he filed over the issue. Read the complaint, too. Also see Duck Duck Go’s DontTrack.us site.

In terms of Google blocking referrers, it already blocks tons of stuff it considers private from its search suggestions. Conceivably, it could use the same technology to filter search referrers, to help publishers and protect users.

But aside from that, if Google thinks this needs to be done for privacy reasons, then it needs to block referrers for everyone and not still allow them to work for advertisers. That move is one of the most disturbing, hypocritical things I’ve ever seen Google do. It also needs to take the further step and stop its own Chrome browser from passing them.

If blocking referrers isn’t a privacy issue, then Google needs provider referrer data to all publishers, not just those who advertise.

Google Merges Analytics, Webmaster Tools Data, Adds SEO Reports – Search Engine Watch (#SEW)
Oct 15th, 2011 by netizenshane

Google Webmaster Tools reports are now included as part of Google Analytics. You may recall four months ago, Google Analytics launched a pilot program between the two services. As of today the Analytics team announced the initial test are over and everyone will now be able to link Webmaster tools and Google Analytics accounts. They’ve even enhanced the services based on feedback during the pilot.

Integrated Webmaster Tools Reports

For now, there are three new reports, “based on Webmaster Tools data.” All three can be found under the Traffic Sources section of Google Analytics, in a section labeled Search Engine Optimization. The three reports are Queries, Landing Pages, and Geographical Summary.

 Google Merges Analytics, Webmaster Tools Data, Adds SEO Reports Search Engine Watch (#SEW)

The Queries report will provide information on terms Google users searched on where your site’s URL appeared in the results. the default report includes Impressions (within a SERP), clicks, click-through rate, and the average position your site appeared in for all search queries. Secondary dimension choices are limited, but a few visitor stats are available.

The next report is all about landing pages. It displays your top-viewed content that were entry points to your site as a direct result of a Google search query. The same four metrics and secondary dimensions are used in this report.

The third report is the Geographical Summary report, which displays the origin countries of your visitors that originated from a Google SERP. This report also gives impressions, clicks and CTR.

How to Link Your Accounts

To use these reports, you must first link your accounts. Linking your accounts is a relatively simple process that starts in Google Webmaster Tools. Your main Webmaster Tools landing page, lists the sites connected to your account.

 Google Merges Analytics, Webmaster Tools Data, Adds SEO Reports Search Engine Watch (#SEW)

Click the Manage Site menu and select Google Analytics property. You’ll be taken to a page labeled Enable Webmaster Tools data in Google Analytics where you can match up the site to the matching site in your Google Analytics profiles. Find the match and click Save. After confirming the action in a dialog box, your accounts will be connected.

If you did it right the reports will appear. If not, you’ll see something that resembles this:

 Google Merges Analytics, Webmaster Tools Data, Adds SEO Reports Search Engine Watch (#SEW)

Together At Last

It’s a good first start at merging features. In true Google Analytics style, you can even filter the reports and add dimensions or metrics. However, don’t get all excited about only checking one property for all your SEO needs.

Having the quick stats on search queries is very important. However, Webmaster Tools still presents more data, including changes in popularity for various queries. All the other cool metrics like inbound links, subscriber stats, and Googlebot crawler reports are still in Webmaster Tools.

How Inventory Feeds are Changing Paid Search Advertising – Search Engine Watch
Oct 15th, 2011 by netizenshane

Earlier this year, I laid out a vision for the future of paid search advertising on Google in The Dawn of Paid Search Without Keywords. The essential point in the article was that advertisers won’t rely on keyword/bid/match type combinations to target searchers.

That vision has been realized in ecommerce more quickly than any other vertical.

For any search, Google attempts to find the intersection between two fundamental questions:

  • What’s the most relevant result?
  • What’s the best way to monetize the search results most effectively (i.e., maximize yield)?

It used to be that a set of pages and text ads were presumed to be the best answer. Since the advent of universal search and universal paid search, the range of potential answers to a search query has grown to include:

  • A page
  • A product
  • A choice of products
  • Text
  • Images
  • Video
  • A hybrid result
  • A brand

When a search query has commercial or transactional intent, Google tips the scale even more aggressively toward monetization. Specifically, they’re increasingly monetizing search results through the promotion of products with images.

  • Product listing ads
  • Product extensions
  • Product search

Each of these kinds of commercial search results are driven by inventory feeds from a product catalog.

To illustrate this, let’s consider someone researching “men’s boots”

 How Inventory Feeds are Changing Paid Search Advertising Search Engine Watch

There are almost no informational results visible. From top to bottom, the results above the fold are dominated by:

  • Text ads with extensions (blue border): This is a standard text ad with some variation of Ad Sitelinks that include product and price.
  • Product listing ads (green border): Typically, product listing ads have only appeared in the top 2 positions of the right column. This variation, possibly a beta test, brings the images and text over similar to other ads in the premium positions.
  • Product search (red border): The organic listings including product search shopping results.

In cases where the ads aren’t perceived as being as relevant, product results shift to the right column, as is the case with “best men’s boots”:

 How Inventory Feeds are Changing Paid Search Advertising Search Engine Watch

In the example above, I expanded the product results from the Zappos product extensions+ box. When collapsed, the right column shows more text ads.

When the search query includes adjectives that aren’t directly related to specific product, ads become more traditional but Google product search still dominates the organic results, as is the case for “men’s designer boots”:

 How Inventory Feeds are Changing Paid Search Advertising Search Engine Watch

What’s interesting here is that the product search delivers results whose title matches the search query. Title is the most visible of the product feed specifications that help drive results.

The more brand+ product specific the query, the more likely it is that several merchants carry some of those items and have product specific results to feed the SERPs. Take “kenneth cole boots” as an example:

 How Inventory Feeds are Changing Paid Search Advertising Search Engine Watch

One Product, Many Possible Results

It used to be that a product could only be marketed towards particular set of keywords with a text result. Now, one product can appear in up to three places with different messaging and presentation in each place.

The question for retailers, then, is “How do we increase our exposures as profitably as possible for our product catalog and data feeds?”

The fundamental difference for inventory feed driven search is that relevance to the search query is influenced by attributes in the data feed. Google is creating search query/product pairs.

The system of matching queries to products isn’t as direct as keywords. Consider the example of Product Listing Ads to illustrate the concept. It works in three steps:

  • Attributes: You create an inventory feed and specify all of the attributes for your products. This provides the information that will be visible to searchers (e.g., title, as well as meta data that Google uses to help determine the relevance of a given product to a query).
  • Product Targets: Product listing ads are done without keywords. Google determines which queries your products are relevant to. You can influence this process by specifying Product Targets. Product Targets use the attributes from your product feed to determine which products are eligible to be shown and create bidding groups. Product type is the most obvious example of how you might use attributes in designing product targets.
  • Product Filters: You may not want to advertise your entire product catalog or you may want to organize campaigns according to product lines. Product filters limit the potential products that are eligible to appear. Product filters are set at the campaign level and will also affect which products can appear in your product extensions.
  • Negative Keywords: As with conventional paid search, you can prevent products from appearing for specific queries with negative keywords. For example, you might add “men” or “men’s”; as a negative if you only sell women’s designer boots.

New Options, New Complexity

On the surface, Google is simplifying paid search: simply upload your feeds and let them do the work of matching products to search queries.

But, with new features come new variables to optimize:

  • Which images are most persuasive?
  • What promotions (extra text for product ads) will attract the right audience?
  • What is the most effective bid for these new ad types?

These ad types are also subject to the usual concerns of inventory feeds:

  • Which product to promote when multiple products match a query (a.k.a.,canonicalization/unification)
  • Promotions/Scheduling
  • Low/no stock inventory
  • Seasonality

Retail Paid Search Without Keywords

Retail paid search is leading the charge in paid search without keywords. Paid search results are now beginning to resemble traditional product search results and new ad formats are increasingly adopting the models and controls of standard AdWords buys.

For retailers, it’s obvious that maximum exposure and profit in search marketing has become a blended mix of regular ads, ad extensions, product listing ads, product search results and organic listings. For paid search professionals working with retailers, feed management and sculpting is now as fundamental as keyword research in running effective campaigns.

PPC Bid Optimization Without Conversion Tracking – Search Engine Watch (#SEW)
Oct 11th, 2011 by netizenshane

Lesson 1 of running a PPC campaign: track your conversions. A simple rule that we all try to stick to. But sometimes we can’t.

Not all campaigns work so easily. It may be that conversions are too few and far between to make a difference. The website may convert everything over the phone through a system to complex/entrenched to track back to keyword level.

Whatever the reason, there are two techniques you can use to help get your PPC account delivering more value, even if you can’t track it.

Option 1: Engagement Metrics

This one is hopefully going to seem like an obvious first step to most of you. If you have Google Analytics linked to your AdWords account, then you can create “engagement goals” and import them into AdWords as a conversion.

 PPC Bid Optimization Without Conversion Tracking Search Engine Watch (#SEW)

Engagement goals consist of tracking visitors who meet a threshold time on site or pages visited. In the screenshot above you can see how a goal would be set up to create a pages visited requirement of greater than 1. Doing this would let you see a goal in AdWords for any visits that did not bounce.

Based on the type of website you’re running you really need to think carefully about which metric (time on site or pages visited) is going to be more useful for you, and what your requirement is going to be to log it as a conversion in AdWords.

Remember! Google Analytics does not track the time the user spent on the final page of their visit. If the user read that page for a long period then left the site, you won’t see how long they spent on that final page. This can make time on site a slightly misleading metric for websites where visitors won’t view many pages.

If you want to log pages per visit, then decide what kind of visitors qualify as “engaged” enough to be worth logging. You will find that many visitors will either bounce, or look at everything on the site. Averages suck. Your averages are being skewed by bounces and by visits with many many pages viewed, so don’t base your assessment on logging visitors who visit more pages than your site average.

Visiting more than one page (i.e., not a bounce) can be a good signal. If it takes at least two clicks to reach a product page or view a price, then your ideal threshold may be greater than two pages viewed.

Option 2: Relative Weighting

For some sites you can’t even accurately track the above behavior. One page visits might still be worth a lot to an information or blog site. You may have organizational reasons that you can’t use Google Analytics or other web analysis packages.

You need to be more creative in these situations.

Step 1

 PPC Bid Optimization Without Conversion Tracking Search Engine Watch (#SEW)

Take your top 20 keywords by traffic. In most cases these will account for the majority of your total clicks, but if not then take 30, or 40. Don’t take too many at this stage, or you’re making a lot of extra work for yourself for not too much value. Exclude any of your brand keywords from this list.

Step 2

Rank these keywords by a “likelihood of conversion” factor. You need to know your business really well here, but you have to put your keywords into order sorted by how relevant and targeted they are. A particular keyword might suggest that everybody searching for it is looking for exactly your product. Alternatively it might have other meanings or be more generic. Rank these keywords by their “awesomeness” factor.

Step 3

 PPC Bid Optimization Without Conversion Tracking Search Engine Watch (#SEW)

Give each of these keywords a score from 1 to 10 for relevance. The best and highest should be 9 or 10 (this won’t always be the case, if none of your best keywords are in your top 20 by traffic) and work from there. Don’t feel like you need to go all the way down to 1. If the lowest keywords have maybe half the likelihood of conversion of the highest keywords, then respective scores of 5 and 10 would be appropriate.

Step 4

 PPC Bid Optimization Without Conversion Tracking Search Engine Watch (#SEW)

Add a column to your list that indicates the potential value of a conversion. This is only really necessary if different keywords imply a different value (e.g., they might be searches for a different product). If your website sells just one service, or value is not related to search term at all, skip this step and just add “1″ to this column next to every keyword.

Step 5

 PPC Bid Optimization Without Conversion Tracking Search Engine Watch (#SEW)

Multiply these scores together for each keyword, and divide each one by the total sum of all scores added together. What you have now is an estimate for what proportion of your budget you should be spending on each keyword.

Remember that these keywords didn’t make up your total spend. If the top 20 keywords spent 80 percent of your budget, then these proportions you’ve found are proportions of the 80 percent.

Deciding the Bids

You should now have each of your top 20 keywords with a score from 0 to 1 representing the proportion of your spend that should be going to this keyword. Achieving this takes some time. You’re going to need to analyze your spend levels on a daily basis, and make changes accordingly.

Each day, download the keyword data for the previous day from the account. For each keyword in your list, work out how much it spent (proportionally) compared to your target. If the keyword’s spend is too high, reduce the bid by 10 percent. If it is too low, increase the bid by 10 percent. Set a sensible level of tolerance on this, or you’ll be changing your bids even if your keywords arereeeeeeally close to perfect.

The process that will occur is this: every day each keyword that is spending too much will have its bid reduced a little. If the next day it is spending correctly, then you’ll stop making that change to that keyword and it’s nicely on target. If it still hasn’t reached target then the process will happen again.

The Inevitable Complication

You’re working from assumptions based on gut instinct and guesswork. These aren’t accurate and they never will be.

Don’t go crazy looking to make each keyword fit your target perfectly, since your target is an educated guess at best. The process outlined above will give you a bidding strategy to meet your target but it doesn’t validate it if you were wrong.

The second issue to be aware of is that every time you change your bids, the proportion of budget spent by each keyword will change. But it will affect the proportion spent by all your other keywords too.

If we imagine a situation with two keywords, “keyword a” and “keyword b”. “Keyword a” spends $10 per day and “keyword b” spends $20 per day. If I conduct this analysis and increase the bids on “keyword a” so that it spends $20 per day, then “keyword b” is now spending a lower proportion of my new, higher budget. And if my budget is limited, then I may need to reduce “keyword b” to compensate.

Each action on a keyword affects the proportional spend on the others. Don’t be surprised if your keywords overreact.

In Summary

Having a lack of good conversion data is no excuse for not optimizing bids in a campaign. The campaign can still perform better and deep down you know it!

The two methods in this article will give you some pointers to go about pushing your spend towards your best converters and most profitable keywords, but your overall budget must still be determined by the profitability of PPC as a whole.

Re-assess your assumptions regularly. Don’t take your first best guess as fixed. Make changes and improvements over time as your knowledge grows and you can keep making the campaign better and better.

Google Eye-Tracking Study Highlights Impact of Local/Places Results – Search Engine Watch (#SEW)
Oct 10th, 2011 by netizenshane

SEOmoz just released the results of a pretty cool eye-tracking study and a few of the results are surprising. Google’s different SERPs formats seem to have varying effects on users and those using video and local search listings will find these results promising.

A few of the results actually suggest that if you’re targeting a query with local connotations, youmay be better off skipping those coveted top 3 organic spots and shooting for the local listings. To be clear, the study was small and examined just eight subjects shown five SERPs each. It would be very useful to see the results of a larger, broader study, given the promising outcome of this one for blended search results.

Another point readers need to be clear on – this study measured eye movements, not clicks or mouse activity.

SEOmoz used Mirametrix’s S2 Eye Tracker, told the subject the search term, and let them check out each depersonalized, Chicago based full-screen SERP. The subject data was then aggregated and used to create heat maps to show which areas of the page they migrated to.

7-Pack Shows Lots of Love for Local/Places Results

The first SERP listed your typical text results all the way down the page, with integrated Local/Places results and two maps on the right. People viewed this in a modified F shape, honing in on the top three or four results and the maps, so this one was pretty much what you would expect.

 Google Eye Tracking Study Highlights Impact of Local/Places Results Search Engine Watch (#SEW)

The second SERP was for the term “pizza” and showed three organic results at the top, followed by a 7-pack of Local/Places results, which got most of the action. You can see that the top 3 and the map got a little bit of lovin’, but if you’re in Chicago and hoping to rank for pizza, you want to be in that 7-pack, not necessarily in one of the top three spots.

Video and Shopping Results Show Images Worth a Closer Look

Next up, a search for “how to make a pizza” because it brought up video results as No. 2 and No. 3. This isn’t exactly surprising either, because pictures break up the plain text and you would expect the thumbnail to draw some attention against Google’s minimalist layout.

No. 2, the video result, seems to have outperformed even the top organic result, as least as far as drawing the reader’s eye to it. The PPC ad on the right did alright, but nothing beyond No. 5 got any attention to speak of.

 Google Eye Tracking Study Highlights Impact of Local/Places Results Search Engine Watch (#SEW)

For the fourth SERP, a search for “pizza cutters” brought up related searches at the top and shopping results in the fourth position. The related searches didn’t draw much attention, even though they’re right on top, and the top organic listings still did fairly well considering there are images below.

Considering the difference between the way these thumbs and the ones for videos were treated, I have to wonder if the related searches pushing the images farther down the page, coupled with the fact that they’re clustered together, takes away some of their power. It would be interesting to see how different image types and arrangements draw clicks as well as eye movement.

Expanded Sitelinks – Not So Hot

 Google Eye Tracking Study Highlights Impact of Local/Places Results Search Engine Watch (#SEW)

This might just say something about the fact that the average person searching “Pizza Hut” is really just hungry and not doing a research project… I think the results could have been very different for a non-food product. It might actually be interesting to see a study on the impact of expanded sitelinks in different industries or categories.

Anyway, the “Pizza Hut” SERP had a PPC ad on top, a map on the right, and the Pizza Hut corporate site in the top organic spot with six expanded sitelinks. That listing didn’t get much attention. Just below it, there’s a location-finder result for “Pizza Hut locations in Chicago” that shows three Pizza Hut addresses. Surprisingly (to me) this didn’t get much love, either.

Below that, though, you see the Places listings for local Pizza Huts, and that’s the hot spot for the entire page, which looks really good for local listings since it’s about halfway down the page. This might actually be below the fold on some computers, so I don’t know that it would do as well in those cases.

Check out the entire study at SEOmoz, and hopefully this topic gets a further look! We’ve seen some interesting and sometimes surprising eye-tracking studies, including the one we did last year with Explido WebMarketing, which compared results across search engines, and our hit or miss evaluation of the new Google SERP when it came out last May.

7 SEO Twitter Tips to Rank Better in Search
Oct 7th, 2011 by netizenshane

Even though Google recently ended it’s Realtime Search deal with Twitter, which means Twitter’s fire hose is no longer catalogued and used in real-time search results, there’s still a good deal of SEO benefit to be gleaned from Twitter.

Basically, Twitter is a natural complement to your site and a way to gain some additional spots on organic search engine results pages (SERPs), helping your website and its content get found. So how can you best leverage your Twitter account for search? Here are 7 tips to help you get started.

1. Use Your Real Name Wisely

It’s important to note that your Twitter “Real Name” doesn’t have to be your actual name, unlike with Facebook. Instead, it can be virtually anything as long as it comes in under the 20-character limit Twitter imposes.

As such, your Twitter “Real Name” is perhaps the most important thing you can set on your Twitter profile. It appears in the title tag for both your Twitter profile and your individual tweet pages. It’s highly searchable, and it’s something that will appear in Google’s link to your profile.

This means you need to make sure your real name is both relevant to you, a keyword people are likely to search for, and something that will make people want to click.

2. Make Your Username Count

Like your real name, your Twitter username is crucial content that will be displayed in the title of your Twitter profile page and, in some cases, your individual tweet pages. Also like your real name, it can be anything you want as long as it comes in under the 15-character limit Twitter places on usernames.

It is best to use this space wisely with an easily remembered, keyword-rich Twitter username that will get the attention of anyone searching for related keywords.

It may seem like an impossible task. Fifteen characters certainly isn’t a lot to work with, but a few quick Google searches for keywords related to your site can reveal what kinds of Twitter handles are ranking well, giving you a template for success.

3. Focus on Your Bio Next

After your real name and username, your bio is the next most important thing you can edit. At 160 characters, it’s longer than a tweet, and it can be crucial to your SEO as it is both highly indexable content, and the first few words of it also appear in your Twitter page’s description.

It is important to make your bio count. Make the first few words an interesting teaser that draws searchers to click and ensure that the entire bio has at least one or two of the keywords you’re targeting.

4. Link, Link and Link Some More

To be strong in search engines, your Twitter profile needs the same thing any other site needs: lots and lots of links. Link to your profile everywhere you can, and do so with strong keywords in the anchor text.

This works well because, even though your Twitter profile is closely related to your site, it is hosted on a different domain, meaning you can pass along a great deal of trust to it from your site. This makes your links to it much more valuable.

You can further this benefit by encouraging others to link to your Twitter profile as well, such as including it in an author byline when you do guest blogging, which will improve the amount of authority it receives from search engines.

5. Get Followers, Build Recognition

Every follower you get is more than just a person reading your tweets. It’s a link to your Twitter profile on their “Following” page and possibly retweets and mentions of your profile, which also include a link to you.

Although these links are “internal” in nature (meaning they are all links from within Twitter.com), they can help you compete with other Twitter profiles that might be on the same or a similar topic, giving you an additional edge.

Since many searchers who land on Twitter profiles were doing Twitter-specific searches, this could be a very powerful advantage to have.

6. Stay Focused With Your Tweets

While it’s certainly fine and maybe even a good thing to have some fun with your Twitter account and go off-topic from time to time, you need to stay focused and regularly publish tweets that are on-topic and keyword-rich.

It’s important to remember that your main Twitter profile, in Google’s eyes, is very much like any other page with a headline, body copy, and links. As with any other page, if that content is keyword-rich, it’s more likely to be ranked well.

Keeping your tweets focused lets you keep that copy keyword-dense, giving Google exactly what it wants to see and encouraging it to rank your Twitter page higher than other, less-focused accounts.

7. Don’t Forget Your URL

While it’s true that your URL doesn’t actually pass on any SEO authority due to Twitter’s use of the “nofollow” tag, it’s still an important tool for directing the traffic your Twitter profile gets back to your site.

Since the eventual goal of any Twitter presence is to turn that traffic back to your site and your business, forgetting to use your URL is a misstep you can’t afford to make.

In the end, what separates Twitter most from your run-of-the-mill sites is that it exists in an almost-completely enclosed ecosystem. Therefore, much of your link building has to be done from within, and nearly all of your content building is done in the form of tweets.

7 Google Analytics Advanced Segments I Love (and you should too) | SEOmoz
Oct 7th, 2011 by netizenshane

love using Advanced Segments in Google Analytics. Sure, you can export a big chunk of data to Excel and then use some Excel wizardry to clean up the data and display it in different ways, but what if you just need to get a quick snapshot of certain traffic or trends, but the default segments don’t go far enough? I’ve put together a few of my favorite Google Analytics Advanced Segments for you, so that you can add them to your own Analytics and use them as you need.

Segment 1:  1 word keywords

This first segment will just give you all of the one-word keywords that are driving traffic to your site. Sometimes there are crap keywords, but I am increasingly finding it interesting how sites can rank for one-word keywords because of personalized results.

^s*[^s]+(s+[^s]+){0}s*$

Segment 2: 2-3 words

This next statement gives you all of the two and three word keyphrases driving traffic to your site. These are normally the important terms for you, because 2-3 word phrases have been shown to drive the best targeted traffic. For example, “vouchers” might be too general too really bring you good traffic but “voucher codes” or “quality voucher codes” will probably give you better traffic.

^s*[^s]+(s+[^s]+){1,2}s*$

Segment 3: 4+ words

This segment is like the statements above, but instead gives you all keyword phrases that have four or more words in them. This is all you need for segmenting down to your longtail keywords quickly and easily in Analytics.

^s*[^s]+(s+[^s]+){3,}s*$

Segment 4: Social Media Traffic

Sometimes you need to see the easy breakdown of your referral sources, and even more often it can be helpful to see just your Social Media traffic. Here is how we have our Social Media Advanced Segment set up in the Distilled Analytics account:

Segment 5: Traffic to Blog (where URL is http://www.site.com/blog)

If you have a separate blog where you write and try to generate both links and traffic, it can be helpful to have a way to easily segment out the traffic to just this section of your site. Sure, you could just use the Filter section, but why do this when you can set up an Advanced Segment and have it forever? A little more work in the short term can save some big headaches. Note: this is really only helpful if your blog articles have the URL structure of http://www.site.com/blog/(post-url) Here is how I have set up an Advanced Segment to do just this:

Segment 6: Google as Source

Since SEOs often care mostly about Google traffic (for better or for worse), since it is where the bulk of our traffic comes from in almost every case, I also have an Advanced Segment set up to show me the traffic where Google is my source. Here you go:

Segment 7: Twitter Traffic

Segmenting down to just Twitter traffic is a chore, because so many different Twitter clients show up differently in your Analytics. This has become less of an issue recently, since Twitter changed to putting all shared links shortened with the T.co shortener (which Tom wrote about here), but we still need to account for all Twitter clients when segmenting out back traffic.

Here is the Advanced Segment I use. Have I missed any?

How to set up Advanced Segments for any section of your site.

Much like the “/blog” example above, if your site is broken into areas (for example, my site is broken into /category/search-engine-optimization and /category/social-media), you can also create advanced statements for these.

Here is how I set up an Advanced Segment for my SEO category:

Finishing Up

These are the most useful Google Analytics Advanced Segments that I have in my arsenal that can apply over almost any client. Others have written about more specific Advanced Segments in these posts:

Charting ‘Unique Keyphrases’ Using Advanced Segments

Guide to Setting up Advanced Segments in Google Analytics for Complex Brand Names

Segmenting Important Data through Advanced Segments

via 7 Google Analytics Advanced Segments I Love (and you should too) | SEOmoz.

Oct 7th, 2011 by netizenshane

Google AdWords Adds Mobile Page Call Tracking – Search Engine Watch (#SEW)
Oct 7th, 2011 by netizenshane

Google AdWords users will now be able to track the success of their campaigns in driving calls from mobile devices. The new code introduced by Google allows advertisers to get analytic data for calls placed from a mobile page.

How the On-Site Tracking Works

 Google AdWords Adds Mobile Page Call Tracking Search Engine Watch (#SEW)Google advertisers can place a JavaScript code on their site that allows tracking of targeted calling behavior. The advertiser’s site has to be a mobile-accessible page that allows users to place a call by clicking on a specific field.

Once the code has been added, AdWords will display analytic data on the success of each keyword, ad group, and campaign as it relates to generating calls.

As stated by John Sutton, VP of Online Marketing at Red Ventures (a company that participated in the beta program for this feature), the ability to track calls both accesses important user behavior data and opens advanced controls (such as cost-per-acquisitionbidding strategies or conversion optimization).

Google has already done data tracking on calls, but only within the confines of the “click-to-call” ad extension. This new tracking is distinct, covering actions on the advertiser’s web page, and the tracked customer actions don’t cost the advertiser anything extra.

Setting Up Call Tracking

Adding call tracking to your site is a two-step process. You simply:

  • Get the necessary code:
    • In your AdWords account, Navigate to Reporting and Tools > Conversions > +New conversion.
    • Add the name of the campaign and select “Call” for the “Conversion location.” Then hit “Save and continue.”
    • Fill out the “Conversion category” and “Conversion value” fields with the best information available, then choose “I make changes to the code.”
    • A new window will open and provide you with the necessary code snippet. Save this for your records.
  • Plug the code in:
  • Copy the code generated in the segment above.
  • Paste the code into the HTML of the targeted landing page. The script will need to be placed in the page element (link, image, etc.) that results in a call.
  • AdWords will automatically recognize the code and start tracking information.

Especially given the rise in mobile technology, and predictions that the majority of U.S. cell phone owners will be wielding a smartphone by the end of 2012, call optimization of this type is likely to become more valuable in the months to come.

Update Made To Definition Of A Google Analytics Session
Oct 6th, 2011 by netizenshane

Google Analytics announced a change on the way a sessions are calculated on the tool. While this change will not affect the majority of the accounts significantly (according to the official blog post“most users will see less than a 1% change”), it is an important change.

Below, I will describe why it is important and how this can affect some Google Analytics accounts (mostly companies that misuse Google Analytics campaign parameters), and what you can do to make sure you are on the right track.

What Is Changing & Why?

According to the official post, here is a summary of how Google Analytics has ended a session up till now:

  • Greater than 30 minutes have elapsed between pageviews for a single user
  • At the end of a day
  • When a user closes their browser

In the new model, Google Analytics will end a session when:

  • Greater than 30 minutes have elapsed between pageviews for a single user
  • At the end of a day
  • When any campaign information for the user changes. Campaign information includes: utm_source, utm_medium, utm_term, utm_content, utm_id, utm_campaign and auto-tagging from AdWords (gclid)

This change is an interesting move as it will provide more accurate data when it comes to Multi-Channel Attribution. This will happen because visitors that visit a website multiple times during a time period that would originally be considered as one single session, will now have their cookies updated to a new session in some specific situations.

For example, if someone visits a website from a PPC ad and then leaves the site and within 10 minutes get back to the website through an organic link would be considered as one long visit from PPC in the old model. In the new model, we would have two short visits, each attributed to its own source.

Do Not Use Campaign Parameters For In-Site Tracking

One of the mistakes I have seen when it comes to implementing Google Analytics is the usage of Campaign Parameters for in-site tracking (mostly tracking navigation usage or internal campaigns).

This practice produces inaccurate numbers for those analyzing in-site behavior and also heavily affects traffic sources accuracy and, therefore, should never be used. Now even more. With the current update to how sessions are defined, each time a visitor clicks on an internal link that uses campaign parameters, a new session starts; this will artificially increases the number of visits in addition to the issues I described above.

If you are currently using campaign parameters to track in-site behavior, here is what you should do:

  1. Remove all campaign parameters from your links. For example, if you have a link on your site such as searchengineland.com?utm_source=story&utm_medium=link&utm_campaign=launch , you should simply usesearchengineland.com
  2. If you are tracking navigation elements on the site. Using Event Tracking is the best way to go: add an onclick event to the “a” tag that would include the following:onclick=”_gaq.push(['_trackEvent', 'navigation', 'link', 'launch']);”
  3. If you are tracking internal campaigns. Using Custom Variables is the best way to go: add an onclick event to the “a” tag that would include the following:onclick=”_gaq.push(['_setCustomVar', 5, 'internal_campaign', 'banner', 2]);” . It is important to note that you should check with other people involved in setting Google Analytics to be sure spot 5 (the first value on the function above) is available for campaign tracking.

The above techniques will certainly provide you with accurate numbers in the best possible way without affecting your reports.

Are You Using Filters To Keep Your Analytics Data Honest?
Oct 6th, 2011 by netizenshane

Data is a fickle friend. It is only as reliable as the setup you have and the effort you put into understanding the numbers. There are so many variables that can kill your data, and fixing the things that are broken is always a process of elimination.

I can tell you that most skewed data is generally caused by two things 1) incorrectly installed scripts, or 2) the site owners themselves.

You and your coworkers and employees are the one that visits your site most often. Your marketing team and webmaster are probably a close second and third. Do those visits really represent honest traffic to your site, with an intent to buy, sign up or convert? No, not really.

In my experience with lodging and hospitality websites, many reservation agents either book phone call stays through the website, or ask potential guests to navigate a site with them.

Marketing Managers navigate through the site on an hourly basis. CEOs, should you be measuring your internal clicks on your site? No, of course you shouldn’t be!

There is an easy way to fix this, but I bet a those that have actually taken this step are in the minority. You’d be surprised how many companies miss this simple step.

What Are Common Causes Of Flawed Data?

There are many actions that skew data. Aside from the employee traffic, here are a few more examples:

  • Do you have an IT policy that company wide, your browser homepage needs to be your website homepage? I’ve visited a few corporate offices that have this set up. Every visit, pageview, exit or bounce from this page is skewing your data.
  • Do your office or complex guests access the Internet via a wireless connection that redirects them to your website automatically? These visits are from people who are most likely not going to buy from you, at that time, they’re probably already on site and looking to check email or post to Facebook.

I’ll show you how to make your data honest, and get an idea of how much impact your team’s visits have on your own website in the process.

First you need to do some discovery and figure out which visitors are accessing your site and creating visits you don’t want to track. Here are some more examples in addition to those mentioned above:

  • Employees in your office
  • Employees from home or home offices
  • Webmaster from their office and/or home
  • Contractors who write or create collateral for you from their offices or home offices

Now that you know who the offenders are, you need to find out their IP addresses. Ask each person above to please provide the IP address from their in office and home office computers.

It’s easy to find by having everyone access the site WhatIsMyIp.com from their computers in each location. Your IT department or manager can give you a list of IPs your in-house computers use to access the Internet.

Once we’ve collected the IP addresses, we can login to your Google Analytics and start adding filters. I’m assuming everyone is looking at the New Version of Google analytics for these steps.

Once you login to your GA account, move all the way over to the left of the orange bar along the top and click on the gear icon.

sel 8 25 filters 1 600x37 Are You Using Filters To Keep Your Analytics Data Honest?

Choose the correct profile, then click on the “Filters” tab

sel 8 25 filters 2 Are You Using Filters To Keep Your Analytics Data Honest?

Within the Filters tab, click the “+New Filter” button.  This is where you’re going to enter the details you’ve collected during your diligence above.

Now, there are quite a few ways to exclude or include data here. The simplest way is excluding by IP address. You can also see there are ways to set up profiles that only include data from certain referrers, or data from subdirectories on the site.

You’d use this in a separate profile that only tracked that data, that’s a discussion for another time. For today’s exercise, we’re going to exclude traffic by IP address.

sel 8 25 filters 3 Are You Using Filters To Keep Your Analytics Data Honest?

Make sure you name each filter something descriptive.

“In Office IP” or “Bob’s at home IP” you want it to be easy to make changes in the future if someone changes their ISP or is no longer employed by the company. Fill in the information and save each filter. Every IP address needs a different filter, so this could take some time to set up, but you’ll see a difference when you’re looking at honest data when you’re done.

The next thing I’d do is add an annotation to your data graph to remind anyone who looks at the data that employee & contractor IPs were removed on a specific date. This is really easy,  a few clicks from the analytics homepage and you’re done.

Underneath the graph on your “My Site” Tab in the orange bar, click the down arrow.

sel 8 25 filters 4 600x246 Are You Using Filters To Keep Your Analytics Data Honest?

Click “Create new annotation”

sel 8 25 filters 5 Are You Using Filters To Keep Your Analytics Data Honest?

Then enter the date, your note, and click save

sel 8 25 filters 6 600x98 Are You Using Filters To Keep Your Analytics Data Honest?

You can now see the annotation in the graphic, and you can see how excluding the IP address affected all of your data.

sel 8 25 filters 7 Are You Using Filters To Keep Your Analytics Data Honest?

Many believe that setting up analytics is a matter of signing up for an account, and installing some scripts. It’s really more involved than that. There are many things that might look minor on the surface, but have a big impact on the way your data is collected and viewed.

Make sure your analytics is set up correctly, in the long run, you can save yourself time and mistakes. The truth is, dishonest data can lead to bad decisions, and in a competitive online marketplace, we can’t afford too many of those mistakes.

SIDEBAR
»
S
I
D
E
B
A
R
«
»  Substance:WordPress   »  Style:Ahren Ahimsa