SEOsean.com Blog

A blog about search engine optimization and internet marketing.

How To Do Keyword Research Using Insights for Search

April 2nd, 2012 by SEOsean

For many of us using Google Adwords tool for keyword research has become a stable. After today’s blog post I’m hoping to shake that ground up a bit. You may know of or might not know about Google’s Insight for Search. It’s a free service that lets you see the top search terms over time and area of the world. Additionally it provides relevant search term data. So let’s take a peak under the hood and see what we can do for our keyword research with Insights for Search. Listed below is a step by step process to use Insights for Search for keyword research.

  1. Setup Excel sheet – This is actually a pretty critical step even though it might seem like common sense. You want to make sure you have a few columns setup in your Excel sheet to allow for recording of the keyword and a column for the average search volume over the past 12 months.
  2. Brainstorm on Keywords – Before we start using the tool you want to make a list of keywords that you think would be relevant and popularly searched. Put this down somewhere so you can use it in the next step.
  3. Search Insights – Next is the actual search step. What are critical here are the settings you choose for your search. Under the search terms column on the tool’s web page enter your keyword. Then under the filter column choose “Web Search”, “United States” or your country for the second row of dropdown options and in the third row of dropdowns, select “Last 12 Months”. This setup will give you the search volumes for each month over the last 12 months for those users in the United States or your respective country. Once you’ve done that click the search button to initiate the search process.
  4. Record Data – Your next step is to record the data returned from the search. Record the total number located under the “Totals” section. This will give you an average of the search volume over the past 12 month period.
  5. Rise and Repeat – Repeat the above last 2 steps (#3-4) for each keyword you came up with in your brainstorming.
  6. Discover Keywords – While doing the above for each keyword also take note of the suggested search terms under the “Search terms” section. Add any relevant keywords here to your list.
  7. Sort – After you’ve completed making your list and recording all average search volumes, now it’s time to sort the list. Using Excel’s sort feature sort the list from highest to lowest for the average search volume column.
  8. Analysis – Now that you have a beautifully sorted list of search volumes paired up with keywords it’s time to analysis and decide which keywords to use. Remember just because a keyword has the highest search volume doesn’t mean that it’s the best keyword for you to optimize for. Factors such as competition and keyword relevance to your site’s topic should be considered. Another tip; I like to use the custom highlighters in Excel to mark which keywords I’m going to be using so I don’t get confused for forget which ones I decided upon.
  9. Recheck Annually – As you may have guessed the search volume number change monthly. So I suggest performing this task annually to make sure the keywords your targeting still have significant searches to warrant optimizing for. Mark it on your custom printed calendars so you don’t forget.

What’s the Percent Distribution of Search Result Clicks

September 20th, 2011 by SEOsean

It’s been a while since Cornell University’s original study was done on the distribution of clicks by search engine users viewing search results (SERPs). In fact it’s been many years - over 5 years! In that time Google has changed, the Internet has changed, the way we use the Internet has changed and most importantly the results page in Google has drastically changed. So the question really becomes… Does Cornell University’s study still provide us with accurate incite into the number of clicks each ranking site would receive in the SERPs?

SERPs Precent Clicks

I doubt we would have the same results today if this study was redone. For one we now have map listing in the search results. These typically take up a large chunk of the results and in many cases fall before the first “real” organic results. Therefore taking a healthy percent of those users’ clicks. Although I think that depends a lot on the keyword. For example a search for “pizza places” might get a lot of clicks on the map results but a search for “contact lens store” might not as users might be looking to buy contacts from a website rather than a physical storefront.

Next we have to look at things like Universal Search, where we’re seeing news, blog posts, twitter tweets, images, shopping and more in search results. Additional things like videos, thumbnails, star ratings, Sitelinks, and even Google’s new Google Plus all play a factor in where user clicks are going.

The other side of this is that each search engine user is not seeing the same results. Meaning if I do a search in Atlanta, GA I might see a different set of search results compared to someone searching in Pittsburgh, PA. Therefore making it harder to determine a standard percent that SEOs and other online marketers can follow.

Even with all that I do think we can find some sort of reasonable percentages but would have to include percents for all the variables that could be in search results, like if there is map listings in the results then the first real organic result gets 40% of the clicks but without those map listings it gets 50%. So we would have to first compile a list of variables for search result pages.

Here is a quick shot at making that list:

  1. Local Place Listing present (Google Map listings)
  2. Twitter tweet feed present
  3. Ads above the SERPS
  4. New feed present
  5. Variations for location based search
  6. etc…

I believe a good study for is in order here and would provide a lot of valuable incite for both SEOs and site owners.

Use YouTube Search Stories Creator for Backlinks

April 17th, 2010 by SEOsean

If you have not heard yet YouTube release a new feature where you can create your own search story videos right just like the ones they have in the Google ads. The look really professional which got me thinking. Someone savvy enough could come up with some great videos that would get people to link back to them and all they have to do is you this simple video creator.

The video creator only takes a few minutes to create a video and it even adds in the music for you. Although you do have to come up with at least 6 keywords you want your story to include. I wish you could choose less but I guess that’s mandatory.

So how can you use this to get links back to your website? Well, first come up with a unique idea for a search. My personal favorites are Zombie apocalypse, Cookie Monster’s Search Story, Worth It. You can check out more on the search video creation page.

Anyways here is my search story video:

Google Can’t Add! New Webmaster Tools Impressions Data Doesn’t Add Up

April 17th, 2010 by SEOsean
2+2=5

At first I was very excited about the new Webmaster Tools impression data released this past week. It seemed very promising. As any website owner I’m sure you would love to know how many impressions (how many times) your site shows up in the search results for a given keyword/search term. And as a SEO services provider it sounded like a dream come true.

Although it would seem like this would be simple to get the data correct and you would think Google would make sure its telling you the correct data at risk of being called out on it, but I think someone at the Googleplex has got a stuck down 7 key on their adding machine.

Surprisingly or not so surprisingly, the data just does not add up with the data you find in the Adwords Keyword Tool. I ran a little study on a few websites of mine and others before coming to this conclusion. I wish it were not so but the data speaks for itself.

Here are my findings from my personal running log website. A great site by the way, if you’re looking for a free easy way to track running workouts.

Keyword Webmaster Tools Impressions Adwords Tool (exact global search)
online running log 260 590
free running log 91 260
running logs free 36 170

Some notes on how I conducted this study:
My website remained in on the first page of Google during the entire past 30 days from which the data was gathered by Google’s Webmaster Tools. I also did not show up on any other search page during this time - just on page 1 of Google. So my results in theory should be very accurate. Then when using the Adwords Tool I made sure the Search Match was set to Exact and I used the Global Monthly Search Volume.

Now I went into this knowing that the search volume from the Adwords Tool is an average from the past several months so I take that into account when I look at the data from each. But even if you’re thinking with that the data just does not add up. I can see if the data was close then there would not be much for alarm but the data is not even close - it’s off by 100s.

So my next logical thing to do was check out another website’s data on this. I figured well let’s do someone with a more higher volume keyword that way we can also see my results are just an oddity or something.

I chose to use the data Aaron Wall from SEObook.com posted on his site when he reviewed this new Webmaster Tools impression data. Granted I don’t have access to his account I could see his 3 top keywords and their impressions data from his image that he posted on his website. But we also have to take into account (which was different than my website), his website is getting impressions on more than just the 1st page of Google (thus some duplicate impressions). So his impression results should be just a little higher than the Adwords Tool data.

Here are his results:

Keyword Webmaster Tools Impressions Adwords Tool (exact global search)
seo 74,000 1,000,000
seo book 5,400 9,900
seobook 4,400 5,400

As you can see again the data is just completely off. The Webmaster Tools data just does not come close to the Adwords Tool data. It’s off by many 1000s.

So what’s my conclusion…? Well I think that the webmaster tools data is more accurate than the Adwords data - or so it seems. I mean it would seem to reason it would be easier for Google to display more accurate data for your website impressions rather than make some calculations based on searches - such as the Adwords data. But I think I might be wrong again! Possibly neither data could be accurate.

I decided to take this study one more step further. Let’s look at my clickthrough data on Webmaster Tools verse my Google Analytics data. Now in theory these should be the same - the number of clickthroughs are the same as the number of unique visits from that keyword in my Anlaytics data.

Here is what I found:

Keyword Webmaster Tools Visits Analytics Visits
online running log 16 31
free running log 28 89
running logs free 5 31

Shocking! That data does not match up again. So now do I really trust Google Analytics or is it just that the Webmaster Tools data is just plan wrong. I choose to trust my Google Anlaytics data since I have test it in the past verse other visitor tracking software and they seemed to match up.

So that means the Webmaster Tools data is just plan wrong but what do we do? I don’t know that there is much we can do, possibly inform Matt Cutts or submit a post on the Google Help forum but I don’t know how much Good that will do. I image we will only get a response back saying …yes it’s no accurate… Well we know that, so all we can do really is move on and use the data with a grain of salt.

The data is still useful. It’s just like the old green bars in the Adwords tool before it gave you search volume numbers. You can use the data to give you a general measure of how well that keyword does against your percent clickthrough rate. Knowing that actually can help you make better decisions on which keywords you should focus more time on optimizing your website for. Aaron Wall was also kind enough to point this out too using his own data. Thank you Aaron.

Google And Twitter Spam Disaster

December 11th, 2009 by SEOsean

Google recently released it’s personalized search with the inclusion of Twitter tweets. It seems Google has made another milestone in search technology… but wait a minute, one thing Google might have missed is that this could be a recipe for spam disaster.

As Rae Hoffman of Outspoken Media points out in a recent blog post, it can be quite easy for someone to spam the results. “The most obvious thing was the ability to real time spam Google’s results.” Just posting a tweet about certain topics and your on the first page of Google!

The scary part in all this is that there seems to be no sort of spam filter to stop someone from having their tweet show up in the search results. In fact I ran my own test to see if I could post something that would show up. I searched Google for SEO and posted a few things in twitter. You’ll see I quickly dominated the twitter results with my tweets! FYI, my twitter name is @gallagherdesign

Although after running these tests I did find a possible spam filter. It seems there is a time limit between tweets that one must wait or Google won’t show your tweet. Also I noticed sometimes Google would only show 1 or 2 of my tweets and seemed to block the rest. Maybe there is hope yet!

Now back to the topic in general… What does this all mean for brands? Well image someone doesn’t like your brand or they are a competitor. They could just sit at their computer or hire someone to tweet all day about how bad your brand is. And what could you do to stop them and what they say from showing up in the search engines? Of course they could always naturally optimize a web page and do the same.

So what about Google’s brand? I noticed there are no Twitter results showing up in the search results when you search the keyword “Google“. Or at least when I searched I found none. In addition it looks like a lot of brands I checked out don’t have twitter results showing up on the first search results page but rather you have to go into the options and click the “latest” link to see them.

Maybe with time Google will also add a Webmaster Tool where you can opt in or out of allowing these types of results to show up for your brand - probably would only be for big brands though like Nike or Walmart. It would be smart for them to add something like that.

Twitter Tweets in Google Search Results

October 21st, 2009 by SEOsean

Google just announced (10/21/09) that they will include relevant real-time Twitter tweets in their search results! Goggle states that in their quest for providing real-time search results they have decided to include tweets in their results.

How this effects search results:
Apparently only those search results that can be aided by a real-time observation will show related tweets. Google gives an example of searching for snow conditions at your favorite ski resort. So I would image the Google team will come up with an algorithm that will recognize specific searches and then display the tweets. But what I’m interested in finding out is where these are going to be displayed. Are they going to be at the top of the search results or are they going to be near the bottom or on the side some where? I guess we will have to wait and see.

When will we start to see tweets in the results:
According to Google the upcoming months it should be releasing tweets in their search results. Google states “…we look forward to having a product that showcases how tweets can make search better in the coming months.

How this effects SEO:
Depending on how and where tweets are displayed in the search results this can have a significant impact in search and SEO. If tweets are displayed in a highly visible search users might be more inclined to read or click on the tweet rather than a real search result. We will have to see a study on this but it will certainly diminish the number of clicks on natural and pay-per-click listings.

So this could impact organic SEO in a negative way. But at the same time it could also help. Users will then be able to simply tweet about a topic they want to rank for and *chirp* their on the first page of Google for a competitive keyword. Then make that even better by adding a link in your tweet back to your site and you then have an effective traffic source.

Hear it from the horse:
To read exactly what Google says about the subject, please review their recent blog post announcing the Twitter and Google partnership: RT @google: Tweets and updates and search, oh my!

Don’t forget, follow us on twitter @gallagherdesign

You Can Benefit From Being Content Scraped

September 25th, 2009 by SEOsean

If you own a website, especially a blog or make post frequently to your site that could be content scraped I sure you can see how frustrating it could be when your hard work gets scraped off your site onto someone else’s. Even though it’s frustrating and doesn’t seem right to let people do that, I’ve still been deciding whether or not you could actually be harmed by being content scraped.

Up until recently I was up in arms over the debate. Could you really be harmed by duplicate content on another site and would that site show up higher than you in the search results? Well those where good questions and I’d seen answers and good argument for both sides, some people saying yes some saying no.

So I sought out a definitive answer to the question. I found that for my own blogs and websites letting people scrap my content has not posed a problem. All of my sites are showing up higher in the search results than the content scraping sites. Then I also looked at the fact that I’m don’t seem to be getting any sort of penalty form Google on this, so looks all good from here.

But the question still remained as this study was only done on a few websites. So I decided to come to the conclusion that the answer to the question might be different for each site. I’d have to test each site I work on and determine if it’s a positive or negative thing for that site. Although to my surprise, today I came across a video by Matt Cutts that can put some, if not all, of this question to rest.

In the above video Matt Cutts confirms that having people scrap your content can actually be beneficial to your site. How..? Links (it’s all about the links), if you have links in the content that someone scraped and those links are linking back to your from the content scraper site then that would count as a link back to your site. So that also means that you should not be getting some sort of penalty for the duplicate content.

Pretty interesting, see I was not sure if Google would count these as link because in most cases it’s a version of your content linking back to your content. But apparently Google gives you some credit for this. Which I actually do see in my link results within my Webmaster Central Account.

So I’d suggest letting people scrap your content but make sure you include links back to your website in your content and that the links have the full website address. You also might do the same with your images in your posts.

Now I guess the only questions remaining are:

  1. Is this always true for every site. I’ve heard other webmasters talk about how their site shows up below some content scraper. So I guess you need to make sure you have at least some PageRank and TrustRank, then this should not be an issue.
  2. There is still the issue that some people might visit the content scrapers site and not your site for what ever reason. Then if you are making revenue off ads or some other conversion you are losing out. In that case you might still not want people to scrap your content.

What are your thoughts? Let us know by commenting.
P.S. I know if your scrapping our content :)

Twitter Issues - Not Working!

August 7th, 2009 by SEOsean
Twitter Bird Dead

This is just a quick post about Twitter issues I seam to be having. It does not seam to be allowing me to login to the home page but I can from other pages using the same exact username/password. But anyways at least I got around that hurdle finally.

Although now I don’t seem to be able to post anything new on Twitter. I’m just getting the loading image when I click the update button on my Twitter home page after typing a message. I’ve even left the page open for a good hour or two and still no working!

So I then came back again and it’s still not working! Ahrrrrrrr, I hate when this kind of stuff happens. I’m not that big of a deal but I’d like to at least update my Twitter page with some of the new posts we’re making on our blog here.

I’ll try again latter and hopefully they will have the issue resolved. I’m guessing it has something to do with the browser through as I’m able to update using a twitter iPhone app. Plus I just saw this posted message on the Twitter status site: Problems with updating from Firefox 3.5. Only issue is I’m not using that version of Firefox but I did just update my version so maybe it has something to do with that.

Again hopefully they will get this issue fixed quickly!

Video on Domain Trust and Domain Authority

August 7th, 2009 by SEOsean

I recently stumbled onto a great video on Domain Trust and Domain Authority. In the past I’ve been skeptical of some of the ideas and thinking behind what Rand Fish from SEOmoz has said but I agree with him on this understanding of Domain Trust and Domain Authority.

His video (posted below) outlines exactly what Domain Trust and Authority are and how to use them to benefit your site. So what are these things?

Domain Trust - Is the trust search engines put into your website’s domain. If you have a higher domain trust then you site including all pages in your site get this same trust placed on it because it’s all the same domain.

Domain Authority - This is the authority placed on your site’s domain name by search engines. Again this authority effects all pages of the same domain.

Rand outlines the calculation of each of these with the following points, which I agree are some of the important points that make up Domain Trust and Domain Authority.

Domain Trust

  1. Who links to you?
  2. Who do you link to?
  3. Registration info (domain name registration info)
  4. User data signals (data from internet users through various sources like browser, analytics, etc…)

Domain Authority

  1. Link juice/PageRank
  2. Diversity of link sources
  3. Temporal analysis (time or how quickly/slowly links are built)
  4. Distribution analysis (distribution of links to the pages within your site)

Share your comments and think on Domain Trust and Authority below by posting a comment!

Definitions for Twitter Terms

July 4th, 2009 by SEOsean

After starting my own Twitter account recently for our company (check us out: @gallagherdesign) and hearing other people ask what different things meant in regards to the “Twitter language” I decided I should write a post to help define that language. Also I’m going to be updating this post as new words get introduced into the Twitter community.

  • Twitter - is a free micro-blogging website, or in other words a free social networking site, that offers users the ability to post short messages (usually only 140 characters in length) on their personal page which then can be viewed by anyone.
  • Tweet - is the name given to the short updates you can post on your twitter account/page.
  • Hashtag - is a tag made of a keyword or term starting with the hash character and is used to indicate a category or topic for a tweet/post on twitter. To use the hashtag you type the hash character and then the keyword/term with no spaces.
    Example: #seosean
  • RT - means to repeat a tweet or message from someone else. The tweet/post usually contains the username of the person the message is from so as to give them credit for it.
    Example: RT @gallagherdesign Twitter language defined: http://www.seosean.com
  • OH - means you are posting something you overheard and is not your original thought.
  • @ - the @ symbol disgnates a tweet/post at the user’s username directly followed by the symbol.
    Example: @gallagherdesign I love your site!