Thursday, October 25, 2012

Keyword Research Analysis



 Five effective Keyword Research Analysis Tips for website optimization:

Use multiple - word phrases
In spite of using more generic words try to use synonyms or "long tail" keywords. Research has shown that multi word phrases are capable of bringing more traffic to your site rather than highly competitive main keyword phrases. As long tail keywords plays a role of unique keyword for the site which increases the chance of bringing the site to the top. Less generic keywords or key phrases can bring a good amount of traffic to the site initially.

Research keywords specifically for each page
Stuffing of surplus targeted keywords in the Meta tags and using the same Meta tag in every page of the site is a complete waste. Your keyword research needs to be page specific and only focusing on 2 to 5 keywords per page. It would be more effective according to the best SEO practices as it gives each site page a chance for higher ranking on its own. 

Country specific Keyword Research
Do not forget keyword search terms can be country specific. For different countries there are different keywords terms you must research and then reference that country's search engine when doing your initial keyword research. For instance, UK and Australia may have different expressions, terminology and spellings. Referencing the country specific search engine can yield to a good amount of local traffic to your site. 

Perform Effective Keyword Analysis
Effective keyword research for your website should be accompanied with an effective analysis of those keywords in the search engines to check:

To evaluate the competitiveness of your keywords. Along with checking the competitiveness of your keywords you should look at the strength of the competition.

Are the other sites listed for your keywords truly your competitors?   

 If it is really the desired niche keyword for that page

 Are the sites listed for your keyword even related to your industry, products or services? 


Ongoing Keyword Research

Keyword Research and Analysis is not a home work of one day. It needs continuous efforts to be done upon to survive. Consistent modifications according to the emerging market changes are needed. Ongoing keyword research and modifications are the must for best SEO practices.

All the search engine optimization process goes around the website overall impact and usefulness and obviously the content and the keywords that are going to be used for the optimization process decide the fortune. Therefore keyword research needs expertise and should be done under the supervision of experts.



Keyword Research


Keyword Research is a process employed by search engine optimization professionals to discover and research actual search phrases which people enter into the search engines when conducting a search. Search Engine Optimization involves constant testing, experimenting and improvement. It all begins with words typed into a search box. Ranking for the "right" keywords can make or break your website. Through the detective work of puzzling out your market's keyword demand, you not only learn which terms and phrases to target with SEO, but also learn more about your customers as a whole.
It's not always about getting visitors to your site, “but about getting the right kind of visitors.

Search engines search for keywords or key phrases only in the web site when they search for the information for the net surfers.As search Engines need the most informative and useful sites for their visitors, they prefer the sites containing content with the highest priority of the keywords being searched.Expert Keyword Research is the cornerstone to a successful SEO. There are many Keyword Research and Generator tools. But in reality these tools can only give an approximate idea of what keywords can be searched relevant to a website like yours. But the actual keyword analysis can only be done by the person who has done analysis of the site keeping in mind the user's perspective and potential customers.

Let us talk about Google's AdWords Keyword tool which is a common starting point for SEO keyword research. It not only suggests keywords and provides estimated search volume, but also predicts the cost of running paid campaigns for these terms. To determine volume for a particular keyword, be sure to set the Match Type to [Exact] and look under Local Monthly Searches. Remember that these represent total searches. Depending on your ranking and click-through rate, the actual number of visitors you achieve for these keywords will usually be much lower.

Lastly Keywords should also be used in several other elements on your site:
  • Title Tag
  • Meta Description Tags
  • Headings
  • Alt text
  • Anchor Text/ Navigational Links

Saturday, September 22, 2012

SEO Techniques for Website Optimization


1. Alt Tags
Alt attributes are a way to describe images to search spiders simply because search spiders cannot see them. It is not only an opportunity to add more content (with keywords) to your site, but it also helps your images to be searchable on Google. For best results, all images should also have distinct filenames.
Alt tags were originally designed to be displayed as an alternative text to an image in cases when an image cannot be displayed in a browser. Also, alt tags are displayed when users rest their mouse on the image.

2. Title Tags
A title tag describes an online document and has the most SEO power for establishing keyword relevance on the page. The Title tag appears at the very top of a browser or tab window, in search result pages, and external websites.
At 70 characters or less (including spaces), the title tag should include a few relevant keywords and a company or brand name. Important keywords should be placed at the beginning of the title tag. It is recommended to avoid using stop words such as “the,” “is,” “can,” “and,” “but,” “while,” “that,” “we,” etc. that have no keyword value.

3. Meta Descriptions
Meta tags give search spiders a summary of what the page is about. Meta descriptions play an important role in search results and gaining users click-through. Google sometimes uses this summary to display in search result pages (SERPs) and highlights keywords relevant to the search query.
The optimal length of meta description is between 150-160 characters and should include keywords that are relevant to the title tag and content. Consider this an opportunity to advertise the content of your page to searchers.
Ideally, each page should be optimized around one main key phrase that should be included in your title tag, meta description, and content.

4. Sitemap.xml
Google uses Sitemaps to learn about a website’s structure, which allows a better crawling of your site. Creating a Sitemap ensures that search spiders know all the pages of your site that may not be discoverable through the normal crawling process.
Sitemaps are also used to provide metadata about specific types of content on your site such as video, images, mobile, and news. Sitemap.xml files reside in the website main directory and have to be submitted to Google using Google Webmaster Tools.

5. Robots.txt
In addition to Sitemaps, a robots.txt file is used to instruct search engine robots about access to your site. Before accessing pages of a site, search spiders check the robots.txt file to see if it restricts them from accessing certain pages. You only need a robots.txt file if the directory of your site includes content that you don’t want search engines to index. The robots.txt file resides in the main directory of your site.

6. Rich Snippets and Microdata
High ranking on the search engine result pages (SERPs) is no longer the only problem to solve with website optimization. When searchers glimpse through result pages they look for additional relevant information to their query before they click-through. Rich snippets provide searchers with that information and help them quickly decide if the content that exists on a web page is relevant.
To implement rich snippets web developers have to add additional HTML markup to a website. One of the most used markups is microdata. Using simple HTML tag microdata allows you to indicate specific type of information: reviews, people, products, businesses and organizations, recipes, events, music and video content.

7. Code Restructuring
Though search algorithms have changed dramatically since their original inception, search spiders still don’t search deep into a site and don’t spend much time on any given page. With this in mind, code restructuring of some pages is still a valid addition to any advanced SEO techniques. Web developers restructure a page (without affecting a page layout) in a way that places content higher in the code. This helps spiders quickly find relevant information.
The above advanced SEO techniques will significantly improve your website’s online visibility, but that’s not the limit of SEO optimization that can be done to a website. A website can be further optimized for better usability with accesskeys (use of keyboard shortcuts), tabindex (use of tab key to navigate through a page), and web accessibility for people with disabilities. It might sound like unnecessary optimization, but everything that makes a website more human friendly is highly valued by search engines.

Friday, August 31, 2012

What’s The Difference Between Google Analytics And Google Webmaster Tools?


In today’s post, you’ll discover the differences between Google Analytics and Google Webmaster tools and how they benefit your business.
Google Analytics: Google Analytics gives you the following information:
1. Web-traffic: The amount of people visiting your site on a daily, monthly and yearly basis. You can also compare your traffic stats to previous years.
2. Visitors:
  • The amount of website visitors that are new vs. returning.
  • The length of time your visitors spend on your site.
  • The amount of pages your visitors view.
  • The amount of visitors who landed on your site and left immediately.
  • What Geographic location they arrived from.
  • The websites that referred visitors to your site.
3. Keywords:

  • The top keywords bringing traffic to your website.
  • The keywords producing the most sales.
  • The keywords keeping your visitors on your site the longest.
  • The keywords people type to find your site and cause them to leave right away.
4. Content: 
  • The web pages your visitors view most often.
  • The pages bringing you the most traffic.
  • The pages your visitors most often leave your site.
5. Conversions:The conversion rates for the goals you set on your site.
If the goal for your site is lead generation, how many of your total visitors are giving you their information in exchange for your lead magnet?
If your goal is to sell products, how many people buy when they land on your sales page?
Google Webmaster Tools: Google Webmaster Tools reveals how Google sees your site online. You can use this information to fix problems with your site and improve your sites visibility with Google and your buyers. Below you’ll find just some of the information you can get from Google Webmaster Tools.
1. Crawling Errors: Google Webmaster Tools shows you the pages on your site where Google is unable to crawl. The most important aspect of SEO is crawler accessibility. If Google can not crawl your website you can forget about top rankings.
2. Search Queries:This is information on your pages Google has returned to searchers for specific queries.
You can also find information about pages on your website that were shown to searchers most often. This is one of the most important and under utilized tools.
 
Here you’ll be able to get information on things like:
  • The total number of search queries that returned pages from your site.
  • Your top search queries that returned pages in Google.
  • The number or times your pages were viewed in Google’s search results.
  • The number of times your listings were clicked on for a particular search query.
  • The percentage of timesyour listing was clicked for a particular search query.
  • The average position of your website for a particular search query.
3. Links: The amount of links coming into your site from other websites.
4. Malware: Google Webmaster Tools will notify you if your site becomes infected with malware. Google can block your site if it becomes infected. If this happens, you’ll need to have your site cleaned and resubmitted to Google for their approval. The way you do this is through Google webmaster tools.
  1.  HTML errors-Google will notify you if they find any HTML errors on your site.
To summarize the differences between Google Analytics And Google Webmaster Tools:
  • Google analytics is basically a website statistics tool showing you how and how many people visit your site.
  • Google Webmaster tool shows you how Google sees your site from a search engine perspective.
The really great news is you can now connect your Google Analytics with your Google Webmaster Tools account.



Thursday, August 30, 2012

Difference between Google panda and Google Penguin


SEO marketers would have surely loved to receive a heads up regarding the fine details of the Google Panda and Google Penguin updates. The now infamous search engine updates that caused several brands to lose visibility, and forced even more to make dramatic changes to their search engine optimization strategy.

Matt Cutts, one of the top software engineers at Google, actually warned the public about some of the changes in an interview. Cutt’s friendly heads up involved two main issues regarding search:
  1. Over optimization will start to negatively affect rankings.
  2. The quality of content will soon be more of a focal point when determining rankings.
In Cutt’s own words, Google is attempting to “level the playing field” by essentially giving both SEO marketers with great optimization skills and great content an equal opportunity to increase their visibility in the SERPs (Search Engine Result Pages). The mission is all about enhancing the efficiency of the so-called “GoogleBot”, the company’s software that crawls the web and indexes pages, and in turn, create a more relevant and useful experience for the internet searcher. As a result, those who go overboard with their SEO will start to be penalized, while those who improve in the quality department will be rewarded.

Google Panda
Google dropped the bomb that was the Panda update in February of 2011. As you probably know, Google Panda had some complex components to it, but its primary objectives were simple – penalize sites with lower quality content and in the process, return sites with high quality content to the top of the results where they belong.

Google Penguin
Google launched the Penguin update in April 2012. This focused more on tackling spam and penalizing sites that didn’t follow Google’s quality guidelines. Google Penguin also looks to tackling Black hat SEO techniques that many sites use to increase their search engine ranking. This includes techniques like;
  • Keyword/keyphrases stuffing within content.
  • Comment spamming by including links using the same anchor text.
  • Links from article directories and directory listings.
  • Having excessive/unnatural internal links.
  • Too many external links pointing to your site that use the exact same anchor text.
  • Excessive links from low quality sites.

Google’s initiative to level the playing field is just another step in the road to making quality more important than quantity. It may take SEO marketers some getting used to, but those who want to keep benefiting from search traffic will embrace the Google Panda and Google Penguin updates and do what it takes to keep their rankings intact.

Wednesday, August 29, 2012

SEO Basics



1. Domain Canonicalization

Sounds complicated, huh? Don’t worry – it’s not.
Try going into your URL bar and entering the non-www and www version of your site. If they both show up as http://yoursite.com and http://www.yoursite.com, you have canonicalization issues. All you need to do here is 301 redirect the one that has less links to the one that has more links. You can use Open Site Explorer to determine which page has less links.

2. Title

The title is one of the most important on-page SEO factors so make sure you get this right. Generally, you want all of your titles to have your target keywords in the front and have the site name in the back.
For example, look at how Zappos structures their title:
Shoes, Clothing, and More | Zappos.com

3. Breadcrumbs and Related Links

Internal links help structure the strength of inner pages so take advantage of them. Think of your site as an ant hill with water pouring down the tunnels. Ideally, the water would flow through all the tunnels to fill them all up. Breadcrumbs are a great way of creating internal links, here’s an example of what they look like:

In addition, linking to related articles or related products (if you’re an e-commerce store) goes a long way in telling the search engines that these inner pages have some importance as well.


4. Robots control

Many webmasters like to use the robots.txt to block portions of their websites. As an alternative, try using the ‘noindex, follow’ tag on pages because this will allow the link juice to flow freely throughout your site. Using the robots.txt basically creates a black hole for link juice – it stops the flow.

5. Alt tags

As of today, search engines still have difficulty discerning what an image is about. Make sure you use alt tags when you add images so search engines can crawl the text. By doing this, each image you upload with yield some SEO benefit. Remember: No alt tag means you won’t get credit for the picture.

Here’s an example:
“img src=”http://www.evergreensearch.com/wp-content/uploads/2012/01/zappos-breadcrumbs.jpeg” alt=”zappos breadcrumbs” title=”zappos breadcrumbs” width=”326″ height=”186″ class=”aligncenter size-full wp-image-674″

6. User Generated Content

Whether you own a blog or an e-commerce store, user generated content is a great way to boost long tail search traffic to your site. The additional benefits include adding more engagement and building trust on your site. Over 62% of people read online reviews which means if you have an e-commerce site, you’d be shooting yourself in the foot to not include user generated reviews. These will help increase conversion rates plus help you generate content on your product pages to help increase organic traffic.
If you have a blog, you can get great user engagement from your great content. The result: lively discussions in your blog comments section and added content at no cost. UGC is a scalable form of SEO that most webmasters should not miss out on.

Conclusion

SEO doesn’t have to be hard. While the industry is rapidly changing every year, there are still a few timeless SEO laws that still exist. Pay attention to those laws and make the simple changes outlined in this post to give your organic traffic a bump. Once you work out all the kinks on your site and are doing the most important things efficiently, you can dive into more heavy lifting SEO tasks. Until then, just get the basics right.