Wednesday, January 2, 2013

Focus on Target Audience to Improve SEO Efforts


If you want to perform well in the search engines then you need to stop focusing on pleasing the search engines. That doesn’t seem to make much sense does it? Let’s explain.

The search engines want to provide users with the best possible results. If the results aren’t relevant to a search, don’t meet a user’s needs, or are filled with low quality sites the user isn’t going to have a very good experience. The primary revenue source of the search engines comes from visitors clicking on paid listings in the Sponsored section. If search engine users aren’t getting good results, they will find alternate ways to find what they are looking for, and won’t be clicking on ads meaning that the search engines won’t be generating revenue. Like any other business, the search engines aim to please their customers, or users. For this reason, the search engines (more specifically- Google) are changing their algorithms to attempt to “think” more like a human. SEO is no longer about pleasing search engine robots, it’s about pleasing actual people.

Think about all of the shifts that have taken place in the SEO industry even just within the last few years. One of the first major changes was the Google Panda update. The targets of the Panda update were websites that produced what was considered to be low quality content. Website owners knew that in order to get recognized by the search engines they had to produce content across the web that included targeted keywords and links. In many cases, how an actual visitor viewed the content was an after thought. Content distribution was done mostly for link building purposes. Quantity was the focus over quality. The recent Google Penguin update was a reminder that Google is serious about cleaning up and cleaning out the spam that lives amid its search results. Penguin went after sites that were guilty of too much SEO, or “over optimization”. For a human visitor, there is no need to include 5 keyword anchor text links in a 500 word article or launch keyword rich domain micro sites. That kind of behavior is clearly meant for the search engines only, and the search engines have made their stance loud and clear- stop doing it already!

Another example of how the search engines are focusing more on the user experience to rank websites is the emphasis on social search. People want suggestions from other people. If certain content performs well in social media, it must be of good quality. The search engines no longer have to assume using backlink data what kind of content people want to see. The evidence is right there within the re-tweets, shares, likes, comments, etc.

The bottom line is to stop thinking so much about the search engine spiders. Think about your target audience members. What kind of content would they like to see? This doesn’t mean that SEO best practice is dead. It’s still important to conduct keyword research, optimize your website, and build inbound links but the focus has shifted. Write content that appeals to your target audience and build links from places that they might actually visit. Keep it natural and don’t over optimize anything. By focusing on your target audience you have the best chance of succeeding in the search engines and online in general for the long term.

Tuesday, December 18, 2012

How Search Engine Works ?

The first basic truth you need to know to learn SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven. Although technology advances rapidly, search engines are far from intelligent creatures that can feel the beauty of a cool design or enjoy the sounds and movement in movies. Instead, search engines crawl the Web, looking at particular site items (mainly text) to get an idea what a site is about. This brief explanation is not the most precise because as we will see next, search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.

First, search engines crawl the Web to see what is there. This task is performed by a piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way. Having in mind the number of pages on the Web (over 20 billion), it is impossible for a spider to visit a site daily just to see if a new page has appeared or if an existing page has been modified, sometimes crawlers may not end up visiting your site for a month or two.

What you can do is to check what a crawler sees from your site. As already mentioned, crawlers are not humans and they do not see images, Flash movies, JavaScript, frames, password-protected pages and directories, so if you have tons of these on your site, you'd better run the Spider Simulator below to see if these goodies are viewable by the spider. If they are not viewable, they will not be spidered, not indexed, not processed, etc. - in a word they will be non-existent for search engines.

After a page is crawled, the next step is to index its content. The indexed page is stored in a giant database, from where it can later be retrieved. Essentially, the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords. For a human it will not be possible to process such amounts of information but generally search engines deal just fine with this task. Sometimes they might not get the meaning of a page right but if you help them by optimizing it, it will be easier for them to classify your pages correctly and for you – to get higher rankings.

When a search request comes, the search engine processes it – i.e. it compares the search string in the search request with the indexed pages in the database. Since it is likely that more than one page (practically it is millions of pages) contains the search string, the search engine starts calculating the relevancy of each of the pages in its index with the search string.

There are various algorithms to calculate relevancy. Each of these algorithms has different relative weights for common factors like keyword density, links, or meta tags. That is why different search engines give different search results pages for the same search string. What is more, it is a known fact that all major search engines, like Yahoo!, Google, Bing, etc. periodically change their algorithms and if you want to keep at the top, you also need to adapt your pages to the latest changes. This is one reason (the other is your competitors) to devote permanent efforts to SEO, if you'd like to be at the top.

The last step in search engines' activity is retrieving the results. Basically, it is nothing more than simply displaying them in the browser – i.e. the endless pages of search results that are sorted from the most relevant to the least relevant sites.

Thursday, October 25, 2012

Keyword Research Analysis



 Five effective Keyword Research Analysis Tips for website optimization:

Use multiple - word phrases
In spite of using more generic words try to use synonyms or "long tail" keywords. Research has shown that multi word phrases are capable of bringing more traffic to your site rather than highly competitive main keyword phrases. As long tail keywords plays a role of unique keyword for the site which increases the chance of bringing the site to the top. Less generic keywords or key phrases can bring a good amount of traffic to the site initially.

Research keywords specifically for each page
Stuffing of surplus targeted keywords in the Meta tags and using the same Meta tag in every page of the site is a complete waste. Your keyword research needs to be page specific and only focusing on 2 to 5 keywords per page. It would be more effective according to the best SEO practices as it gives each site page a chance for higher ranking on its own. 

Country specific Keyword Research
Do not forget keyword search terms can be country specific. For different countries there are different keywords terms you must research and then reference that country's search engine when doing your initial keyword research. For instance, UK and Australia may have different expressions, terminology and spellings. Referencing the country specific search engine can yield to a good amount of local traffic to your site. 

Perform Effective Keyword Analysis
Effective keyword research for your website should be accompanied with an effective analysis of those keywords in the search engines to check:

To evaluate the competitiveness of your keywords. Along with checking the competitiveness of your keywords you should look at the strength of the competition.

Are the other sites listed for your keywords truly your competitors?   

 If it is really the desired niche keyword for that page

 Are the sites listed for your keyword even related to your industry, products or services? 


Ongoing Keyword Research

Keyword Research and Analysis is not a home work of one day. It needs continuous efforts to be done upon to survive. Consistent modifications according to the emerging market changes are needed. Ongoing keyword research and modifications are the must for best SEO practices.

All the search engine optimization process goes around the website overall impact and usefulness and obviously the content and the keywords that are going to be used for the optimization process decide the fortune. Therefore keyword research needs expertise and should be done under the supervision of experts.



Keyword Research


Keyword Research is a process employed by search engine optimization professionals to discover and research actual search phrases which people enter into the search engines when conducting a search. Search Engine Optimization involves constant testing, experimenting and improvement. It all begins with words typed into a search box. Ranking for the "right" keywords can make or break your website. Through the detective work of puzzling out your market's keyword demand, you not only learn which terms and phrases to target with SEO, but also learn more about your customers as a whole.
It's not always about getting visitors to your site, “but about getting the right kind of visitors.

Search engines search for keywords or key phrases only in the web site when they search for the information for the net surfers.As search Engines need the most informative and useful sites for their visitors, they prefer the sites containing content with the highest priority of the keywords being searched.Expert Keyword Research is the cornerstone to a successful SEO. There are many Keyword Research and Generator tools. But in reality these tools can only give an approximate idea of what keywords can be searched relevant to a website like yours. But the actual keyword analysis can only be done by the person who has done analysis of the site keeping in mind the user's perspective and potential customers.

Let us talk about Google's AdWords Keyword tool which is a common starting point for SEO keyword research. It not only suggests keywords and provides estimated search volume, but also predicts the cost of running paid campaigns for these terms. To determine volume for a particular keyword, be sure to set the Match Type to [Exact] and look under Local Monthly Searches. Remember that these represent total searches. Depending on your ranking and click-through rate, the actual number of visitors you achieve for these keywords will usually be much lower.

Lastly Keywords should also be used in several other elements on your site:
  • Title Tag
  • Meta Description Tags
  • Headings
  • Alt text
  • Anchor Text/ Navigational Links

Saturday, September 22, 2012

SEO Techniques for Website Optimization


1. Alt Tags
Alt attributes are a way to describe images to search spiders simply because search spiders cannot see them. It is not only an opportunity to add more content (with keywords) to your site, but it also helps your images to be searchable on Google. For best results, all images should also have distinct filenames.
Alt tags were originally designed to be displayed as an alternative text to an image in cases when an image cannot be displayed in a browser. Also, alt tags are displayed when users rest their mouse on the image.

2. Title Tags
A title tag describes an online document and has the most SEO power for establishing keyword relevance on the page. The Title tag appears at the very top of a browser or tab window, in search result pages, and external websites.
At 70 characters or less (including spaces), the title tag should include a few relevant keywords and a company or brand name. Important keywords should be placed at the beginning of the title tag. It is recommended to avoid using stop words such as “the,” “is,” “can,” “and,” “but,” “while,” “that,” “we,” etc. that have no keyword value.

3. Meta Descriptions
Meta tags give search spiders a summary of what the page is about. Meta descriptions play an important role in search results and gaining users click-through. Google sometimes uses this summary to display in search result pages (SERPs) and highlights keywords relevant to the search query.
The optimal length of meta description is between 150-160 characters and should include keywords that are relevant to the title tag and content. Consider this an opportunity to advertise the content of your page to searchers.
Ideally, each page should be optimized around one main key phrase that should be included in your title tag, meta description, and content.

4. Sitemap.xml
Google uses Sitemaps to learn about a website’s structure, which allows a better crawling of your site. Creating a Sitemap ensures that search spiders know all the pages of your site that may not be discoverable through the normal crawling process.
Sitemaps are also used to provide metadata about specific types of content on your site such as video, images, mobile, and news. Sitemap.xml files reside in the website main directory and have to be submitted to Google using Google Webmaster Tools.

5. Robots.txt
In addition to Sitemaps, a robots.txt file is used to instruct search engine robots about access to your site. Before accessing pages of a site, search spiders check the robots.txt file to see if it restricts them from accessing certain pages. You only need a robots.txt file if the directory of your site includes content that you don’t want search engines to index. The robots.txt file resides in the main directory of your site.

6. Rich Snippets and Microdata
High ranking on the search engine result pages (SERPs) is no longer the only problem to solve with website optimization. When searchers glimpse through result pages they look for additional relevant information to their query before they click-through. Rich snippets provide searchers with that information and help them quickly decide if the content that exists on a web page is relevant.
To implement rich snippets web developers have to add additional HTML markup to a website. One of the most used markups is microdata. Using simple HTML tag microdata allows you to indicate specific type of information: reviews, people, products, businesses and organizations, recipes, events, music and video content.

7. Code Restructuring
Though search algorithms have changed dramatically since their original inception, search spiders still don’t search deep into a site and don’t spend much time on any given page. With this in mind, code restructuring of some pages is still a valid addition to any advanced SEO techniques. Web developers restructure a page (without affecting a page layout) in a way that places content higher in the code. This helps spiders quickly find relevant information.
The above advanced SEO techniques will significantly improve your website’s online visibility, but that’s not the limit of SEO optimization that can be done to a website. A website can be further optimized for better usability with accesskeys (use of keyboard shortcuts), tabindex (use of tab key to navigate through a page), and web accessibility for people with disabilities. It might sound like unnecessary optimization, but everything that makes a website more human friendly is highly valued by search engines.

Friday, August 31, 2012

What’s The Difference Between Google Analytics And Google Webmaster Tools?


In today’s post, you’ll discover the differences between Google Analytics and Google Webmaster tools and how they benefit your business.
Google Analytics: Google Analytics gives you the following information:
1. Web-traffic: The amount of people visiting your site on a daily, monthly and yearly basis. You can also compare your traffic stats to previous years.
2. Visitors:
  • The amount of website visitors that are new vs. returning.
  • The length of time your visitors spend on your site.
  • The amount of pages your visitors view.
  • The amount of visitors who landed on your site and left immediately.
  • What Geographic location they arrived from.
  • The websites that referred visitors to your site.
3. Keywords:

  • The top keywords bringing traffic to your website.
  • The keywords producing the most sales.
  • The keywords keeping your visitors on your site the longest.
  • The keywords people type to find your site and cause them to leave right away.
4. Content: 
  • The web pages your visitors view most often.
  • The pages bringing you the most traffic.
  • The pages your visitors most often leave your site.
5. Conversions:The conversion rates for the goals you set on your site.
If the goal for your site is lead generation, how many of your total visitors are giving you their information in exchange for your lead magnet?
If your goal is to sell products, how many people buy when they land on your sales page?
Google Webmaster Tools: Google Webmaster Tools reveals how Google sees your site online. You can use this information to fix problems with your site and improve your sites visibility with Google and your buyers. Below you’ll find just some of the information you can get from Google Webmaster Tools.
1. Crawling Errors: Google Webmaster Tools shows you the pages on your site where Google is unable to crawl. The most important aspect of SEO is crawler accessibility. If Google can not crawl your website you can forget about top rankings.
2. Search Queries:This is information on your pages Google has returned to searchers for specific queries.
You can also find information about pages on your website that were shown to searchers most often. This is one of the most important and under utilized tools.
 
Here you’ll be able to get information on things like:
  • The total number of search queries that returned pages from your site.
  • Your top search queries that returned pages in Google.
  • The number or times your pages were viewed in Google’s search results.
  • The number of times your listings were clicked on for a particular search query.
  • The percentage of timesyour listing was clicked for a particular search query.
  • The average position of your website for a particular search query.
3. Links: The amount of links coming into your site from other websites.
4. Malware: Google Webmaster Tools will notify you if your site becomes infected with malware. Google can block your site if it becomes infected. If this happens, you’ll need to have your site cleaned and resubmitted to Google for their approval. The way you do this is through Google webmaster tools.
  1.  HTML errors-Google will notify you if they find any HTML errors on your site.
To summarize the differences between Google Analytics And Google Webmaster Tools:
  • Google analytics is basically a website statistics tool showing you how and how many people visit your site.
  • Google Webmaster tool shows you how Google sees your site from a search engine perspective.
The really great news is you can now connect your Google Analytics with your Google Webmaster Tools account.



Thursday, August 30, 2012

Difference between Google panda and Google Penguin


SEO marketers would have surely loved to receive a heads up regarding the fine details of the Google Panda and Google Penguin updates. The now infamous search engine updates that caused several brands to lose visibility, and forced even more to make dramatic changes to their search engine optimization strategy.

Matt Cutts, one of the top software engineers at Google, actually warned the public about some of the changes in an interview. Cutt’s friendly heads up involved two main issues regarding search:
  1. Over optimization will start to negatively affect rankings.
  2. The quality of content will soon be more of a focal point when determining rankings.
In Cutt’s own words, Google is attempting to “level the playing field” by essentially giving both SEO marketers with great optimization skills and great content an equal opportunity to increase their visibility in the SERPs (Search Engine Result Pages). The mission is all about enhancing the efficiency of the so-called “GoogleBot”, the company’s software that crawls the web and indexes pages, and in turn, create a more relevant and useful experience for the internet searcher. As a result, those who go overboard with their SEO will start to be penalized, while those who improve in the quality department will be rewarded.

Google Panda
Google dropped the bomb that was the Panda update in February of 2011. As you probably know, Google Panda had some complex components to it, but its primary objectives were simple – penalize sites with lower quality content and in the process, return sites with high quality content to the top of the results where they belong.

Google Penguin
Google launched the Penguin update in April 2012. This focused more on tackling spam and penalizing sites that didn’t follow Google’s quality guidelines. Google Penguin also looks to tackling Black hat SEO techniques that many sites use to increase their search engine ranking. This includes techniques like;
  • Keyword/keyphrases stuffing within content.
  • Comment spamming by including links using the same anchor text.
  • Links from article directories and directory listings.
  • Having excessive/unnatural internal links.
  • Too many external links pointing to your site that use the exact same anchor text.
  • Excessive links from low quality sites.

Google’s initiative to level the playing field is just another step in the road to making quality more important than quantity. It may take SEO marketers some getting used to, but those who want to keep benefiting from search traffic will embrace the Google Panda and Google Penguin updates and do what it takes to keep their rankings intact.