Local Search Engine Marketing

Local Search Engine MarketingAs search engine optimization (SEO) and marketing (SEM) growing fast, everybody looking for different ways to get high ranks on major search engines and increase their online business. Marketing your website by local search engine and directories submission is one of the most way to get found by local audience/customers. Local search engines are also called regional search engines which target audience from particular region.

Local search engines work great if you are targeting some particular region for your online business. Major benefit to submit website to local search engines and directories is very specific and targetted visits which mostly likely ready to convert into sales.

Local search engine marketing basically includes:

  • Local Search Engine Submissions
  • Local Directory Submissions
  • Local Yellow Pages Submissions
  • Local Classifieds Listing

Some common steps for local search engine optimization and marketing:

  • Search and make a list of local search engines and directories
  • Start Submission to these local search engines and directories
  • Always full and accurate address to submit
  • Never forget to give contact number, fax and email Ids in your listing
  • Use google map to display your geo-location with full path preview
  • Choose most relevant category to submit website
  • Use targetted keywords in title and description for submission

These are most common tips to market your business online.

http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml

Tracking Clicks with Google Analytics

Tracking Clicks with Google AnalyticsGoogle Analytics is the most popular traffic and visits tracker for websites. Google analytics uses Urchin tracker to track visits on particular website. Using analytics we can’t just track visits and page view but we can track every click by visitor. We can track click on text links, images or other non html components.

Tracking a visitor click is very simple. we just need to call urchintracker: a javascript function with the component (image, text link etc.). To call urchinTracker function we need to use OnClick event for that particular image.

For example see below example to track clicks on an image:

Code for Legacy Tracking Code (urchin.js)

<img src="/images/rss-icon.jpg"
onclick="javascript:urchinTracker('/RSS-FEED');"/>

Code for New Tracking Code (ga.js)

<img src="/images/rss-icon.jpg"
onclick="javascript:pageTracker._trackPageview(‘/RSS-FEED’);"/>

Following is sequence of process occurs:

  • When a visitor click on that particular image the browser captures click
  • The browser response to click by executing urchinTracker code.
  • urchinTracker function creates a pageview in Google Analytics with name ‘/RSS-FEED/’ under Content > Top Content Section (see below snapshot)


You can use this method to track click on almost every component on the web page like pdf files, email link, register button or other.

http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml

Keywords Search and Selection Tips

Keywords Search and Selection TipsThe keyword selection process is the most crucial part for optimizing a website for search engines. Selecting effective keywords for website is very important.
So here are some effective tips for keyword selection:

  • Build a Keyword List: Create a list of keywords related to your product, services and theme of the website. Use popular keyword selection tools like wordtracker, overture, google keywords tool and more. Try use all combinations and synonyms for every keyword. When searching keywords always prefer key phrases instead of single keywords, because narrow (long tail) key phrases have better chances to give high search engine rankings.

  • Sort Keywords : After searching good keywords now sort them according to their search count (how many people search for particular keyword) and competition (how many websites listing in major search engines).

  • Selecting Keywords from List: If website is a fresh website and have lots of competition in its domain then start with narrow keywords i.e., low search count and less competitive keywords. Generally long tail keywords (key phrases) have low search count and competition after selecting those keywords go for regional keywords (if you are targeting specific regional audience/customers).

  • Magic of Misspelled Keywords: Using common misspelled keywords can also be helpful to get good search engine rankings. These misspelled keywords can be used in META tags which are not visible to users. Search engines consider some common misspelled keywords for listing relevant web pages. But consistent use of such misspelled keywords can be harmful.

  • Constantly Watch Keywords Search Count and Competition: Keyword search and selection is not just a one time process you need to keep watching keywords trends and performance and keep it on going process to search and change keywords accordingly.

So do not waste your efforts for searching and selecting huge number of keywords, be specific.

http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml

How to Optimize a Dynamic Web Site

There are lots of misconceptions about optimizing a dynamic website, usually we listen dynamic websites are not at all search engine friendly websites and they can’t have high search engine rankings. This is not true dynamic websites can be search engine friendly if designed and developed with care from starting by keeping SEO techniqes in mind.
Dynamic websites are basically database driven websites created by using some server side side scripting. There are some problems with dynamic websites that makes them difficult to be read by search engine crawlers/bots.

Following are some basic problems with dynamic websites:

  • Virtual Dynamic Pages – as dynamic websites are database driven websites and dynamic websites do not have physical pages on server. So it is difficult for search engines to read content from dynamically created web pages.
  • Complex URLs – Dynamic websites usually have dynamic URLs with lots of query strings. e.g., domain.com/products.php?id=1233

    Search engine bots treat URLs with query strings as never ending series of links, that is not crawlable.

  • Difficult to Give unique meta tags to dynamic pages.
Following are some tips to optimize dynamic websites :
  • URL Rewriting – Covert all your dynamic URLs contain query strings to search engine friendly URLs (static URLs). Convert domain.com/products.php?id=1233 to domain.com/products.php/id/1233/
  • Create Static Search Engine Friendly Pages – In addition to all dynamic created pages it is good practice to create search engine friendly pages with good keyword rich content in it and links other interal pages. Give descriptive and keyword rich title and meta tags to these static pages.
  • Use CGI Scripts – To make URLs search engine friendly use CGI scripts to put all information in querystrings to some variables.
  • Create Static site Map – Create a static HTML site map for all static and dynamic pages (physical URLs). If you have lots of pages then never make sitemap too long split it into multiple pages.
  • Do Link Exchange – Get inbound links from related websites to increase link popularity for the website. Check here for choosing a good link partner
Using these tips you can make dynamic website a search engine friendly website and get high rankings on search engines.
http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml

Choosing a Search Engine Friendly CMS Software for Website

Search Engine Friendly CMSContent Management System (CMS) have been in use to make it easy to develop and manage content rich websites. These CMS tools give very user friendly interface to manage component on any web page of the website on the fly. There are huge number of CMS softwares available on the web, and very few of them are search engine friendly CMS softwares. There are some problems in using CMS to design a website like usually cms softwares have dynamic URLs with query strings, not able to give different meta tags for indvidual pages. There are very few CMS softwares have truly search engine (SEO) friendly functionality. But some of them like joomla cms has capability to put some Add-ons to make it search engine friendly CMS software. So before choosing a CMS software your should know what a SEO friendly CMS tool should contain.

Following are some common features which CMS tool should have to make it truly search engine friendly CMS:

  • Search engine friendly titles – CMS software must allow different title tags for indvidual pages.
  • Unique Meta Tags – CMS should allow to give unique meta tags for every page on the website.
  • Static URLs – CMS should have feature to create custom static search engine (SEO) friendly URLs.
  • 301 redirection functionality – CMS tool should have permanent 301 redirection feature.
  • ALT Attribute for images – A search engine friendly cms features to give keyword rich ALT attributes to static images in web page.
  • Customized CSS capability – A cms should have feature to customize CSS according to needs.
  • Easy to manage folder structure – Linking and navigation structure should be easy to manage and understand.

I would love to have some comments and experiences from individuals about different CMS softwares available in the market.

http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml

Criteria for Choosing a Good Link Exchange Partner

Choosing a Good Link Exchange PartnerIn the process of link building, the most tedious task is to choose good link partners to exchange your link. The process of link building has many benefits, good link building campaign can help you to increase traffic on your website, increase your link popularity on search engines and boost your search engines rankings. It is not like that you can exchange your links with any website. There is some criteria that helps to you to choose a good link partner.

Following are tips (criteria) for choosing a good link exchange partner:

  • Theme and service Relevancy – While searching for link partners you should pick websites with same theme and domain for services/products of your site. It is worthless to exchanges link with a tourism website if you are providing web design and development services.
  • Page rank – When getting link from a relevant website, page rank for the page where your link is placed should be good.
  • Number of links – Page where your website link is being placed should not have huge list of links, try to link with website 20-25 links per page.
  • Quality of links – On link page with your link should not have irrelevant links like casino or adult site links.
  • Static HTML Links – Some webmasters gives JavaScript that is of no use as back links or link popularity is concerned.
  • Check for NoFollow link – some tricky webmasters give links with NoFollow attributes that is again useless.
  • Redirected Links – Some webmasters provide redirected link with a dynamic URLs/query string that is not a search engine friendly way to get links

While doing link building campaigns always focus on quality of links instead of quantity.

http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml

Effective Link Building Techniques

Effective Link Building TechniquesLink Exchange or link building is the most important and powerful way to increase your search engine rankings/ page rank as well as traffic to your web site. After designing a website and doing all ON page SEO techniques now its time to boost your website performance by using off page SEO activities. In link building process webmasters usually search for their link partners and trade/exchange links with them. Almost all major search engines like Google, Yahoo and MSN give importance to number of sites linked to your website, incoming links are like votes to your website. But choosing a bad or irrelevant link partner can even ban your website from search engine index. So be careful while choosing your link partner.

Link building includes three kind of linking strategies:

1. One Way Linking – In this process you need not to give anything in return for getting a link from other website. There are some websites provide free links to your website.

2. Reciprocal Linking – This is most popular way to increase your link popularity. In this process you need to find website relevant to your business and services and then send them request to place your link on their but before that you need to place their link on your website.

3. Three Way Linking – Its bit complex linking strategy to gain high link popularity. In this you ask for a link to your website from webmaster but you don’t provide link back from your website but from any other (third) website to your link partner.

Before starting a link building process you need to understand the way for selecting a good link partner. Always keep in mind it’s not just the quantity of links but the quality that matters.

http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml

Search Engine and Directory Submission

Search Engine and Directory SubmissionAfter completing with ON Page SEO processes now it’s time to plan for a strong link building strategy for your website. There are many methods to get number of links one of them is getting your site submitted to popular search engines and web directories. Once your site is completed start submitting your website to major search engines like Google, Yahoo, and MSN. Never forget to submit your website to regional search engines. After submitting website to major search engines next process is selecting good directories and submit your website to most relevant category. There are lots of web directories available on web every web directory have different criteria to put your link in their index, some of them give links without asking anything in return (FREE Web Directories), others ask for a reciprocal link or some are paid directories which ask for money to put your link.


Following are different type submission:

FREE Link Submission: There are huge number of FREE web directories that you can use to submit your website. One the most popular free web directory is DMOZ.org. In Free submission you just need to select most relevant category to you website theme and services.

Reciprocal Linking: Some web directories ask for a link to their home page from your website, called reciprocating links.

Paid Submission: Some web directories charge for including your website link to their index. Featured link submission is also of same kind but in this method you get link to a special place on their website for example, Home page or featured links portion.

Be careful when you are selecting a web directory for submission. Regional directories, service specific directories are more valuable than general web directories.

http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml

The hullabaloo about Page Ranks

Are we becoming Page Rank junkies?
There is an addiction in the online space- and it’s called the hankering-after-page-rank addiction.

Before we all get bent out of shape worrying about our web site’s PageRank let us take a collective deep breath and ponder over the following statement:

Google does not rank websites, it ranks pages

And this is something we need to keep in mind. When submitting to directory sites and more importantly paid listings, it’s the Page Rank of the PAGE that the link is on, that is important. Keeping in mind that it the internal page has a good ranking, it should send traffic to your site too. It is possible for internal pages to gain a higher page rank than the home page.

Possibly the best thing we can do is to regularly change and update our pages. Keeping the pages fresh is seen as new content and bring back the google spider. And needless to say, the more this happens, the more likely our Page Rankings will increase. This is also why blogs in generally do well in natural search results because they provide regular changing information.

Cheers to PageRank de-addiction!

http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml

Robots.txt File and Meta Robots Tag

Robots.txt File is used to give instructions to search engine crawlers (also known as bots, spiders) to allow or disallow some part of website for crawling, This is also known as The Robots Exclusive Protocol. Whenever a search engine bot visits a website it checks for robots.txt file and find instruction given in this file. So if it finds instructions not to crawl website it goes on.

Sytanx for Robots.Txt file:

User-agent: *
Disallow: /

*In this syntax User-agent specifies name of particular robots or * (for all robots) and Disallow gives the path disallowed for crawling (in our example Root path is given so any robot will not crawl any page of website).

Examples:

User-agent: *
Disallow:
All robots can crawl your website, no folder has been disallowed.

User-agent: *
Disallow: /cgi-bin/
All robots will not crawl content under “cgi-bin” folder.

User-agent: GoogleBot
Disallow: /images/
GoogleBot will not index content under “images” folder.

User-agent: *
Disallow: /
All bots has been disallowed to visit any part of website

User-agent: GoogleBot
Disallow:

User-agent: *
Disallow: /
In This example we are allowing only GoogleBot to crawl our website and disallowing all other bots.

Steps for Creating a robots.txt file

  1. Open any text editor (notepad)

  2. Write your robots instruction

  3. Save file as “robots.txt”

  4. upload robots.txt file to root of your website

Meta Robots Tag

Meta Robots Tag is used to define page specific instructions for search engine robots to index or not to index content of page. This tag should be placed under HEAD tag.

Like other meta tags Robots Meta tag have same attributes but with different values:

Syntax & Examples:
<meta name=”Robots” content=”INDEX, NOFOLLOW”>
<meta name=”ROBOTS” content=”NOINDEX, FOLLOW”>
<meta name=”ROBOTS” content=”NOINDEX, NOFOLLOW”>

Note* If no meta robots tag is given then default content will INDEX,FOLLOW.

http://feeds.feedburner.com/Search-Engine-Marketing-Optimization-Guide?format=xml