Thursday, June 12, 2014

6 Actionable Techniques of Onpage SEO



Best On-page SEO techniques

When it comes to Onpage Search Engine Optimization, most of us think that it is all about some Meta tags and Page Title, but it is not that simple, especially now-a-days. Planning and properly executing an Onpage SEO on a webpage or a website is time taking and requires expertise, research & analytical skills, experience and complete understanding on the technology. There are numerous Onpage SEO factors which can lead a webpage to rank higher on SERP (Search Engine Result Pages). A properly structured & optimized webpage or website can retain the users within the site, increase the number of page views, pull down the bounce rate and above all enhance the overall user experience. Here we are going to see six most important On Page SEO techniques which can deliver most of the results.

Keyword analysis, an indispensable part of SEO:

Analytical skill is an important competence of a Search Engine Optimizer. Being an optimizer we need to accomplish various researches and analysis to make a website better for various major search engines. For a website, industry keywords are very crucial and a thorough and proper analysis of keywords can give us immense number of instances (search queries) for which our websites or webpages can rank higher on SERP. But it is not always about ranking or getting visitors to our website it is more about targeting the right visitors for our site who are searching something on our domain.

Brainstorming and compiling keywords from competitor’s websites are the first step of an effective and strategic keyword analysis. For any type of business there must be some competitors trading in the same demographic. We have to identify their websites just by inputting our first seed keyword in Google and then accumulate the keywords those companies used in their webpages. There are couple of good tools are available for this job which you can find on internet. Moz browser toolbar is an excellent free SEO tool that can be used for gathering competitive keywords from different websites.

After collecting Keywords from competitor’s websites now it is time to refine and filter the best keywords according to their search data. Google Adwords Keyword Planner tool is your best friend for this. Using this free tool from Google we can segregate keywords depending on their Search Volume and Competition level. It is wise to select such keywords which get good amount of searches but have Low or Medium competition. Long tail keywords with a demographic (city or area) can be the best bet for a newly built or local website.

There are some keywords which Google lists (mostly) at the bottom of SERP when we search something. These keywords are LSI (Latent Semantic Indexing) and are used by search engines to identify the relevancy of the content or product or services explained in a webpage. We should use some of those LSI keywords in our page content for better & proper indexing.

Page Title, Meta Description and Headers are the places to start the optimization:

Page Title is probably the most important Onpage SEO factors. A properly written and targeted Page Title can not only boost the ranking but it can also deliver better CTR (Click Through Rate) from SERP. One well optimized Page Title must target at least one keyword and it should come as early as possible in the Page Title. That means it is better to start writing the Page Title with the main Keyword/s. Page Titles should be unique in all the pages within a website. No pages within a site should contain same or very similar keyword/s or keyword phrases in their Page Titles. That means, not only we have to write the non-duplicate Page Titles but also we should target separate keywords in each page’s Page Title.

Meta Description helps to get visitors to come to our website from different search engines. We should naturally include the main keyword’s variations into a webpage’s Meta Description. It should sound appealing and reliable to our users so that they click on our link on SERP. There are plenty of instances where a top ranked website gets lesser visitors than a lower ranked website only because of their robotic and creepy Meta Descriptions.

Headers are the heading of our webpage (H1) and paragraphs (H2 to H6). Remember to include at least one important keyword in H1 and other headers. A webpage must contain a single H1 but can contain multiple H2, H3, H4, H5 and H6. Headers should be mentioned as per their numbers, means, H2 should not come before H1. There are many websites we find regularly where they have not mentioned any Headers or if mentioned then they are not optimized at all. Do make sure that your website gets its optimized H1, H2 & H3 (minimum).

Optimize website’s URLs with much care and affection:

Webpage’s URLs are as important as its Page Title. Search Engines love short and keyword rich URLs with proper hierarchical structure. Avoid using spaces or underscores (_) to separate words in an URL rather they should be separated by dashes (-). .HTACCESS file can be used to format or rewrite Search Engine friendly URLs. Avoid using arbitrary values, random numbers and special characters (?, & etc.) in a link and in case if you must (dynamic links for sorting, searching etc.) then use Hash (#) before it. Short and well optimized URLs are tremendous helpful for SEO and they not only help in ranking but also, they are easy to read, remember, share and link to.

A website should open either with www or without www. Unfortunately most of the websites violate this rule and they do not even use canonical tag. As a result Google and other search engines treat those links (www.domain.com & domain.com) as two different URLs but with same content which creates a content duplication issue. Eventually the Page Rank (PR) of that domain will be divided in two parts between the www and non www versions. Using .HTACCESS we can resolve this issue by permanently redirecting a domain to either the www version or the non www version.

Content is the King and optimize them naturally & ethically:

Well researched, informative, fresh and well optimized contents are the assets for a company (website/blog). Regular creation of good quality contents on a particular niche can increase the crawling rate of a website and as a result more and more pages of the website will get indexed by major search engines. This process is continuous and there should be a pattern of interval of publishing new resources or contents in a website or blog. Engaging and detailed contents in a website can boost the traffic, social shares & signals which are inseparable parts of today’s Search Engine Optimization.

Publishing duplicate contents by any means can be devastating for a website. Most of the chances that those duplicate content pages won’t rank on SERP and if a website contains mostly copied contents which are already published somewhere on internet then that website will suffer for the same.

Creating contents consistently on an industry may not be that much effective until and unless those contents are well optimized so that the regular searchers can find those contents on Google and other search engines when they search. Optimization of contents involves many factors and techniques. The initial strategy should be to identify and target the prospective online customers of an industry, to know what they are looking for on internet and to deliver the contents with the updated solutions.

At the core of the content optimization, there are couple of things we need to maintain. We should mention the primary keyword within the first 100 to 150 words of the web content. Use Bold & Italic to highlight the industry keywords and maintain a keyword density around 3% to 4%. Writing contents in small paragraphs with proper headings are good for better user experience and engagements. Concentrate on the quality of the content and not the quantity. Write minimum 500 words contents for better crawling & indexing.

Make sure that the multiple webpages within a website do not have the same or very similar content. But if you have to do that then use the Canonical tag (rel=”canonical”) to tell search engines the single actual instance (URL) of those duplicate content pages.

Using inner links to link multiple pages of a website can boost the crawling rate and page views and it also helps to pull down the bounce rate of a website. This intern helps in better ranking on SERP, overtime.

Pictures are pixels and search engines experience difficulties to read/crawl them. Pictures or images in a website must be optimized in three steps. First optimize the image file name by including a keyword which matches with the image itself. Second Optimize the Alt attribute by making it keyword rich and lastly optimize image’s Title attribute both for users & search engines by inputting a meaningful keyword phrase.

Structured Data are the big thing of modern SEO:

Proper structuring of data of a webpage can not only boost the traffic (CTR) but this can also deliver higher ranking (indirectly) on SERP. Structured Data or Rich Snippet Markups are used to tell search engines about some of the important parameters of a business (website) which helps in effective crawling and quicker indexing. Schema is a universal Rich Snippet Markup which is being used by all major search engines and can be implemented on websites for almost any business.

Optimize website for fast loading:

Google loves websites which load faster without making their crawlers and searchers waiting. In general fast loading websites get more page views and have lower bounce rates than slow websites. There can be many reasons which lead a website to load slowly, like, unnecessary use of HTML elements, cluttered HTML, bad coding style, huge data fetching from databases etc. As an optimizer there are certain measures we can take to make a website faster.

Place the script files before the closing body element (</body>). Javascript and HTML does not load parallel rather they load serially. As a result, if we put our javascripts in the html head (<head>…</head>) then that can lead to a slow loading. So it is better if we can write/link our scripts just before closing the body tag. Using one big Javascript and css file instead of using multiple small files can minimize the number of HTTP requests which can boost the loading time.

Optimize the image sizes and use CSS Sprites to decrease the response time of a website.

We can use GZip file compression mechanism and Content Delivery Network (CDN) to make a site load faster.


Apart from all these, we must note that today most of the people are using smart phones & tablets and they are accessing internet directly on the go. In near future the number of mobile internet users is going to increase exponentially. As a business it is almost unbearable to ignore these users and we have to design our websites in such a fashion which opens in any screen sizes and fits into it beautifully. We need responsive websites for the same. Google loves this type of websites and blogs and eventually they are going to rank much higher than a simple website on SERP when searched something from a smart phone or tablet computer.



0 comments:

Post a Comment

Note: Only a member of this blog may post a comment.