Friday, March 16, 2018

BASICS OF SEO SEO | MUST KNOW

BASICS OF SEO 



SEO isn't a one time setup and there are layers of complexity to it. And what better place to start than with the basics?

Seo is not rocket science. While it's best to hire an outside expert with prior knowledge of SEo to help get your house in order picking up the basics of SEO and reordering your websites on your own is no big deal. Here are a few tweaks you can make to avoid running afoul of SEO guidelines.

META TAGS: 

The name says it all. Very clearly, the meta tag contains data about a tag. The content of meta tags generally describes information about the HTML page which usually cannot be represented by any other HTML tags.



With respect to SEO, initially, Google used the metadata of a web page in its search rankings, but earlier SEo "experts" began misusing this tool and cramming keywords unrelated to their page content in to the metadata. Google eventually got wise to this and decided in the end to devalue it. These days Google doesn't use meta keywords in algorithm at all, because they're too easy to abuse. Yahoo! says, they use the Keyword Meta Tag when ranks a page. Hence, it makes sense to add one for Yahoo! and any other minor search engines that still use it. Again, Baidu's Chinese language engine uses the keywords meta tag in its search ranking. So in the case your traffic consists of a huge chunk of Chinese language users, make sure to include this in your HTML code. A small primer to the variety of meta tag attributes is as follows:

1. Title attribute: 

Title meta tags are the most important of all of the meta tags discussed here. These tags have a real impact on search rankings and, perhaps just as importantly, are visible to the average user. When a user opens your web page, the title seen in the browser header bar is the content within your title attribute meta tag.

2. Description attribute: 

The most commonly used tag is the "description" meta tag. In case of an empty description, search engines generally generate one using the content of your website. For Google, adding the description meta tag does not result in a boost in the Search Engine Results Pages (SERPs), but the description might be used for the description for your SERP listings in Google. It's important to note that the description tag won't always show up in the results for a Google search (Google frequently picks a snippet of text from the page itself but it's useful in other ways.

3. Meta robots attribute 

With this attribute, you're telling the search engines what to do with your web pages: index/noindex This tells the engines whether to show your page in search or not low/nofollow. This tells the engines what to do with links on your pages: whether they should trust and "follow" your links

URL STRUCTURE: 

The URL structure of your website pages is one sure shot indicator of how optimized your website is for a search engine. One mantra you must not ignore at any cost is that hyphens used in a URL give it a whole different meaning than when underscores are used. An example would be when you google "brown leather couch".
In case you'd like your page to feature on the search results, naming your URL http://yourdomain.com/ brown leather couches" is worlds apart from naming "bttpallyourdomain. com/broton leather couches"
 The presence of underscores would cause your URL to be indexed as "brownleathercouches", which is entirely irrelevant to what you MEANT to provide. In a similar vein, using capital letters changes the meaning of your URL, so it's generally best to avoid them. Also, make sure your URL is loo% readable by humans.
If you are shown twoentries:
http/lsourdomain.com/brawn-leather-couch
bttp://yourdomain.com/main.php? 956564 t65
 It's pretty obvious you'll be more inclines click on the first link rather than the second. Another cardinal rule of naming URL structures is that your URL length must not exceed 2048 characters.
In such a scenario, ignoring this means Internet Explorer won't be able to load the page. Moreover, your folder and hence, URL structure, should indicate the importance of your content. In case you've created a nested URL structure for more important content, it's best to change it to give more importance to the main content.
Having a search feature on your site generates dynamic and duplicate URLs to the same content, depending on the user's filtered search. Block extra dynamic URLs pointing to the canonical (statically named) URL by using a robots.txt. Also, you can help Google index your page better by telling its robots that a particular page has an alternative mobile link, in case you've created your content in such a way. In Google's words,"On the desktop page, add a special link rel "alternate" tag pointing to the corresponding mobile URL.
This helps Googleboot discover the location your site's mobile pages".
When you've made changes to a URL, remember that you are technically removing a page that has been indexed by Google. Remember to add a 301 redirect from your old URL to your new in case of such a change.

BROKEN LINKS

As mentioned before, changing URL scan lead to missing and broken links Seeing as how broken links stop search engine crawlers in their tracks, leading them to stop crawling that page and move on to the next one, it means any pages it hasn't crawled won't be indexed or receive a ranking. you'd like to avoid making a mess as this and like to direct your traffic the appropriate link. use a 30l redirect. It's best you use a broken link checker, with various such services available online, and reorder your website structure. Apart from search engines being unable to index your website, you'll also face problems with dropping traffic to your website, which will get frustrated with broken links and just avoid visiting your website from the next time onwards.

SITE STRUCTURE 

The better your site structure, the better ranking in the search engines, Every website has some structure." It might be a rigorous and streamlined structure, or it may be a disorganized jumble of pages. If you are intentional and careful with your site structure, you will create a site that achieves search excellence.
Google's algorithm uses information from searchers to rank your site. If your site has poor CTRs and low dwell time, it will not perform well in the contrast, when a user finds a site that they like ie. as a  site with great structure-they don't bounce and they stay longer.
 An accurate site structure can reduce bounce rate and improve dwell time, both of which will lead to improved rankings. One secret tool that many forget to use are site links, which appear in under the site's Google search result. Seeing as how Google automatically creates sitelinks using its own algorithm, which it does only for websites with a great site structure, one can't stress enough on the importance of a good site structure.

To create this kind of site structure, you'll need to :

1. Plan out a hierarchy before you develop your website.

 If you're a website from scratch, you're position plan out site structure for the best SEO possible. The simplest hierarchy is have the most important search ranking pages at the top, and to filter down to the less important pages, To help with a mental representation, it should look a bit like this tree: Considering how complicating the hierarchy will make it tougher for web crawlers to index your document, it makes sense to keep the main categories between two to five, and having only relevant subcategories under the main category.

2,Create a URL structure that follows your navigation hierarchy. 

The second main element in developing strong site structure is your URL structure. If you've logically thought through your hierarchy, this shouldn't be too difficult. Your URL structure follows your hierarchy.

So let's say your hierarchy for your website for restaurant, "Kim's Restaurant", looks like this:
Main categories-"About Us", "Menu", "Locations", "Contact and the subcategories are "Menu Drinks", "Starters". "Main Course", "Deserts" Locations "Mumbai", "Bangalore", "Delhi" The URL structure for the Bangalore location would look like this: kimsrestaurunt.com/locations/bangalore
Your URL structure will be organized according to your site hierarchy.
This means, obviously, that your URLs will have real words (not symbols) and appropriate keyword coverage.

3. Create your site navigation in HTML or CSS. When you create your navigation, keep the coding simple. HTML and Css are your safest approach. Coding in JavaScript, Flash, and Ajax will limit the crawler's ability to cover your site's well-thought out navigation and hierarchy. 4. Use a shallow depth navigation structure. As mentioned earlier, most page ranking algorithms use the page's depth relative to the main page as an indication of the page's content's importance.

Make sure relevant and necessary contentisn'tburied deep within hierarchy. Shallow sites work better, both from a usability and crawler perspective, as noted in this Search Engine Journal article.

5. Create header that lists your main navigation pages. 

Your top header should list out your main pages. That's it. Though adding any new menu elements doesn't reduce the efficiency of your SEO, certainly distracts users from the main elements, which are your menu categories in the header. In case you have footer links, make sure they are the same ones in the header Though this again isn't helpful for SEO directly, changing the order links complicates user experience, which will probably reduce the footfall on your website.

6. Link pages internally 

Since not possible to categorize all sub-pages and create a perfect hierarchy, advisable to internally link as many pages as possible, so as to a) Provide a useful pathway to navigate logically within your website, which helps reduce the number of clicks needed to reach a page on your website. Considering our previous point about low depth websites,this is distinct advantage that cannot be countered. b) Allow the usage of keyword anchor text.

XML SITE MAPS XML

Sitemaps serve a very niche purpose in search engine optimization: facilitating indexation. Posting an XML sitemaps kind of like rolling out the red carpet for search engines and giving them a road map of the preferred routes through the site. XML sitemaps basically provide a sense of direction to website's crawlers about the links you want the bots to focus on. since XML which basically Extensible up Language is a machine- readable format meant for search engines and other data reliant programs like feed readers, they're pretty appropriate for usage in such scenarios. Even though the URLs in an sitemap can be on the same domain different sub domains and domains, each XML file is limited to 50,000 URLs per file and can be at maximum 10 MB in size. When a site contains more than 50,000 URLs or reaches 10 MB, multiple XML, sitemaps need to be generated and called together from an XML sitemap index file. In the same way an XML sitemap lists URI's in a site the XML sitemap index lists XML sitemaps for a site.

Because XML sitemaps serve as a set of recommended links to crawl, any non canonical URLs should be excluded from the XML sitemap. Any URLs that have been disallowed in the robots.txt file such as secure e commerce pages, duplicate should also not be included in the XML sitemap.

PAGE LOADING TIME:

It behoves content creators to keep websites that load quickly and don't put off the end user.
Not only improve functionality for visitors and keep them coming back for some more of that snappy performance, it'll help robots crawl and index your website faster and increase crawler activity. Always test your pages to ensure images, text, and other content on the page loads as quickly as possible. If you find content that slows down your page, replace it with faster loading content or remove it altogether. Optimize your HTML, CSS, and Javascript files to ensure that everything loads quickly. Google's Page Speed Insights is a good tool for this.

MOBILE PERFORMANCE AND RESPONSIVENESS

Google's latest update affects the mobile version of your site. you don't yet have an optimized mobile version yet, it's best to start now and create a fast loading website that uses resources efficiently and responsively. Google has updated its algorithm to show websites that are optimized for a mobile experience. Apart from the speed of loading, the fact that your website is responsive or not playza major role in Google's indexing. Websites providing an optimal experience are ranked higher and crawled more often.

ROBOTS TXT:

 The Robots Exclusion Protocol (REP) is a group of web standards that regulate robot behaviour and search engine indexing. Initially, the original REP from 1994, extended 1997, defining crawler directives for robots.txt. Some search engines support extensions like URI patterns (wild cards).
A later extension in 1996 defined indexer directives that robots used the meta tags, Webmasters still apply REP tags in the HTTP header of non-HTML resources like PDF documents or images.
 The latest 2005 version contained the Micro format rel nofollow, defining how search engines should handle links where the element's REL, attribute contains the value "nofollow. REP tags (noindex, nofollow, unavailable after steer particular tasks of indexers and bots. In some cases (nosnippet, noarchive, noodp), they provide directives for query engines at runtime ofa search query. Other than with crawler directives, each search engine interprets

REP tags differently. For example, Google wipes out even URL-only listings and ODPreferences on their SERPs when a resource is tagged with "noindex, but Bing sometimes lists such external references to forbidden URLs on their SERPs. Although robots txt lacks indexerdirectives,itis possible tosetindexer directives for groups of URIs with server sided scripts acting on level that apply X-Robots-Tags to requested resources.
This method requires program- ming skills and good understanding of webservers and the HTTP protocol. Both Google and Bing make use of regular expressions, in order toiden- tify pages or sub folders that an SEO wants excluded.

These two characters are the asterisk and the dollar sign which is a wildcard that represents any sequence of characters which matches the end of the URL Beware though the robots txt file is public and available for everyone to see. In the case you have private information that you don't want pub licly searchable, use a more secure approach such as password protection. This file allows people to view which section the webmaster has blocked the engines from. You don't need to outsider to see a distinct improvement in your website traffic. Just following some of the painless tweaks is a sure shot recipe for success and higher traffic.

No comments:

Post a Comment

About Us

MbylleBlog

I love blogging and writing articles to help people out is my greatest goal and joy. I also love internet marketing and connecting with people. If you have any question don't hesitate to contact me.