Digital marketing is the marketing of products or services using digital technologies, mainly on the Internet, but also including mobile phones, display advertising, and any other digital medium.

Tuesday, September 17, 2019

Site Map & Robots.txt

                         Site Map

A site map is a model of a website content designed to help both users and search engines
navigate the site. A site map can be a hierarchical list of pages (with links) organized by topic, an
organization chart, or an XML document that provides instructions to search engine crawl bots. 
Site map may also be spelled sitemap
When the site map is for users, it just a plain HTML file with a listing of all the major pages on
a site.
In the context of search engines, the site map, also known as a sitemap.xml file, helps search
engine crawlers index all pages on the site. While a site map does not guarantee that every page
of a site will be crawled, major search engines recommend them.
Site maps are especially important for sites that use Adobe Flash or JavaScript menus that do not
include HTML links. Google introduced Google Sitemaps to help Web crawlers find dynamic
pages, which were typically being missed. Bing, and all other search engines also support this
protocol

                              Robots.txt

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a
standard used by websites to communicate with web crawlers and other web robots. The standard specifies
how to inform the web robot about which areas of the website should not be processed or scanned. Robots are
often used by search engines to categorize websites. Not all robots cooperate with the standard; email
harvesters, spambots, malware and robots that scan for security vulnerabilities may even start with the portions
of the website where they have been told to stay out. The standard can be used in conjunction with Sitemaps, a
robot inclusion standard for websites. 

  1 comment:

Thanku

Translate

Total Pageviews