Today’s Korea Herald has an interesting article about Korean websites. Believing that it’s safer to keep foreign search engines from indexing all the content on their site, many Korean websites use robots.txt files to block search engine bots from indexing a portion (or all) of their site. The robots.txt files are read by search engine bots as they index websites and the data inside this file lists which pages should not be indexed.
The complication is that many websites, such as tourism, university, and government websites, are blocking potentially good traffic to their sites. Another problem is that some of these sites believe they are securing sensitive information when in fact this information should be password protected. And finally, if the robots.txt file is not written correctly, a site may not be indexed at all and could lose potential traffic.