Today’s Korea Herald has an interesting article about Korean websites. Believing that it’s safer to keep foreign search engines from indexing all the content on their site, many Korean websites use robots.txt files to block search engine bots from indexing a portion (or all) of their site. The robots.txt files are read by search engine bots as they index websites and the data inside this file lists which pages should not be indexed.