Transparency at the new Whitehouse.gov

Kottke made an interesting post about the differences between the old and new Whitehouse.gov websites robots.txt files yesterday. Robots.txt files tell search engines what to include or not to include in their index.

The new one is :
User-agent: *
Disallow: /includes/

an example from the old one:
Disallow: /earmarks/search
Disallow: /earmarks/query.html

2400 lines of disallows, more over on Kottke’s post.

I wonder if it really alludes to transparency on their part or just a hurry from the dev’s to get the website up in time.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>