Robots.txt block all [Mechanical engineering]

Robots.txt block all

Why Does WordPress Say To Disallow All PHP Pages

Just getting to grips with a duplicate content issue on my site. To be brief using a combination of the plugin All In One SEO and a robots.txt file.

I can't understand why the suggested robots.txt file for WordPress, (posted on the WP support forum I think), disallows all your PHP pages.

I am sure I am not the only one who has php pages on their site that they want to have indexed.

So,
1. Why disallow the WP PHP pages
2. When I understand that it is a good thing to do, how do I disallow those pages but allow the ones I want Google to find?

Thanks in advance,

Ade

well some files just weren't meant to be seen. I just keep the wp-core files and some processing files like .dll from public view for potential security reasons. They don't affect your rankings or anything. Here's my robots.

User-agent (*) talks to all crawlers.
then you disallow or allow whatever you want indexed.

User-agent: *
# disallow all files in these directories
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/
Allow: /images/

# disallow all files ending with these extensions
Disallow: /*.php$



You might also like
Watch Glenn: The Flying Robot (2010) Free Online
Watch Glenn: The Flying Robot (2010) Free Online
Vir -The Robot Boy 9th December 2014 Video Watch Online pt3
Vir -The Robot Boy 9th December 2014 Video Watch Online pt3
Robot 2 Full Movie Amir Khan Watch Online [HD1080p]
Robot 2 Full Movie Amir Khan Watch Online [HD1080p]
Robots trailer WATCH MOVIES ONLINE
Robots trailer WATCH MOVIES ONLINE
Related Posts