I now work for a large site. It’s not a place I haven’t been before, but I’m definitely getting back into the swing of running a large (we’ll do over 10 million requests during our busy months). We’re in the middle of migrating our websites in house onto larger servers that we’ll manage ourselves. It’s a daunting task because our company runs lots of smaller sites we’ve either picked up or somewhat abandoned along the way. These sites were written in older versions of PHP (probably PHP3) by the owner of the company as he was learning PHP. As a result I’m having to go back and fix include paths, etc. as we migrate to the new server, but how exactly do you automate the task of checking a site with hundreds to tens of thousands of pages for parse errors?
A friend of mine, Ian, who I do a lot of “bouncing off of” had a great idea. Basically, you turn on
log_errors in your
tail -f /path/to/error.log | grep PHP and then run a garden variety link checker against your site. As the link checker crawls your site it should trigger parse errors that will show up in your error file.
There you have it. A simple and easy way to check a large site for parse errors. Turn your
error_reporting to simply be
E_ALL to find notices and other problems in your code.