After 8 days, PHP stuck in "extended" usleep?

If you have any ideas, please throw them out there, thanks.

Seemed like a good idea to ask here first before adding lots more logging code. I thought the script was sufficiently tested, which is why I turned off excessive logging to save room on the disk–because I want this script to run for years with very minimal supervision. My script does log MySQL errors and I don't see any problem with MySQL.

I kick off the script from rc.local doing this:

su - usernameofthescript -c "nice /home/myscript/myscript.php < /dev/null &"

From "top" I can see the script sleeping, 0% CPU and only 1.9% memory. I just noticed the script jumped from 16:41.73 to 16:41.74 so apparently the script is still running. But somehow it's stuck somewhere since Saturday.

Inside the main loop I'm using iniset('maxexecution_time',0);

Basically, the script sleeps for an hour, then does some filegetcontents(), stores info to mysql, sleeps for an hour and so on for eternity. For 7 days it's working flawlessly. At the end of each cycle it logs how long it took to do its business (this worked for 7 days, every hour) but since yesterday it's not logging anything.

Because the CPU is at 0% right now for this script, I'm wondering if it's stuck in "extended" usleep? I'll punch that into Google in a minute. The last logged time, it took 755.6 seconds so it should have gone into usleep for only 48 minutes. I'm wondering if "nice" somehow extended that longer?

top now says 16:41.75

12 Replies

Instead of usleep, I'm going to try this function I found on php.net

function my_sleep($seconds) 
{ 
    $start = microtime(true); 
    for ($i = 1; $i <= $seconds; $i ++) { 
        @time_sleep_until($start + $i); 
    } 
} 

"soulhunter1987 at post dot ru 18-Aug-2010 07:49

Since sleep() can be interrupted by signals i've made a function which can also be interrupted, but will continue sleeping after the signal arrived (and possibly was handled by callback). It's very useful when you write daemons and need sleep() function to work as long as you 'ordered', but have an ability to accept signals during sleeping. "

Alternatively, consider using something like cron if your situation requires something to happen once an hour. It's already there and running on your system, so you might as well use it if possible.

Also, I'm not sure how possible it is with PHP, but having some way to communicate with the sleeping process (either by sending it SIGUSR1 and having it spit stuff to syslog, or by having a socket) is really handy for situations like this.

Way easier to use cron and much more reliable as well. Cron will always be called when it's supposed to.

Have it call the php script you wrote (minus the sleep functions).

PHP by default doesn't handle long term running scripts very well.

Initially I was using some bash code I found in Google–which I thought would be bulletproof. But I'm not a bash guru and the script didn't work--so figured I'd give PHP-CLI a chance.

I did a quick look at cron and don't see anything in there to prevent overlap of execution if a task takes too long. I figure getting from 8 days to 8 years won't be that hard with PHP--and understanding what causes PHP to terminate is probably useful to know anyway.

Recently I saw the author of "node" complain about PHP's inability to sleep as one of the main strengths of node. Sounded like smoke and mirrors to me, but what do I know, I'm not the author of a hot new language ;-)

Over the years, instead of using cron (typically not an option on shared hosts--hence my habit of avoiding cron) I'd hook the maintenance into the website somehow, like "if we're done drawing the website and we haven't done x in y hours, do x." For little jobs that works fine, but this PHP-CLI script I'm tweaking now, it might do more and more stuff over time to the point where it could be running in the background all day without sleeping.

@ferodynamics:

I did a quick look at cron and don't see anything in there to prevent overlap of execution if a task takes too long. I figure getting from 8 days to 8 years won't be that hard with PHP–and understanding what causes PHP to terminate is probably useful to know anyway.
There are several tricks that you can use to address this issue.

The easiest way would be to create a small "lock" file when your PHP script begins. Then register a shutdown function where you delete the lock file. Every time you run the script from cron, check if the lock file exists. If it does, the previous invocation hasn't completed yet, so just skip this cron job and try again next time.

The problem with this approach is that, if the PHP script doesn't shut down properly, the lock file won't be deleted and none of the subsequent cron jobs will get anything done. There are some ways to fix this, too: for example, if you use Memcached, you could make the lock file expire after a few hours.

Another approach would be to use a "queue". Whenever you have a new job that needs to be done, add it to a queue. You could use something like Redis to manage the queue, or just query whatever database you're using. Write a PHP script that takes a few items off the queue, processes them, and exits promptly. If there's nothing in the queue, exit immediately. Also write a bash script that keeps calling that PHP script in a loop, maybe with a few seconds of sleep in between. Run the bash script at startup. That way, your queue will always be handled in time, and PHP won't need to sleep. (Sleeping in bash is much more reliable than sleeping in PHP. Anything you write in PHP should just do the job and get the hell out of there.) I used this trick once in the past because one of the PHP libraries I was using had a bad memory leak.

Also, instead of dumping everything to /dev/null, try pointing the output to a file. That way, you'll be able to see any unexpected errors which might cause your script to hang. (By the way, it's > dev/null, not < /dev/null. The arrow points in the direction of dumping.)

Bottom line: PHP really isn't a good language for batch jobs from the CLI. Nowadays, I use Python for that.

@hybinet:

Run the bash script at startup. That way, your queue will always be handled in time, and PHP won't need to sleep. (Sleeping in bash is much more reliable than sleeping in PHP. Anything you write in PHP should just do the job and get the hell out of there.) I used this trick once in the past because one of the PHP libraries I was using had a bad memory leak.

Also, instead of dumping everything to /dev/null, try pointing the output to a file. That way, you'll be able to see any unexpected errors which might cause your script to hang. (By the way, it's > dev/null, not < /dev/null. The arrow points in the direction of dumping.)

Bottom line: PHP really isn't a good language for batch jobs from the CLI. Nowadays, I use Python for that.

Actually that's not a dump, there's no output–only basic logging. For some reason, PHP-CLI needs "< /dev/null" to continue executing in the background, no idea why because the script is not expecting input--I found this tip somewhere in Google and indeed it's not optional ;-)

Yes easy to use a ton of memory if you don't reset your variables, for example, but like I posted I'm only using 1.9% memory. For now I won't play devil's advocate against PHP unless there's some irrefutable bug report I don't know about. After so many years, I'm thinking it's not a big deal do while (true) and expect that to run at least until the next reboot. Time will tell ;-)

I've used php as daemons using cli and via crons, never had an issue with either, one thing I suggest if you're going the daemon way is create a "proper" daemon, look at the process control php module, use that to fork the php process and assign handlers to various signals. I had a daemon that pulled data from a redis queue using the blocking pop feature (it sits and does nothing until data is available).

@ferodynamics:

Yes easy to use a ton of memory if you don't reset your variables, for example, but like I posted I'm only using 1.9% memory. For now I won't play devil's advocate against PHP unless there's some irrefutable bug report I don't know about. After so many years, I'm thinking it's not a big deal do while (true) and expect that to run at least until the next reboot. Time will tell Wink
If it works for you, good for you!

As for me, I just don't trust PHP enough to keep a script running for more than a few hours. The script I had a trouble with in the past was a web crawler, which used Simple-HTML-DOM-Parser and HTML Purifier to extract information from some web pages and insert it into a database. No matter how diligently I unset() everything after each iteration, memory usage continued to grow – probably because PHP couldn't handle the complicated cross-references that DOM trees inevitably produce. I don't know if it's fixed now, but it gave me a sour taste.

@obs:

I had a daemon that pulled data from a redis queue using the blocking pop feature (it sits and does nothing until data is available).
Haha, Redis BLPOP is wicked 8)

I wrote an IRC bot in PHP years ago. I used to run it for weeks or months at a time without issue.

Update: that sleep function is still working.

New sleep function didn't make a difference. Died after 7 days.

I wrote a blog about the problem here:

http://ferodynamics.com/system_daemon-v … r-php-cli/">http://ferodynamics.com/system_daemon-vs-cron-for-php-cli/

Summary: I think the issue was with getfilecontents() but hard to say for sure.

Reply

Please enter an answer
Tips:

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (https://www.google.com)

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct