Why does my crawler script suddenly end with no error?


Posted on 16th Feb 2014 07:03 pm by admin

Hi.

I have written a web crawler script. It will visit a large number of URL's with cURL.

After around 2-3 minutes of running, it will just stop, with no error output or notices.

I have these settings:
Code: [Select]set_time_limit(0);
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);
Any ideas why it would just stop?

No comments posted yet

Your Answer:

Login to answer
121 Like 30 Dislike
Previous forums Next forums
Other forums

Help with looping
I have a comma separated list of colors taken from a database ($ICo) and a directory of images named

Insert Failing.
Hey,
I am making a Sign up page for a website, but the insert query into the Database does not se

Phase Error I can't seem to find :(`
Hey guys, I can seem to find my phase error on this. I am getting this error

[error]
Pars

How to show more than 1 users with this code...
Hello,
i have a table that shows users only if I, as Administrator, want to be shown. But its sho

Perplexing problem showing a .jpg
Please disregard..........I figured it out

MASS PM
Hello all, I'm trying to send mass private messages to users in my database but keep getting an erro

output printing as hexadecmal... XD
Everything works except the calcem call for the totusold and totuprice. The out put shows up as hex.

Error in query: Resource id #4??
hey guys, having a minor (i think) problem here that i havent been able to figure out. long story sh

ctype() validation - allowing illegal characters
Hello,
I use ctype() to filter and validate a user form. However, I am trying to allow certain c

isset undefined variable
Hi all,

Hope someone can point out the obvious. I've a log in script, if you dont enter a use

Sign up to write
Sign up now if you have flare of writing..
Login   |   Register
Follow Us
Indyaspeak @ Facebook Indyaspeak @ Twitter Indyaspeak @ Pinterest RSS



Play Free Quiz and Win Cash