Why does my crawler script suddenly end with no error?


Posted on 16th Feb 2014 07:03 pm by admin

Hi.

I have written a web crawler script. It will visit a large number of URL's with cURL.

After around 2-3 minutes of running, it will just stop, with no error output or notices.

I have these settings:
Code: [Select]set_time_limit(0);
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);
Any ideas why it would just stop?

No comments posted yet

Your Answer:

Login to answer
121 Like 30 Dislike
Previous forums Next forums
Other forums

Google Map
I have done Google Map Integration for one my project. But for that we need the "latitude and l

php/xmlrpc class issue
I am working on xml-rpc in php to start with.

I have a class that only has variables defined

void* and sizeof()
Hi,

Using void* and sizeof is it possible to get the value of the object ?

I understan

New Login Script
Hi all, i attempted to create a whole new login script witch isnt working for some reason i dont kno

Developing Ajax-enabled ASP.Net applications for the iPhone
I would like to develop Ajax web applications using Visual Studio that are optimized for the iPhone.

PHP Thumbnail Creation
Ok so i use this function to create thumbnails:

Code: [Select]function createthumb($name,$fil

Extra fields in main body and php required.
I'm new to PHP so please be gentle. I have a little php script below that works well.
The &quo

IIS & NW MII on the same server
Hi,

We're weighing the possibility to run both IIS and MII (NetWeaver) on the same server

Limiting uploaded file type
Hi. First post here
I am working on a simple upload script, and I need it to limit the allowed f

php and downloading
I want to use a php script to create two files. Then I want it to redirect (or whatever) to have tho

Sign up to write
Sign up now if you have flare of writing..
Login   |   Register
Follow Us
Indyaspeak @ Facebook Indyaspeak @ Twitter Indyaspeak @ Pinterest RSS



Play Free Quiz and Win Cash