Why does my crawler script suddenly end with no error?


Posted on 16th Feb 2014 07:03 pm by admin

Hi.

I have written a web crawler script. It will visit a large number of URL's with cURL.

After around 2-3 minutes of running, it will just stop, with no error output or notices.

I have these settings:
Code: [Select]set_time_limit(0);
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);
Any ideas why it would just stop?

No comments posted yet

Your Answer:

Login to answer
121 Like 30 Dislike
Previous forums Next forums
Other forums

Delete HTML file after loading
I have limited experience with php and its been a year or two since I've last used it. I have a sma

upload image name with extension using php
hi frds..

<input id="file1" type="file" name="file[]" &a

$_POST
Hi, I have 2 seperate php files, and i want my $_POSt["fname"] To go into both of them, Fo

Files in current folder. Should be an easy fix.
Never mind. I've asked about this before and just found my answer. Anyway to delete this?

Default TimeZone
The server I'm working with is hosted in America so all times inserted into the database are coming

Pagination won't carry results past page 2.
Hi all,

I've worked out my pagination script and its paginating fine until I click next from

INSERT data problem!
After having an string with apostrophes ', double quotes " or any other special characters, suc

Performance impact of cookies
Hi, I was just wondering what impact there would be in terms of performance if you where to set then

Adding meta tags under Zend FW
Hello there, recently has come to my hands the FTP of a website which is running under Zend, I would

Ten Operator Syntax
Hi Guys,

I can't figure out why i am getting a parse error with this basic ten op code:

Sign up to write
Sign up now if you have flare of writing..
Login   |   Register
Follow Us
Indyaspeak @ Facebook Indyaspeak @ Twitter Indyaspeak @ Pinterest RSS



Play Free Quiz and Win Cash