Why does my crawler script suddenly end with no error?


Posted on 16th Feb 2014 07:03 pm by admin

Hi.

I have written a web crawler script. It will visit a large number of URL's with cURL.

After around 2-3 minutes of running, it will just stop, with no error output or notices.

I have these settings:
Code: [Select]set_time_limit(0);
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);
Any ideas why it would just stop?

No comments posted yet

Your Answer:

Login to answer
121 Like 30 Dislike
Previous forums Next forums
Other forums

Confused with Loop
guys, i got confused with Looping...this is the case

I have 3 stocks, let say "Stock A&q

Receive Rosettanet Message to SAP 4.6c
Dear all ,

My customer will send the PO details by rosettanet message , Is it

IF STATEMENT HELP
Hi
i have created a calendar from a table:

Code: Calendar: October 2009
<table w

Preg_match unknown modifyer
Hello,

Im trying to write a little script for my forums i need to get the reply from my forum

check comment for html
hi, I just wanted to check if a comment a user posts contains HTML, and if it does, to not allow it

Multithreading design
Hi

I have come up with a Singleton class that manages a pool of database connections. Basical

What are causes of a connection-timeout with fopen()?
Hello! Here is the situation: The server I host my website on just upgraded it's PHP build from 4.4.

extract data
Code: <div class="post hentry uncustomized-post-template">

having probem inserting data into db table
hi

i have a table with following columns in it
Code: candidate_id, degree, cgpa, institute

Hom to make one url to open together with another url
I have a chat, which i want to be opened, as soon as the users login to the site. As it is now, when

Sign up to write
Sign up now if you have flare of writing..
Login   |   Register
Follow Us
Indyaspeak @ Facebook Indyaspeak @ Twitter Indyaspeak @ Pinterest RSS



Play Free Quiz and Win Cash