Why does my crawler script suddenly end with no error?


Posted on 16th Feb 2014 07:03 pm by admin

Hi.

I have written a web crawler script. It will visit a large number of URL's with cURL.

After around 2-3 minutes of running, it will just stop, with no error output or notices.

I have these settings:
Code: [Select]set_time_limit(0);
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);
Any ideas why it would just stop?

No comments posted yet

Your Answer:

Login to answer
121 Like 30 Dislike
Previous forums Next forums
Other forums

Different payment methods with different cross company requirements in F110
We are implementing SAP in a company in the shipping industry. They have the following requirement:<

a dificult string search
Hi I don't know a way around this. I want the user to input a password, but to make it a bit complca

help me fix these syntax errors...
I keep getting multiple syntax errors on this script like this one:

Parse error: syntax error

data type in column definition
Hi,
I create a table with column called "Direction of Travel code ".
The travel codes in t

Add a sign-up feature to a flat file login script
I'm working on a flat file login script and I would like to add a sign-up feature to it with a email

Apart from cron
I need to run a php file every one hour. Is there any other solution apart from cron job?

whats wrong with my code please help!!!
this is the error


Warning: mysql_close(): supplied argument is not a valid MySQL-Link res

pageination not working right... coping images over 4 pages
Code: <?php //This code will obtain the required page number from the $_GET array. Note that

big pagination problem in php
<?php

$connect = mysql_connect("localhost", "root", "")

date("now") prints out wrong date ?
Hi Guys
Anyone know why and how I can fix it ?

Sign up to write
Sign up now if you have flare of writing..
Login   |   Register
Follow Us
Indyaspeak @ Facebook Indyaspeak @ Twitter Indyaspeak @ Pinterest RSS



Play Free Quiz and Win Cash