Why does my crawler script suddenly end with no error?


Posted on 16th Feb 2014 07:03 pm by admin

Hi.

I have written a web crawler script. It will visit a large number of URL's with cURL.

After around 2-3 minutes of running, it will just stop, with no error output or notices.

I have these settings:
Code: [Select]set_time_limit(0);
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);
Any ideas why it would just stop?

No comments posted yet

Your Answer:

Login to answer
121 Like 30 Dislike
Previous forums Next forums
Other forums

update sql when refresh - php
hi
I have made a table (attachement)

the users can update the sql database using + or x bu

Server side $_SESSION
how does one keep the session completely server side. no cookies to the browser at all. i need this

help finding hacking loopholes
i was attacked by a redirect php injection

my pc is clean of viruses

so i figure that

Getting rid of quotes when printing data
Hi Guys, I use the filter_var FILTER_SANITIZE_STRING to filter the textarea input. The function esca

PHP page is blank
Hi Everyone,

I have a site in which I am able to open the first PHP webpage in my browser but

extract data
Code: <div class="post hentry uncustomized-post-template">

BI in Upstream Production operations
Appreciate if you can assist in the following areas:
1) Examples of life before and after BI i

apart from cron
I need to run a php file every one hour. Is there any other solution apart from cron job?

Displaying Column Names
I have a question regarding the ability to show the column names from my table/query.
What I'm lo

Deleting Partners on the Customer Master.
Does SAP handle removing the Partner from Open Sales Orders when a Partner is deleted in the Custome

Sign up to write
Sign up now if you have flare of writing..
Login   |   Register
Follow Us
Indyaspeak @ Facebook Indyaspeak @ Twitter Indyaspeak @ Pinterest RSS



Play Free Quiz and Win Cash