Find only certain URLs from page ... regex (semi-complete script)

Posted on 16th Feb 2014 by admin

Hi guys,

What I need to do is take a page & extract all the URLs from the page & place them in an array.

However I only need to grab certain URLS

eg.

site1.com
site1.com/folder/thisfile.zip
site2.com
site2.com/some/folder/or/subfolder/1.mp3
site3.com

but then leave out of the array

site4.com
site5.com/the/script/needs/to/be/able/to/grab/sub/folders/and/files/2.mp3


Here's the script I've got so far but this will grab ALL the links ... so I need to modify this & perhaps use an if or switch statement to check whether it's a link I actually want...


<?php

$string = '<a href="http://www.example.com">Example.com</a> has many links with
examples <a href="http://www.example.net/file.php">links</a> to many sites and
even urls without links like http://www.example.org just to fill the gaps and
not to forget this one http://phpro.org/tutorials/Introduction-to-PHP-Regex.html
which has a space after it. The script has been modifiied from its original so now
it grabs ssl such as https://www.example.com/file.php also';

/**
*
* @get URLs from string (string maybe a url)
*
* @param string $string
*
* @return array
*
*/
function getUrls($string)
{
$regex = '/https?://[^" ]+/i';
preg_match_all($regex, $string, $matches);
return ($matches[0]);
}

$urls = getUrls($string);

foreach($urls as $url)
{
echo $url.'<br />';
}

?>

Other forums