I am trying to implement a secure php get page function afer my website got hacked with LFI, RFI and DTA (I got the whole package LOL).
Browsing the web I found this script which seems to work fine in preventing inclusions of files that I did not previously validated
<?$page = (!empty($_GET['page'])) ? $_GET['page'] : 'home.php';?>
<?// pick a page
switch ($page) {
case 'home.php': include($page); break;
case 'folder/page1.php': include($page); break;
case 'folder/folder/page2.php': include($page); break;
// don't accept any other values
default: /* error 404 for example */ unset($_GET['page'], $page); $a = false; break;
}?>
<?if(!$a){
}?>
<?
if(!isset($page)) {
$a = true;
$page = 'home.php';
} ?>
Regarding Directory Traversal I found this OLD discussion here on Stackoverflow
where Patrick Moore suggests to sanitize such scripts to prevent Directory Traversal with
$_GET['page'] = str_replace( array('..', '/', '\' ), '', $_GET['page'] );
However, I doubt I can use the script suggested by Patrick Moore, since several of the pages I have to include are contained in subfolders (see above example script), so I don't think I can sanitize the '/', and not sure how effective the script would be in preventing DTA's if I only str_replace '..'
and '\'
.
Any suggestion on how to sanitize the passed value given the folder structure of my website would be welcome.