Information Assurance

Toufic Arabi


front | classes | research | personal | contact

Information Assurance

Bugtraq Analysis

Directory Traversal Attack

back to bugtraq analyses page

Boxalino: Guided-Search Engine that offers a new type of search experience by enabling step-by-step guidance within the search process,

What the problem is:

Input passed via the URL parameter

boxalino/client/desktop/default.htm

is not properly verfied before being used to display files.

The Risk:

The lack of input check can be exploited to disclose the contents of arbitrary files via directory traversal attacks

What is a directory traversal attack:

-Properly controlling access to web content is cruical for running a secure web server -Directory traversal attack is an exploit which allows attackers to access restricted directories and execute commands outside the web server's root directory

What web servers provide as security:

-Access Control Lists: a list which the web server's administrator uses to indicate which users or groups are able to access and modify or execute what files on the server. -root directory: a directory in which users of the web server are confied to and can not cd out of

What you need for a directory traversal attack:

-a web server -little knowledge on where the defualt files and directories on the system are located

How do Directory Traversals work?

Suppose: http://test.webarticles.com/show.asp?view=oldarchive.html

What is happening is you request the show.asp dynamic page, and send it the parameter view with the value oldarchive.html show.asp gets the html file and sends it back to the browser that displays it.

What if an attacker send this:

http://test.webarticles.com/show.asp?view=../../../../../Windows/system.ini

This will cause the dynamic page to retrieve the file system.ini from the file system and display it to the user. Each ../ tells the system/server to go one directory up the file tree all an attacker has to do is guess how many directories he needs to go up the tree, and since he is only requesting pages, he has an infinite amount of time and error tolerance until he finds the right level up.

What could have been done, should be done to prevent it:

-always have latest version of the web server

-Web Vulnerability Scanner: crawls website and checks for directory traversal vulnerability, report the vulenerability and show how to easily fix it, tool example: Acunetic Web Vulnerability Scanner

-Filter any user input, remove everything but the good data