|
Ideas for Improving ASP and ASP.NET Web Application Security - Part 2
|
by Brett Burridge
Feedback
|
Average Rating: This article has not yet been rated.
Views (Total / Last 10 Days):
38113/
87
|
|
|
Use emails to report web application errors |
Building an error reporting facility into your web
applications can be beneficial when improving site security. It also has the
added benefit of being able to notify the web developer as soon as bugs arise,
enabling problems to be fixed and the web application made more robust. If the
error reporting makes use of e-mail to send the errors, then the application
will benefit from a near real time reporting system of errors and suspicious
website activity.
An error reporting e-mailing function for classic ASP was
described in this ASPAlliance article: http://aspalliance.com/brettb/ErrorReportEmailer.asp.
Error handling in ASP.NET is much improved. There is an Application_Error
subroutine in the Global.asax that is called whenever an error is encountered
within an ASP.NET page within the web application. There is also a Page_Error
even that is called should there be an error on an individual page. ASP.NET
also offers improved tracing of errors, such as the ability to view the line
number that raised the error (an application must be compiled in Debug mode in
order for line numbers to be present in error reports).
Once the error reporting e-mail function has been
incorporated into the website, the e-mails can then be monitored in order to
detect security issues. Depending on the way the application was coded, failed
login attempts, attempted SQL injection attacks or other suspicious activity
will often cause error reports to be generated.
Note that if the web application has a high level of
traffic, it is advisable to build in a limit to the number of e-mail error
reports that are sent in a specified time period. A variable within the ASP
Application object can be used to keep a count of the number of e-mails sent in
a specific time period.
|
Check SQL Server user permissions |
This is basic security advice, but a surprising number of
developers embed the SQL Server system administrator (sa) account credentials
within their application connection strings. This leads to two major issues.
·
The account credentials are visible to anyone who has access to
the application's source code.
·
Should the website be compromised the malicious user may be able
to delete tables, drop databases and be able to do all manner of other
undesirable things. It is, therefore, highly recommended that a new SQL Server
user account be created for the Internet user. This user should only be given
access to the objects they are going to need to access. If they only need read
access for a table for example, then they should only be given SELECT
permission and not INSERT, UPDATE or DELETE permission.
The use of stored procedures is highly recommended as a
means of improving security because then the user only needs to be given EXEC
permissions on the stored procedures they need to use.
Alternatively, it is possible to use Windows authentication
for the SQL Server access, in which case for applications using anonymous
access,, the IUSR_machinename could be configured as a SQL Server user and
given the minimum level of object access.
|
Be wary of exposing sensitive information through Index
Server |
Index Server on Windows NT servers and Indexing Services on
Windows 2000 servers offer a good "out of the box" functionality for
building website search engines. Unfortunately, Index Server suffers from a
few issues which can cause security problems on a server.
·
Index Server itself had a number of security flaws, which were
resolved with a number of service packs from Microsoft.
·
Since Index Server catalogs files on the file system, it is
possible for content to appear in search results that you may not want.
·
Index Server is unable to differentiate between content files and
website structure files. Consequently, it is possible for website include files
and other structural files to appear in search results.
A few years ago I built an add-on for Index Server called
the Index Server Companion that uses a web crawler to
save content from a website's content and make it available for cataloging by
Index Server (read more about the Index Server Companion at http://www.winnershtriangle.com/w/Products.IndexServerCompanion.asp).
The advantage of this system is that since the website itself was crawled
rather than the files, the content of the pages appears exactly as the end user
would see it (i.e. all Include files are included and ASP interpreted) and
there is no risk of unintentionally indexing content that should not appear in
search results.
The other advantage is that the Index Server Companion obeys
web server robots.txt files conforming to the robots exclusion protocol as well
as the robots meta tag in individual website pages.
Microsoft's Site Server 3.0 has similar web crawling
functionality available by using the Gatherer component, but unfortunately Site
Server is no longer available. Some of the functionality has been transferred
to Microsoft's Share Point Portal Server, but sadly it does not do exactly the
same as Site Server used to do.
|
Do not use descriptive error messages on login or other
pages |
When creating a standard username/password login page, care
should be taken to ensure that malicious users are not given clues about the
nature of the login system. The following error messages should be avoided
when displaying a failed login attempt.
"The password for this user is
incorrect" - This confirms that the malicious user has a valid
username.
"The password should be 6 characters
long" - The malicious user now knows the length of a valid
password.
"The username could not be
found" - The malicious user can simply keep trying until they enter
a valid username.
Displaying a more generic error message along the lines of
"The username or password you entered is incorrect" offers fewer
clues about the nature of the login system in use. This makes it just a little
bit more difficult for someone to login using someone else's account
credentials.
|
Limit the number of login attempts |
The accessible nature of web based applications together
with the ease of writing automated login scripts mean that it is relatively
easy to write a script to automatically guess website login credentials. The
task is made even easier if the malicious user already knows a login name or
the website does not support the use of strong passwords (i.e. case sensitive,
mixed case passwords mandatory or passwords that include non-alphanumeric
characters).
For this reason, it is recommended to ensure that each
session has a limit to the number of failed login attempts. Since most automated
HTTP scripting methods do not support sessions, it is also recommended to
ensure there are not more than a certain number of failed login attempts from a
specific IP address in a specific time period.
Monitoring the IIS web server log files for signs of
repeated, failed login attempts is also highly recommended. A utility such as
Microsoft's Log Parser (http://www.logparser.com/)
can be used to achieve this.
It may also be worth considering either temporarily or
permanently disabling the accounts of users that appear to have a large number
of failed login attempts in a specific time period.
|
Switch Directory browsing off |
Again, this is basic advice, but it is essential to ensure
that the directory browsing setting within IIS is set to off. If directory
browsing is on, it means that a user will automatically see the contents of a
folder that does not contain a default document (i.e. default.htm or
default.asp on most IIS servers).
This is especially hazardous if you have confidential
information on your website or the website earns its revenue from selling
content that is downloaded from the website itself, such as documents or
software download files.
Needless to say, you should also ensure that IIS is also
configured to enable default documents. Switching the directory browsing off
and enabling default documents should be done at the website level, so that any
new sub-folders created on the website inherit the settings.
|
Use IP address restriction to improve administrative site
security |
If your website contains an administrative web user
interface accessible via the Internet, then it is advisable to use as much
security as possible. It is particularly advisable to restrict access to a
single IP address or a range of IP addresses if only one or several machines
are going to require access to the administrative functions.
Including IP address restrictions is possible through the
IIS management console. IP address restrictions may be applied to entire
websites, as well as individual folders and even files. It is also possible to
put in IP address checks at the application level by making use of the
REMOTE_ADDR server variable.
|
Be wary of using DNS-less Connections |
DNS-less connections are fairly common on websites that make
use of Microsoft Access. While DSN-less connections to file based databases
are convenient in that they do not require access to the server in order to
configure DSN connections, there are security implications associated with
DNS-less connections. The main issue is that the DNS-less connection will
usually contain the filename of the database, making it much easier for a malicious
user to find the filename of the database should they be able to gain access to
the website's source code.
See the section "Secure your Access database"
below for other suggestions about improving the security of Microsoft Access
database driven websites.
|
Secure your Access database |
If your website uses Microsoft Access (or other file based
database) then particular care needs to be taken to ensure the information
contained within it does not find its way into the hands of malicious users. Needless
to say, sensitive information such as credit card numbers should never be
stored within the database, especially in an unencrypted state.
The following points will help to secure your database.
·
Ensure that the database is not stored in a folder that is accessible
from the website. If the database is in a folder that is accessible from the
website (a large number of hosting companies set up websites this way) then
ensure that you cannot download the .mdb file using a web browser.
·
Remember to password protect your database. This will prevent
casual users from looking in the database, although it is possible to get hold
of utilities that can be used to determine what the password is.
·
Encrypt any sensitive data.
|
Be wary of uploaded files |
If you have a file upload facility within your website then
it is critical to perform a check on the types of files that may be uploaded. This
is especially critical if the uploaded content is going to be saved to a folder
that is accessible via the web. This is because an uploaded file type could be
executed on the server by a user who makes a standard browser request for the
file once it has been uploaded.
Although it is essential to black-list certain file types
(such as .asp, .aspx, and if your server supports it, .php), a safer
alternative is to provide a white-list of specific file types that can be
uploaded (such as .jpg, .gif and .png for an image upload facility). It is
also worthwhile including a maximum file size that can be uploaded - most file
uploading server components allow such a limit to be set.
If you are intending to use uploaded files (such as resumes
submitted by candidates using a job vacancies site for example) then it is also
a good idea to implement a virus checking facility before the content reaches a
business processes that make use of the uploaded file.
|
Submit your application to performance testing |
Strange things can often happen to web applications when
they are under heavy loads. It is, however, worthwhile taking the time to test
your web application using an application such as OpenSTA (http://www.opensta.org/) or one of the
commercial web testing offerings.
Recently I subjected one of my own websites to performance
testing, and while the application performed well, I discovered that the
default setting for the ADODB.Connection's timeout was quite low. Increasing
the timeout time ensured fewer users would ever see the timeout message.
Whether they would stay around to wait for the page to load is another matter
entirely!
While it is possible to write a quick Visual Basic
application or script to repeatedly request the same URL via HTTP, it is
advisable to test using an application that can perform real-world testing of
your web application, such as performing searches on search facilities, logging
in, submitting forms and other functionality.
Do not forget that it is very inadvisable to subject your
live system to performance testing!
|
|
|
|
Product Spotlight
|
|