Catching Back Doors through Code Reviews

July 18, 2014
Pentests

Off late, code reviews have been gaining a lot of popularity. Organizations which till recently were content with a secure network and an occasional Penetration Test are now getting their application’s code reviewed before going live.

A code review, over and above what application penetration tests find, can uncover backdoors and Trojans in the code. These backdoors could have been introduced in the code intentionally or inadvertently.

Insecurities in most applications may arise due to a number of reasons. One important reason being the huge pressure on developers to meet the functional requirements and deliver on time. Some of the common mistakes developers may make are -

  1. Miss linking a page to other web pages
  2. Put some test code and forget to delete it
  3. Misplace web pages in home directory which are actually meant for other application modules
  4. Some malicious developers may intentionally plant a backdoor for future access

How do backdoors enter the application?


Consider a web based application built in ASP.NET. The application has strict authentication and authorization controls. A secure session management scheme has been implemented.

But unfortunately, one of the developers had unintentionally left some test pages in the application directory. The test page was written to execute a few database queries from the front-end; basically for “ease-of-use”. An attacker notices the test page while browsing the application and he quickly replaces web page name in the URL to the test page name, accesses the page and retrieves credit card information of customers. Thus, a small mistake in the development phase can result in theft of confidential information.

The existence of a backdoor can allow attackers to inject, view, modify or delete database/web pages without authorization. In some cases, it may also penetrate into the system and execute system commands.

The key characteristics of backdoors are:

  1. Orphaned web pages
  2. Left over Debug code
  3. Invisible Parameters
  4. Unnecessary web pages
  5. Usage of DDL statements
  6. Usage of Deletes/Updates

Techniques to detect backdoors through code review

Let’s see how we look for backdoors using each of the above mentioned characteristics.

Orphaned web pages

Look for all web pages that are not linked or called from any other web page; probably used for testing and not removed. This can be detected by analyzing page header directives to check for a page call.

The task can be made easier by writing a perl script that will search for links through out the application which are not linked to any other web page. Another way could be to write a perl script for a string search that will search for a particular web page name, say test.aspx, through out the application directory. The script displays every line that contains test.aspx from the application code. This method requires manual analyzing of the source code.

Left over Debug code

Look for all web pages where the session object is assigned a value from user input. Session object variables are used to hold information about one single user, and are available to all web pages across the application. So, if a session object is assigned a value on one page, the same session object can be used for making a decision or to make a SQL query on another page. Let’s say, the session object was used to test role based access feature in an application. The developer later decides to use classic ASP style coding and forgets to delete the code. An attacker notices this and changes the session object value to gain higher privileged access to the application. This causes authorization bypass or privilege escalation. If session objects are assigned a value from user input and are used as logic for authorization, then it’s a vulnerability.

Invisible Parameters

Identify all web pages for GET or POST parameters parsed by a web page. Look for those parameters that do not have any server side related code.

The task can be simplified by writing a perl script which will extract input parameters from web pages, store them into an array and compare the two to find parameters that only appear in server side code.

Unnecessary web pages

Look for web pages which are not linked to current working directory of the application. There may exist pages which are just placed into the application folder but are being called from other application modules.

Usage of DDL statements

Look for DDL statements in all web pages for operations like delete, drop, alter or create. These operations must not be handled from code behind; instead should be handled from a stored procedure.

Usage of DELETE/UPDATE

In all web pages, look for DELETE and UPDATE statements without a WHERE clause or WHERE conditions that always evaluate to True.

Best Practices

Here are some best practices that a developer must keep in mind while developing an application.

  1. Identify and remove all web pages that are not linked to any other application web pages
  2. Identify and remove GET/POST parameters that are not used by the application
  3. Segregate web pages accordingly. It is best to have critical application modules hosted on separate servers
  4. Do not assign value from user input to global variables
  5. Always use stored procedures for DDL operations
Share this post
Wordpress Security
Malware Analysis
Tools & Techniques
Pentests
PTaaS
Cyber Security
Technology
Subscribe to our newsletter

Join our newsletter today and enhance your knowledge with valuable insights. It's quick, easy, and free!

Be a Team Player
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Latest blogs

Latest updates in cybersecurity services

View All
Blacklock Blog Image
Wordpress CMS Security
June 9, 2016
Wordpress CMS Security
Tools & Techniques
August 8, 2014
Tools & Techniques