Friday, December 14, 2007

Targeted attacks

Cyber Crooks are beginning to take more time in preparing attacks and doing thorough research in hopes of pulling off Social Engineering attacks that will net them valuable data. Social Engineering is becoming one of the most popular attack methods in recent years. Phishing attacks trick users into giving up their login credentials to various sites such as Paypal, Ebay, Online Banking and many others.

Recently a targeted attack was launched at Oak Ridge National Laboratory (ORNL) where crooks tricked employees into opening an attachment that appeared to be official.

"Oak Ridge National Laboratory (ORNL) recently experienced a sophisticated cyber attack that appears to be part of a coordinated attempt to gain access to computer networks at numerous laboratories and other institutions across the country. A hacker illegally gained access to ORNL computers by sending staff e-mails that appeared to be official legitimate communications. When the employees opened the attachment or accessed an embedded link, the hacker planted a program on the employees' computers that enabled the hacker to copy and retrieve information. The original e-mail and first potential corruption occurred on October 29, 2007. We have reason to believe that data was stolen from a database used for visitors to the Laboratory."

Attackers will gain as much information about an organization as they can and then craft the attack appropriately. If they are looking at a Life Insurance company they know there is data somewhere that will allow them to commit identity theft. They must figure out how to get that data. They will likely try and find out who the key people in the organization is and what their role in the company is. If this is an attack to be carried out by organized crime they will likely have a lot of resources at their disposal. I would guess they would have security experts from the dark side on their team who have been searching for and possibly finding unknown vulnerabilities that can be exploited.

US-CERT just released an advisory about attacks involving exploitation of Microsoft Access Database files.

US-CERT is aware of a stack buffer overflow vulnerability in the way that Microsoft Access handles specially crafted database files. Opening a specially crafted Microsoft Access Database (e.g., .MDB) can cause arbitrary code execution without requiring any additional user interaction. Microsoft Access files are considered to be high-risk, so it may be possible to execute arbitrary code without using a vulnerability in Microsoft Access.

There isn't a lot of information on this current advisory and it isn't know if this an exploit for an unknown vulnerability or not. If it is an unknown vulnerability and someone has exploit code for it they have a perfect tool to break into a network with. It is unlikely that anti-virus or SPAM filters would detect this file if sent as an attachment. Some organizations will block this kind of attachment. One good way to deliver the goods would be to find an XSS (Cross Site Scripting) vulnerability on their own website where they can then host an evil link to the file which appears to be located right on the organizations own website. An email can be sent to the appropriate people (CEO, Finance Director, System Administrator, etc) with a link and other text making it look legit. They may even include the company logo and an official looking signature. If one of the recepients opens the attachment (that currently isn't detected by AntiVirus) the exploit would run and possibly compromise the computer of the user. Most of the time an exploit would download other malware from an external site and then install it on the system. If the user is logged in with administrator rights then a rootkit could be installed to hide all the files planted on the system. So even down the road AntiVirus knows how to detect the malware the rootkit will be hiding it from the AntiVirus software. This could go undetected long enough for the attackers to capture keystrokes from the unsuspecting user. Once they get the login information they need they can then login to other applications where the data may be stored. If the end user was a DBA or other user that has direct access to the sensitive data it is pretty much game over.

You will likely hear me often mention one vulnerability (actually it was a feature) that existed in every single version of Windows back to version 3.0 in 1990. The Windows Metafile Vulnerability which was made public December 2005. This 'feature' would allow anyone that had proper exploit code gain complete control over the system. For 15 years this existed! We can only speculate at how many people found that vulnerability and exploited it over the years. A fix was released from Microsoft to address the problem but they kind of dragged their feet a little. SANS published a diary about a Zero Day WMF Exploit problem that caused the Internet to turn Yellow (at least the SANS ISC Infocon Color turned yellow). For Microsoft Windows users the Internet was just plain unsafe. There was no offical patch or workarounds initially so everyone was vulnerable. With exploit code freely available all you had to do was generate an evil WMF file, rename it to something.jpg (Windows finds the incorrect filename and fixes it for you! doh!) and then embed into your most trusted forum or your Myspace page and infect anyone that simply viewed the image.

It is very hard to protect yourself from targeted attacks from exploits to unknown vulnerabilites. Those that are watching their networks very closely are more likely to see such an attack and act before much damage is done. An elite System Administrator will have various things in place to watch for suspicious activity. Intrusion Detection Systems, Syslog Servers, file monitoring software such as Tripwire, host based security suites that include protection from viruses, spyware and contains things like anomaly detection, web surfing protection and buffer overflow protection. The old days of just having AntiVirus software and a firewall are long gone. You really need to have as many layers of security and protection in place as possible and then still be very careful. And look at your logs often!

Saturday, December 8, 2007

Is illegal file sharing on college campuses really as high as they say it is?

Again I need to remind everyone the content of this blog is solely my own and does not represent the University of Pennsylvania. I am not authorized to speak on behalf of the University of Pennsylvania and will not do so. I am, however, an employee of a University and am personally concerned about my university as well as other universities out there. I'm hoping this blog may stir up a little discussion on what the real numbers are when it comes to students illegally sharing files.

I also want to state that I am not for the illegal sharing of files. I am absolutely against it. I just want to make sure that the numbers presented in the media are fair numbers. I have a feeling they aren't fair at all.

I'm sure we all remember when emails were leaked from a company called MediaDefender. The news Spread fast and a lot of people started looking to see what MediaDefender was up to. MediaDefender claims to be an anti-piracy solution that stops the trade of illegally traded copyrighted material. A site called Mediadefender-defenders received a copy of the leaked emails and published them online for the whole world to see. In those emails a few threads show that one of their customers wants to see what number of EDU (Education) addresses are showing up in the Gnutella network. MediaDefender generates some reports from the data they have and provides a few updates to their customer.

The data contained in the section below can be found by using the following Google search: "edu ips" intitle:edu

When you see numbers published in the media about the huge number of college students sharing illegal files and then take a look at the numbers below there just seems to be a huge gap. I realize some of the EDU IP addresses may be from a private NAT (Network Address Translation) which enables multiple hosts on a private network to access the Internet using a single public IP address. It is safe to say the numbers are probably a bit higher than the data shows but I wouldn't imagine it would be significantly higher. I don't have access to data that would show this, however.

In the data below the first column is the Date at which the numbers were gathered by MediaDefender. The second one (Uniq IPs) shows the count of unique IP addresses they saw on the Gnutella network. The third column (Uniq EDU IPs) shows the number of unique addresses that resolved to an EDU domain. The fourth column (% EDU IPs (MediaDefender) are the percentages of EDU IP addresses that were on the Gnutella network. The fifth column (Actual % EDU IPs) are the numbers I came up with from their data.

To make a long story short, MediaDefender's data shows that the average percentage of EDU IP addresses found to be on the Gnutella network during the time they sampled the data is 1.76%. That number alone seems to be fairly low.

Date Uniq Ips Uniq EDU Ips % EDU Ips (MediaDefender) Actual % EDU Ips
02/01/07 342854 8398 2.40% 2.45
04/12/07 291001 7175 2.50% 2.47
06/14/07 265504 2475 0.93% 0.93
07/14/07 199333 1303 0.65% 0.65

1098692 19351

Now we have at least some numbers here. One thing to think about, however, is the numbers above do not necessarily reflect the number of people sharing files illegally. This just shows the numbers MediaDefender saw on the Gnutella network. I believe the most popular application for this network would be LimeWire. I was doing a bit of research on Limewire a few months back and was trying to find a file to download that was at least 4 MB in size. Of course I don't want to download anything that would be considered illegal so I kept searching and searching and searching. Well, after several hours I just could not find any content over 4 MB that I could feel comfortable to download and gave up due to lack of time I had to spend on it. So I guess I can raise my right eyebrow and give a "what are you doing on that network" look. Well, I was there researching so maybe others are as well.

There are other networks out there as well such as BitTorrent, Ares, FastTrack and many more. Some networks are used by various software vendors for pushing out a lot of legitimate content such as Open Source applications and operating systems as well as software patches.

"The MPAA estimates that about 44% of the movie industry's domestic losses to piracy -- over $500 million annually -- are attributed to college students illegally sharing files over peer-to-peer networks. "

Another article from Alternet says "But sticking with the MPAA's semi-bogus numbers, educational technology nonprofit Educause points out that "since less than 20 percent of college students live on campus and use the residence hall networks, this means that less than 4 percent of the infringers are using campus networks, and they are responsible for less than 9 percent of the losses. Over 91 percent of the claimed losses are on commercial networks." Get that: 4 out of every 100 infringers (even trusting the industry assessment of infringement, which usually is not too carefully defined) are on college networks"

Educause has a handy list of issues, links, and an action page here.

I don't have the technical skills nor the resources to really dive in and get actual numbers. If I did you can bet I would be diving into it with full force. If there are some of you out there that would like to start thinking of creating a project to find out what the real numbers of Student P2P users are it would be really valuable for the EDU community!

Friday, December 7, 2007

MPAA updates Universitytoolkit site

Now the site seems to be a bit more professional looking and provides main information about the toolkit and motives right on the front page. The actual toolkit remains unavailable at this time but they say to check back soon. I imagine they still need to make sure they are complying with all the licensing issues.

They do have a few problems, however. Even though they appear to be addressing some of the earlier issues that were brought up in this article they are still being a little sloppy. They should seriously consider putting an 'under construction page' on the main site until they really go through everything with a fine-toothed comb. It is like we are looking at the development site where it is still being worked on. You should really only use development servers to host sites that are not ready to actually be published into production.

For instance, some of their documentation is a bit contradictory.

In the two page summary and the Admin guide they have these bullets listed. The first line is from the summary document and the second line is from the admin document.

1)The University Toolkit is a free software application to analyze the traffic on campus local networks
2)The University Toolkit is a free software application designed to analyze the traffic on campus local networks

1) Creates a simple graphical report on the extent of file sharing occurring within the campus network
2) Creates a simple graphical report on the extent of file sharing occurring within the campus network

1) The University toolkit does not identify infringements
2) The University toolkit does not identify copyright infringements

1) No privacy issues - the content of traffic is never examined or displayed
2) No privacy issues - the content of traffic is never displayed

1) It does not communicate results to the MPA
2) It does not communicate results to the MPA

1) It is offered for free to all universities on CD and as a download on
2) It is offered for free to all universities on CD and as a download on

1) Requires minimal effort from IT staff.
2) Requires minimal effort from IT staff.

1)Access to NTop and Snort data for detailed analysis.
2 Provides access to NTop and Snort data for detailed analysis.

Looking at #4 about the privacy issues. They seemed to have brought over the old version into the two page summary. This statement is still not correct. Snort does in fact examine the contents. They properly worded it in the admin document.

1) No privacy issues - the content of traffic is never examined or displayed
2) No privacy issues - the content of traffic is never displayed

I think it would really help them if they brought in a project management team to help them finish this. If the MPAA wants Universities to seriously consider using this tool they need to be serious themselves and do it right. Personally I would like to see more useful links including describing the background on why this toolkit was provided, a detailed blurb about what they are hoping will come from this endeavor. What the real expectations are if sysadmins look at a report from the toolkit. Does the MPAA expect IT staff to investigate? What steps will be needed to make this tool useful in curbing illegal file sharing. A FAQ would be nice as well.

They also need to investigate Beta software and follow the way the rest of us Interneters deal with Beta. Beta means you pretty much have an application that is almost ready for production but still needs to be tested to work out all of the bugs. Allowing people to signup for testing and give them the ability to provide feedback directly to the MPAA or possibly a discussion forum.

They are still not there yet but this can be remedied. I think they did lot of damage to themselves when they hastily slung this toolkit onto the web without really looking at it. Their reputation in software development is now somewhat tarnished.

Thursday, December 6, 2007

Apple leaves Windows and Mac systems vulnerable

Brian Krebs wrote about an unpatched vulnerability in quicktime being exploited November 27th, 2007. That was nine days ago and Apple hasn't even acknowledged it yet. Hackers have had plenty of time to work on exploits for this vulnerability and now there are two "Universal Exploits" at milw0rm that work on Windows or Mac systems.

Personally, this is beginning to remind me of the time Microsoft had a huge WMF vulnerability just about two years ago. For the first time I felt it was pretty much unsafe to surf the web. Microsoft was dragging their feet on releasing a patch and at that time I decided to get a Mac which I named MS06-001. I was pretty upset at Microsoft around that time for leaving everyone vulnerable for so long. Now here is Apple doing pretty much the same thing but unlike Microsoft, they don't even acknowledge the problem. At least Microsoft tells you that you are pretty much screwed for now.

The SANS Internet Storm Center published a diary that includes some workarounds. These workarounds are not easy for your average user since some kind of technical computer knowledge is needed. Quicktime is so embedded into today's web experience that it is difficult to just install the software (which is what I did, btw). How does your average user even know they have a problem? And then there are logistics to think about if trying to deploy these workarounds in a large network environment. At the time I am thinking of our University which has well over 35,000 systems in a fairly decentralized setup. On one hand I want to recommend the workarounds yet on the other hand I don't. Hearing at least something from Apple would make the decision easier. If a patch is coming soon then maybe it is worth the risk to wait. If a patch is not coming soon than maybe it is worth recommending the workarounds.

There are Apple users that are beginning to get fed up with Apple's lack of response as well. They are virtually begging Apple for a fix and get no response whatsoever. And discussions at MacRumors shows people are worried about it but seem to be somewhat in denial or just not getting it. Numerous posts pretty much state that it will only affect Windows users and Mac users have nothing to worry about. This seems to be the default attitude of a lot of Mac users, unfortunately. The Universal exploit was released in late November yet there isn't a single mention of it in this thread.

This is a serious problem. A lot of what would be considered trusted sites allow users to embed content. It is really hard for the average user to protect themselves in this situation.

Tuesday, December 4, 2007

MPAA accused of violating Ubunutu's GPL

It appears that Matthew Garrett from the Ubuntu development team has had the MPAA toolkit removed. Matthew posted a very short journal entry showing the before and after snapshot of the toolkit's home page.

"MPAA don't fuck with my shit.

(And yes, I did attempt to contact them by email and phone before resorting to the more obnoxious behaviour of contacting the ISP. No reply to my email, and the series of friendly receptionists I got bounced between had no idea who would be responsible but promised me someone would call back. No joy there, either.)"

If you click the "context" link it will take you to the Washington Post article that Brian Krebs and I worked on.

Then Slashdot picks it up last night and there are already 188 comments from readers.

"Ubuntu developer Matthew Garrett has succeeded in getting the MPAA to remove their 'University Toolkit' after claims it violated the GNU GPL. After several unsuccessful attempts to contact the MPAA directly, Garrett eventually emailed the group's ISP and the violating software was taken down."

Stay tuned....

Thursday, November 22, 2007

MPAA University Toolkit/Backdoor

The Motion Picture Association of America is sending letters to Universities in hopes of getting them to install what they call the MPAA University Toolkit. You can find a posting about this at the Educause Security list. I don't have a copy of the letter so I am not sure exactly what the content is but the fact that they are sending letters asking Universities to install their toolkit which is a currently a Beta version. Beta is not suitable for a production environment.

They are hoping that Universities will install this toolkit to cut down on the amount of illegal file sharing. However, this tool seems to be introducing some severe vulnerabilities to the privacy of users and direct access (unauthenticated and totally anonymous) to the logs of all network traffic that can be accessed from any remote system on the Internet. It also appears that they are providing false information to exactly what this toolkit does.

I went ahead and downloaded this toolkit and performed some rudimentary tests. Below are the initial findings.

Xubuntu loads up and a terminal opens up. This lets you set your networking information and a 'session password'. You can then enter a name to describe the sensor.

You get a message that peerwatch is running and two URLs to manage the device.



During this setup the sensor will check to see if there is a new version of the University Toolkit. In my case I downloaded it straight before I installed it and it found a new version was available. The current version at the time of testing is 1.2-RC5. The toolkit will phone home to the site and check a file called version.txt. The last modified date of the file at this time was 10-12-2007 and the text of the file reads 1.2-RC3. So, maybe they are just absent minded about keeping the versions updated properly.

I setup my home router to route all unsolicited traffic from the Internet to my new MPAA University Toolkit. This means that any traffic coming to my home router (initial traffic) will go to my toolkit. I am testing the toolkit from a remote location that is located about 17 miles from the sensor with no VPN.

My device is now actively monitoring the network. Before we go much further let us take a look at the 2-page summary of the toolkit located at

The bullet points in the summary with my notes enclosed in []:

- The University Toolkit is a free software application to analyze the traffic on campus local networks
[It didn't cost me a thing to download.]

- Creates a simple graphical report on the extent of file sharing occurring within the campus network
[not just file sharing!]

- The University toolkit does not identify infringements
[This is true]

- No privacy issues - the content of traffic is never examined or displayed
[This is not true. There are a lot of privacy issues with this and I'll show you why later]

- It does not communicate results to the MPA
[This will take some time to verify. The sensor will check in for an update for a newer version. Doesn't that mean the MPAA now has the IP address of the toolkit sensor?! More on this later as well!]

- It is offered for free to all universities on CD and as a download on
[This is true]

- Requires minimal effort from IT staff.
[This isn't entirely true. The work that has to be done for this to be effective (for their purposes) is great.]

- Access to NTop and Snort data for detailed analysis.
[From a web based console that has no authentication and lets you view it from anywhere in the world. Yes, this is true!]

Whitney Houston, we have a problem.....

First I have to admit I haven't done full packet captures or analyzed what data is written to disk. It would be much appreciated if someone out there has the time to do this. But just on the face of this I can tell you there seems to be the makings of a great conspiracy theory. If any Universities have installed this toolkit you better read further.

First let us talk about the 'no privacy issues - the content of traffic is never examined or displayed

Nov xx xx:xx:xx ubuntu kernel: [ 223.078052] device eth0 entered promiscuous mode

The network card went into Promiscuous mode which enables a network card to pass all traffic it receives to the CPU rather than just packets addressed to it. It also lets you inspect the contents of those packets. There is more information later in this blog about this.

Another privacy issue. Even though they ask you for a 'session password' the web application used to 'view reports' and 'view ntop' is not password protected. I was able to successfully view what websites were visited, what email servers were communicated with, what remote desktop connections there were, and on and on and on without supplying any credentials at all. This means that anyone (including the MPAA) could access this system remotely and view the console. View all traffic from any host this sensor can see. (don't forget, this system will check for updates giving up the IP address of the sensor to the MPAA). It also appears that the logging for Apache2 is turned off so you won't have any record of what was accessed in the web application.

Here is a screenshot of what what I could see remotely without supplying any authentication whatsoever.

As we can see it shows traffic to pretty much all hosts with which my home computer is communicating with. In a Student Resnet network this would be on a much larger scale.

They did customize some firewall rules to 'specifically' allow access to the MPAA Toolkit from remote. If they didn't want the to be available from remote I would think that they would have either added a function in the setup that let you choose your own rules or password protect the entire application.

#Allow NTOP and PEERWATCH connections

iptables -A INPUT -p tcp --dport 8180 -j ACCEPT
iptables -A OUTPUT -p tcp --sport 8180 -j ACCEPT
iptables -A INPUT -p tcp --dport 3000 -j ACCEPT
iptables -A OUTPUT -p tcp --sport 3000 -j ACCEPT

Port 8180 is the main MPAA Toolkit home page which lets you see the "Top 10 File Transfers" and the "Top 10 Alerts" that was triggered by the Snort rules. There is also a link to an Advanced page that takes you to the NTOP application.

Port 3000 is the NTOP application and it gathers information about network traffic. NTOP will let you see ALL traffic on the network and not just the P2P related traffic. If someone visits their bank it will show up in this log. It will also show the IP address and MAC address of the host that visited the banking site. Now lets put on our 'conspiracy theory hat'. The MPAA could, in theory, get data about an IP address that is file sharing. In their database of sensors that 'phoned home' they could directly access the sensor and get real-time traffic stats for the host they are investigating. They can get the MAC address of said host. A MAC address is unique to a computer and by default doesn't change. A university will match the IP address and timestamp to a MAC address then search their database to find the user. Just let that sink in a bit. To add to this conspiracy lets think of some examples that this tool can be used for. I'll put on an evil MPAA hat for a minute.

What if I log into my MPAA networked desktop computer and think today I want to go look at the sensor on the University of Tinfoil Hats' network. I go there and load up the main application and get a list of IP addresses that are showing to be on the Kazaa network. I get the MAC address and all information I can get. Then I send that information to the packet henchmen (MediaDefender, etc) and tell them to find this IP address and generate data for a DMCA notice. Well, now the MPAA has way more information than they are supposed to have. Actually information that would normally require a subpoena. They can drill down and find out where that IP address is going. Myspace, PirateBay, Grandma's website, whatever. If a person on the network has a personal website they might even be able to get a name and contact information of that user. And to add more weirdness to this scenario the Apache webserver logging is disabled. So you will not know what IP address accessed the application and what pages were accessed.

So lets get to the "the content of traffic is never examined or displayed":

They are using Snort which is a free intrusion detection system that is able to view the packet headers and contents of packets and compare to a list of predefined signatures to determine if specific things exist in the packet. Normally this is used to detect attacks on the network and is also used to enforce company policies.

The MPAA included the following Snort rules in the distribution and listed them in the Snort configuration file:

include $RULE_PATH/bleeding-p2p.rules
include $RULE_PATH/local-ftp.rules
include $RULE_PATH/local-http.rules
include $RULE_PATH/local-smb.rules
include $RULE_PATH/p2p.rules

So it looks like they weren't telling the whole truth about the viewing of traffic content. This tool is in fact looking at that traffic and inside the packets (content).

Example from local-ftp.rules:
alert tcp any 20 -> any any (msg: "FTP Download - MPEG Movie File - B2"; \
content: "00 00 01 B2"; depth: 6; rawbytes; \
sid: 1000501; rev: 1; \

Example from local-http.rules:
alert tcp any 80 -> any any (msg: "HTTP Download > 100M - MPEG Movie File - B2"; \
flow: established,from_server; \
content: "HTTP"; depth: 5; nocase; \
content: "200"; within: 8; \
content: "Content-Length\: "; within: 300; nocase; \
pcre: "/^[0-9]{9,}\r\n/R"; \
content: "0d 0a 0d 0a"; within: 100; \
content: "00 00 01 B2"; within: 6; \
sid: 1000101; rev: 1; \

Example from their local-smb.rules:
alert tcp any 445 -> any any (msg: "SMB-445 Download > 100M - MPEG Movie File - B2"; \
flow: established,from_server; \
content: "SMB2E"; offset: 5; within: 4; nocase; \
content: "00 00 01 B2"; distance: 54; within: 4; \
sid: 1000301; rev: 1; \

These above do inspect the contents of the packet.

Example of their p2p.rules:
alert tcp $HOME_NET any -> $EXTERNAL_NET 8888 (msg:"P2P napster login"; flow:to_server,established; content:"00 02 00"; depth:3; offset:1;

classtype:policy-violation; sid:549; rev:8;)

Isn't Napster totally legit now? Why would they include a detected napster login?

There seem to be some false positives so far in initial testing. It showed Kazaa traffic coming from my computer but I don't have Kazaa installed. In actuality it is fairly hard to reliably detect P2P traffic. A lot of P2P applications allow for changing of port numbers used as well as encrypting the traffic. Add the popularity of using proxies such as Tor and it gets even more difficult.

I think Perhaps the MPAA jumped the gun a bit when they began to send the letters out asking folks to install this toolkit. There are a lot of changes that need to be made before a tool like this could be installed in a production environment. And of course anyone that is responsible for running a network should always make sure to look closely at devices and applications before putting them into production.