I apologize for not blogging more lately but the Rugby World Cup has been on TV for the past two months and, being a huge rugby fan, had to dedicate a portion of my life to it. Now that it’s over, Go Springboks, I’m back 🙂
Here is the list:
Air Force Ready to Drop Cyber Bombs – You had to know it was coming.
“In the wake of several Chinese probes into the Defense Department’s non-classified computer and communications network, known as the NIPRNET, as well as German and British defense networks, the Air Force has made it clear it feels that, to fight effectively in cyberspace, a military must be on the offensive.”
Reading List – October 10, 2007 – Good collection of posts to check out.
A few things on my reading list for today … The first is a three part series on crimeware (malware specifically designed to yield money for the operators of the malware through direct financial theft) by CSO magazine. It’s an interesting look and shows that the underground economy is just as skilled as the fully legit software economy at adapting to “everything as a service”. It’s a three party series and they cover all sorts of links between groups and techniques. Excerpts below …
A collection of educational security incidents as of late:
Finanical Information On Thousands E-mailed To Student
Contractor Loses Decade Worth of Louisiana Student Financial Aid Data
Stolen Flash Drive Contained Student Data
Student Worker Steal UNCC Student Credit Card Information
MSU Extended University Computer Breached
Open FTP Files Contain Student Information
Cisco closing internal research group? – For all the money Cisco makes I can’t figure out why they think they can afford to not keep funding this group.
Dark Reading quotes a Cisco spokesman as saying that the CIAG still exists but the article goes on to say that the group’s research projects were on hold, as of Tuesday. Some of the research include SCADA security research, a honeynet for SCADA systems, Internet DNS scanning, study of “collateral damage” on network devices from malware attacks, a VoIP threat study, and the Common Vulnerability Scoring System (CVSS), reports Dark Reading.
BlackEnergy DDoS Bot – Analysis Available – Check out the report. It’s quite interesting.
BlackEnergy is an HTTP-based botnet used primarily for DDoS attacks. Unlike most
common bots, this bot does not communicate with the botnet master using IRC. Also, we
do not see any exploit activities from this bot, unlike a traditional IRC bot. This is a small
(under 50KB) binary for the Windows platform that uses a simple grammar to
communicate. Most of the botnets we have been tracking (over 30 at present) are located
in Malaysian and Russian IP address space and have targeted Russian sites with their
DDoS attacks.This report is based on analysis of the distribution package of the BlackEnergy botnet,
tracking approximately 30 live and distinct botnets, and disassembly of several samples
captured in the wild.
Transient Electromagnetic Devices (TEDs) Can Threaten Our IT Infrastructure – Didn’t they do this in Ocean’s 11? 🙂
Many people recognize an old term – electromagnetic pulse or EMP. The ElectroMagnetic Pulse (EMP) effect was first observed during the early testing of high altitude airburst nuclear weapons. In the past EMP’s generally required the use of a nuclear detonation. Today a destructive EMP can be produced without the use of a nuclear device. The development of Transient Electromagnetic Devices (TEDs) now makes the threat of an EMP attack much more likely.
Web Application Scanning Depth Statistics – Agreed, it’s not easy to find a ‘one size fits all’ approach when evaluating web application scanners.
One of the most difficult aspects of web application security scanners is understanding how to evaluate them. Obviously the false positive false negative ratios are important, but it’s often difficult to measure, as it depends on the web application in question. However, Larry Suto came up with a very interesting concept on how to do unbiased measurements of web application scanners. One of the most important measurements is to understand how well the spider portion of the scanner works.
Forensics: New Options for the Enterprise – Nice tip of the hat to the importance of log analysis and log retention for use in forensic investigations.
Log analysis in particular has long been a thorn in IT’s side. Either you tried hard to forget that terabyte or so of raw log data just sitting there, or you paid through the nose for a security information manager. Now, affordable log analyzers are available from companies like LogLogic that can justify their existence by satisfying provisions of Sarbanes-Oxley and the Payment Card Industry Data Security Standard. Meanwhile, packet-capture products from vendors such as Network Instruments and NetWitness not only enable investigators to do full session reconstruction, they also help the network team diagnose performance problems. Finally, products from Clearwell Systems and Athena Archiver mean IT can handle e-mail analysis in-house. While aimed at e-discovery, these tools will also be invaluable when investigating claims of harassment or other inappropriate behavior involving e-mail communications.
Auditing and Securing Multifunction Devices from the SANS Information Security Reading Room
Honeynet Project’s status report for 2007 – I especially enjoyed the ‘lessons learned’ section. Thanks Anton for pointing this out.
Securing the Gateway to Your Enterprise: Web Services – Great article that you should take a look at if you run an IIS web server.
Eugene Siu, a Senior Security Consultant on the ACE Team has just published a great article summarizing some of the pitfalls and issues around web services security. You can read the whole article here.
First Line of Defense for Web Applications – Part 1 – Good review of validating input when developing software.
There are lots of security principles which one should be aware of while developing software but at the heart of any secure application, there should be a first line of defense – and the mother of all defenses is: Input Validation!
There is so much buzz around for how hackers hack and what offensive techniques do they use to break in, but at the core it is the mitigation strategy which matters to me and many of my customers. Lack of input validation is one of the _core_ vulnerabilities for almost all web attacks. If we can get this thing right, we can save lot of $(s) down the road. This series of blogs will talk in detail about Input validation strategies for web applications. We will also take a look at some interesting top Validation bloopers.
The DMZ Isn’t Dead…It’s Merely Catatonic – I agree, the DMZ does not provide “defense in depth” but does help isolate systems.
Joel Espenschied over at Computerworld wrote a topical today titled “The DMZ’s not dead…whatever the vendors are telling you.” Joel basically suggests that due to poorly written software, complex technology such as Web Services and SOA and poor operational models, that the DMZ provides the requisite layers of defense in depth to provide the security we need.
I’m not so sure I’d suggest that DMZ’s provide “defense in depth.” I’d suggest they provide segmentation and isolation, but if you look at most DMZ deployments they represent the typical Octopus approach to security; a bunch of single segments isolated by one (or a cluster) or firewalls. It’s the crap surrounding these segments that is appropriately tagged with the DiD moniker.
md5deep Version 2.0 – Hey…cool 🙂
Jesse Kornblum has released version 2.0 of his popular file-hashing application md5deep. The tool now supports unicode characters in file names when run on the Microsoft Windows platform. From now on md5deep also processes hash values from hash sets in EnCase format (.hash). Please see the changelog for details and further bug fixes.
Poll: Which Logs Do You Collect? – Please contribute to the poll…I’m curious what people are collecting as well.
I figured I’d do a poll a week since people really like it. So, my first poll-a-week: Which Logs Do You Collect?
Defining Digital Forensics – Yes…it would be great.
Wouldn’t it be great if we could just look up the term “digital forensics” in the dictionary? Unfortunately, as you and others have found, it is not that easy. Even better, wouldn’t it be great if we could sort out who is really performing digital forensics versus those performing media analysis, software code analysis, and/or network analysis? In the past, most have used other terms such as computer forensics; intrusion forensics; video forensics; audio forensics; and digital and multimedia forensics. It is past time for someone to succinctly coin this term…
[PCI] Compliance Stats Q3 2007 – Interesting results. I wonder how many people care outside of PCI ASV’s…and maybe the customers of the non-compliant organizations.
You should check out the newly released compliance statistics for Q3 2007.
98% of Level 1 and 2 merchants confirmed that they do not store prohibited data. Acquirers of Level 1 and 2 merchants that continue to store prohibited data are currently subject to monthly fines.
Intro to Reverse Engineering – Part 2 – Yay, part 2 of the article.
In Part 1, Intro to Reverse Engineering – No Assembly Required, we extended the series of coding articles for non-programmers with an area of high interest in the infosec community. We’re proud to be able to bring you the highly anticipated follow-up complete with screen shots, sample code and applications. This one is long and detailed, so strap yourselves in for some great educational content.
This paper is designed to outline some essential reverse engineering concepts, tools and techniques – primarily, debuggers and using the debugging process to reverse engineer application functions and algorithms. It is assumed you have knowledge of basic assembly and C programming. An understanding of Win32 programming and API calls is also helpful. This tutorial does not necessarily have to be read in order (although it is strongly advised), as some sections do not contain information that directly relates to subsequent sections. However, if you begin skipping around and find that you have trouble understanding a concept, or feel like you missed an explanation, it would be best to go back to previous sections of the tutorial and read them first.
Well I’ve finally lost my cold…and as a reward…I’ve thrown out my back. *shakes fist*
Here is the list:
Virtualization Security Training? – That’s not a half bad idea 🙂
If the industry is having trouble finding IT generalists with training in virtualization security, I can only imagine the dearth of qualified security experts in the hopper. I wonder when the first SANS course in virtualization security will surface?
Common Criteria Web Application Security Scoring (CCWAPSS) Released – Interesting white paper. Has anyone implemented this scoring system internally?
The purpose of the scoring scale CCWAPSS is to share a common evaluation method for web application security assessments/pentests between security auditors and final customers.
This scale does not aim at replacing other evaluation standards but suggests a simple way of evaluating the security level of a web application.
The Merits Of Threat Modeling – I suspect that threat modeling exercises would help prevent quite a few design flaws if organizations took the time to hold them.
As a consultant, I have been involved with many-a threat modeling exercise. Oftentimes, they are boring, process intensive sessions where you stare out the window praying that the meeting ends or that the lunch you ate contained botulism. They are also boring, process intensive meetings that have more impact on the longterm security of your organization than just about anything you are likely to do.
WinHex, X-Ways Forensics, X-Ways Investigator 14.4 released – Quite the list of features in this release.
Official release of SQL Power Injector 1.2 – Download Now! – Another tool to try.
SQL Power Injector is a graphical application created in .NET 1.1 that helps the penetrating tester to inject SQL commands on a web page.
For now it is SQL Server, Oracle and MySQL compliant, but it is possible to use it with any existing DBMS when using the inline injection (Normal mode).
Moreover this application will get all the parameters you need to test the SQL injection, either by GET or POST method, avoiding thus the need to use several applications or a proxy to intercept the data.
The emphasis for this release is maturity, stability and reliability with secondary goals of usability, documentation and innovation.
Lessons From a Cyberdefense Competition Red Team – Michael posted his insights from his recent ISU Red Team involvement (Part 1, Part 2, Part 3). It sounds like it was a good opportunity.
This weekend Iowa State University held its annual CyberDefense Competition in Ames, Iowa. The event is hosted by students and faculty from the Information Assurance Student Group and the Electrical and Computer Engineering department. In the event, teams of students attempt to deploy and manage various services representative of normal business applications. During the 20 hours the event covers, the teams are scored on their service uptimes as tracked by network monitoring (Nagios) and other neutral teams acting as normal users of the services. In addition, much like the real world, there is another team of students, faculty, and area professionals acting as attackers, intent on owning and bringing down those offered services. The services the teams were required to offer were web services (with pre-packaged web content), mail (smtp and imap), a telnet shell, ftp, wireless access for normal users, and dns to get it all working.
Something You Should Know: FTC Is Aggressively Going After Companies With Poor Security – Witch hunt or proactive initiative? 🙂
Of all the U.S. government regulatory oversight agencies, the Federal Trade Commission (FTC) is the most active and aggressive in looking for and applying penalties to organizations that not only are in noncompliance with laws and regulations, but also those who are not in compliance with their own information security and privacy promises; in other words, those that are practicing “unfair and deceptive trade practices.”
Indiana State Police Forensics Field Triage Program a Success – Good news!
Approximately two years ago, the Indiana State Police instituted a unique program in which examiners conduct on scene computer forensics. The goal of the Computer Forensics Field Triage program is to utilize departmental resources efficiently to improve cyber crime investigations by conducting on scene computer examinations in a forensically sound manner. The program was an immediate success. Investigators found that conducting examinations on scene was far superior to conducting examinations in a laboratory setting. Specific circumstances sometimes dictate that an on scene examination is the only viable alternative…
Website Vulnerability Statistics (17 mo. and counting) – Download the report and give it a read.
It’s that time of the quarter where we get to release our WhiteHat Website Security Statistics Report (PDF) – the aggregate vulnerability data we’ve collected when assessing the custom web applications of hundreds of the largest and most popular websites on a continuous basis (weekly is typical). This data is also very different from Symantec, Mitre (CVE), IBM (ISS) X-Force, and others who track publicly disclosed vulnerabilities in commercial and open source software products. WhiteHat’s report focuses solely on previously unknown vulnerabilities in custom web applications, code unique to that organization, on real-world websites
Auditing open source software – Great post on auditing open source software with some solid examples.
Google encourages its employees to contribute back to the open source community, and there is no exception in Google’s Security Team. Let’s look at some interesting open source vulnerabilities that were located and fixed by members of Google’s Security team. It is interesting to classify and aggregate the code flaws leading to the vulnerabilities, to see if any particular type of flaw is more prevalent.
Why won’t this cold go away?
Here is the list:
aircrack-ptw – Fast WEP Cracking Tool for Wireless Hacking – Still using WEP? Want to reconsider that?
The aircrack team were able to extend Klein’s attack and optimize it for usage against WEP. Using this version, it is possible to recover a 104 bit WEP key with probability 50% using just 40,000 captured packets. For 60,000 available data packets, the success probability is about 80% and for 85,000 data packets about 95%. Using active techniques like deauth and ARP re-injection, 40,000 packets can be captured in less than one minute under good condition. The actual computation takes about 3 seconds and 3 MB main memory on a Pentium-M 1.7 GHz and can additionally be optimized for devices with slower CPUs. The same attack can be used for 40 bit keys too with an even higher success probability.
No Ring Untarnished – Interesting article on kernel vulnerabilities.
Kernel vulnerabilities themselves are nothing new, of course. The exploitation of local kernel flaws has been a popular pastime for many researchers and hackers over the years, and in many cases these flaws were shown to be exploited just as reliably as a local flaw in userland software. However, being local to the system has its advantages; the level of interactivity with the system and the data that is available make for more reliable and/or predictable results. We have seen more than a fair share of remote kernel flaws over the years as well, some of which were leveraged in historical attacks (such as the Teardrop denial of service attack).
Some logging notes – Michael mentions on his blog that he doesn’t feel he performs enough logging. From the comments it’s easy to tell where Anton and I stand on this practice 🙂
My own logging? At home, I don’t do enough. At my last job, we did logging, but didn’t use it enough or probably use it properly. At my current job, we don’t do enough logging at all.
Log Trustworthiness Hierarchy – I like this post. One thing I’d like to see is how this hierarchy could be impacted by ‘trusted’ systems that aren’t tuned to remove false positives, aren’t continuously updated for vulnerabilities, etc.
So, do you trust your logs to accurately depict what happened on the system or network? Which logs do you trust the most? How do we increase this trust?
My first draft of such trust hierarchy follows below (from low trust to high trust):
Compromised system logs (mostly pure distilled crap :-), but might contain bits that attacker missed/ignored)
Desktop / laptop OS and application logs (possibly changed by users, legitimate systems owners, etc)
All logs from others systems where ‘root’/Admin access is not controlled (e.g. test servers, etc)
Unix application logs (file-based)
Local Windows application logs
Local Unix OS syslogs
Unix kernel audit logs, process accounting records
Local Windows server OS (a little harder to change)
Database logs (more trusted since DBA cannot touch them, while ‘root’ can)
Other security appliance logs (located on security appliances)
Various systems logs centralized to a syslog server
Network device and firewall logs (centralized to syslog server)
Logs centralized to a log management system via a real-time feed (obviously, transport encryption adds even more trust)
Seek and Destroy: Enhancing America’s Digital First Strike Capabilities – I tend to believe that these capabilities are already in place or are currently in development.
What if the cyber attacks went beyond military targets and focused on civilian infrastructure? Would we look at this any different than a physical attack on our infrastructure? Given our reliance on digital technology is there really a difference?
And now for some security papers:
Forensic Analysis of a SQL Server 2005 Database Server
Understanding the Importance of and Implementing Internal Security Measures
Tuning an IDS/IPS From The Ground UP
OS and Application Fingerprinting Techniques
Another Presentation: FINAL Full Log Mining Slides – Thanks to Anton for posting another one of his excellent presentations.
Today I am happy to release what I consider to be my most interesting old presentation – a full slide deck on log mining. It covers a few years of my research into using simple data mining techniques to analyze logs stored in a relational database. It even comes with examples of real intrusions caught by my algorithms as well as tips on reproducing my results in your environment.
NSA writes more potent malware than hacker – Hmm…this kind of goes back to my first strike point above 🙂
A project aimed at developing defences against malware that attacks unpatched vulnerabilities involved tests on samples developed by the NSA.
The ultra-secretive US spy agency supplied network testing firm Iometrix with eight worms as part of its plans to develop what it describes as the industry’s first Zero-day Attack Test Platform.
Richard Dagnell, VP of sales and marketing at Iometrix, said the six month project also featured tests involving two worm samples developed by a convicted hacker. The potency of the malware supplied by the NSA far exceeded that created by the hacker.
A Waste of Time – Yikes…not exactly a glowing review.
It just wrong when a company like Cisco charges an outrageous amount of money for a class that doesn’t do anything. I’ve been to other classes that were either free or less than $200 for 2 days that I gained much more from. After the class was finished we filled out a class evaluation and I made sure to let it be know that I was unhappy. I was nice and constructive with my criticism. One of the questions was “Based on your experience in this class would you take another Cisco Authorized Training Class?” My answer was a resounding “NO!”. This is my first CAT class and I’m sure that many of them are very well done, but his isn’t one of them.
Congratulations Brian Granier! – I had the pleasure of attending the graduation ceremony while at SANS 2007 in Las Vegas. Congrats Brian!
Our handler Brian Granier became this week the second student to graduate from the SANS Technology Institute!
Microsoft’s Anemone Project – This was the first I’d heard of this initiative. It’s a great idea for reading traffic prior to and after encryption.
Ubiquitous network monitoring using endsystems is fundamentally different from other edge-based monitoring: the goal is to passively record summaries of every flow on the network rather than to collect availability and performance statistics or actively probe the network…
It also provides a far more detailed view of traffic because endsystems can associate network activity with host context such as the application and user that sent a packet. This approach restores much of the lost visibility and enables new applications such as network auditing, better data centre management, capacity planning, network forensics, and anomaly detection.