Wednesday, December 4, 2013

Desktop Packet Reference Tool

A long while ago, I started working on a Windows app for helping people learn about packets. Unfortunately, life has had a crazy way of getting "in the way" and after at least a year, I have finally found some time to get back to working on this app.

Initially I was creating this more as a Snort rule generator and tester to help people learn more about creating Snort rules, and how they have changed over each version. The tabs for each packet type was an afterthought at the time. However, I decided it would be more fun for me to focus on the different protocols and not mess around with creating a tool that, quite frankly, already exists within Snort.

I think I will have this, and two other big projects done by Christmas. I did want to put a couple screenshots out there in case anyone with an interest runs across this blog and has changes/additions to suggest, or even complaints. Just remember, this is a fun thing for me and I hope that people will be able to utilize it once it is finished.

The application opens with this view:

 
The above just shows the initial view when the application is opened. I haven't changed the title yet, which is why it still reads "IDS Rule Builder." The user can select any of the visible tabs and see the standard view of the packet for the selected protocol. As of now, once a field on a tab is selected, that is the information that will stay visible on that tab view until a different field is selected. This means that a user can select IP Version, for example, switch over to the TCP Packet tab and select the "Source Port" and then switch between these two tabs without having the selected information cleared. 
 
And here is a view of the application after the user selects "IP Version" under the "IP Packet" tab. The "Byte Offset" text is sort of a favorite of mine. I believe that anyone who has ever tried to follow Mike Poor on audio when he discusses tcpdump filters and byte offsets will appreciate having a tool like this to visualize and reinforce what Poor is saying. :-)


Anyway, this is just a preview as I still have a LOT of the information text strings to build and add in. I am toying with the idea of using some fashion of highlighting the selected field once selected but not sure if that's necessary. I do think I will get this done prior to Christmas (along with the other two projects) and once it's complete I will put the project, source, and executables up on bitbucket publicly.

Hope everyone had a wonderful and safe Thanksgiving and that their Christmas festivities are fun and safe as well! :-)

dw

Thursday, November 14, 2013

Sitting the GWAPT exam - Updating my certs

Four months ago I enrolled in the SANS On-Demand course for SANS542, Web Application Penetration Testing. I chose this course as I love pen testing, I understand and enjoy working with web technologies (My Master's project was web services), and the course looked exciting as Kevin Johnson, the creator of SamuraiWTF, was the "instructor" for the course.

I was initially able to find some time both at work and home to devout to listening to the audio and reading the books. However, I found my workload quickly increasing, causing me to back-burner my study efforts on SANS542. About three weeks ago, I realized that while I was being sent to Vegas for a work trip, that the exam availability would expire prior to my return. I had asked SANS for a few days of extension and was, to no surprise, told no. So I scheduled my exam for the afternoon of the last Friday that I would be in Vegas.

Had I known that by the time for the test had arrived, that I would have been two hellish days into a flu that lasted for almost six, I would have scheduled the exam for earlier. Apart from being disgustingly sick I was also swamped with work and thus had a grand total of 12 hours to prepare for the exam.

So, with little preparation and the flu, I drove on and sat the exam. I ended up using the full two hour time limit  as I had to dig through the books for some obscure answers on some tools I don't use very often. However, I am pleased to say that I finished the exam with a passing score of 91.43%. Had I not already had some solid experience in web app pentesting, I am certain that would not have passed this exam at all. I felt the exact same way after I sat and passed the GREM and I do think that SANS did a good thing in making the test shorter in total number of questions as well as time.

It is my belief that by using this new testing format that the number of honestly unqualified people "earning" GIAC certs will decrease, despite the "watering down" of these and others, such as CISSP, from different branches of the US.

I will stop ranting now, before I get too high on my soapbox. The confidence I have in my technical abilities has really increased by quite a few external sources over the last few weeks. So much so that my passing of the GWAPT has me considering seriously a number of different options for my next challenge, including the GSE.

So I am now in a position to decide what I want to do next. For the GSE, I still need SANS504 (Which I've been told is one I could probably sit cold and still ace it. However, there are still at least five other SANS courses I would like to take, as well as the OSCP, the SNORT-CP, MCITP, and two software engineering certs offered by IEEE.

Not bragging...I think it's somewhat funny how many certs I currently hold. Anytime we are allowed to use tuition reimbursement funds for courses, we are required to pick courses that prep for certifications as well as having to sit and pass said certs.

What I currently hold:
Sec+
CISSP
MCP
ArcSight Admin
ArcSight Analyst
GSEC
GCIA
GPEN
GREM
GCNA
GWAPT



Thursday, October 24, 2013

PowerShell and Nessus


Wouldn't it be nice if every PT tool spat out their results in the exact same format? I'd be happy if Nessus, nmap, MetaSploit, and Nikto all use the exact same format for output. Whether this means they only use CSV's and keep the same order for the "header" rows or using the same elements and tree for XML output. However, reality is slight skewed...I mean different. :-)  Recently I worked on a PT mission and it was my responsibility to compile the Nessus results and then to parse them for certain items. This was the easy part. The fun part came when I needed an easy way to convert the CVE numbers in the CSV output to that of what the Army uses: IAVMs/IAVAs. Easy right?

I thought so, figuring it to be just a "simple" find & replace operation. But, and this is where reality came in to perspective, it did take a little work in getting some powershell scripts to work, stripping out what I wanted into multiple output files (still CSV's, intentionally) and then convert the CVE numbers in ALL of those output files.

You may be asking "Dave, why didn't you just convert the CVEs first then strip out the results you wanted?" Good question. The answer is, to me, very simple: I would rather my find and replace operation only have to do absolutely what it must in terms of processing lines/words of a file as opposed to processing the entirety of ALL of my output files. In this case I had five seperate output files, totaling only 105MB. This isn't a big amount when one considers the computing power available today. The parsed output files numbers six and weighed in at a glaringly heavy 18.8MB...might call this a feather-weight contender.

Since the find and replace operation runs line by line, at least the way I wrote it, this 18.8MB is a much better number to run through. I should also mention that this particular pentest was very small in scope and size, consisting of five Class 'C' blocks, with a grand total of less than 500 live hosts between them. That said, I think any PT'er with even only a few PT's under the belt could recognize the small size of the original Nessus output files (105MB).

Below is what I wrote to work for my needs on this particular mission. I am in the process of turning this into a module that is dynamic, allowing the user to select the same, or some combination, of what I hard-coded, as well as the proper directories for the original Nessus output CSV's and the final output of the script.
#Parse-Nessus.ps1
#
# Ingredients:
#   1) Directory of Nessus output files
#      - In CSV format
#   2) User has ability to read and write to appropriate directories
#   3) User is able to read/modify the original and output files
#   4) IF you want to convert CVE's to IAVA/IAVM numbers, the mapping can be found at:
#       - http://iase.disa.mil/stigs/downloads/xls/iavm-to-cve(u).xls
#          - This file has a lot of other information in it. I found it easier to strip out just the CVE and IAVA columns and store into a new CSV file labled:
#          - ReplacementList.csv
#   5) I also found a great script to start with...had to modify it for me, but the link is:
#       - http://tangodude.wordpress.com/2013/04/15/powershell-multiple-find-replace-in-files-with-lookup-list/
#
# Comments: The big problem here is that the list from DISA doesn't seem to have all of the CVE numbers so there is still a little manual work that has to be done

#In order to strip out just the CVE and IAVA numbers from the xls spreadsheet, we need to convert it from an XLS document to a CSV document:
$xlCSV=6
$Excelfilename = "C:\users\UserOne\Desktop\iavm-to-cve(u).xls"
$CSVfilename = "C:\users\UserOne\Desktop\TempListing.csv"
$OutCSVfilename = "C:\users\UserOne\Desktop\ReplacementList.csv"
#create an Excel object
$TempExcel = New-Object -comobject Excel.Application
#we don't need to actually open Excel
$TempExcel.Visible = $False
#we don't need macro or other alerts
$TempExcel.displayalerts=$False
#Open the downloaded XLS file with the Excel
$TempWorkbook = $Excel.Workbooks.Open($ExcelFileName)
#Now save the opened file with the new filename and as an CSV file
$TempWorkbook.SaveAs($CSVfilename, $xlCSV)
#Close the Excel object
$TempExcel.Quit()
#Just in case, Really close the Excel object :-)
if(ps excel){kill -name excel}

#we don't need to keep the converted CSV file.
del $CSVfilename  

#we will be reading this CSV file into a hash table but first, let's parse out the info we really want
#Where the Nessus CSV files are located
$CSVSourceDir = "C:\Users\dwerd_000\Desktop\nessus"
#Get ONLY the CSV files you want. In this example, the files have names like scan1.csv, scan2.csv, so the -like "sc*.csv" will select only our scan output files
$DataFiles = Get-ChildItem $CSVSourceDir -force | Where { $_.Name -like "sc*.csv" } | Foreach-Object -process { $_.FullName }
#If you want to know how many files were stored in the DataFiles object
[int] $DataFilesCount = $DataFiles.Count
#Again, this is just for verifying the number of CSV files
Write-Output "Discovered $DataFilesCount CSV Data files in $CSVSourceDir "

#Now that we have the needed files in the DataFiles object, it's time to strip out the data that we want.
#First, declare some vars for the output files. Here you can see the items I was most interested in.
#One small note of caution: create the directory structure that you want to use.
$outFileHighs = "C:\Users\UserOne\Desktop\nessus\Parsed\NessusResults_ALL_Highs.csv"
$outFileAdobeReader = "C:\Users\UserOne\Desktop\nessus\Parsed\NessusResults_ALL_Adobe_Reader.csv"
$outfileShockwave = "C:\Users\UserOne\Desktop\nessus\Parsed\NessusResults_ALL_Adobe_Shockwave.csv"
$outFileFlash = "C:\Users\UserOne\Desktop\nessus\Parsed\NessusResults_ALL_Adobe_Flash.csv"
$outFileRCE = "C:\Users\UserOne\Desktop\nessus\Parsed\NessusResults_ALL_Remote_Code_Execution.csv"
$outFileOracleJava = "C:\Users\UserOne\Desktop\nessus\Parsed\NessusResults_ALL_Oracle_Java.csv"

#Now let's parse through the DataFiles. The ForEach loop will load each file, search for each wanted item,
ForEach ($DataFile in $DataFiles)
{
        #Again, some verbosity here for info/debugging purposes
        $FileInfo = Get-Item $DataFile
        $LogDate = $FileInfo.LastWriteTime
        Write-Output "Reading data from $DataFile ($LogDate ) "
       
        #Let the Parsing begin. Each parsing line appends the selected data line to the specified CSV file. The NoTypeInformation switch is personal preference...but I'd recommend using it.
        #Find all vulns for Adobe Reader that have an actual Risk value and the Risk value isn't "None"
        [array]$CSVData += Import-CSV $DataFile | where {$_.Description -Match "Adobe","Reader" -and $_.Risk -ne "None" } | select "Plugin ID", CVE, Risk, Host, Protocol, Port, Name, Description | Export-Csv $outFileAdobeReader -NoTypeInformation -Append
      
        #Find all high vulns
        [array]$CSVData2 += Import-Csv $DataFile | where {$_.risk -eq "high"} | select "Plugin ID", CVE, Risk, Host, Protocol, Port, Name, Description | Export-Csv $outFileHighs -NoTypeInformation -Append
       
        #Find all vulns for Shockware that have an actual Risk value and the Risk value isn't "None"
        [array]$CSVData3 += Import-Csv $DataFile | where {$_.Description -Match "Shockwave" -and $_.Risk -ne "None" } | select "Plugin ID", CVE, Risk, Host, Protocol, Port, Name, Description | Export-Csv $outfileShockwave -NoTypeInformation -Append
       
        #Find all vulns for Flash that have an actual Risk value and the Risk value isn't "None"
        [array]$CSVData4 += Import-Csv $DataFile | where {$_.Description -Match "Flash" -and $_.Risk -ne "None" } | select "Plugin ID", CVE, Risk, Host, Protocol, Port, Name, Description | Export-Csv $outFileFlash -NoTypeInformation -Append
       
        #Find all vulns for "Remote Code Execution" that have an actual Risk value and the Risk value isn't "None"
        [array]$CSVData5 += Import-Csv $DataFile | where {$_.Description -Match "Remote Code Execution" -and $_.Risk -ne "None"} | select "Plugin ID", CVE, Risk, Host, Protocol, Port, Name, Description | Export-Csv $outFileRCE -NoTypeInformation -Append
       
        #Find all vulns for Java that have an actual Risk value and the Risk value isn't "None"
        [array]$CSVData6 += Import-Csv $DataFile | where {$_.Description -Match "Java" -and $_.Name -Match "Java" -and $_.Name -notmatch "JavaScript" -and $_.Risk -ne "None" } | select "Plugin ID", CVE, Risk, Host, Name, Description | Export-Csv $outFileOracleJava -NoTypeInformation -Append 
       
        #If you want to track how many lines are added to each file you can do something like this
        #[int] $CSVDataCount6 += $CSVData6.Count
        #Write-Output "Imported $CSVDataCount6 records"
}

#OK, so now that we have all of our files with the output we want, it's time to create the Hash table
#I manually removed the header row of the ReplacementList.csv file. You can do the same or modify the code that creates the file above
$HashTable  = @(get-content C:\Users\UserOne\Desktop\nessus\ReplacementList.csv ) -replace ",","=" | convertfrom-stringdata

#The location of the parsed nessus values. By using a standard naming convention, I can again use a wildcard mask to load only the files I want
$ParsedFiles = "C:\users\UserOne\Desktop\nessus\Parsed\Nessus*.csv"

#Let's get the nessus csv files and run them through a ForEach loop.
#We load each file, using the FullName value (path and filename) into a var and then use
#an HashTable enumerator to make the changes.
gci $ParsedFiles |
ForEach-Object {
    $Content = gc -Path $_.FullName;
    foreach ($h in $HashTable.GetEnumerator()) {
        $old = $($h.Keys)
        $new = $($h.Values)
        $Content = $Content -Replace "$old", "$new"
    }
    Set-Content -Path $_.FullName -Value $Content
}


Wednesday, October 9, 2013

Powershell for Hashes and Timestamps

I have recently been loving the functionality that PowerShell provides. I think it's Microsoft's best attempt at a *nix-like shell system. There is so much that I have been able to do just playing around with it that today, when needed, I was able to bang out two quick scripts. I figured I would post one here now (and maybe the second one after I clean it up...and maybe "module-ize" it).

I recently have been put through the wringer by first Dell (in regards to a failed hard drive on my primary laptop) and then Microsoft (in regards to my primary Live account being hacked and misused). Because of these issues, I have had to rebuild and set up my system from scratch. What I found while doing this was that I had somehow made a good number of "backup" copies of the source directory of a big project I have been working on. As I use BitBucket, this normally wouldn't be a problem...except for the fact that there was a good amount of changes I made while the last hard drive was failing, which caused multiple pushes to fail when the laptop froze up. This left me with a less than sure feeling of main and primary backup folders for the project. What to do?

I decided that the easiest thing would be to have a spreadsheet of the filepath, filename, the MD5 of each file, and the Last Write Time of each file. So, I moved all of the folders under one temporary one on my desktop. Now I just needed to get the meta-data I needed. PowerShell and the PowerShell Community Extensions to the rescue!

The PowerShell Community Extensions (PSCX, http://pscx.codeplex.com/) provides a useful Get-Hash function. This function can produce a number of different type of hashes depending on the switches applied by the user. Even better, this function accepts pipelined results and the pipelining of its own output, which comes in very handy.

To get to some code, using the PSCX Get-Hash function is as easy as:

Get-Hash MyFile.txt

The above will by default produce the MD5 hash of MyFile.txt and will output four data values:
- Path: the full path and name of the file
- Algorithm: the algorithm used. In the example above the output would be 'MD5'
- HashString: the hash string based on the algorithm used
- Hash: the system datatype of the HashString

For my purposes, I only care about the 1st and 3rd columns (Path, HashString). However, this is still not enough information. The below script is the solution that works for me. I think I am going to convert this to get rid of the hard coded values at some point in the very near future.



####################################################
# FileName: Get-HashesAndTimeStamps.ps1            #
# Author: Dave Werden                              #
# Date:   9 Oct 2013                               #
# NOTES:                                           #
# The four columns produced by the PSCX's Get-Hash #
# module are: Path, Algorithm,HashString,Hash      #
# Dependencies: The PSCX pack must be installed and#
#  imported in order to make use of the Get-Hash   #
#  module.                                         #
####################################################



#Hardcoded csv filepath and name
$outCSVFileTemp = "C:\users\dwerd_000\Desktop\SB_File_Hashes_Full.csv"

#Hardcoded location of files
$sbpath ="C:\users\dwerd_000\Desktop\ScoutNB_Collections\"
#create the collection of files
$sbfiles = gci $sbpath -Recurse | ? { !$_.PSIsContainer }


#process each file, getting the file hash and last write time for each
#ouput goes to file defined in outCSVFileTemp above
foreach ($sbfile in $sbfiles ) {
   
    #smarter to grab file's LastWriteTime value first in order to append to the Get-Hash object
    $sbfileTime = $sbfile.LastWriteTime.ToString("dd/MM/yyyy HH:mm:ss")
    Get-Hash $sbfile | Select-Object Path,HashString,@{Name='LastModified';Expression={$sbfileTime}} | Export-Csv $outCSVFileTemp -Append

 }


To quickly explain what is going on here exactly:
$sbfiles is set to contain all of the files in the given path. This is done recursively and excludes folders themselves.
Next, a foreach loop is used to process each file by:
   - First grabbing the file's LastWriteTime property, using the given format and saving to $sbfileTime
   - Next (and this was the FUN part) the file object is cut-up using the Select-Object function where only the Path and HashString 'columns' are retained and third column (LastModified) is added and set to the value of $sbfileTime
   - The "new" object, consisting of Path, HashString, LastModified, columns/values is now exported to the $outCSVFileTemp.

By running this script, I am able to use one spreadsheet to identify the newest version of each file, as well as if multiple versions of the same file are the same or different. While there is probably a way to automate this in PowerShell, I still prefer to do these kinds of tasks semi-manually by using Excel's ability to filter/sort as well as the its ability to highlight duplicates (HashString, in this case). The only other action that I currently do manually but may add to this script is the splitting of the Path value into the full path to the lowest folder in one column and the filename by itself in a second column (not sure which way to go on this).

Anyway, it was a lot of fun to bang this out and to see that I ended up with a CSV file of exactly the data I needed and nothing else. The other PowerShell script I knocked out today was a (for now) hardcoded parser for finding specific items from one or more Nessus results file and creating an appropriately named CSV file for these found subsets. Maybe later this week or next I will post that up as well....actually, I am certain I will as I have not found a good PS or other tool to find and compile the subsets I need from Nessus in order to provide valid data for PT reports.

Monday, July 22, 2013

Network Security Monitoring Book

Last week I happened to notice that Richard Bejtlich's new book, The Practice of Network Security Monitoring: Understanding Incident Detection and Response, was available for pre-sale form the No Starch website. After considering it for a short time, I decided that I would go ahead and make the pre-sale purchase, and that I would buy the hard copy so that I would get the free eBook with it. This afternoon I downloaded the eBook (all three formats of it) and the hard copy should be mailed to me next week sometime. But, I am not sure what I think about this yet.

I am actually in the middle of reading two books for book reviews. One book review is for the SIGSOFT quarterly publication and the other is for the SIGACT quarterly. I have really enjoyed doing these book reviews as it gives me (usually) a free copy of a new book and I get to share my opinion with any of the readers of these journals. It is these two current reviews that I am doing that makes me a little uncertain of my choice to go ahead and buy the Bejtlich book: do I have ANY time to actually read this book right now?

I hope so as I am looking forward to this book! Other than a general respect for Mr. Bejtlich's accomplishments and my understanding of his position on things that interest me, the book had one HUGE selling point for me: Doug Burk's SecuirtyOnion. I am a major fan of SecurityOnion and I think it's inclusion in this book is just awesome! It also looks as though the book goes beyond just the installation and configuration of SecurityOnion in that SecurityOnion seems to be the foundation of the book itself.

More about this book in the future...

As of now, the book is still available for pre-sale and has a 30% discount available:

http://nostarch.com/nsm

Thursday, May 2, 2013

PowerShell for your Network Adapter

Recently, I needed to buy a new computer for my development work. While I could have shopped around and bought either an earlier version of Windows, or just a good machine for a Linux distro. However, since the particular laptop I needed to replace was a Windows Vista box, I decided to stick with Windows and set my dev environment up the same way.

This is how I came to own a Windows 8 laptop. I now have a Windows 8 phone as well (thanks to T-Mobile NOT having a single Droid phone that I liked), and while it syncs well with my new laptop, it's a whole different domain of complaints. It is a complaint of my Windows 8 laptop that has caused me to right this long overdue entry to this blog: after my computer sleeps for some [apparently] arbitrary amount of time, my Wi-Fi adapter is disconnected and will not successfully reconnect. I should say that the adapater will not connect without disabling and then re-enabling the adapter. Since I am already playing around a bit with PowerShell, I figured, why not write a script that I can just run when needed. The following covers my efforts (successful efforts, of course!).

Before creating a PowerShell script, it is important to know if you can actually execute said script. As with almost everything else in Windows since the UAC was introduced, just logging in as an administrator doesn't automatically cause every process you start to have the same credentials. PowerShell scripts are no different, so here are a couple key points.

1. Your environment must allow for the running of the PowerShell scripts that you want to save and execute.
2. For writing and testing your scripts, I find it best to open PowerShell (either the command line or the ISE) with "Run as Administrator".

Number 2 above is easy...at least I hope it is for anyone who has read to at least this point of this entry. :-) However, Number 1 may cause some heartache for someone. So, and without a big long description of using PowerShell itself, to determine if you can save and execute PowerShell scripts:

PS C:\Get-ExecutionPolicy
...should return either "RemoteSigned" or "Unrestricted"
I tend to prefer the Unrestricted option while I am working.

To change the execution policy:
PS C:\Set-ExecutionPolicy unrestricted

Now that this is out of the way, time for the hard and complicated stuff.

Before I continue though, I want to report that I happen to like PowerShell. I have been tinkering with it in my tiny bit of free time (when I should probably be doing more with helping test SecurityOnion -- an absolutely AWESOME product!!!). Until today, I have been approaching PS from a standpoint of what it can/can't do for a pentest, flow data, and remote execution (from an SA standpoint). Today, I needed something quick that would let me avoid having to open up multiple windows and go through the disable/enable motions. Below is the fruit of this strenuous labor.

#***********************************************************
#RenewWiFi.ps1
#author: me
#notes: this was way too easy #***********************************************************
#disable Wi-Fi adapter and DO NOT show the confirm window
Disable-NetAdapter -Name wi-fi -Confirm:$false

#Re-enable adaptor Enable-NetAdapter -Name wi-fi #***********************************************************

...and that's it. Really, an actual script file isn't even needed to do this...but it let's me double-click once and get back to something REALLY important, checking on my Detroit Tigers!!!

dw