Sunday, March 01, 2009

Bash scripts

Bash is a scripting tool available on many free software platforms. Please visit http://en.wikipedia.org/wiki/Bash for more information on bash.


The following bash scripts could be used to download information available on the Internet for off line use.


Script to download judgments of the Supreme Court of India published by NIC


NIC has made available the full text of nearly 18,000+ judgments of the Supreme Court from 1950 till date, along with recent judgments of various High Courts at http://www.judis.nic.in/judisf.htm


The following block of code may be copied and saved as get_scj.sh


#!/usr/bin/bash
#AUTHOR: ramanraj.k@gmail.com 2004-06-05
#script to download Supreme Court case reports published at NIC for offline use
#needed: bash, wget, and of course, a connection to the internet
#usage: get_scj.sh from_no to_no
# get_scj.sh 1 500 # download cases 1 to 500
# get_scj.sh 501 1000 # download cases 501 to 1000
# cases starting with from_no to to_no are downloaded
# and saved as sc_no.html, for example sc_1.html ... sc_500.html

LIMIT=$2
for ((a=$1; a <= LIMIT ; a++))
do
file=sc_$a.html
wget -t1 "http://www.judis.nic.in/sc/qrydisp.asp?tfnm=$a" -O $file
if [ ! -s $file ]; then
rm $file
fi
done
#alert user with bell
echo -e "\a"
exit


Set permissions to get_scj.sh to make the script executable with the command:

# chmod 755 get_scj.sh

Instructions to use are included in the scripts itself. As on date more than 18,000 reportable judgements of the Supreme Court are available online, that is nearly 650 MB. The script does not keep track of the last case downloaded, and the range of reports to be downloaded should be given while invoking the script. To get information about the last case downloaded, one could issue the command:


# ls -alt sc*htm

Script to download Madras High Court cause list



#!/bin/bash
#AUTHOR: ramanraj.k@gmail.com 15-02-2003
#script to (1) download chennai high court cause list
# (2) search for required information using grep
# (3) save list to cause_list.tar
#usage: get_list.sh day
# where day can be mon, tue, wed, thu or fri
#
#needed: bash, wget, and of course, a connection to the internet
# define variables; assign dir
DAYLIST="$1"
echo "$DAYLIST"
if [ $DAYLIST = mon ]
then
DAYDIR=omon
fi
if [ $DAYLIST = tue ]
then
DAYDIR=otues
fi
if [ $DAYLIST = wed ]
then
DAYDIR=owed
fi
if [ $DAYLIST = thu ]
then
DAYDIR=othurs
fi
if [ $DAYLIST = fri ]
then
DAYDIR=ofri
fi
if [ $DAYLIST = weekly ]
then
DAYDIR=weekly
fi

#last week list to compare lengths
ls -al ~/chennai_new/$DAYDIR

#download list
wget http://causelists.nic.in/chennai_new/$DAYDIR/cl.html -O cl.html
wget http://causelists.nic.in/chennai_new/$DAYDIR/misc.html -O misc.html
wget http://causelists.nic.in/chennai_new/$DAYDIR/dnot.html -O dnot.html
wget http://causelists.nic.in/chennai_new/$DAYDIR/sitarr.html -O sitarr.html

#concatanate lists
cat sitarr.html > cl_temp.html
cat dnot.html >> cl_temp.html
cat cl.html >> cl_temp.html
cat misc.html >> cl_temp.html

#check and create destination directories
if ! test -d chennai_new
then
mkdir chennai_new
fi
if ! test -d chennai_new/$DAYDIR
then
mkdir chennai_new/$DAYDIR
fi

#copy list to destination directory
cp ./cl.html ./chennai_new/$DAYDIR/cl.html
cp ./misc.html ./chennai_new/$DAYDIR/misc.html
cp ./dnot.html ./chennai_new/$DAYDIR/dnot.html
cp ./sitarr.html ./chennai_new/$DAYDIR/sitarr.html
cp ./cl_temp.html ./chennai_new/$DAYDIR/full_list.html

#checklist
# show file size
ls -alH cl_temp.html
# WARN IF TRANSFER INTERRUPTED
grep --ignore-case "transfer interrupted" cl_temp.html

# CHECK LIST cl_temp.html for names [edit to suit your need]

grep --ignore-case "\(k.*\|r.*m.*\)\(kr.*\|ram.*\)raj\|justice\|j.r.k" cl_temp.html
#alert user with bell
echo -e "\a"
exit