Random Linux CLI Commands
A few random cli commands


dig

take a list of domains in a file and have dig rezolv all RR.

dig +nocmd -f DomainList.txt any +noall +answer >> DNSlist.txt

cmp

compare 2 files -s, –quiet, –silent suppress all normal output

cmp -s DNSlist.txt DNSlist1.txt

wpscan

scan only vulnerable plugins and vulnerable themes and list users

ruby wpscan.rb -u nob.ro -e vp,vt,u

nmap

-A (aggressive?) enables -O and -sV flags for OS and service detection -p scans all ports

sudo nmap -A -p "*" 192.168.1.5

grep

find a string (ex.: a database name) for all files inside a folder

grep -rnw 'public_html' -e "db_db"

find a string in all files in a folder and mode the files that have this string inside another folder

grep -i -Z -r -l 'string' . | xargs -I{} mv {} ./folder_name 

get lines A(fter) and B(efore) the search string

grep -A2 -B3 something somefile.txt

take all strings starting with ex: “www.” and ending in “>” and prints them eatch on a new line

grep -oP "(?=www.).*?(?=>)" domainlist.txt

grep for multiple strings

grep -e "string 1" -e "string 2" -e "string 3" 

grep regular expression find emails in a text:

grep -E -o "\b[a-zA-Z0-9._-][email protected][a-zA-Z0-9.-]+\.[a-zA-Z0-9.-]+\b" file.txt

grep -o '[[:alnum:]+\.\_\-]*@[[:alnum:]+\.\_\-]*' file.txt | sort | uniq -i

sed

replace a string with another string in a file. ex: replace www. with nothing

sed -e 's/www.//g' domainlist.txt

display only numbers and .

sed "s/[^0-9.]//g"

display text between 2 strings

sed -e 's/^.*BEGIN//g;s/END.*$//g'

tr

delete a character from a file. \ is the escape character.

tr -d \" file

tr -d \/ file

host

read line by line domains from a list and write results of host domain to another file

while read LINE ; do host $LINE; done < list-domains.txt > host-domains.txt

watch

watch for open TCP,UDP open ports with socket summary

watch ss -stplu

watch system temperatures with lm-sensors (apt-get install lm-sensors)

watch sensors

watch list of open connections updated eatch second

watch -n 1 lsof -i -P

sort uniq

Remove duplicate lines from a file

sort file.txt | uniq -u

less

Similar with tail -f but u can switch between normal less with “CTRL+C” to abort and “F” to watch the file

less +F file.txt

mdns

Browse mdns on network

avahi-browse -a -r

mdns-scan

dpkg

List all files from a package

dpkg --listfiles [package]

dpkg -S [package]

wget

Clone site with links

wget --recursive --no-clobber --page-requisites --html-extension  --convert-links  --domains site.tld http://site.tld

akw

get lines that are in file2 but that are not in file1

awk 'FNR==NR {a[$0]++; next} !a[$0]' file1 file2

comm

get lines that are in file2 but that are not in file1

sort file1 > file1.sorted
sort file2 > file2.sorted
comm -1 -3 file1.sorted file2.sorted

rv

reverse a string


echo "string" | rv

Posted on:

December 28, 2014
397 words

Contact Me:

If you are interested in contacting me feel free to do so by mail or online:

nob {at} nob.ro