Don t scan, just ask A new approach of identifying vulnerable web applications. 28th Chaos Communication Congress, 12/28/11 - Berlin



Similar documents
McAfee Certified Assessment Specialist Network

gathering Dave van Stein 9 april 2009

Course Content: Session 1. Ethics & Hacking

Vulnerability Assessment and Penetration Testing

CS 558 Internet Systems and Technologies

Client logo placeholder XXX REPORT. Page 1 of 37

Penetration Testing with Kali Linux

Web Vulnerability Scanner by Using HTTP Method

ABC LTD EXTERNAL WEBSITE AND INFRASTRUCTURE IT HEALTH CHECK (ITHC) / PENETRATION TEST

June 2014 WMLUG Meeting Kali Linux

EVALUATING COMMERCIAL WEB APPLICATION SECURITY. By Aaron Parke

Penetration Testing. Types Black Box. Methods Automated Manual Hybrid. oless productive, more difficult White Box

Chapter-1 : Introduction 1 CHAPTER - 1. Introduction

Recon and Mapping Tools and Exploitation Tools in SamuraiWTF Report section Nick Robbins

EVILSEED: A Guided Approach to Finding Malicious Web Pages


Penetration Testing. Presented by: Elham Hojati Advisor: Dr. Akbar Namin July 2014

PURPLE PAPER EXEGESIS OF VIRTUAL HOSTS HACKING BY PETKO PETKOV AND PAGVAC (ADRIAN PASTOR)

Bust a cap in a web app with OWASP ZAP

Web Application Threats and Vulnerabilities Web Server Hacking and Web Application Vulnerability

Aiming at Higher Network Security Levels Through Extensive PENETRATION TESTING. Anestis Bechtsoudis. abechtsoudis (at) ieee.

Tools for penetration tests 1. Carlo U. Nicola, HT FHNW With extracts from documents of : Google; Wireshark; nmap; Nessus.

STABLE & SECURE BANK lab writeup. Page 1 of 21

WHY DOES MY SPEED MONITORING GRAPH SHOW -1 IN THE TOOLTIP? 2 HOW CAN I CHANGE MY PREFERENCES FOR UPTIME AND SPEED MONITORING 2

Ethical Hacking Course Layout

(WAPT) Web Application Penetration Testing

HTTPParameter Pollution. ChrysostomosDaniel

DEVELOP ROBOTS DEVELOPROBOTS. We Innovate Your Business

ESISS Security Scanner

Acunetix Web Vulnerability Scanner. Getting Started. By Acunetix Ltd.

CRYPTUS DIPLOMA IN IT SECURITY

Kentico CMS security facts

Security Threat Kill Chain What log data would you need to identify an APT and perform forensic analysis?

IBM. Vulnerability scanning and best practices

Penetration Testing Workshop

ReadySpace Limited Unit J, 16/F Reason Group Tower, Castle PeakRoad, Kwai Chung, N.T.

COURSE NAME: INFORMATION SECURITY INTERNSHIP PROGRAM

W3Perl A free logfile analyzer

Secure Web Development Teaching Modules 1. Security Testing. 1.1 Security Practices for Software Verification

Defining our context 4 Phases of Penetration Testing APT Education? Penetration Testing. Philipp Lachberger / 23

With so many web applications, universities have a huge attack surface often without the IT security budgets or influence to back it up.

Pwning Intranets with HTML5

A VULNERABILITY AUDIT OF THE U.S. STATE E-GOVERNMENT NETWORK SYSTEMS

1 Log visualization at CNES (Part II)

So today we shall continue our discussion on the search engines and web crawlers. (Refer Slide Time: 01:02)

Web Application Security

Newsletter - September T o o l s W a t c h T e a m NJ OUCHN & MJ SOLER

EXTRA. Vulnerability scanners are indispensable both VULNERABILITY SCANNER

Conducting Web Application Pentests. From Scoping to Report For Education Purposes Only

Improving Web Vulnerability Scanning. Daniel Zulla

AtlSecCon 2012, 01 March Intru-Shun.ca Inc.

WEB APPLICATION HACKING. Part 2: Tools of the Trade (and how to use them)

Web Analytics Definitions Approved August 16, 2007

Learn Ethical Hacking, Become a Pentester

SEO Techniques for Higher Visibility LeadFormix Best Practices

Joomla Security Report

Wikto how does it work and how do I use it?

Apache: Analyze Logs for Malicious Activities & Monitor Server Performance

Honeypot that can bite: Reverse penetration

Annex B - Content Management System (CMS) Qualifying Procedure

[state of the internet] / SEO Attacks. Threat Advisory: Continuous Uptick in SEO Attacks

AUTOMATE CRAWLER TOWARDS VULNERABILITY SCAN REPORT GENERATOR

INTRODUCTION: PENETRATION TEST A BUSINESS PERSPECTIVE:

Automated Penetration Testing with the Metasploit Framework. NEO Information Security Forum March 19, 2008

How to scan/exploit a ssl based webserver. by xxradar. mailto:xxradar@radarhack.com. Version 1.

INFORMATION SECURITY TESTING

Introduction: 1. Daily 360 Website Scanning for Malware

SECURING APACHE : THE BASICS - III

SECURITY B-SIDES: ATLANTA STRATEGIC PENETRATION TESTING. Presented by: Dave Kennedy Eric Smith

Professional Penetration Testing Techniques and Vulnerability Assessment ...

SANS Dshield Webhoneypot Project. OWASP November 13th, The OWASP Foundation Jason Lam

Web Application Hacking (Penetration Testing) 5-day Hands-On Course

WINDOWS AZURE NETWORKING

Keep you computer running Keep your documents safe Identity theft Spreading infection Data Integrity (DPA: Data Protection Act)

The Top Web Application Attacks: Are you vulnerable?

WEB SITE SECURITY. Jeff Aliber Verizon Digital Media Services

Evaluation of Penetration Testing Software. Research

INTRUSION DETECTION SYSTEM (IDS) by Kilausuria Abdullah (GCIH) Cyberspace Security Lab, MIMOS Berhad

WEB Penetration Testing

Website Marketing Audit. Example, inc. Website Marketing Audit. For. Example, INC. Provided by

Web Hosting 101. with Patrick McNeil

Excellence Doesn t Need a Certificate. Be an. Believe in You AMIGOSEC Consulting Private Limited

Chris Gates

Zscaler Internet Security Frequently Asked Questions

Introduction:... 1 Security in SDLC:... 2 Penetration Testing Methodology: Case Study... 3

Don t promote your website; Let your website to promote you! Proposal for Website. EIndiaTech.

Attack Frameworks and Tools

CYBERSECURITY INESTIGATION AND ANALYSIS

Search Engine Submission

DEPLOYMENT GUIDE Version 2.1. Deploying F5 with Microsoft SharePoint 2010

PTSv2 in pills: The Best First for Beginners who want to become Penetration Testers. Self-paced, online, flexible access

Overview of Network Security The need for network security Desirable security properties Common vulnerabilities Security policy designs

How to break in. Tecniche avanzate di pen testing in ambito Web Application, Internal Network and Social Engineering

Malware Monitoring Service Powered by StopTheHacker

ETHICAL HACKING APPLICATIO WIRELESS110 00NETWORK APPLICATION MOBILE MOBILE0001

Challenges of Automated Web Application Scanning

Symantec Protection Center Enterprise 3.0. Release Notes

DIGITAL MARKETING. The Page Title Meta Descriptions & Meta Keywords

Adobe Systems Incorporated

Cyber Security Challenge Australia 2014

Transcription:

Don t scan, just ask A new approach of identifying vulnerable web applications

Summary It s about identifying web applications and systems Classical network reconnaissance techniques mostly rely on technical facts, like net blocks or URLs. They don t take the business connections into view and hence don t identify all potential targets. Therefore Spider-Pig was developed. A tool that searches for web applications connected to a company based on a scored keyword list and a crawler. That way web application run by satellite companies or service providers can be identified as well. TABLE OF CONTENTS Overview on classical network reconnaissance Restrictions and possible enhancements Presentation of Spider-Pig and its approach Statistics of a real-life scenario Conclusion

Classical network reconnaissance TECHNIQUES Query NIC and RIPE databases Reverse DNS DNS Zone Transfer Research in forums, social networks, Google hacking Identify routing and hence new systems Brute-Force DNS TOOLS whois, dig, web browser, Maltego,

How Business looks like Company A.0 THE TARGET :) SATELLITE COMPANIES Branch office Partner firm Company A.1 SERVICE PROVIDER Payment Provider Separate Webhoster Marketing Campaign Service Center Company B

what we see with network reconnaissance SOME (!) Net blocks IP addresses URLs DNS information Company A.1 Company A.0 ALL (?) Net blocks IP addresses URLs DNS information Company B

Restrictions and possible enhancements Restrictions Scans solely on a technical level: - IPs - Net blocks - URLs No business relations are taken into account. Potentially incomplete view: You can only search for what you know (e.g. RIPE queries). Probably just one company in focus. Enhancements For an attacker it doesn t count whether he attacks company A or B: - Access to sensitive data - Reputational damage - Take business relationships into account. Build a logical network between systems, independent of the owner. Identify (all) applications / systems linked to a company based on a business connection level, independent of provider. Not all targets are identified! Development of Spider-Pig.

Approach of Spider-Pig Spider-Pig identifies web applications connected to a company in an automated manner and hence allows to identify potential targets (web applications and systems) for further attacks. Due to German laws the tool will not be released to public. But its methodology will be explained 1. Model Business 2. Perform Search 3. Calculate Results 4. Exploit Model the business, services and relationships of the company in scope. Based on a scored keyword list with terms, like company name, product names, tax numbers. Note: No technical details (e.g. IPs) have to be provided. Based on the keywords, different search engines a queried in an automated manner. Furthermore whois information are gathered for the identified domains and IP addresses. Identified web sites are crawled for links to new web pages. Duplicates are removed from the results. For each domain a score is calculated and a sorted list with the results is generated. Top results are reviewed manually and potential targets are extracted. Targets are sent to a custom vulnerability scanner that uses the data from the crawling phase and attacks the web sites in an automated manner. At the end, we hopefully get a shell! :)

First step: Model Business Idea Web applications connected to a company contain terms like company name, products or imprint. There are imprecise terms like products names that lead to many web sites (e.g. Amazon). There are precise and very specific terms like tax numbers or imprints that lead to company pages. Solution Manually assemble a list that contains terms specific to the company in focus. Assign each entry a score, how unique and specific it is. Note: Experience showed that ~200 terms are sufficient and already lead to hundreds thousands of results. Technical details (e.g. URLs or software used) can be used but don t have to. A score from 1-100 works pretty well, but can be adopted.

Second step: Perform Search Idea Keywords are used to query different search engines (e.g. Google, Bing, Yahoo). Results are read and a list is built, where the fully qualified domain name (FQDN) is the index. Every time the FQDN is part of a search result, the score assigned to this search item is added. For the IP and domain name a whois query is executed to get details on the owner. If the whois record contains the company name, address or a keyword, the FQDN is as hit. Example: If you focus on a certain market (e.g. Europe) you can exclude all other results based on the CountryCode received via whois. 5. Additional steps 1. Take keyword from list 2. Query search engines 3. Check results and calc. score 4. Perform whois query Note: Using the FQDN as identifier, already removes duplicates and speeds up the process. FQDN IP URL Title Score Hit www.a.com 1.2.3.4 http://www.a.com/about Company A 10 (9 + 1) http://www.a.com/product1 View product1 N

Second step: Perform Search Implementation Notes Search component was developed in Perl and is not more than 503 LOC :) Google and Bing are used as search engines. Results are stored in memory and wrote to disk in regular intervals. Querying NIC and RIPE databases Depending on the database, the results do have different layouts. Consider this when parsing the whois queries. If you query too fast, you re getting blocked. Hence you have to rotate the queried systems or throttle the querying speed. As the search results can easily consist of multiple thousands of systems, consider that a small delay per query will delay the whole process by days.

Second step: Perform Search Using search engines For Perl there are many modules for different search engines. However most of them don t work. But curl and some regular expressions are fine to implement it yourself. Google covers pretty much, however Bing leads to some additional results. Google Google Custom Search API can be used 1.000 queries / day are free If you want to use the service efficient, you need to buy a search key, which is charged on the amount of search queries For a normal search this should result in ~200$ Consider different languages (!) as they will lead to diverse results and disable filter (!) Bing Bing API is available An AppID is needed that can be requested for free at Bing Developer Center Service is free of charge, however the queries have to be throttled Consider different markets (!) as they will lead to diverse results and disable filter (!) Google: https://www.googleapis.com/customsearch/v1?q=[qry]&num=10&start=[pointer]&lr=[lang]&key=[...]&cx=[cx]&filter=0 Bing: http://api.search.live.net/xml.aspx?appid=[appid]&query=[qry]&sources=web&adult=off&market=[market]&web.count=5 0&web.offset=[pointer]

Second step: Crawl Idea Search queries lead to a scored list of potential targets. The higher the score, the more probable the web page is connected to the company in focus. Company web pages probably link to new company web pages. Web pages with a high score (e.g. 200) should be crawled and checked for external links. Crawling top scored web pages Select certain systems from the list (see third step) and feed them to a crawler. The crawler then identifies new external links on these web pages and hence receives new web applications that might be of interest. These can be appended to the results list. Furthermore the crawler can already perform the first step of the vulnerability scan.

Second step: Crawl Credits The vulnerability management framework (pita) mentioned during the talk hasn t been developed by me. It is the effort of multiple years of development by various security consultants working at diverse companies and as freelancer! For Spider Pig just some modifications were made to base components. Crawler Written in Perl Identifies external links and adds them to the results list Performs first step of VA

Third step: Calculate Results Idea Now a scored list with all identified targets is available. If no filter is applied, the list contains hundreds thousands of results and cannot be checked manually. Results with a low score (e.g. <10) are probably not relevant. Results with a very high score are probably irrelevant, as they refer to Online Shops, Portals, etc. Hence just a small portion of the results has to be analyzed manually. Score Interesting Results Web page

Fourth step: Exploit Idea Now real targets have been identified that somehow are connected to the company in focus. Information regarding the content, forms, etc. have been saved in the database during the crawling. This basis can be used for an automated vulnerability scan. pita Note (again): The vulnerability management framework (pita) mentioned during the talk hasn t been developed by me. It is the effort of multiple years of development by various security consultants. Based on the identified web applications and the data collected during the crawling (e.g. forms that can be submitted), the vulnerability management framework pita is started. pita runs: Fuzzer (XSS, SQL Injection, ) Web Application Fingerprinter (wafp) Cookie/Session analyzer SSL tests 3 rd party tools (sqlmap, nmap, Nikto, ) At the end we get a report with all vulnerabilities, out of keywords

Statistics In case we would run Spider-Pig against a big known company, we might get the following results: Keywords used 287 Duration (including VA) 7 days it could be optimized :) Costs Unique FQDNs 150.871 FQDNs with filter Applications reviewed 300 ~200 Dollar - due to Google, without Google 0 Dollar 50.784 after a filter for EU countries was applied Confirmed applications 223 Marketing campaigns, customer service, official web pages, Applications were hosted on external and internal systems Official web pages within top 10! It works! :) Vulnerabilities

Conclusion Classical network reconnaissance doesn t take all aspects into account and hence does not identify all potential targets of a company in focus. Development of Spider-Pig, a tool to model business connections. Spider-Pig allows to identify web applications linked to a company due to their business connections. Based on a list of keywords without any technical details Spider-Pig attacks companies. Besides assembling a list of keywords and a manual review of results, no human interaction is needed. Spider-Pig collects some interesting statistics/information and offers new perspectives. Companies should not only focus on protecting their own systems. An attacker will (most likely) take the easiest way. Companies should not only protect their own systems but choose service providers with caution. Sensitive information should not be spread across multiple systems and business partners.

Thanks! Questions? Contact me fmihailowitsch [at] deloitte.de Credits Richard Sammet, who developed the idea & tool with me!