UC San Diego UC San Diego Electronic Theses and Dissertations

Size: px
Start display at page:

Download "UC San Diego UC San Diego Electronic Theses and Dissertations"

Transcription

1 UC San Diego UC San Diego Electronic Theses and Dissertations Title Understanding URL Abuse for Profit Permalink Author Chachra, Neha Publication Date Peer reviewed Thesis/dissertation escholarship.org Powered by the California Digital Library University of California

2 UNIVERSITY OF CALIFORNIA, SAN DIEGO Understanding URL Abuse for Profit A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Computer Science by Neha Chachra Committee in charge: Stefan Savage, Co-Chair Geoffrey M. Voelker, Co-Chair James H. Fowler Kirill Levchenko Lawrence K. Saul 2015

3 Copyright Neha Chachra, 2015 All rights reserved.

4 The Dissertation of Neha Chachra is approved and is acceptable in quality and form for publication on microfilm and electronically: Co-Chair Co-Chair University of California, San Diego 2015 iii

5 DEDICATION To mom for instilling a love of science in me. iv

6 TABLE OF CONTENTS Signature Page Dedication Table of Contents List of Figures List of Tables Acknowledgements Vita Abstract of the Dissertation iii iv v vii viii ix xii xiv Chapter 1 Introduction Chapter 2 Using Crawling to Study Large-Scale Fraud on the Web Introduction Related Work Architecture Selena Oliver-I Oliver-II Stallone Charlotte Browser Instrumentation using Custom Extensions Responding to Deterrence Summary Acknowledgements Chapter 3 Characterizing Affiliate Marketing Abuse Introduction Background Methodology Identifying Affiliate URLs and Cookies User Study Crawling Browser Extension Analysis Results Networks Affected by Cookie-Stuffing v

7 3.4.2 Prevalence of Cookie-Stuffing Techniques Fraudulent Browser Extensions Prevalence of Affiliate Marketing Summary Acknowledgements Chapter 4 Characterizing Domain Abuse and the Revenue Impact of Blacklisting Introduction Background Data Sets Authenticity and Ethics GlavMed and SpamIt URIBL Spam Feeds Domain Abuse Overall Observations Advertising Vectors Infrastructure Domains Purchased Traffic Blacklisting Blacklisting Speed Coverage Blacklisted Resource Blacklisting Penalty Discussion A Simple Revenue Model Changing Blacklisting Penalty Increasing Coverage Related Work Summary Acknowledgements Chapter 5 Conclusion Dissertation Summary Future Directions and Final Thoughts Bibliography vi

8 LIST OF FIGURES Figure 2.1. Figure 2.2. Figure 3.1. Figure 3.2. System design for Oliver-I, the second version of the Web crawler built in System design for the proof-of-concept crawler, Charlotte, built in Different actors and revenue flow in the affiliate marketing ecosystem. The left half of the figure depicts a potential customer receiving an affiliate cookie, while the right half shows the use of the affiliate cookie to determine payout upon a successful transaction. 25 Stuffed cookie distribution for top 10 categories of impacted merchants Figure 4.1. Revenue from clicks on different kinds of referrers Figure 4.2. Figure 4.3. Figure 4.4. Spammers seamlessly switch from one free hosting site to another in the face of takedowns Revenue of domains before and after blacklisting. Note that the x-axis is non-linear The highest cost of domain a spammer can afford (y-axis) against the time delay (x-axis) in blacklisting vii

9 LIST OF TABLES Table 2.1. Table 2.2. Table 3.1. Table 3.2. Different versions of the Web crawler we built for studying fraudulent ecosystems The table shows some of the supported features for interacting with Web pages and the corresponding challenges we faced Examples of affiliate URLs and cookies for different affiliate programs Affiliate programs affected by cookie-stuffing and the distribution of cookie-stuffing techniques corresponding to the 12K stuffed cookies we detected Table 3.3. Affiliate Programs that AffTracker users received cookies for Table 4.1. Classification of referrers used by SpamIt affiliates Table 4.2. Classification of referrers used by GlavMed affiliates Table 4.3. Table 4.4. Example referrers for advertising vectors ( and Web search), infrastructure domains (free hosting, bulk, and compromised), and purchased traffic Statistical differences between blacklisted and non-blacklisted domains viii

10 ACKNOWLEDGEMENTS I would like to thank my wonderful advisers, Stefan Savage and Geoff Voelker. They have been remarkable advisers and mentors throughout my doctoral program. Through their guidance, encouragement, and constructive feedback, they taught me how to do research and how to best present my work in speech and in writing. I am also grateful to them for encouraging me in my initiatives for the community development activities within CSE. I am also thankful to Kirill Levchenko who has not only been a supportive committee member, but also a wonderful friend and mentor. His astuteness and work ethic have been inspirational from the beginning of my doctoral program. I would also like to thank James H. Fowler and Lawrence K. Saul for being on my doctoral committee and being available whenever I needed any help. My co-authors have been remarkable to work with and I am very grateful for all their help. Different co-authors inspired me to learn different skills. I am especially grateful to Chris Grier, Alexandros Kapravelos, Christian Kreibich, Damon McCoy, and Vern Paxson. Julie Connor, Brian Kantor, Cindy Moore, and Jennifer Folkestad have also helped immeasurably with various administrative tasks for which I am sincerely grateful. I am also indebted to all my friends within CSE and outside who anchored me, guided me, inspired me, and cheered on for me at my lowest points. In particular, I am grateful to Alexander Bakst, Lakshmi Balachandran, Karyn Benson, Dimple Chugani, Sabrina Feve, Olga Gromadzka, Tristan Halvorson, Aswath Krishnan, Brenna Lanton, Sohini Manna, Keaton Mowery, Valentin Robert, Zachary Tatlock, and Rosalydia Tamayo. The CSE community in UC San Diego has been like a family over the last six years and I am thankful to all the faculty and students here who make it such a special ix

11 department. Specifically, I would like to thank Marc Andrysco, Dimitar Bounov, Joe DeBlasio, Jessica Gross, Tristan Halvorson, Danny Huang, Ranjit Jhalla, Lonnie Liu, Nadyne Nawar, David Kohlbrenner, Sorin Lerner, John McCullough, Sarah Meiklejohn, Marti Motoyama, Arjun Roy, Alex Snoeren, Cynthia Taylor, Panagiotis Vekris, Sravanthi Kota Venkata, Ganesh Venkatesh, Michael Vrable, David Wargo, and David Y. Wang among many others who make this department so special. I would also like to express my gratitude to Steve Checkoway for being a good friend and for creating the dissertation template, which proved to be indispensable for writing my dissertation. I am also very grateful to Danny Huang for taking over the responsibilities of running the Learn from Peers talk series that I started. Finally, I am grateful to my mother, Praveen Chachra and my brother, Ricky Chachra for being unfailingly supportive and loving. Chapter 2, in part, is a reprint of the material as it appears in Proceedings of the 4th USENIX Workshop on Cyber Security Experiment and Test (CSET). Chris Kanich, Neha Chachra, Damon McCoy, Chris Grier, David Y. Wang, Marti Motoyama, Kirill Levchenko, Stefan Savage, Geoffrey M. Voelker, The dissertation author was the primary investigator and author of this paper. Chapter 3, in part, is a reprint of the material as it appears Proceedings of the 2015 ACM Conference on Internet Measurement Conference (IMC). Neha Chachra, Stefan Savage, and Geoffrey M. Voelker, The dissertation author was the primary investigator and author of this paper. Chapter 3, in part, is also a reprint of material as it appears in Proceedings of the 23rd Usenix Security Symposium. Alexandros Kapravelos, Chris Grier, Neha Chachra, Christopher Kruegel, Giovanni Vigna, and Vern Paxson, The dissertation author was the primary investigator and author of this paper. x

12 Chapter 4, in full, is a reprint of the material as it appears in Proceedings of the Workshop of Economics of Information Security (WEIS). Neha Chachra, Damon McCoy, Stefan Savage, Geoffrey M. Voelker, The dissertation author was the primary investigator and author of this paper. xi

13 VITA 2009 Bachelor of Science in Computer Science, University of California, San Diego Research Assistant, University of California, San Diego 2012 Master of Science in Computer Science, University of California, San Diego 2015 Doctor of Philosophy in Computer Science, University of California, San Diego PUBLICATIONS Affiliate Crookies: Characterizing Affiliate Marketing Abuse. Neha Chachra, Stefan Savage, and Geoffrey M. Voelker. In Proceedings of the 2015 ACM Conference on Internet Measurement Conference (IMC), October Empirically Characterizing Domain Abuse and the Revenue Impact of Blacklisting. Neha Chachra, Damon McCoy, Stefan Savage, Geoffrey M. Voelker. In Proceedings of the Workshop of Economics of Information Security (WEIS), June Hulk: Eliciting Malicious Behavior in Browser Extensions. Alexandros Kapravelos, Chris Grier, Neha Chachra, Christopher Kruegel, Giovanni Vigna, and Vern Paxson. In Proceedings of the 23rd Usenix Security Symposium, August Manufacturing Compromise: The Emergence of Exploit-as-a-Service. Chris Grier, Lucas Ballard, Juan Caballero, Neha Chachra, Christian J. Dietrich, Kirill Levchenko, Panayiotis Mavrommatis, Damon McCoy, Antonio Nappa, Andreas Pitsillidis, Niels Provos, M. Zubair Rafique, Moheeb Abu Rajab, Christian Rossow, Kurt Thomas, Vern Paxson, Stefan Savage, Geoffrey M. Voelker. In Proceedings of the 2012 ACM Conference on Computer and Communications Security (CCS), October No Plan Survives Contact: Experience with Cybercrime Measurement. Chris Kanich, Neha Chachra, Damon McCoy, Chris Grier, David Y. Wang, Marti Motoyama, Kirill Levchenko, Stefan Savage, Geoffrey M. Voelker. In Proceedings of the 4th USENIX Workshop on Cyber Security Experiment and Test (CSET), August xii

14 Click Trajectories: End-to-End Analysis of the Spam Value Chain. Kirill Levchenko, Andreas Pitsillidis, Neha Chachra, Brandon Enright, Márk Félegyházi, Chris Grier, Tristan Halvorson, Chris Kanich, Christian Kreibich, He Liu, Damon McCoy, Nicholas Weaver, Vern Paxson, Stefan Savage, Geoffrey M. Voelker. In Proceedings of IEEE Symposium on Security and Privacy, May xiii

15 ABSTRACT OF THE DISSERTATION Understanding URL Abuse for Profit by Neha Chachra Doctor of Philosophy in Computer Science University of California, San Diego, 2015 Professor Stefan Savage, Co-Chair Professor Geoffrey M. Voelker, Co-Chair Large-scale online scam campaigns pose a significant security threat to casual Internet users. Attackers simultaneously abuse millions of URLs to swindle visitors by selling counterfeit goods, by phishing to steal user credentials for various online services, or even by infecting user machines with malware. In this dissertation, I address the problem of studying these large-scale fraudulent ecosystems that heavily rely on URL abuse for profit. I demonstrate the feasibility of analyzing ground truth data at scale to derive valuable insights about the underlying business model, allowing me to assess the impact of different interventions on attacker revenue. xiv

16 First, I address the challenge of collecting high-fidelity ground truth data under adversarial conditions. I describe the design of an efficient Web crawler that mimics real user activity to elicit fraudulent behavior from Web sites. I then use the crawler to detect affiliate marketing fraud on hundreds of Web sites. Fraudulent affiliates target merchants who outsource their affiliate programs to large affiliate networks to a much greater extent than merchants who run their own affiliate programs. Profit-oriented attackers seek to minimize costs to maximize profit. Therefore, the use of more sophisticated and expensive techniques against in-house affiliate programs suggests stricter policing by these programs. Subsequently, I analyze the ground truth sales data for two major counterfeit pharmaceutical programs with total sales of $41M over three years. Attackers advertising via spam and black-hat search-engine optimization show different patterns of domain abuse to maximize profit under differing defensive pressures. To analyze the efficacy of intervention, I use concurrent blacklisting data and study the revenue impact of blacklisting on spammer revenue. Blacklisting, which is the most popular intervention universally used against abusive URLs, is effective in limiting revenue from specific URLs that are blacklisted. However, it does not undermine overall profitability due to very low cost of replacing domains, high consumer demand for counterfeit pharmaceuticals, logistical difficulties in rapid detection and universal deployment of blacklists, and the sophistication and ingenuity of attackers in the face of takedowns. xv

17 Chapter 1 Introduction Almost every major online service today is plagued with spam content designed to lure visitors to attacker-controlled Web pages that are deployed to profit from unscrupulous activities. Some examples of these activities include the sale of counterfeit pharmaceuticals and luxury goods, phishing to steal user credentials for banking sites and other online services, infecting user machines with adware and other malware, etc. Usually, scammers advertise their Web sites through URLs contained within spam messages [49, 96, 100], spam posts on social networks [32, 85], blogs [58] and almost any site supporting public content creation, or through black-hat search-engine optimization [48, 90]. Some of these fraudulent campaigns use botnets to rapidly create vast quantities of spam content all over the Web [70, 89]. Understanding the revenue model of attackers is critical for identifying effective intervention techniques to undermine profitability. The scale of abuse coupled with the presence of an intelligent adversary makes it infeasible to reason about attacker resources in isolation, thus creating a need for observational analysis through direct engagement with attacker infrastructure. In this thesis, I directly address the challenges inherent in measuring profit-driven abuse of URLs for different fraudulent activities. In particular, I demonstrate the feasibility of collecting and analyzing large-scale ground truth data to reason about the cost dynamics within fraudulent ecosystems, thereby placing us on a 1

18 2 strong footing to devise better interventions. I describe the design of a custom Web crawler for collecting large-scale data about scam content hosted on attacker controlled URLs. I implemented the crawler to closely mimic real user activity to elicit malicious behavior from scam sites by instrumenting a modern browser with sophisticated custom add-ons. Subsequently, I used the crawler to collect various features (e.g., screenshots, HTML content, etc.) from millions of abusive Web sites. Attackers usually respond to crawling with deterrence or aggression, thereby giving rise to challenges in building and maintaining a defensive crawling infrastructure. I describe some of these challenges stemming from adversarial response, and also address concerns arising from the scale of measurement studies. Our key insights include instrumenting modern browsers to achieve verisimilitude, deploying evasive techniques such as using multiple diverse IP addresses for crawling, and continuously updating crawling in response to the inevitable evolution of attacker capabilities. I then use ground truth data to analyze abuse in two separate profit-oriented fraudulent ecosystems. In the first study, I use large-scale data collected from crawling and a user study to explain the dynamics of affiliate marketing fraud (Chapter 3). Affiliate fraud garnered widespread media attention in 2014 with the imprisonment of Shawn Hogan [86], an EBay affiliate indicted for wire fraud of $15.5M [31]. There have been multiple similar legal disputes over affiliate marketing since then [25]. These reports motivated me to measure affiliate fraud on the Web. In particular, I crawl Web sites to detect cookiestuffing, whereby a fraudulent affiliate causes the user s browser to receive a special cookie (i.e., an affiliate cookie) so that, when the user makes a purchase on a retailer site, the fraudulent affiliate earns a commission. I find that attackers target merchants outsourcing the operation of their affiliate programs to large affiliate networks much more than merchants who run their own affiliate programs. I measure the different

19 3 techniques deployed by affiliates to fraudulently earn commissions and observe attackers using more sophisticated and expensive techniques against in-house affiliate programs. Since fraudsters are motivated to minimize cost and maximize profit, the use of more expensive techniques against in-house affiliate programs suggests stricter policing by these programs. Using a concurrent dataset containing the source code of all publicly available browser extensions for Google Chrome, I also identify fraudulent browser extensions that perpetrate affiliate fraud by tampering with the Web sites visited by the extension users. Unlike Web sites perpetrating cookie-stuffing fraud, a browser extension provides much greater control to a fraudulent affiliate who can ensure that an extension user has an affiliate cookie just before making a purchase, thereby ensuring commission for the affiliate. In the second case study, I determine the cost dynamics of domain abuse in the counterfeit pharmaceutical market using a publicly available ground truth dataset for two of the largest counterfeit pharmaceutical campaigns with sales worth $41M spanning over three years. While the efficacy of interventions such as domain blacklisting and takedowns is frequently debated [27, 38], I discern the actual revenue impact of defensive interventions on spammer profitability. By looking at the Referer fields for actual transactions on counterfeit Web sites, I characterize the abuse of domains for advertising counterfeit drugs in spam messages and in search-engine results. This dataset shows that attackers abuse different advertising vectors and online services simultaneously to maximize revenue (Chapter 4). Furthermore, the spamming strategies are sensitive to the pressures of interventions and takedowns. Typically, spammers respond with ingenuity and agility to replace the URLs that are taken down and use strategies that mitigate the risk of detection. The ground truth data also suggests that domain blacklisting, a universally popular intervention used against abusive URLs, is only locally effective in limiting revenue from the specific domains that are blacklisted. At a macro level, the

20 4 inexpensive and replaceable nature of URLs coupled with a strong consumer demand for counterfeit drugs ensure that blacklisting is merely a small cost of business and does not undermine the overall profitability of online counterfeit pharmacies. While I only describe two case studies for understanding the abuse of URLs and the efficacy of interventions, the general approach of collecting large-scale data through crawling and analyzing it to infer attacker costs and revenue is applicable for studying any other adversarial campaign reliant on domain abuse at scale. This dissertation is structured as follows. Chapter 2 describes the design of an efficient, high-fidelity crawler we use for measuring fraudulent ecosystems. Chapter 3 provides an analysis of abuse in the affiliate marketing ecosystem. Chapter 4 describes domain abuse in the counterfeit pharmaceutical market, and the impact of domain blacklisting on attacker revenue. Finally, Chapter 5 summarizes this dissertation and provides some ideas for future directions.

21 Chapter 2 Using Crawling to Study Large-Scale Fraud on the Web In this chapter, we describe the design of a Web crawler to gather large-scale data to study fraudulent ecosystems on the Internet. We synthesize our experiences and the lessons we learned from building and maintaining crawling infrastructure used for visiting millions of Web sites. We describe the operational challenges to achieving verisimilitude obtaining measurements that capture real behaviors many of which arise from the adversarial nature of the measurement process. Since 2010, we have also iterated on the crawler infrastructure to accommodate the storage needs of the ever-growing scale of collected data, to support new functionality needed by other members of our group for various projects (e.g., the ability to set user-agents to study cloaking [90]), and also to take advantage of the latest technology available to us. We discuss the challenges we overcame and provide valuable insight into building crawlers that engage an adversary. 2.1 Introduction Scammers often attract customers to their Web sites through large-scale advertising via spam messages or spam comments on blogs, social networks, etc., or 5

22 6 through black-hat advertising and search-engine optimization. Irrespective of the advertising vector used, the end result is the same scammers abuse URLs to profit from unscrupulous activities. To discover new avenues for intervention, security researchers often perform observational, data-driven studies to reason about attacker resources and the underlying business models. Sometimes researchers can opportunistically benefit from publicly available data (Chapter 4), but generally we have to gather data about illicit ecosystems ourselves. A natural approach for collecting high-quality data about scammer infrastructure is to directly engage with it like a victim would. For example, using a tool to automatically visit spam-advertised URLs in a browser engages the network infrastructure of an attacker in a manner very similar to real users. Furthermore, when studying large-scale campaigns that abuse millions of domains, it is necessary to use an automated tool that can engage a sufficiently large segment of the ecosystem to identify major actors, common payment mechanisms, and other resources. Thus, we developed an efficient Web crawler to study scams heavily reliant on domain abuse. At its core, a Web crawler can use any tool for making Web requests. However, not all tools are created equal. Simple tools such as wget are inadequate for comprehensively studying fraudulent online markets because they do not emulate a real user well, and thus, do not always comprehensively elicit malicious behavior from Web sites. For example, in Chapter 3, we discovered many fraudulent Web sites that use simple JavaScript to create hidden DOM elements for affiliate fraud. Such behavior is completely opaque to simple tools that do not execute scripts. The challenges of mimicking a real user with high-fidelity and of eliciting the relevant malicious behavior from Web sites motivated us to use a modern browser in our crawler. Over time, we instrumented two popular browsers, Firefox and Google Chrome, with add-ons to emulate real user activity while crawling.

23 7 Besides emulating a real user, we also faced unexpected challenges stemming from the adversarial nature of our studies. In any adversarial system, every actor plays the dual roles of attacker and defender. We learned that scammers have become adept at evading defensive crawlers like ours, and respond with deterrence or aggression. Thus, we were forced to build evasive capabilities in our crawlers as well. Broadly, we evade detection by intelligently rate-limiting the number of visits per domain and by diversifying the range of IP addresses we use. Thus, over the years we learned how to build an efficient Web crawler that can collect data at scale in an adversarial system. In this chapter, we describe all the building blocks needed to build a Web crawler like ours, and the commonly desired features that we implemented in our crawler. Furthermore, we explain some of the important rules of thumb for future architects of such crawlers. This chapter is organized in the following manner. Section 2.3 describes the evolution of our crawler s architecture to accommodate the needs of different studies, such as crawling millions of spam-advertised domains [49] and crawling to detect affiliate fraud (Chapter 3). Section 2.4 describes the different capabilities we built into the custom browser extensions to extract various features from the visited Web pages. Next, in Section 2.5, we discuss the evolution of our crawling in response to detection by scammers. Finally, Section 2.6 summarizes our key takeaways from building and maintaining crawlers to study adversarial ecosystems. 2.2 Related Work Crawling is a common approach for data-collection on the Web. Perhaps the most popular publicly available crawling dataset is the Common Crawl [2] corpus. The 149TB of the Common Crawl data from August 2015 contains Web content from 1.84 billion Web pages along with the HTTP headers for all requests. However, the Common Crawl

24 8 data is unsuitable for studying fraudulent ecosystems because the Common Crawl crawler is easily detectable (it uses a special User-Agent field set to CCBot). Thus, an attacker can easily detect and evade the Common Crawl crawler by looking at the User-Agent header in the HTTP request. Furthermore, even if an attacker fails to actively evade the Common Crawl crawler, the data corpus is still incomplete for studying scam campaigns because Common Crawl actively attempts to avoid spam and other low-quality URLs while crawling [97]. In fact, we manually verified that the Common Crawl corpus did not contain the fraudulent URLs we discovered while studying affiliate fraud (Chapter 3). Crawling attacker-controlled URLs such as those advertised in spam is the most natural way to engage the infrastructure that directly underlies the attacker business model, and has long been a standard technique among security groups in both academia and industry. Academics have used crawling for a variety of measurement problems including the network characteristics of spam relays [70], Web hosting [45], phishing sites [59], blacklist effectiveness [77], spamming botnets [39, 96], and fast-flux networks [39], just to name a few. While most of these studies focus on the inferences derived from the collected data, we describe the challenges encountered in building and maintaining crawler infrastructure. 2.3 Architecture As late as 2007, researchers could successfully rely on command-line tools (e.g., curl) to request Web content from attacker-controlled servers [8]. However, as Web environments have grown increasingly complex, attackers have adopted more sophisticated techniques to lead customers to their Web sites while simultaneously avoiding automated crawlers. Along with all popular Web sites, scam sites have also adopted scripting languages (e.g., JavaScript, ActionScript) to dynamically generate content on the fly, sometimes conditionally based on user actions. For example, while

25 9 Table 2.1. Different versions of the Web crawler we built for studying fraudulent ecosystems. Crawler Year Salient Features Selena 2009 Based on Selenium Web testing framework Oliver-I 2010 Custom built; uses Firefox and is tightly coupled with Postgres database Oliver-II 2012 Decoupled from database; uses Hadoop file storage system Stallone 2012 Stand-alone version of Oliver-II Charlotte 2015 Uses Google Chrome; offers flexibility in data storage studying affiliate fraud on the Web (Section 3.3.3), we observed that the fraudulent affiliate in control of bestwordpressthemes.com caused the user s browser to fetch a fraudulent affiliate cookie only when another specific cookie was not already present on the user s browser. Under these circumstances, a simple command-line tool cannot fully trigger the malicious functionality of a Web site. Broadly, crawling today requires using a popular browser to ensure fidelity because, as extensible platforms, carefully constructed browser add-ons can make browsers act more like real users. In our case, we instrumented two popular browsers, Google Chrome [33] and Firefox [62] using custom extensions. While using a modern browser has several advantages, crawling URLs using a full browser greatly increases the CPU and memory resources required; connection timeouts from blacklisting and unreachable domains further tie up resources, preventing quick recycling (although some of this can be addressed through per-machine parallelism and configuring kernels to provision more network resources). Thus, for most large-scale projects it is a requirement to dedicate a cluster of machines to comprehensively study an ecosystem. Consequently, we designed the Web crawler to run efficiently on a cluster. As with any large software system, we iterated over the crawler design several times to implement new features and to address the limitations of the existing design. Table 2.1 shows the different versions of the crawler that we built over the years. In this

26 10 section, we describe the system design and the key changes we made in each iteration of the crawler Selena In 2009, we used Selenium, a Web testing framework, to create our first crawler called Selena, that ran on a cluster of 15 modes. Specifically, we used Selenium- Grid [72] that allowed running multiple browser instances concurrently on a cluster and automatically requesting features (e.g, HTML content) from the visited pages. However, since Selenium was developed as a framework for testing Web applications under the programmer s control, we faced challenges in using it for crawling Web sites under the attacker s control. Since Selenium was designed for testing Web applications on many different browsers, the resulting crawler was too complex and inefficient for our needs, thereby making maintenance difficult. For example, implementing our own code for taking a screenshot of a Web page proved very complex. Thus, we abandoned Selenium in favor of building our own crawling infrastructure from scratch Oliver-I We built the next version of the crawler, called Oliver-I, in Figure 2.1 shows the overall design of Oliver-I. At its core, it uses the Firefox browser instrumented with a custom extension for visiting Web sites. For storing the workload of URLs to be crawled and the results from Web visits, we use Postgres, a relational database. Oliver-I runs on a cluster of 30 nodes. To fully utilize a node, we run multiple instances of Firefox on each node. A node-controller process manages the interaction between the machines and the database. It interacts with the browsers using threadcontrollers. Each thread-controller manages a single instance of Firefox. The crawler operates as follows. First, the node-controller retrieves URLs from

27 11 Thread Controller DATABASE NODE CONTROLLER Thread Controller URL Q RESULTS Thread Controller Thread Controller NODE CONTROLLER Thread Controller Figure 2.1. System design for Oliver-I, the second version of the Web crawler built in the crawl queue in the database and stores them in a local queue shared between all of the thread-controllers. A thread-controller for a browser pops URLs from the local queue for visiting. The thread-controllers are responsible for requesting various features about the Web visit (e.g., page screenshots) using API calls exposed by the custom extension running in the browser. The thread-controller combines all the features from a visit and puts them in a local result queue, again shared between all of the thread-controllers. The node-controller empties the result queue by performing batch insertions into the database. This design is replicated on each of the 30 machines across the cluster. To ensure maximum hardware utilization, we run 40 instances of Firefox concurrently on each machine. Besides emulating a real user, performing a study relying on Web crawling requires the cluster to be robust to failure conditions in order to crawl in steady-state for a period of time (several months in our case). To tolerate intermittent network or server issues, Oliver-I makes multiple attempts to visit a URL before deciding it is not valid. To prevent stalled connections from idling a browser instance indefinitely, it times out long

28 12 page loads after multiple minutes. It also detects browser failures (e.g., a hung process) with heartbeat requests from the thread-controller every 15 seconds; if a browser does not respond, the thread-controller restarts it. To ameliorate the effects of any malware infections, memory leaks, or other resource leaks, we reboot the crawler and its browsers on every machine in the crawling cluster every 24 hours. We used this crawler to visit millions of spam-advertised domains [49]. During busy days of crawling spam feed corpuses, we crawled over 600K URLs/day with brief bursts corresponding to peak rates of 2M URLs/day Oliver-II We visited millions of spam-advertised URLs using Oliver-I steadily over a few months [49]. Recognizing the value of crawling to gather large-scale data, we allowed other members of our research group to use the crawler as well. Even though we built several of the features described in Section 2.4, our original system design was inherently limited for catering to the needs of different projects and of the scale of measurement studies. Thus, we iterated over the original system design to create Oliver-II in The new design offers flexibility for the needs of different projects and is much more scalable. However, we only focused on improving the overall system design and left the core browser instrumentation intact from Oliver-I. Thus, Oliver-II has the same capabilities for interacting with Web pages as Oliver-I, and it handles hung browsers and page errors in the same manner as well. Originally, we only needed a specific set of features from every Web visit to a spam-advertised URL, which we stored in Postgres database. As any other relational database, Postgres provides guarantees of consistency, atomicity, and durability. The collected data was, therefore, consistent and easy to query and analyze. However, as the scale of the data grew, we reached the maximum write-through rate to the database and

29 13 adding more nodes to the cluster had no effect on the rate of crawling. Thus, we had to revisit our system design and we decoupled the database from the crawler. We adopted the Hadoop file storage system, which allows storing data in files that are replicated across a cluster for redundancy. Another change we made from Oliver-I was to allow users to only request features necessary for their studies. For example, while we collected the DOM content, screenshots, HTTP headers, embedded page components, etc., for studying spam-advertised domains, we only needed the Set-Cookie headers and the DOM content of a select few elements to study affiliate fraud (Chapter 3). Allowing users to request only specific features from their crawl jobs allows us to use the file system storage more efficiently. Furthermore, decoupling from the relational database in favor of a loosely structured file storage system enables us to implement multiple new features (e.g., ability to execute arbitrary JavaScript) without being constrained to a specific database schema Stallone We recognized that some users wanted to use a crawler but did not need the resources of an entire cluster. Therefore, we created a stand-alone version of Oliver-II called Stallone, and made it publicly available. 1 By default, Stallone stores the crawling results on the local file system Charlotte As mentioned before, when we redesigned the crawler to create Oliver-II, we did not modify the core browser instrumentation. Thus, Oliver-II uses the same browser extension as Oliver-I. Eventually, like with any software engineering system, the legacy code from 2010 became outdated and limited the performance of the crawler. When we 1

30 14 REDIS Crawl Queue Results Queue WEBDIS SERVER NODE CONTROLLER NODE CONTROLLER Figure 2.2. System design for the proof-of-concept crawler, Charlotte, built in needed crawling to study affiliate marketing abuse, we decided to experiment with newer technology to implement a prototype by instrumenting Google Chrome. In 2010, Firefox was the only browser that offered sufficient functionality to browser extensions, but by 2015, Google Chrome had become a major player in the browser extension ecosystem. Thus, we created a new prototype for the crawler, called Charlotte, using Google Chrome as the browser and a much simpler system architecture. Charlotte is a smaller, proof-of-concept Web crawler with only four nodes. The system design for this crawler is shown in Figure 2.2. As with Firefox, we again instrumented Google Chrome using a custom extension. We used Charlotte for visiting approximately 500K domains to detect affiliate cookies from Web visits (Chapter 3). The new design consists of two queues on a fast key-value store called Redis. Redis provides atomic push and pop actions for queues, and thus supports concurrent requests. The extension interacts with the Redis database over HTTP using asynchronous requests through an off-the-shelf server, Webdis, which serializes HTTP requests into Redis requests.

31 15 Once the node-controller starts the browser instances on a node, the browser extension starts crawling. It grabs a URL from the Redis crawl queue (along with a list of features to extract from the URL visit), visits the URL, and saves the visit features in a JavaScript object. This object is serialized and saved in the result queue on Redis. For the study in Chapter 3, a separate process emptied the Redis result queue into a Postgres database for ease of analysis. However, our storage schema is not coupled with crawling, and one could equivalently transfer the data from the result queue onto a file system or a different relational database. The key differences between the architecture in Figure 2.1 and Figure 2.2 are the absence of thread-controllers and the simplification of the node-controller. While the node-controller in Figure 2.1 manages 30 different thread-controllers and interacts with the Postgres database, the node controller in Figure 2.2 simply observes the machine utilization to control the number of running browser instances for maximum hardware utilization. Beyond starting and stopping browser instances and ensuring that all the required processes are running, the node controller performs no other task. Furthermore, since the extension itself can intelligently visit URLs and save all the features directly, we do not need a separate thread-controller to individually request different Web visit features by making API calls to the browser extension. Charlotte handles failure conditions in the same manner as Oliver-I. Specifically, it restarts hung browsers and times out after three minutes for every page load. Overall, we found it much simpler to rewrite a crawler after maintaining existing crawlers for multiple years. While one can attribute the ease and expedience of creating a new crawler to our experience, using the improved API available in browsers and the newly available off-the-shelf tools such as Redis certainly proved beneficial in simplifying the crawler architecture, thereby making it easier to maintain.

32 16 Table 2.2. The table shows some of the supported features for interacting with Web pages and the corresponding challenges we faced. Feature Challenge Issuing clicks Pop-ups requiring clicks to proceed Setting custom headers Web site cloaking Reconstructing redirect chains Limited extension developer API in Browser Instrumentation using Custom Extensions When studying a fraudulent online marketplace, a researcher might desire a variety of features about the attacker-controlled Web pages. While some of the features (e.g., HTML content of a page) might be useful for studying almost any fraudulent ecosystem, other features might be relevant only in specific use-cases (e.g., HTTP cookies). Over the course of a few years, we built and deployed a variety of features required for different projects. In some cases, we also had to build features primarily as a response to the adversarial nature of the studies. We discuss some of the capabilities we built into the different versions of the crawler below, and the corresponding challenges we faced (Table 2.2). All of the functionality described below is implemented within the custom extensions used for instrumenting browsers. Issuing clicks While crawling spam-advertised URLs [49], we noticed that spammed sites increasingly used more sophisticated redirection techniques designed to trick users, but also to make crawling more difficult. In particular, sites use JavaScript to present popups to users that require a mouse click event to proceed to the final landing page, and use image overlays on the page to the same effect. Thus, we added the ability to issue clicks in our browser extension such that it detects and clicks on popups and images to trigger these sophisticated redirects. While it is simple to issue clicks using a browser extension in both, Google

33 17 Chrome and Firefox browsers, the challenge is to stay current with the different techniques scammers deploy, so that one can add appropriate capabilities to the crawler. Setting custom request headers To crawl URLs in search results to explore Web site cloaking and black-hat search-engine optimization (SEO) activity in [90], we observed the need for yet more sophisticated mimicry to emulate real users. In particular, crawling a cloaked page returns different results depending on the HTTP Referer and User-Agent fields. Sites decide whether a request comes from the result of a search based upon the contents of the Referer, cloaking the contents otherwise. Sites further return different content depending on the operating system specified in the User-Agent string (e.g., a scam site will sell fake anti-virus software to Windows-based visitors and offer an ipod scam to Mac-based visitors). A crawling system for such URLs therefore requires the further ability to parameterize specific HTTP fields for each URL crawled. Thus, we added the ability to modify outgoing HTTP request headers in our custom extensions. Redirections While crawling spam-advertised domains, we observed significant use of URL redirection to bring customers to storefront sites. Generally, visiting a spammed URL results in one or more redirects before finally landing the user on a storefront Web site where the user can buy goods such as counterfeit pharmaceuticals. Often, the spammed URLs are hosted on abused free hosting domains (e.g., imgur.com) or cheap bulk-purchased domains (Section 4.4.3), because free hosting and bulk-purchased domains impose very low cost on spammers when the URLs are blacklisted or taken down. Even when the redirects are simple 301 or 302 HTTP redirects, in 2010, we found it challenging to reconstruct the redirect chains of URLs from the visited URL to the final landing page. Most of the challenge arose from the lack of appropriate API calls

34 18 exposed to the Firefox extension developers, and the large number of embedded page components (e.g., advertisements) that often result in hundreds of network requests for every visited page. Fortunately, when we updated our crawler in 2015, we found that the browser extension API had become much richer over the years and, as a result, we found it straightforward to reconstruct the redirect chains for URLs requesting fraudulent affiliate cookies (Chapter 3). Querying specific DOM elements For some projects, we need to analyze specific DOM elements in a Web page. For example, while studying affiliate fraud, we wanted to analyze whether a DOM element associated with a specific Web request was visible to the end-user. Extracting DOM elements from processing raw HTML and HTTP headers, as collected using simple tools like wget, is extremely challenging. However, modern browsers are complex pieces of software built to gracefully handle the vast majority of programming errors when parsing the page content into a structured DOM tree. Furthermore, browsers expose multiple functions to extension developers for querying specific page elements and their style properties. Thus, a modern crawler can effectively delegate all the actions needed to gather data to a browser instrumented with a custom extension, thereby significantly reducing the workload of post-processing scripts. In fact, in Chapter 3, the browser extension installed on Google Chrome performs all of the actions we needed for studying affiliate fraud on the Web. The custom extension causes the browser to automatically visit a page; it then analyzes the incoming HTTP Set-Cookie headers, and parses the styling properties of the DOM elements corresponding to the cookie requests. Executing arbitrary scripts on demand Over time, as described in Section 2.3, we exposed the crawler to all of the members in our research group. One commonly requested

35 19 feature was the ability to extract context-specific information from pages (e.g., all frames containing advertisements). Instead of modifying the custom extension for every feature request, we allowed injection of arbitrary custom JavaScript (through eval) on the visited Web pages as requested by the crawler user. Besides the features listed above, we also implemented a few features in the crawler that we do not discuss here because we did not face any significant challenges in their implementation. Some examples include the ability to screenshot a Web page, to store the server IP address, to store the contents of all embedded components such as included CSS or JavaScript files, to save HTTP request and response headers, etc. All of these features are again implemented in the custom browser extensions for both, Google Chrome and Firefox browsers. 2.5 Responding to Deterrence In an adversarial environment, both the attacker and the defender attempt to evade detection by the opponent. While studying spammed URLs [49], we observed that spam sites blacklisted IP addresses we used for crawling. 2 To counter blacklisting, in every version of the crawler starting with Oliver-I, we have used a combination of prevention and detection. To avoid being blacklisted, we tunnel HTTP requests through proxies running in multiple disparate IP address ranges, using various cloud hosting and IP address reselling services, as well as address blocks loaned to us from individuals and via experimental allocations from the Regional Internet Registries. We then randomize HTTP requests across the address ranges to minimize the footprint of any single IP address for any given site. Blacklisting manifests either as DNS 2 In our related activities monitoring underground forums, and through collaborations with similarly focused researchers, we have found a range of blacklist firewall configurations designed to specifically block traffic from various security groups, including our own. This blacklisting includes both individual IP addresses as well as entire address ranges, /24 and larger, associated with particular security organizations.

36 20 errors (the name server is also commonly an element of scam infrastructure), 5xx HTTP error codes, or connection timeouts. We detect blacklisting by monitoring the rates of such errors and reacting when short-term rates well exceed long-term rates. In response, we retry requests using a different IP address range. Once again, the lessons we learned in 2010 proved useful again in 2015 when we created Charlotte. While studying affiliate fraud (Chapter 3), we learned of a high-profile case of a fraudulent affiliate, Shawn Hogan, indicted for wire fraud of $15.5M against EBay s affiliate program [31]. Shawn Hogan only perpetrated affiliate fraud once per IP [16], again strongly suggesting the need for a diverse IP range for studying fraud. Besides blacklisting crawler IPs, we also observed more aggressive actions from spammers such as implicit DDoS on crawlers via spam poisoning. In particular, the Rustock bot started emitting large amounts of spam containing URLs with random.com domains (literally millions of both real and unregistered domains, none of which was truly being advertised [14]). The purpose of this campaign appears to be both poisoning blacklisting services with large numbers of false positives and overwhelming crawlers such as ours with timeouts and diverse useless page loads. When this behavior started in September of 2010, we were able to manually identify some lexical patterns used across most of these URLs and tried to filter them out using regular expressions. This approach was ultimately unsuccessful as the operators of Rustock changed their poisoning code to become ever more random. To address this issue we added state to our crawler and, instead of blindly crawling all URLs, use a method that tracks the appearance of individual registered domains over time. Thus, Oliver-I schedules crawls based on how frequently a registered domain has been seen. This approach prioritizes new domains, minimizing the overhead and blacklisting risk of re-crawling the same domain many times, but not crawl millions of domains that have only been seen once.

37 Summary Data collected from Web crawling can provide valuable insight into the underground activities on the Internet. Attackers increasingly make use of techniques to treat real users and automated crawlers differently. Thus, crawlers today need to be sophisticated to sufficiently mimic a real user to gather useful data. We built such a crawler by instrumenting a modern browser using a custom extension and used it across a series of projects. The crawler can collect a variety of features including HTML content of a page, scripts and stylesheets embedded within a page, all network headers, screenshots of Web pages, etc. It can also perform specific actions such as clicking or executing arbitrary JavaScript on demand. We were also forced to use a wide range of proxy IP addresses to protect our crawler nodes from being blacklisted by the spammers. Cloud hosting and IP address resellers proved to be expedient and inexpensive solutions. Finally, the scale of spammer activities and the resource usage of browsers necessitated the use of multiple instances of browsers running across an entire cluster to successfully crawl a large number of URLs in a timely manner. Even though instrumenting a modern browser provides flexibility in interacting with the crawled Web page, such as by clicking, we have had to stay current with the evolution of various techniques used by spammers to continuously update our crawler over the years. We describe the different versions of crawlers we built between 2009 and 2015 to make the crawler more versatile and easier to maintain. In the next chapter, we analyze abuse in the affiliate marketing ecosystem. We deploy the crawler to discover Web sites stealing commissions on user purchases from popular online retailers such as Amazon.

38 Acknowledgements Chapter 2, in part, is a reprint of the material as it appears in Proceedings of the 4th USENIX Workshop on Cyber Security Experiment and Test (CSET). Chris Kanich, Neha Chachra, Damon McCoy, Chris Grier, David Y. Wang, Marti Motoyama, Kirill Levchenko, Stefan Savage, Geoffrey M. Voelker, The dissertation author was the primary investigator and author of this paper.

39 Chapter 3 Characterizing Affiliate Marketing Abuse In this chapter, we provide a case study of detecting fraud in an ecosystem using ground truth data that we collect using the Web crawler infrastructure described in Chapter 2. Specifically, we use the page content and the HTTP headers exchanged with attacker-controlled Web sites to measure and characterize the techniques deployed by unscrupulous affiliates to fraudulently earn commissions on purchases from major e-commerce merchants such as Amazon, Macys, etc. 3.1 Introduction Affiliate marketing is a popular form of pay-per-action or pay-per-sale advertising whereby independent marketers are paid a commission on converting traffic (e.g., clicks that culminate in a sale). Heralded as the the holy grail of online advertising a decade ago [83], affiliate marketing has become prevalent across the Web, complementing more traditional forms of display advertising. Affiliate marketing is often described as a low-risk proposition for merchants, as they pay out only upon the successful completion of sales. Consequently, affiliate marketing attracts significant investment from almost every major online retailer, some 23

40 24 of whom also invest in multiple third-party affiliate advertising programs. Similarly, it is an attractive proposition for independent marketers as they can create online content (e.g., book reviews) that can be monetized simultaneously as a means to attract likely converting traffic and to host contextual advertising. For every click that converts into a sale, affiliate marketing is frequently much more profitable than display ads because the earning commission is typically between 4 and 10% of the sales revenue [6, 84]. Like almost all economic activity on the Web, affiliate marketing also attracts the attention of fraudsters looking to make easy cash. Affiliate fraud garnered widespread media attention in 2014 with the imprisonment of Shawn Hogan [86], an EBay affiliate indicted for wire fraud of $15.5M through the use of a technique called cookie-stuffing [31] whereby the Web cookies used to determine the likely source of user traffic are overwritten without the user s knowledge. There have been multiple similar legal disputes over affiliate marketing since then [25]. Besides media attention, affiliate marketing has also been a subject of academic research to understand the incentives in the ecosystem and the extent of affiliate fraud [26, 78]. We study the affiliate fraud ecosystem using ground truth data we collect by crawling hundreds of thousands of domains with a modern browser instrumented using a custom extension (Chapter 2). Our extension, AffTracker, can identify affiliate cookies for six of the top affiliate programs. From crawling likely sources of cookie-stuffing, we find that large affiliate networks such as CJ Affiliate (formerly Commission Junction) and Rakuten LinkShare (recently renamed to Rakuten Affiliate Network) are targeted by cookie-stuffing orders of magnitude more than affiliate programs run by merchants themselves, such as the Amazon Associates Program. Lower attempted fraud coupled with the much higher use of evasive cookie-stuffing techniques against in-house affiliate programs suggests that such programs enjoy stricter policing, thereby making them more difficult targets of fraud.

41 25 u Signs up v Receives affiliate links AFFILIATE <a>affiliate link</a> ~ Pays affiliate AFFILIATE URL liate NETWORK es affi h c t e t fe r a e li s g affi row <fyin x B iden ookie c s n r Retu AFFILIATE ith kie w s coo send el r e s Brow cking pix a { for tr y es requ t AFFILIATE NETWORK } Pays affiliate z Visits merchant site w Clicks on affiliate link network containing tracking pixel Purchases goods USER MERCHANT (a) USER MERCHANT (b) Figure 3.1. Different actors and revenue flow in the affiliate marketing ecosystem. The left half of the figure depicts a potential customer receiving an affiliate cookie, while the right half shows the use of the affiliate cookie to determine payout upon a successful transaction. Our analysis also shows that retailers in the Apparel, Department Stores, and Travel and Hotels sectors of e-commerce are disproportionately targeted by affiliate fraud on the Web, usually through domains typosquatted on the merchant s trademarks. Furthermore, we also identify several browser extensions that are complicit in affiliate fraud. We find that all of these extensions some with thousands of users each earn commissions by silently modifying the merchant URLs visited by the extension users while browsing the Web. Finally, we evaluate data from a two-month in situ user study with 70+ users and find that affiliate marketing is dominated by a small number of affiliates while cookie-stuffing fraud is rarely encountered. Overall, our targeted crawl and user study both suggest that the problem, while real, appears to be less prevalent than suggested by previous reports. 3.2 Background Online merchants benefit from affiliate marketing through customized and tar- geted advertising for their products. For example, when an affiliate reviews a bicycle on

No Plan Survives Contact

No Plan Survives Contact No Plan Survives Contact Experience with Cybercrime Measurement Chris Kanich Neha Chachra Damon McCoy Chris Grier David Wang Marti Motoyama Kirill Levchenko Stefan Savage Geoffrey M. Voelker UC San Diego

More information

Intermediaries and regulation

Intermediaries and regulation Intermediaries and regulation Economics of Security and Privacy (BMEVIHIAV15) Mark Felegyhazi assistant professor CrySyS Lab. Intermediaries and regulation BME Department of Telecommunications (Híradástechnikai

More information

CLOAK OF VISIBILITY : DETECTING WHEN MACHINES BROWSE A DIFFERENT WEB

CLOAK OF VISIBILITY : DETECTING WHEN MACHINES BROWSE A DIFFERENT WEB CLOAK OF VISIBILITY : DETECTING WHEN MACHINES BROWSE A DIFFERENT WEB CIS 601: Graduate Seminar Prof. S. S. Chung Presented By:- Amol Chaudhari CSU ID 2682329 AGENDA About Introduction Contributions Background

More information

3.5 SECURITY. How can you reduce the risk of getting a virus?

3.5 SECURITY. How can you reduce the risk of getting a virus? 3.5 SECURITY 3.5.4 MALWARE WHAT IS MALWARE? Malware, short for malicious software, is any software used to disrupt the computer s operation, gather sensitive information without your knowledge, or gain

More information

deseo: Combating Search-Result Poisoning Yu USF

deseo: Combating Search-Result Poisoning Yu USF deseo: Combating Search-Result Poisoning Yu Jin @MSCS USF Your Google is not SAFE! SEO Poisoning - A new way to spread malware! Why choose SE? 22.4% of Google searches in the top 100 results > 50% for

More information

Underground economy. Economics of Security and Privacy (BMEVIHIAV15) Mark Felegyhazi. assistant professor CrySyS Lab.

Underground economy. Economics of Security and Privacy (BMEVIHIAV15) Mark Felegyhazi. assistant professor CrySyS Lab. Underground economy Economics of Security and Privacy (BMEVIHIAV15) Mark Felegyhazi assistant professor CrySyS Lab. Underground economy BME Department of Telecommunications (Híradástechnikai Tanszék) mfelegyhazi(atat)crysys(dot)hu

More information

The Cost of Phishing. Understanding the True Cost Dynamics Behind Phishing Attacks A CYVEILLANCE WHITE PAPER MAY 2015

The Cost of Phishing. Understanding the True Cost Dynamics Behind Phishing Attacks A CYVEILLANCE WHITE PAPER MAY 2015 The Cost of Phishing Understanding the True Cost Dynamics Behind Phishing Attacks A CYVEILLANCE WHITE PAPER MAY 2015 Executive Summary.... 3 The Costs... 4 How To Estimate the Cost of an Attack.... 5 Table

More information

UC San Diego UC San Diego Electronic Theses and Dissertations

UC San Diego UC San Diego Electronic Theses and Dissertations UC San Diego UC San Diego Electronic Theses and Dissertations Title A Comprehensive Approach to Undermining Search Result Poisoning Permalink https://escholarship.org/uc/item/9cn1867h Author Wang, David

More information

Cloak of Visibility. -Detecting When Machines Browse A Different Web. Zhe Zhao

Cloak of Visibility. -Detecting When Machines Browse A Different Web. Zhe Zhao Cloak of Visibility -Detecting When Machines Browse A Different Web Zhe Zhao Title: Cloak of Visibility -Detecting When Machines Browse A Different Web About Author: Google Researchers Publisher: IEEE

More information

The role of phone numbers in understanding cyber-crime

The role of phone numbers in understanding cyber-crime The role of phone numbers in understanding cyber-crime A. J. Isachenkova M. Balduzzi + A. Francillon D. Balzarotti Eurecom, Sophia Antipolis, France + Trend Micro Research, EMEA July 11, 2013 1/34 Introduction

More information

On the Effects of Registrar-level Intervention

On the Effects of Registrar-level Intervention On the Effects of Registrar-level Intervention He (Lonnie) Liu Kirill Levchenko Geoffrey M. Voelker Stefan Savage UC San Diego Mark Felegyhazi Christian Kreibich UC Berkeley ICSI 1 Spam 2 Spam Infrastructure

More information

10 KEY WAYS THE FINANCIAL SERVICES INDUSTRY CAN COMBAT CYBER THREATS

10 KEY WAYS THE FINANCIAL SERVICES INDUSTRY CAN COMBAT CYBER THREATS 10 KEY WAYS THE FINANCIAL SERVICES INDUSTRY CAN COMBAT CYBER THREATS WHITE PAPER INTRODUCTION BANKS ARE A COMMON TARGET FOR CYBER CRIMINALS AND OVER THE LAST YEAR, FIREEYE HAS BEEN HELPING CUSTOMERS RESPOND

More information

Scalability, Fidelity, and Containment in the Potemkin Virtual Honeyfarm

Scalability, Fidelity, and Containment in the Potemkin Virtual Honeyfarm Scalability, Fidelity, and in the Potemkin Virtual Honeyfarm Michael Vrable, Justin Ma, Jay Chen, David Moore, Erik Vandekieft, Alex C. Snoeren, Geoffrey M. Voelker, Stefan Savage Collaborative Center

More information

COMPUTATIONAL CHALLENGES IN HIGH-RESOLUTION CRYO-ELECTRON MICROSCOPY. Thesis by. Peter Anthony Leong. In Partial Fulfillment of the Requirements

COMPUTATIONAL CHALLENGES IN HIGH-RESOLUTION CRYO-ELECTRON MICROSCOPY. Thesis by. Peter Anthony Leong. In Partial Fulfillment of the Requirements COMPUTATIONAL CHALLENGES IN HIGH-RESOLUTION CRYO-ELECTRON MICROSCOPY Thesis by Peter Anthony Leong In Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy California Institute

More information

with Advanced Protection

with Advanced  Protection with Advanced Email Protection OVERVIEW Today s sophisticated threats are changing. They re multiplying. They re morphing into new variants. And they re targeting people, not just technology. As organizations

More information

Botnets: major players in the shadows. Author Sébastien GOUTAL Chief Science Officer

Botnets: major players in the shadows. Author Sébastien GOUTAL Chief Science Officer Botnets: major players in the shadows Author Sébastien GOUTAL Chief Science Officer Table of contents Introduction... 3 Birth of a botnet... 4 Life of a botnet... 5 Death of a botnet... 8 Introduction

More information

Security+ Guide to Network Security Fundamentals, Third Edition. Chapter 3 Protecting Systems

Security+ Guide to Network Security Fundamentals, Third Edition. Chapter 3 Protecting Systems Security+ Guide to Network Security Fundamentals, Third Edition Chapter 3 Protecting Systems Objectives Explain how to harden operating systems List ways to prevent attacks through a Web browser Define

More information

Prevx 3.0 v Product Overview - Core Functionality. April, includes overviews of. MyPrevx, Prevx 3.0 Enterprise,

Prevx 3.0 v Product Overview - Core Functionality. April, includes overviews of. MyPrevx, Prevx 3.0 Enterprise, Prevx 3.0 v3.0.1.65 Product Overview - Core Functionality April, 2009 includes overviews of MyPrevx, Prevx 3.0 Enterprise, and Prevx 3.0 Banking and Ecommerce editions Copyright Prevx Limited 2007,2008,2009

More information

HOW TO CHOOSE A NEXT-GENERATION WEB APPLICATION FIREWALL

HOW TO CHOOSE A NEXT-GENERATION WEB APPLICATION FIREWALL HOW TO CHOOSE A NEXT-GENERATION WEB APPLICATION FIREWALL CONTENTS EXECUTIVE SUMMARY 1 WEB APPLICATION SECURITY CHALLENGES 2 INSIST ON BEST-IN-CLASS CORE CAPABILITIES 3 HARNESSING ARTIFICIAL INTELLIGENCE

More information

RSA INCIDENT RESPONSE SERVICES

RSA INCIDENT RESPONSE SERVICES RSA INCIDENT RESPONSE SERVICES Enabling early detection and rapid response EXECUTIVE SUMMARY Technical forensic analysis services RSA Incident Response services are for organizations that need rapid access

More information

THE BUSINESS CASE FOR OUTSIDE-IN DATA CENTER SECURITY

THE BUSINESS CASE FOR OUTSIDE-IN DATA CENTER SECURITY THE BUSINESS CASE FOR OUTSIDE-IN DATA CENTER SECURITY DATA CENTER WEB APPS NEED MORE THAN IP-BASED DEFENSES AND NEXT-GENERATION FIREWALLS table of contents.... 2.... 4.... 5 A TechTarget White Paper Does

More information

Security & Phishing

Security & Phishing Email Security & Phishing Best Practices In Cybersecurity Presenters Bill Shieh Guest Speaker Staff Engineer Information Security Ellie Mae Supervisory Special Agent Cyber Crime FBI 2 What Is Phishing?

More information

KASPERSKY FRAUD PREVENTION FOR ENDPOINTS

KASPERSKY FRAUD PREVENTION FOR ENDPOINTS KASPERSKY FRAUD PREVENTION FOR ENDPOINTS www.kaspersky.com KASPERSKY FRAUD PREVENTION 1. Ways of Attacking Online Banking The prime motive behind cybercrime is making money and today s sophisticated criminal

More information

The Invisible Threat of Modern Malware Lee Gitzes, CISSP Comm Solutions Company

The Invisible Threat of Modern Malware Lee Gitzes, CISSP Comm Solutions Company The Invisible Threat of Modern Malware Lee Gitzes, CISSP Comm Solutions Company November 12, 2014 Malware s Evolution Why the change? Hacking is profitable! Breaches and Malware are Projected to Cost $491

More information

Tripwire Inferring Internet Site Compromise

Tripwire Inferring Internet Site Compromise Tripwire Inferring Internet Site Compromise Joe DeBlasio Stefan Savage Geoffrey M. Voelker Alex C. Snoeren UC San Diego On account compromise Compromise of email/social network/etc is devastating Personal/professional

More information

Fraud Mobility: Exploitation Patterns and Insights

Fraud Mobility: Exploitation Patterns and Insights WHITEPAPER Fraud Mobility: Exploitation Patterns and Insights September 2015 2 Table of Contents Introduction 3 Study Methodology 4 Once a SSN has been Compromised, it Remains at Risk 4 Victims Remain

More information

Threat Detection and Mitigation for IoT Systems using Self Learning Networks (SLN)

Threat Detection and Mitigation for IoT Systems using Self Learning Networks (SLN) Threat Detection and Mitigation for IoT Systems using Self Learning Networks (SLN) JP Vasseur, PhD - Cisco Fellow jpv@cisco.com Maik G. Seewald, CISSP Sr. Technical Lead maseewal@cisco.com June 2016 Cyber

More information

LivePoplet: Technology That Enables Mashup of Existing Applications

LivePoplet: Technology That Enables Mashup of Existing Applications LivePoplet: Technology That Enables Mashup of Existing Applications Akihiko Matsuo Kenji Oki Akio Shimono (Manuscript received January 29, 2009) We have developed LivePoplet, a technology that allows the

More information

A Review Paper on Network Security Attacks and Defences

A Review Paper on Network Security Attacks and Defences EUROPEAN ACADEMIC RESEARCH Vol. IV, Issue 12/ March 2017 ISSN 2286-4822 www.euacademic.org Impact Factor: 3.4546 (UIF) DRJI Value: 5.9 (B+) A Review Paper on Network Security Attacks and ALLYSA ASHLEY

More information

CASE STUDY TOP 10 AIRLINE SOLVES AUTOMATED ATTACKS ON WEB & MOBILE

CASE STUDY TOP 10 AIRLINE SOLVES AUTOMATED ATTACKS ON WEB & MOBILE CASE STUDY TOP 10 AIRLINE SOLVES AUTOMATED ATTACKS ON WEB & MOBILE The Customer: Top 10 Airline CREDENTIAL STUFFING KILLCHAIN A Top 10 Global Airline that earns over $15 Billion in annual revenue and serves

More information

Fast and Evasive Attacks: Highlighting the Challenges Ahead

Fast and Evasive Attacks: Highlighting the Challenges Ahead Fast and Evasive Attacks: Highlighting the Challenges Ahead Moheeb Rajab, Fabian Monrose, and Andreas Terzis Computer Science Department Johns Hopkins University Outline Background Related Work Sampling

More information

Machine-Powered Learning for People-Centered Security

Machine-Powered Learning for People-Centered Security White paper Machine-Powered Learning for People-Centered Security Protecting Email with the Proofpoint Stateful Composite Scoring Service www.proofpoint.com INTRODUCTION: OUTGUNNED AND OVERWHELMED Today

More information

THE ACCENTURE CYBER DEFENSE SOLUTION

THE ACCENTURE CYBER DEFENSE SOLUTION THE ACCENTURE CYBER DEFENSE SOLUTION A MANAGED SERVICE FOR CYBER DEFENSE FROM ACCENTURE AND SPLUNK. YOUR CURRENT APPROACHES TO CYBER DEFENSE COULD BE PUTTING YOU AT RISK Cyber-attacks are increasingly

More information

AdvOSS AAA: Architecture, Call flows and implementing emerging business use cases

AdvOSS AAA: Architecture, Call flows and implementing emerging business use cases AdvOSS AAA: Architecture, Call flows and implementing emerging business use cases An AdvOSS White Paper Latest version of this white paper can always be found at http://advoss.com/resources/whitepapers/advoss-aaa-workflows.pdf

More information

Reducing the Cost of Incident Response

Reducing the Cost of Incident Response Reducing the Cost of Incident Response Introduction Cb Response is the most complete endpoint detection and response solution available to security teams who want a single platform for hunting threats,

More information

Cloud Computing: Making the Right Choice for Your Organization

Cloud Computing: Making the Right Choice for Your Organization Cloud Computing: Making the Right Choice for Your Organization A decade ago, cloud computing was on the leading edge. Now, 95 percent of businesses use cloud technology, and Gartner says that by 2020,

More information

The Credential Phishing Handbook. Why It Still Works and 4 Steps to Prevent It

The Credential Phishing Handbook. Why It Still Works and 4 Steps to Prevent It The Credential Phishing Handbook Why It Still Works and 4 Steps to Prevent It Introduction Phishing is more than 20 years old, but still represents more than 90% of targeted attacks. The reason is simple:

More information

WHAT IS MALICIOUS AUTOMATION? Definition and detection of a new pervasive online attack

WHAT IS MALICIOUS AUTOMATION? Definition and detection of a new pervasive online attack WHAT IS MALICIOUS AUTOMATION? Definition and detection of a new pervasive online attack INTRODUCTION WHAT IS I n this whitepaper, we will define the problem of malicious automation and examine some of

More information

Combatting Browser Fingerprinting with ChromeDust

Combatting Browser Fingerprinting with ChromeDust Combatting Browser Fingerprinting with ChromeDust Ram Bhaskar Rishikesh Tirumala Timmy Galvin 6.858 Final Project (Lab 7) December 12, 2013 Introduction

More information

Enterprise D/DoS Mitigation Solution offering

Enterprise D/DoS Mitigation Solution offering Enterprise D/DoS Mitigation Solution offering About the Domain TCS Enterprise Security and Risk Management (ESRM) offers full services play in security with integrated security solutions. ESRM s solution

More information

RSA INCIDENT RESPONSE SERVICES

RSA INCIDENT RESPONSE SERVICES RSA INCIDENT RESPONSE SERVICES Enabling early detection and rapid response EXECUTIVE SUMMARY Technical forensic analysis services RSA Incident Response services are for organizations that need rapid access

More information

How Microsoft IT Reduced Operating Expenses Using Virtualization

How Microsoft IT Reduced Operating Expenses Using Virtualization How Microsoft IT Reduced Operating Expenses Using Virtualization Published: May 2010 The following content may no longer reflect Microsoft s current position or infrastructure. This content should be viewed

More information

THE EFFECTIVE APPROACH TO CYBER SECURITY VALIDATION BREACH & ATTACK SIMULATION

THE EFFECTIVE APPROACH TO CYBER SECURITY VALIDATION BREACH & ATTACK SIMULATION BREACH & ATTACK SIMULATION THE EFFECTIVE APPROACH TO CYBER SECURITY VALIDATION Cymulate s cyber simulation platform allows you to test your security assumptions, identify possible security gaps and receive

More information

You Are Being Watched Analysis of JavaScript-Based Trackers

You Are Being Watched Analysis of JavaScript-Based Trackers You Are Being Watched Analysis of JavaScript-Based Trackers Rohit Mehra IIIT-Delhi rohit1376@iiitd.ac.in Shobhita Saxena IIIT-Delhi shobhita1315@iiitd.ac.in Vaishali Garg IIIT-Delhi vaishali1318@iiitd.ac.in

More information

Managed Enterprise Phishing Protection. Comprehensive protection delivered 24/7 by anti-phishing experts

Managed Enterprise Phishing Protection. Comprehensive protection delivered 24/7 by anti-phishing experts Managed Enterprise Phishing Protection Comprehensive protection delivered 24/7 by anti-phishing experts MANAGED ENTERPRISE PHISHING PROTECTION 24/7 expert protection against phishing attacks that get past

More information

Integrated Access Management Solutions. Access Televentures

Integrated Access Management Solutions. Access Televentures Integrated Access Management Solutions Access Televentures Table of Contents OVERCOMING THE AUTHENTICATION CHALLENGE... 2 1 EXECUTIVE SUMMARY... 2 2 Challenges to Providing Users Secure Access... 2 2.1

More information

A Guide to Closing All Potential VDI Security Gaps

A Guide to Closing All Potential VDI Security Gaps Brought to you by A Guide to Closing All Potential VDI Security Gaps IT and security leaders are embracing virtual desktop infrastructure (VDI) as a way to improve security for an increasingly diverse

More information

BUFFERZONE Advanced Endpoint Security

BUFFERZONE Advanced Endpoint Security BUFFERZONE Advanced Endpoint Security Enterprise-grade Containment, Bridging and Intelligence BUFFERZONE defends endpoints against a wide range of advanced and targeted threats with patented containment,

More information

2018 Edition. Security and Compliance for Office 365

2018 Edition. Security and Compliance for Office 365 2018 Edition Security and Compliance for Office 365 [Proofpoint has] given us our time back to focus on the really evil stuff. CISO, Global 500 Manufacturer Like millions of businesses around the world,

More information

Office 365 Buyers Guide: Best Practices for Securing Office 365

Office 365 Buyers Guide: Best Practices for Securing Office 365 Office 365 Buyers Guide: Best Practices for Securing Office 365 Microsoft Office 365 has become the standard productivity platform for the majority of organizations, large and small, around the world.

More information

An Introduction to the Waratek Application Security Platform

An Introduction to the Waratek Application Security Platform Product Analysis January 2017 An Introduction to the Waratek Application Security Platform The Transformational Application Security Technology that Improves Protection and Operations Highly accurate.

More information

Real-time Communications Security and SDN

Real-time Communications Security and SDN Real-time Communications Security and SDN 2016 [Type here] Securing the new generation of communications applications, those delivering real-time services including voice, video and Instant Messaging,

More information

Webomania Solutions Pvt. Ltd. 2017

Webomania Solutions Pvt. Ltd. 2017 The other name for link manipulation is Phishing or you can say link manipulation is type of phishing attack done generally to mislead the user to a replica website or a looka-like of some well-known site.

More information

Streaming Prevention in Cb Defense. Stop malware and non-malware attacks that bypass machine-learning AV and traditional AV

Streaming Prevention in Cb Defense. Stop malware and non-malware attacks that bypass machine-learning AV and traditional AV Streaming Prevention in Cb Defense Stop malware and non-malware attacks that bypass machine-learning AV and traditional AV 2 STREAMING PREVENTION IN Cb DEFENSE OVERVIEW Over the past three years, cyberattackers

More information

How technology changed fraud investigations. Jean-François Legault Senior Manager Analytic & Forensic Technology June 13, 2011

How technology changed fraud investigations. Jean-François Legault Senior Manager Analytic & Forensic Technology June 13, 2011 How technology changed fraud investigations Jean-François Legault Senior Manager Analytic & Forensic Technology June 13, 2011 The Changing Cyberfraud Landscape Underground Economy Malware Authors Organized

More information

Monetizing Attacks / The Underground Economy

Monetizing Attacks / The Underground Economy Monetizing Attacks / The Underground Economy CS 161: Computer Security Prof. Vern Paxson TAs: Jethro Beekman, Mobin Javed, Antonio Lupher, Paul Pearce & Matthias Vallentin http://inst.eecs.berkeley.edu/~cs161/

More information

How To Construct A Keyword Strategy?

How To Construct A Keyword Strategy? Introduction The moment you think about marketing these days the first thing that pops up in your mind is to go online. Why is there a heck about marketing your business online? Why is it so drastically

More information

MOBILE DEFEND. Powering Robust Mobile Security Solutions

MOBILE DEFEND. Powering Robust Mobile Security Solutions MOBILE DEFEND Powering Robust Mobile Security Solutions Table of Contents Introduction Trustlook SECURE ai Mobile Defend Who Uses SECURE ai Mobile Defend? How it Works o Mobile Device Risk Score o Mobile

More information

Botnet Communication Topologies

Botnet Communication Topologies Understanding the intricacies of botnet Command-and-Control By Gunter Ollmann, VP of Research, Damballa, Inc. Introduction A clear distinction between a bot agent and a common piece of malware lies within

More information

Intelligent and Secure Network

Intelligent and Secure Network Intelligent and Secure Network BIG-IP IP Global Delivery Intelligence v11.2 IP Intelligence Service Brian Boyan - b.boyan@f5.com Tony Ganzer t.ganzer@f5.com 2 Agenda Welcome & Intro Introduce F5 IP Intelligence

More information

Beyond Blind Defense: Gaining Insights from Proactive App Sec

Beyond Blind Defense: Gaining Insights from Proactive App Sec Beyond Blind Defense: Gaining Insights from Proactive App Sec Speaker Rami Essaid CEO Distil Networks Blind Defense Means Trusting Half Your Web Traffic 46% of Web Traffic is Bots Source: Distil Networks

More information

OWASP Top 10 The Ten Most Critical Web Application Security Risks

OWASP Top 10 The Ten Most Critical Web Application Security Risks OWASP Top 10 The Ten Most Critical Web Application Security Risks The Open Web Application Security Project (OWASP) is an open community dedicated to enabling organizations to develop, purchase, and maintain

More information

ATTIVO NETWORKS THREATDEFEND INTEGRATION WITH MCAFEE SOLUTIONS

ATTIVO NETWORKS THREATDEFEND INTEGRATION WITH MCAFEE SOLUTIONS PARTNER BRIEF ATTIVO NETWORKS THREATDEFEND INTEGRATION WITH MCAFEE SOLUTIONS INTRODUCTION Attivo Networks has partnered with McAfee to detect real-time in-network threats and to automate incident response

More information

Paper. Delivering Strong Security in a Hyperconverged Data Center Environment

Paper. Delivering Strong Security in a Hyperconverged Data Center Environment Paper Delivering Strong Security in a Hyperconverged Data Center Environment Introduction A new trend is emerging in data center technology that could dramatically change the way enterprises manage and

More information

Next Generation Privilege Identity Management

Next Generation Privilege Identity Management White Paper Next Generation Privilege Identity Management Nowadays enterprise IT teams are focused on adopting and supporting newer devices, applications and platforms to address business needs and keep

More information

Detecting Spam Web Pages

Detecting Spam Web Pages Detecting Spam Web Pages Marc Najork Microsoft Research Silicon Valley About me 1989-1993: UIUC (home of NCSA Mosaic) 1993-2001: Digital Equipment/Compaq Started working on web search in 1997 Mercator

More information

FTA 2017 SEATTLE. Cybersecurity and the State Tax Threat Environment. Copyright FireEye, Inc. All rights reserved.

FTA 2017 SEATTLE. Cybersecurity and the State Tax Threat Environment. Copyright FireEye, Inc. All rights reserved. FTA 2017 SEATTLE Cybersecurity and the State Tax Threat Environment 1 Agenda Cybersecurity Trends By the Numbers Attack Trends Defensive Trends State and Local Intelligence What Can You Do? 2 2016: Who

More information

ECONOMICAL, STORAGE PURPOSE-BUILT FOR THE EMERGING DATA CENTERS. By George Crump

ECONOMICAL, STORAGE PURPOSE-BUILT FOR THE EMERGING DATA CENTERS. By George Crump ECONOMICAL, STORAGE PURPOSE-BUILT FOR THE EMERGING DATA CENTERS By George Crump Economical, Storage Purpose-Built for the Emerging Data Centers Most small, growing businesses start as a collection of laptops

More information

Wire Fraud Begins to Hammer the Construction Industry

Wire Fraud Begins to Hammer the Construction Industry Wire Fraud Begins to Hammer the Construction Industry Cybercriminals are adding new housing construction to their fraud landscape and likely on a wide scale. Created and published by: Thomas W. Cronkright

More information

SandBlast Agent FAQ Check Point Software Technologies Ltd. All rights reserved P. 1. [Internal Use] for Check Point employees

SandBlast Agent FAQ Check Point Software Technologies Ltd. All rights reserved P. 1. [Internal Use] for Check Point employees SandBlast Agent FAQ What is Check Point SandBlast Agent? Check Point SandBlast Agent defends endpoints and web browsers with a complete set of realtime advanced browser and endpoint protection technologies,

More information

ADVANCED THREAT PREVENTION FOR ENDPOINT DEVICES 5 th GENERATION OF CYBER SECURITY

ADVANCED THREAT PREVENTION FOR ENDPOINT DEVICES 5 th GENERATION OF CYBER SECURITY ADVANCED THREAT PREVENTION FOR ENDPOINT DEVICES 5 th GENERATION OF CYBER SECURITY OUTLINE Advanced Threat Landscape (genv) Why is endpoint protection essential? Types of attacks and how to prevent them

More information

EBOOK. Stopping Fraud. How Proofpoint Helps Protect Your Organization from Impostors, Phishers and Other Non-Malware Threats.

EBOOK. Stopping  Fraud. How Proofpoint Helps Protect Your Organization from Impostors, Phishers and Other Non-Malware Threats. EBOOK Stopping Email Fraud How Proofpoint Helps Protect Your Organization from Impostors, Phishers and Other Non-Malware Threats www.proofpoint.com EBOOK Stopping Email Fraud 2 Today s email attacks have

More information

How to Fight Back against Phishing A guide to mitigating and deterring attacks targeting your customers

How to Fight Back against Phishing A guide to mitigating and deterring attacks targeting your customers White Paper How to Fight Back against Phishing A guide to mitigating and deterring attacks targeting your customers 2013 Copyright Ecrime Management Strategies, Inc. All rights reserved. PhishLabs and

More information

DoS Cyber Attack on a Government Agency in Europe- April 2012 Constantly Changing Attack Vectors

DoS Cyber Attack on a Government Agency in Europe- April 2012 Constantly Changing Attack Vectors DoS Cyber Attack on a Government Agency in Europe- April 2012 Constantly Changing Attack Vectors 1 Table of Content Preamble...3 About Radware s DefensePro... 3 About Radware s Emergency Response Team

More information

Phishing Activity Trends

Phishing Activity Trends Phishing Activity Trends Report for the Month of September, 2007 Summarization of September Report Findings The total number of unique phishing reports submitted to APWG in September 2007 was 38,514, an

More information

Security and Compliance for Office 365

Security and Compliance for Office 365 Security and Compliance for Office 365 [Proofpoint has] given us our time back to focus on the really evil stuff. CISO, Global 500 Manufacturer Like millions of businesses around the world, you may be

More information

Securing Today s Mobile Workforce

Securing Today s Mobile Workforce WHITE PAPER Securing Today s Mobile Workforce Secure and Manage Mobile Devices and Users with Total Defense Mobile Security Table of Contents Executive Summary..................................................................................

More information

Using Red Hat Network Satellite to dynamically scale applications in a private cloud

Using Red Hat Network Satellite to dynamically scale applications in a private cloud Using Red Hat Network Satellite to dynamically scale applications in a private cloud www.redhat.com Abstract Private cloud infrastructure has many clear advantages, not the least of which is the decoupling

More information

Panda Security 2010 Page 1

Panda Security 2010 Page 1 Panda Security 2010 Page 1 Executive Summary The malware economy is flourishing and affecting both consumers and businesses of all sizes. The reality is that cybercrime is growing exponentially in frequency

More information

ATTIVO NETWORKS THREATDEFEND PLATFORM INTEGRATION WITH CISCO SYSTEMS PROTECTS THE NETWORK

ATTIVO NETWORKS THREATDEFEND PLATFORM INTEGRATION WITH CISCO SYSTEMS PROTECTS THE NETWORK PARTNER BRIEF ATTIVO NETWORKS THREATDEFEND PLATFORM INTEGRATION WITH CISCO SYSTEMS PROTECTS THE NETWORK INTRODUCTION Attivo Networks has partnered with Cisco Systems to provide advanced real-time inside-the-network

More information

The Top 6 WAF Essentials to Achieve Application Security Efficacy

The Top 6 WAF Essentials to Achieve Application Security Efficacy The Top 6 WAF Essentials to Achieve Application Security Efficacy Introduction One of the biggest challenges IT and security leaders face today is reducing business risk while ensuring ease of use and

More information

Quick recap on ing Security Recap on where to find things on Belvidere website & a look at the Belvidere Facebook page

Quick recap on  ing  Security Recap on where to find things on Belvidere website & a look at the Belvidere Facebook page Workshop #7 Email Security Previous workshops 1. Introduction 2. Smart phones & Tablets 3. All about WatsApp 4. More on WatsApp 5. Surfing the Internet 6. Emailing Quick recap on Emailing Email Security

More information

AKAMAI CLOUD SECURITY SOLUTIONS

AKAMAI CLOUD SECURITY SOLUTIONS AKAMAI CLOUD SECURITY SOLUTIONS Whether you sell to customers over the web, operate data centers around the world or in the cloud, or support employees on the road, you rely on the Internet to keep your

More information

Protect Your Data the Way Banks Protect Your Money

Protect Your Data the Way Banks Protect Your Money Protect Your Data the Way Banks Protect Your Money A New Security Model Worth Understanding and Emulating Enterprise security traditionally relied on a fortress strategy that locked down user endpoints

More information

FIREWALL PROTECTION AND WHY DOES MY BUSINESS NEED IT?

FIREWALL PROTECTION AND WHY DOES MY BUSINESS NEED IT? WHAT IS FIREWALL PROTECTION AND WHY DOES MY BUSINESS NEED IT? While firewalls started life simply protecting networks from outside hacks and attacks, the role of the firewall has greatly evolved to take

More information

With turing you can: Identify, locate and mitigate the effects of botnets or other malware abusing your infrastructure

With turing you can: Identify, locate and mitigate the effects of botnets or other malware abusing your infrastructure Decoding DNS data If you have a large DNS infrastructure, understanding what is happening with your real-time and historic traffic is difficult, if not impossible. Until now, the available network management

More information

Standard Course Outline IS 656 Information Systems Security and Assurance

Standard Course Outline IS 656 Information Systems Security and Assurance Standard Course Outline IS 656 Information Systems Security and Assurance I. General Information s Course number: IS 656 s Title: Information Systems Security and Assurance s Units: 3 s Prerequisites:

More information

PALANTIR CYBERMESH INTRODUCTION

PALANTIR CYBERMESH INTRODUCTION 100 Hamilton Avenue Palo Alto, California 94301 PALANTIR CYBERMESH INTRODUCTION Cyber attacks expose organizations to significant security, regulatory, and reputational risks, including the potential for

More information

DreamFactory Security Guide

DreamFactory Security Guide DreamFactory Security Guide This white paper is designed to provide security information about DreamFactory. The sections below discuss the inherently secure characteristics of the platform and the explicit

More information

[Rajebhosale*, 5(4): April, 2016] ISSN: (I2OR), Publication Impact Factor: 3.785

[Rajebhosale*, 5(4): April, 2016] ISSN: (I2OR), Publication Impact Factor: 3.785 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY A FILTER FOR ANALYSIS AND DETECTION OF MALICIOUS WEB PAGES Prof. SagarRajebhosale*, Mr.Abhimanyu Bhor, Ms.Tejashree Desai, Ms.

More information

Robust Defenses for Cross-Site Request Forgery Review

Robust Defenses for Cross-Site Request Forgery Review Robust Defenses for Cross-Site Request Forgery Review Network Security Instructor:Dr. Shishir Nagaraja Submitted By: Jyoti Leeka October 16, 2011 1 Introduction to the topic and the reason for the topic

More information

Cisco Start. IT solutions designed to propel your business

Cisco Start. IT solutions designed to propel your business Cisco Start IT solutions designed to propel your business Small and medium-sized businesses (SMBs) typically have very limited resources to invest in new technologies. With every IT investment made, they

More information

Fighting Spam, Phishing and Malware With Recurrent Pattern Detection

Fighting Spam, Phishing and Malware With Recurrent Pattern Detection Fighting Spam, Phishing and Malware With Recurrent Pattern Detection White Paper September 2017 www.cyren.com 1 White Paper September 2017 Fighting Spam, Phishing and Malware With Recurrent Pattern Detection

More information

Eliminating the Blind Spot: Rapidly Detect and Respond to the Advanced and Evasive Threat

Eliminating the Blind Spot: Rapidly Detect and Respond to the Advanced and Evasive Threat WHITE PAPER Eliminating the Blind Spot: Rapidly Detect and Respond to the Advanced and Evasive Threat Executive Summary Unfortunately, it s a foregone conclusion that no organisation is 100 percent safe

More information

Novetta Cyber Analytics

Novetta Cyber Analytics Know your network. Arm your analysts. Introduction Novetta Cyber Analytics is an advanced network traffic analytics solution that empowers analysts with comprehensive, near real time cyber security visibility

More information

KnowBe4 is the world s largest integrated platform for awareness training combined with simulated phishing attacks.

KnowBe4 is the world s largest integrated platform for awareness training combined with simulated phishing attacks. KnowBe4 is the world s largest integrated platform for awareness training combined with simulated phishing attacks. About Us The world s most popular integrated Security Awareness Training and Simulated

More information

Detect Cyber Threats with Securonix Proxy Traffic Analyzer

Detect Cyber Threats with Securonix Proxy Traffic Analyzer Detect Cyber Threats with Securonix Proxy Traffic Analyzer Introduction Many organizations encounter an extremely high volume of proxy data on a daily basis. The volume of proxy data can range from 100

More information

SEO Get Google 1 st Page Rankings

SEO Get Google 1 st Page Rankings 1. SEO can decrease your cost per acquisition Another benefit of SEO is that it is free. It is far less expensive than advertising to acquire customers. The only costs in SEO are the costs to hire the

More information

UNCLASSIFIED. R-1 Program Element (Number/Name) PE D8Z / Software Engineering Institute (SEI) Applied Research. Prior Years FY 2013 FY 2014

UNCLASSIFIED. R-1 Program Element (Number/Name) PE D8Z / Software Engineering Institute (SEI) Applied Research. Prior Years FY 2013 FY 2014 Exhibit R-2, RDT&E Budget Item Justification: PB 2015 Office of Secretary Of Defense Date: March 2014 0400: Research, Development, Test & Evaluation, Defense-Wide / BA 2: COST ($ in Millions) Prior Years

More information

Author: Tonny Rabjerg Version: Company Presentation WSF 4.0 WSF 4.0

Author: Tonny Rabjerg Version: Company Presentation WSF 4.0 WSF 4.0 Author: Tonny Rabjerg Version: 20150730 Company Presentation WSF 4.0 WSF 4.0 Cybercrime is a growth industry. The returns are great, and the risks are low. We estimate that the likely annual cost to the

More information