Quantitative Vulnerability Assessment of Systems Software

Similar documents
Vulnerability Discovery in Multi-Version Software Systems

Assessing Vulnerabilities in Apache and IIS HTTP Servers

NMOSE GPCD CALCULATOR

AIMMS Function Reference - Date Time Related Identifiers

CIMA Asia. Interactive Timetable Live Online

Airside Congestion. Airside Congestion

IAB Internet Advertising Revenue Report

CIMA Asia. Interactive Timetable Live Online

Grade 4 Mathematics Pacing Guide

Undergraduate Admission File

Nigerian Telecommunications Sector

INTRODUCING CISCO SECURITY FOR AWS

Pushing the Limits. ADSM Symposium Sheelagh Treweek September 1999 Oxford University Computing Services 1

Characterization and Modeling of Deleted Questions on Stack Overflow

IAB Internet Advertising Revenue Report

Freedom of Information Act 2000 reference number RFI

MONITORING REPORT ON THE WEBSITE OF THE STATISTICAL SERVICE OF CYPRUS DECEMBER The report is issued by the.

DATE OF BIRTH SORTING (DBSORT)

Contents:

Web Gateway Security Appliances for the Enterprise: Comparison of Malware Blocking Rates

All King County Summary Report

San Joaquin County Emergency Medical Services Agency

REPORT ON TELECOMMUNICATIONS SERVICE QUALITY WINDSTREAM FLORIDA, INC.

CRIME ANALYSIS SACRAMENTO COUNTY SHERIFF S DEPARTMENT

Cpk: What is its Capability? By: Rick Haynes, Master Black Belt Smarter Solutions, Inc.

Seattle (NWMLS Areas: 140, 380, 385, 390, 700, 701, 705, 710) Summary

Denver, Colorado November 16, 2004 D. R. Corpron Senior Manager & Master Black Belt

Seattle (NWMLS Areas: 140, 380, 385, 390, 700, 701, 705, 710) Summary

Seattle (NWMLS Areas: 140, 380, 385, 390, 700, 701, 705, 710) Summary

Nigerian Telecommunications (Services) Sector Report Q3 2016

App Economy Market analysis for Economic Development

Cybersecurity is a Team Sport

RLMYPRINT.COM 30-DAY FREE NO-OBLIGATION TRIAL OF RANDOM LENGTHS MY PRINT.

DAS LRS Monthly Service Report

Training of BRs/NCs reviewers and experts for Biennial Update Reports technical analysis. 5 th BRs and NCs lead reviewers meeting

Measuring Large-Scale Distributed Systems: Case of BitTorrent Mainline DHT

ISE Cyber Security UCITS Index (HUR)

(S)LOC Count Evolution for Selected OSS Projects. Tik Report 315

What future changes are planned to improve the performance and reliability of the Wairarapa Connection?

MetrixND Newsletters Contents List By Topic

TOSHIBA. Semiconductor company Ref. : H Asia Pacific Satelite Industries TO : APPROVAL SHEET FOR TC7SH04FU(T5L,JF,T.

Multi-part functionality in PINES

Spiegel Research 3.0 The Mobile App Story

Kevin James. MTHSC 102 Section 1.5 Polynomial Functions and Models

Google Analytics: A Worm's-Eye View & DigitalCommons Usage Reports

Win32/Msblast: A Case Study from Microsoft s Perspective. Matthew Braverman Program Manager Microsoft Corporation

CS Programming I: Arrays

Monthly SEO Report. Example Client 16 November 2012 Scott Lawson. Date. Prepared by

TCA metric #3. TCA and fair execution. The metrics that the FX industry must use.

User guide for MODIS derived vegetation fractional cover metrics

INFORMATION TECHNOLOGY SPREADSHEETS. Part 1

Troop calendar

Invincea Endpoint Protection Test

CIMA Certificate BA Interactive Timetable

The State of the Raven. Jon Warbrick University of Cambridge Computing Service

San Francisco Housing Authority (SFHA) Leased Housing Programs October 2015

New Concept for Article 36 Networking and Management of the List

HPE Security Data Security. HPE SecureData. Product Lifecycle Status. End of Support Dates. Date: April 20, 2017 Version:

SYS 6021 Linear Statistical Models

Example. Section: PS 709 Examples of Calculations of Reduced Hours of Work Last Revised: February 2017 Last Reviewed: February 2017 Next Review:

Stat 428 Autumn 2006 Homework 2 Solutions

Internet Threat Detection System Using Bayesian Estimation

Testing Contact and Response Strategies to Improve Response in the 2012 Economic Census

Travellers reviews impact on destination brands. Jonathan Howlett VP Global Destination Marketing

Nigerian Telecommunications (Services) Sector Report Q2 2016

Coaching emerit Certified Event Find out what level you are ready for and what you need to JHB

Tracking the Internet s BGP Table

2018 CERTIFICATION TRAINING SCHEDULE COURSE TARGET AUDIENCE FEE JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC

INTERTANKO Vetting seminar 24 th October 2017

Withdrawn Equity Offerings: Event Study and Cross-Sectional Regression Analysis Using Eventus Software

2018 CERTIFICATION TRAINING SCHEDULE COURSE TARGET AUDIENCE FEE JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC

Excel Functions & Tables

Routing the Internet in Geoff Huston APNIC March 2007

Calendar PPF Production Cycles Non-Production Activities and Events

2018 CALENDAR OF ACTIVITIES

SME License Order Working Group Update - Webinar #3 Call in number:

C Structures, Unions, Bit Manipulations, and Enumerations

Digital Test. Coverage Index

AdMob Mobile Metrics Report

Interim Report Technical Support for Integrated Library Systems Comparison of Open Source and Proprietary Software

The Threat Landscape and Security Trends. Jeremy Ward

Service Excellence by Design. Tom Floodeen Vice President & General Manager Customer Support Division Mentor Graphics Corporation

For personal use only. Update Event & nearmap Solar

Scheduled Base Rental Revenue 919, , , ,027 1,013,613 1,038,875 1,064,831 1,091, ,637 1,099,323 1,132,302 1,166,272

Effects of PROC EXPAND Data Interpolation on Time Series Modeling When the Data are Volatile or Complex

Software Reliability Models: Failure rate estimation

Mobile Broadband and benefits with harmonized UHF spectrum

Imputation for missing observation through Artificial Intelligence. A Heuristic & Machine Learning approach

LOADS, CUSTOMERS AND REVENUE

I.A.C. - Italian Activity Contest.

The New England Approach to HIPAA. John D. Halamka MD Chairman, New England Health EDI Network CIO, CareGroup Healthcare System

software.sci.utah.edu (Select Visitors)

Imputation for missing data through artificial intelligence 1

Integration new Apriori algorithm MDNC and Six Sigma to Improve Array yield in the TFT-LCD Industry

TOWN MANAGER S WEEKLY REPORT

Product Versioning and Back Support Policy

Martin Purvis CEO. Presentation following the AGM. 26 April 2012

Price Change Proposed. October 2018

AdMob Mobile Metrics. Metrics Highlights. May 2010

Inter-Domain Routing Trends

Transcription:

Quantitative Vulnerability Assessment of Systems Software Omar H. Alhazmi Yashwant K. Malaiya Colorado State University

Motivation Vulnerabilities: defect which enables an attacker to bypass security measures [Schultz et al: 7] For defects: Reliability modeling and SRGMs have been around for decades. Assuming that vulnerabilities are a special faults will lead us to this question: To what degree reliability terms and models are applicable to vulnerabilities and security? [Littlewood et al.:14]. The need for quantitative measurements and estimation is becoming more crucial.

Outline of Our Goals Developing a quantitative model to estimate vulnerability discovery. Using calendar time. Using equivalent effort. Validate these measurements and models. Testing the models using available data Identify security Assessment metrics Vulnerability density Vulnerability to Total defect ratio

Time vulnerability discovery model What factors impact the discovery process? The changing environment The share of installed base. Global internet users. Discovery effort Discoverers: Developer, White hats or black hats. Discovery effort is proportional to the installed base over time. Vulnerability finders reward: greater rewards, higher motivation. Security level desired for the system Server or client

Vulnerabilities Time vulnerability discovery model Each vulnerability is recorded. Available [ICAT, Microsoft]. Needs compilation and filtering. Data show three phases for an OS. Assumptions: The discovery is driven by the rewards factor. Influenced by the change of market share. Phase 1 Phase 2 Phase 3 Time

Vulnerabilities Time vulnerability Discovery model 3 phase model S-shaped model. Phase 1: Knowledge low. Installed base low. Phase 2: Knowledge high. Installed base high and growing. Phase 3: Knowledge-high. Installed base dropping. dy dt y Ay( B y) BCe Vulnerability time growth model Time B ABt 1

Vulnerabilities Time based model: Windows 98 45 40 Fitted curve Windows 98 Total vulnerabilites Windows 98 A 0.004873 35 30 25 20 15 10 5 B 37.7328 C 0.5543 χ 2 7.365 χ 2 critial 60.481 0 Jan-99 Mar-99 May-99 Jul-99 Sep-99 Nov-99 Jan-00 Mar-00 May-00 Juḻ 00 Sep-00 Nov-00 Jan-01 Mar-01 May-01 Jul-01 Sep-01 Nov-01 Jan-02 Mar-02 May-02 Juḻ 02 Sep-02 P-value 1-7.6x10-11

Vulnerabilities Time based model: Windows NT 4.0 160 Total vulnerabilities Windows NT 4.0 Fitted curve Windows NT 4.0 140 120 100 80 60 40 20 0 Aug-96 Dec-96 Apr-97 Aug-97 Dec-97 Apr-98 Aug-98 Dec-98 Apr-99 Aug-99 Dec-99 Apr-00 Aug-00 Dec-00 Apr-01 Aug-01 Dec-01 Apr-02 Aug-02 Dec-02 Apr-03 A 0.000692 B 136 C 0.52288 χ 2 35.584 χ 2 critial 103.01 P-value 0.9999973

Installed Base Percentage Millions of users Usage vulnerability Discovery model The data: The global internet population. The market share of the system during a period of time. Equivalent effort The real environment performs an intensive testing. Malicious activities is relevant to overall activities. n Defined as E ( U ) i 0 i Pi 800 700 600 500 400 300 200 100 0 16 Dec., 1995 Dec., 1996 60 50 40 30 20 10 0 36 70 Dec., 1997 Dec., 1998 147 359 304 248 Internet Growth 451 458 479 558 513 Dec., 1999 Mar. 2000 Jul., 2000 Dec., 2000 Mar., 2001 Jun., 2001 569 587 The percentage of the market share of O.S. 608 677 682 745 719 757 Aug., 2001 Apr. 2002 Jul., 2002 Sep., 2002 Mar., 2003 Sep., 2003 Oct., 2003 Dec., 2003 Feb., 2004 May, 2004 Windows 95 Windows 98 Windows XP Windows NT Windows 2000 Others May-99 Aug-99 Nov-99 Feb-00 May-00 Aug-00 Nov-00 Feb-01 May-01 Aug-01 Nov-01 Feb-02 May-02 Aug-02 Nov-02 Feb-03 May-03 Aug-03 Nov-03 Feb-04 May-04

Vulnerabilities Usage vulnerability Discovery model The model: y B( 1 e E vu ) Exponential growth with effort. The basic reliability model [Musa:21]. 40 35 30 Time is eliminated. 25 20 15 10 5 0 0 750 1500 2250 3000 3750 4500 5250 6000 6750 7500 Usage (Million user's months)

Vulnerabilities Effort-based model: Windows 98 40 Windows 98 Actual Vulnerabilities Fitted curve Windows 98 35 30 25 20 15 10 5 0 0 750 1500 2250 3000 3750 4500 5250 6000 6750 7500 B 37 λ vu 0.000505 χ 2 3.510 χ 2 critial 44.9853 P-value 1-3.3x10-11 Usage (Million user's months)

Vulnerabilities Effort-based model: Windows NT 4.0 Windows NT 4.0 Actual Vulnerability Fitted Win NT 4.0 120 100 80 B 108 λ vu 0.003061 60 40 ` χ 2 15.05 20 0 0 100 200 300 400 500 600 700 800 900 1000 1100 1200 1300 1400 1500 χ 2 critial 42.5569 P-value 0.985 Usage (Millions users months)

Vulnerabilities Discussion Windows 98 45 Fitted curve Total vulnerabilites Excellent fit for Windows 98 and NT 4.0. Model fits data for all OSs examined. Deviation from the model caused by overlap: Windows 98 and Windows XP Windows NT 4.0 and Windows 2000 Vulnerabilities in shared code may be detected in the newer OS. Need: approach for handling such overlap 40 35 30 25 20 15 10 5 0 Jan-99 Mar-99 May-99 Jul-99 Sep-99 Nov-99 Jan-00 Mar-00 May-00 Juḻ 00 Sep-00 Nov-00 Jan-01 Mar-01 May-01 Jul-01 Sep-01 Nov-01 Jan-02 Mar-02 May-02 Juḻ 02 Sep-02

Vulnerability density and defect density Defect density Valuable metric for planning test effort Used for setting release quality target Limited defect density data is available Vulnerabilities are a class of defects Vulnerability data is in the public domain. Is vulnerability density a useful measure? Is it related to defect density? Vulnerabilities = 5% of defects [Longstaff: 20]? Vulnerabilities = 1% of defects [Anderson]? Can be a major step in measuring security.

Vulnerability density and defect density Vulnerability densities: 95/98: 0.33-0.56 NT/2000: 0.63-1.8 V KD /D KD : 0.24-1.62% less than 5% System MSLOC Known Defects (1000s) D KD (/Kloc) Known Vulner - abilies V KD (/Kloc) Ratio V KD /D KD Win 95 15 5 0.33 49 0.0033 0.98% NT 4.0 16 10 0.625 162 0.0101 1.62% Win 98 18 10 0.556 58 0.0032 0.58% Win2000 35 63 1.8 151 0.0043 0.24% Win XP 40 106.5* 2.66* 71 0.0018 0.067%* * The number of defects for Windows XP is for the beta version.

Vulnerability / KLOC Defect/KLOC Vulnerability density and defect density Vulnerability densities: 95/98: 0.33-0.56 NT/2000: 0.63-1.8 V KD /D KD : 0.24-1.62% Vulnerability Density Defect Density 0.012 0.01 0.008 0.006 0.004 0.002 3 2.5 2 1.5 1 0.5 0 Win 95 NT 4.0 Win 98 Win2000 Win XP 0 Win 95 NT 4.0 Win 98 Win2000 Win XP

Vulnerability / KLOC Defect/KLOC Vulnerability density and defect density Vulnerability densities: 95/98: 0.33-0.56 NT/2000: 0.63-1.8 V KD /D KD : 0.24-1.62% Vulnerability Density Defect Density 0.012 0.01 0.008 0.006 0.004 0.002 3 2.5 2 1.5 1 0.5 0 Win 95 NT 4.0 Win 98 Win2000 Win XP 0 Win 95 NT 4.0 Win 98 Win2000 Win XP

Results and conclusions Vulnerability Discovery models Time-based model Effort-based model Data fits the models Vulnerability density evaluation Expected ranges of values Future work: Modeling impact of shared code Validation. Reward analysis Other risk factors: patches, vulnerability exploitation

Summary and conclusions We have introduced: Models: Time vulnerability model. Usage vulnerability model. Both models shown acceptable goodness of fit. Chi-square test. Measurements: vulnerability density. Vulnerability density vs. defect density.

Vulnerability Discovery in Multi-Version Software Systems Jinyoo Kim, Yashwant K. Malaiya, Indrakshi Ray {jyk6457, malaiya, iray }@cs.colostate.edu

Outline Motivation for this study Related Work & Data Sources Vulnerability Discovery Models (VDMs) Software Evolution Multi-version Software Discovery Model Apache, Mysql and Win XP data Conclusions and Future Work 21

Vulnerability Discovery Models Describe vulnerability discovery against time Security Maintenance Management Estimating the number of vulnerabilities Patch development planning Guiding Test Effort Applicable to categories (causes and severity levels) Software Risk evaluation Combine with vulnerability exploitation and attack surface 22

Motivation for Multi-version VDMs Superposition effect on vulnerability discovery process due to shared code in successive versions. Examination of software evolution: impact on vulnerability introduction and discovery Other factors impacting vulnerability discovery process not considered before 23

Related Work Software reliability growth models Logarithmic-Poisson Reliability Model (Musa 84) Vulnerability Discovery Process Quadratic and Linear Models (Rescorla 05) Thermodynamic Model (Anderson 01) Logistic (AML) and Effort-based models (Alhazmi 2004-5) Software Evolution Trend of software Evolution (Eick 01) Application of Reliability Growth Model Reliability growth using OpenBSD (Ozment 06) 24

Software Evolution The modification of software during maintenance or development: fixes and feature additions. Influenced by competition Code decay and code addition introduce new vulnerabilities Successive version of a software can share a significant fraction of code. 25

LOC (Lines of Code) LOC (Lines of Code) Software Evolution: Apache & Mysql 100000 600000 90000 80000 500000 70000 60000 400000 50000 40000 300000 30000 200000 20000 10000 100000 0 1.3.0 1.3.2 1.3.4 1.3.9 1.3.12 1.3.17 1.3.20 1.3.23 Version Number 1.3.27 1.3.29 1.3.32 1.3.34 1.3.36 0 4.0.0 4.0.2 4.0.4 4.0.5a 4.0.7 4.0.9 4.0.11a 4.0.13 4.0.15a 4.0.16 Version Number 4.0.18 4.0.21 4.0.23 4.0.24 4.0.26 Initial Code Added Code Initial Code Added Code Modification: Apache 43%, Mysql 31% 26

Jun-98 Jun-99 Jun-00 Jun-01 Jun-02 Jun-03 Jun-04 Jun-05 Jun-06 Oct-01 Feb-02 Jun-02 Oct-02 Feb-03 Jun-03 Oct-03 Feb-04 Jun-04 Oct-04 Feb-05 Jun-05 Oct-05 Feb-06 Jun-06 Oct-06 Percentage Vulnerabilities Vulnerability Discovery & Evolution: Apache & Mysql Apache Mysql DBMS 120% 120% 100% 100% 80% 80% 60% 60% 40% 20% c 40% 20% 0% 0% Release Date Release Date Added Code in Next Version Reliability Growth Code increasing Vulnerability Discovery Some vulnerabilities are in added code, many are inherited from precious versions. 27

Vulnerability Discovery rate Code Sharing & Vulnerabilities Observation Vulnerability increases after saturation in AML modeling Multiple Software Vulnerability Discovery Trend Accounting for Superposition Effect Shared components between several versions of software Calendar Time 1st Version 2nd Version Shared part Total Version Trend Total Version Trend 28

Vulnerability Discovery rate Multi-version Vulnerability Discovery Model Multiple Software Vulnerability Discovery Trend 1st Version Shared part Total Version Trend Calendar Time 2nd Version Total Version Trend B B' ( t) ABt A' B'( t ) BCe 1 B' C' e 1 Cumulative MVDM : fraction of code used in next Previous version Version Apache Mysql 1.3.24 (3-21- 2002) 4.1.1 (12-1- 2003) Next Version 2.0.35 (4-6- 2002) 5.0.0 (12-22- 2003) Shared Code Ratio α 20.16% 83.52% 29

Number of Vulnerability Cumulative Vulnerability One vs Two Humps One-humped Vulnerability Discovery Model Calendar Time Calendar Time Superposition affect 30

Vulnerability Rate Vulnerability Number Multi-version Vulnerability Discovery Model One-humped Vulnerability Discovery One-humped Vulnerability Discovery Trend Calendar Time 1st Version 2nd Version Shared Total Calendar Time 1st version Shared Total May result in a single hump with prolonged linear period 31

Seasonality in Vulnerability Discovery in Major Software Systems HyunChul Joh Yashwant K. Malaiya Dean2026 malaiya@cs.colostate.edu Department of Computer Science Colorado State University

Background Vulnerability: a defect which enables an attacker to bypass security measures [1] Vulnerability Discovery Model (VDM): a probabilistic methods for modeling the discovery of software vulnerabilities [2] Spans a few years: introduction to replacement Seasonality: periodic variation well known statistical approach quite common in economic time series Biological systems, stock markets etc. Halloween indicator: Low returns in May-Oct. [1]Schultz, Brown, and Longstaff, Responding to Computer Security Incidents. 1990 [2]Ozment Improving vulnerability discovery models, 2007

1995-01 1995-05 1995-09 1996-01 1996-05 1996-09 1997-01 1997-05 1997-09 1998-01 1998-05 1998-09 1999-01 1999-05 1999-09 2000-01 2000-05 2000-09 2001-01 2001-05 2001-09 2002-01 2002-05 2002-09 2003-01 2003-05 2003-09 2004-01 2004-05 2004-09 2005-01 2005-05 2005-09 2006-01 2006-05 2006-09 2007-01 2007-05 2007-09 Number of Vulnerabilities (Cumulative) Motivation (Visual Observation) 90 80 Windows NT 4.0 70 60 50 40 30 20 10 cumulative AML each month 0 Calendar Time 34

Examining Seasonality Is the seasonal pattern statistically significant? Periodicity of the pattern Analysis: Seasonal index analysis with test Autocorrelation Function analysis Significance Enhance VDMs predicting ability 35

Data Sets Data Sets to be analyzed here Windows NT Internet Information Services (IIS) server Internet Explorer (IE) National Vulnerability Database (NVD) [3] U.S. government repository of vulnerability management data collected and organized using specific standards 36 [3] National Institute of Standards and Technology. National Vulnerability Database.

Percentage Prevalence in Month Vulnerabilities Disclosed WinNT 95~ 07 IIS 96~ 07 IE 97~ 07 Jan 42 15 15 Feb 20 10 32 Mar 12 2 22 Apr 13 11 29 May 18 12 41 Jun 24 17 45 Jul 18 11 53 Aug 17 7 42 Sep 11 6 26 Oct 14 6 20 Nov 18 7 26 Dec 51 28 93 Total 258 132 444 Mean 21.5 11 37 s.d. 12.37 6.78 20.94 0.25 0.20 0.15 0.10 0.05 0.00 Percentage of Vuln. for Month Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Month Win NT I I S Internet Explorer 37

Seasonal Index Seasonal Index Values WinNT IIS IE Jan 1.95 1.36 0.41 Feb 0.93 0.91 0.86 Mar 0.56 0.81 0.59 Apr 0.60 1.00 0.78 May 0.84 1.09 1.11 Jun 1.12 1.55 1.22 Jul 0.84 1.00 1.43 Aug 0.79 0.64 1.14 Sep 0.51 0.55 0.70 Oct 0.65 0.55 0.54 Nov 0.84 0.64 0.70 Dec 2.37 2.55 2.51 19.68 19.68 19.68 78.37 46 130.43 p-value 3.04e-12 3.23e-6 1.42e-6 Seasonal index: measures how much the average for a particular period tends to be above (or below) the expected value H 0 : no seasonality is present. We will evaluate it using the monthly seasonal index values given by [4]: where, s i is the seasonal index for i th month, d i is the mean value of i th month, d is a grand average 38 [4] Hossein Arsham. Time-Critical Decision Making for Business Administration. Available: http://home.ubalt. edu/ntsbarsh/business-stat/stat-data/forecast.htm#rseasonindx

Autocorrelation function (ACF) Plot of autocorrelations function values With time series values of z b, z b+1,, z n, the ACF at lag k, denoted by r k, is [5]:, where 39 Measures the linear relationship between time series observations separated by a lag of time units Hence, when an ACF value is located outside of confidence intervals at a lag t, it can be thought that every lag t, there is a relationships along with the time line [5] B. L. Bowerman and R. T. O'connell, Time Series Forecsting: Unified concepts and computer implementation. 2nd Ed., Boston: Duxbury Press, 1987

Autocorrelation (ACF):Results Expected lags corresponding to 6 months or its multiple would have their ACF values outside confidence interval Upper/lower dotted lines: 95% confidence intervals. An event occurring at time t + k (k > 0) lags behind an event occurring at time t. Lags are in month. 40

Conclusion / Future Work The results show strong seasonality in systems examined, with higher discovery rates in some months. This needs to be taken into account for making accurate projections. Study of diverse software products, commercial and open-source, to identify causes of seasonality and possible variation across software systems. 41

Return Halloween Indicator Also known as Sell in May and go away Global (1973-1996): 1950-2008 Nov.-April: 12.47% ann., st dev 12.58% 12-months:10.92%, st. dev. 17.76% 36 of 37 developing/developed nations Data going back to 1694 No convincing explanation 0.02 0.015 0.01 0.005 0-0.005-0.01 January February March April May June July August September October November December Jacobsen, Ben and Bouman, Sven,The Halloween Indicator, 'Sell in May and Go Away': Another Puzzle(July 2001). Available at SSRN: http://ssrn.com/abstract=76248

References O. H. Alhazmi, Y. K. Malaiya, I. Ray, " Measuring, Analyzing and Predicting Security Vulnerabilities in Software Systems," Computers and Security Journal, Volume 26, Issue 3, May 2007, Pages 219-228. J. Kim, Y. K. Malaiya and I. Ray, "Vulnerability Discovery in Multi-Version Software Systems," Proc. 10th IEEE Int. Symp. on High Assurance System Engineering (HASE), Dallas, Nov. 2007, pp. 141-148. H. Joh and Y. K. Malaiya, "Seasonal Variation in the Vulnerability Discovery Process, " Proc. 2nd IEEE Int. Conf. Software Testing, Verification, and Validation, April 2009, pp. 191-200. Guido Schryen, Is open source security a myth? What do vulnerability and patch data say?, Communications of the ACM, May 2011, vol. 54, no. 5, pp. 130-140.