Convexity, concavity, super-additivity, and sub-additivity of cost function without fixed cost

Similar documents
ORIE 6300 Mathematical Programming I September 2, Lecture 3

Convexity. 1 X i is convex. = b is a hyperplane in R n, and is denoted H(p, b) i.e.,

2. Optimization problems 6

Numerical Optimization

EC 521 MATHEMATICAL METHODS FOR ECONOMICS. Lecture 2: Convex Sets

MTAEA Convexity and Quasiconvexity

A PARAMETRIC SIMPLEX METHOD FOR OPTIMIZING A LINEAR FUNCTION OVER THE EFFICIENT SET OF A BICRITERIA LINEAR PROBLEM. 1.

arxiv: v1 [math.lo] 9 Mar 2011

Locally convex topological vector spaces

POLYHEDRAL GEOMETRY. Convex functions and sets. Mathematical Programming Niels Lauritzen Recall that a subset C R n is convex if

OPERATIONS RESEARCH. Linear Programming Problem

AM 221: Advanced Optimization Spring 2016

Division of the Humanities and Social Sciences. Convex Analysis and Economic Theory Winter Separation theorems

Mathematical Programming and Research Methods (Part II)

Lecture 5: Duality Theory

Key points. Assume (except for point 4) f : R n R is twice continuously differentiable. strictly above the graph of f except at x

Introduction to optimization

= w. w u. u ; u + w. x x. z z. y y. v + w. . Remark. The formula stated above is very important in the theory of. surface integral.

Lecture 2 - Introduction to Polytopes

Lecture 15: The subspace topology, Closed sets

REVIEW OF FUZZY SETS

Lag Order and Critical Values of the Augmented Dickey-Fuller Test: A Replication

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 2. Convex Optimization

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS

Convexity and Optimization

General properties of staircase and convex dual feasible functions

f x How can we determine algebraically where f is concave up and where f is concave down?

Convexity Theory and Gradient Methods

Applied Lagrange Duality for Constrained Optimization

CMU-Q Lecture 9: Optimization II: Constrained,Unconstrained Optimization Convex optimization. Teacher: Gianni A. Di Caro

Linear Optimization. Andongwisye John. November 17, Linkoping University. Andongwisye John (Linkoping University) November 17, / 25

Fuzzy Generalized γ-closed set in Fuzzy Topological Space

Convexity and Optimization

Convexity: an introduction

Topological properties of convex sets

(1) Given the following system of linear equations, which depends on a parameter a R, 3x y + 5z = 2 4x + y + (a 2 14)z = a + 2

. 1. Chain rules. Directional derivative. Gradient Vector Field. Most Rapid Increase. Implicit Function Theorem, Implicit Differentiation

Convex Optimization Lecture 2

Convex Geometry arising in Optimization

arxiv: v2 [math.lo] 25 Apr 2011

Unconstrained Optimization

The Set-Open topology

The simplex method and the diameter of a 0-1 polytope

College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007

Introduction to Modern Control Systems

Increasing and Decreasing Functions. MATH 1003 Calculus and Linear Algebra (Lecture 20) Increasing and Decreasing Functions

ON BINARY TOPOLOGICAL SPACES

11 Linear Programming

Convex sets and convex functions

Lecture 4: Convexity

1 Linear Programming. 1.1 Optimizion problems and convex polytopes 1 LINEAR PROGRAMMING

Gradient and Directional Derivatives

On Soft Topological Linear Spaces

Interpretation of Dual Model for Piecewise Linear. Programming Problem Robert Hlavatý

IE 521 Convex Optimization

EC5555 Economics Masters Refresher Course in Mathematics September Lecture 6 Optimization with equality constraints Francesco Feri

Dubna 2018: lines on cubic surfaces

A Review on 2-normed Structures

EXTREME POINTS AND AFFINE EQUIVALENCE

Lecture 2 Optimization with equality constraints

Lesson 17. Geometry and Algebra of Corner Points

Convex sets and convex functions

Partition of a Nonempty Fuzzy Set in Nonempty Convex Fuzzy Subsets

Lecture 5: Properties of convex sets

EXTERNAL VISIBILITY. 1. Definitions and notation. The boundary and interior of

2.8. Connectedness A topological space X is said to be disconnected if X is the disjoint union of two non-empty open subsets. The space X is said to

Lecture 2: August 29, 2018

Two Characterizations of Hypercubes

Section 5 Convex Optimisation 1. W. Dai (IC) EE4.66 Data Proc. Convex Optimisation page 5-1

The angle measure at for example the vertex A is denoted by m A, or m BAC.

Lagrange multipliers October 2013

Lecture 2: August 29, 2018

Name. Final Exam, Economics 210A, December 2012 There are 8 questions. Answer as many as you can... Good luck!

Optimization Problems Under One-sided (max, min)-linear Equality Constraints

However, this is not always true! For example, this fails if both A and B are closed and unbounded (find an example).

Math 5593 Linear Programming Lecture Notes

COM Optimization for Communications Summary: Convex Sets and Convex Functions

Mathematical and Algorithmic Foundations Linear Programming and Matchings

International Journal of Mathematical Archive-5(9), 2014, Available online through ISSN

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

A note on isolate domination

Lecture 19 Subgradient Methods. November 5, 2008

LECTURE 7 LECTURE OUTLINE. Review of hyperplane separation Nonvertical hyperplanes Convex conjugate functions Conjugacy theorem Examples

On Jeśmanowicz Conjecture Concerning Pythagorean Triples

Recursive Estimation

Introduction to Operations Research

A Constant Bound for the Periods of Parallel Chip-firing Games with Many Chips

QEM Optimization, WS 2017/18 Part 4. Constrained optimization

Lagrange multipliers 14.8

Compact Sets. James K. Peterson. September 15, Department of Biological Sciences and Department of Mathematical Sciences Clemson University

In this chapter, we define limits of functions and describe some of their properties.

A note on the number of edges guaranteeing a C 4 in Eulerian bipartite digraphs

Lecture 19: Convex Non-Smooth Optimization. April 2, 2007

maximize c, x subject to Ax b,

University of Twente. Faculty of Mathematical Sciences. Convexity preservation of the four-point interpolatory subdivision scheme

4&5 Binary Operations and Relations. The Integers. (part I)

FUZZY METRIC SPACES ZUN-QUAN XIA AND FANG-FANG GUO

Angle Structures and Hyperbolic Structures

Convex Optimization M2

ON SWELL COLORED COMPLETE GRAPHS

Transcription:

MPRA Munich Personal RePEc Archive Convexity, concavity, super-additivity, and sub-additivity of cost function without fixed cost Yasuhito Tanaka and Masahiko Hattori 31 July 2017 Online at https://mpra.ub.uni-muenchen.de/80502/ MPRA Paper No. 80502, posted 1 August 2017 05:30 UTC

Convexity, concavity, super-additivity, and sub-additivity of cost function without fixed cost Masahiko Hattori* Faculty of Economics, Doshisha University, Kamigyo-ku, Kyoto, 602-8580, Japan. and Yasuhito Tanaka Faculty of Economics, Doshisha University, Kamigyo-ku, Kyoto, 602-8580, Japan. Abstract With zero fixed cost, convexity of a cost function implies super-additivity, and concavity of a cost function implies sub-additivity. But converse relations do not hold. However, in addition to the zero fixed cost condition we put the following assumption. (1) If a cost function is convex in some interval, it is convex throughout the domain. (2) If a cost function is concave in some interval, it is concave throughout the domain. Then, super-additivity implies convexity and sub-additivity implies concavity. Subsequently, super-additivity and convexity are equivalent, and sub-additivity and concavity are equivalent. Keywords: cost function, convexity, concavity, super-additivity, sub-additivity JEL Classification: D43, L13 *mhattori@mail.doshisha.ac.jp yasuhito@mail.doshisha.ac.jp

1 Introduction Convexity and concavity are important properties for cost functions of firms. Also superadditivity and sub-additivity are other important properties for them. A cost function c(x) is convex when it satisfies c(λx + (1 λ)y) λc(x) + (1 λ)c(y) for 0 λ 1, x 0, y 0. It is concave when it satisfies c(λx + (1 λ)y) λc(x) + (1 λ)c(y) for 0 λ 1, x 0, y 0. It is super-additive if it satisfies It is sub-additive if it satisfies c(x + y) c(x) + c(y), for x 0, y 0. c(x + y) c(x) + c(y), for x 0, y 0. It is well known that with zero fixed cost, that is, c(0) = 0, convexity implies superadditivity, and concavity implies sub-additivity. But converse relations do not hold. See Bruckner and Ostrow (1962) and Sen and Stamatopoulos (2016). Referring to Bourin and Hiai (2015), Sen and Stamatopoulos (2016) pointed out that the following function is super-additive but it is not convex. xe 1 x 2, x 0. However, if, in addition to the zero fixed cost condition, we put the following assumption, we can show that super-additivity implies convexity, and sub-additivity implies concavity. Assumption 1. the domain. (1) If a cost function is convex in some interval, it is convex throughout (2) If a cost function is concave in some interval, it is concave throughout the domain. Then, super-additivity and convexity are equivalent, and sub-additivity and concavity are equivalent. Assumption 1 excludes a case where a cost function is strictly convex in some interval is such a function. We assume that a cost function, c(x), is positive for positive x, increasing and continuous. and strictly concave in another interval. Above mentioned xe 1 x 2 2

2 Main results We show the following theorems. Theorem 1. (1) If there is no fixed cost, convexity of a cost function implies its superadditivity. (2) If there is no fixed cost, concavity of a cost function implies its sub-additivity. Proof. (1) Convexity super-additivity: For 0 λ 1 and x 0 convexity implies c(λx + (1 λ) 0) λc(x) + (1 λ)c(0) = λc(x). Thus, Then, for x > 0 and y 0 c(λx) λc(x). (2) Concavity sub-additivity: c(x) + c(y) =c ( x (x + y)) + c ( y (x + y)) x + y x + y x c(x + y) + y c(x + y) = c(x + y). x + y x + y For 0 λ 1 and x 0 concavity implies c(λx + (1 λ) 0) λc(x) + (1 λ)c(0) = λc(x). Thus, Then, for x > 0 and y 0 c(λx) λc(x). c(x) + c(y) =c ( x (x + y)) + c ( y (x + y)) x + y x + y x c(x + y) + y c(x + y) = c(x + y). x + y x + y Theorem 2. Suppose that a cost function satisfies Assumption 1, and there is no fixed cost. (1) Super-additivity of a cost function implies convexity. (2) Sub-additivity of a cost function implies concavity. 3

Proof. (1) Super-additivity convexity: Let x > 0. By super-additivity, for any 0 λ 1, c(x) c(λx) + c((1 λ)x). This implies, for any 0 λ 1, c(x) c(λx + (1 λ) 0) + c(λ 0 + (1 λ)x). (1) By the zero fixed cost condition, for any 0 λ 1, c(x) = λc(x) + (1 λ)c(0) + λc(0) + (1 λ)c(x). (2) (1) and (2) imply that, for any 0 λ 1, λc(x) + (1 λ)c(0) c(λx + (1 λ) 0), (3a) or λc(0) + (1 λ)c(x) c(λ 0 + (1 λ)x) (3b) holds. Suppose that for some x and some 0 < λ < 1, (3a) does not hold, that is, λc(x) + (1 λ)c(0) < c(λx + (1 λ) 0). (4a) Then, (3b) must hold with strict inequality, that is, λc(0) + (1 λ)c(x) > c(λ 0 + (1 λ)x). (4b) (4a) means that c(x) can not be convex (nor linear) throughout [0, x]. Similarly, (4b) means that c(x) can not be concave (nor linear) throughout [0, x]. It contradicts to Assumption 1. If c(x) is not differentiable at, for example, y, it may be not concave nor convex (nor linear) in an interval including y. However, if we exclude a pathological function which is continuous everywhere but differentiable nowhere, any function is concave or convex in some interval. Then, by Assumption 1 it is concave or convex throughout the domain. Therefore, both (3a) and (3b) must hold for any x > 0 and any 0 λ 1. Alternatively, we can prove the same result assuming that (3b) instead of (3a) does not hold. Let y = αx for x > 0 and 0 < α < 1. Assume that for some 0 < λ < 1, λc(x) + (1 λ)c(y) < c(λx + (1 λ)y). (5) Let z = λx + (1 λ)y = [λ + (1 λ)α]x. 4

By (3a) Then, from (5) αc(x) c(αx) = c(y). ( λ + 1 λ) c(y) < c(z). α This means where β = α λ + (1 λ)α c(y) < βc(z) + (1 β)c(0), (6) > 0 and 1 β = λ(1 α) λ + (1 λ)α > 0. (5) means that c(x) can not be convex (nor linear) throughout [y, x]. (6) means that c(x) can not be concave (nor linear) throughout [0, z]. It contradicts to Assumption 1. Therefore, for any x > y 0 and any 0 λ 1 λc(x) + (1 λ)c(y) c(λx + (1 λ)y). Thus, c(x) is convex throughout the domain. (2) Sub-additivity concavity: Let x > 0. By sub-additivity, for any 0 λ 1, This implies, for any 0 λ 1, c(x) c(λx) + c((1 λ)x). c(x) c(λx + (1 λ) 0) + c(λ 0 + (1 λ)x). (7) By the zero fixed cost condition, for any 0 λ 1, c(x) = λc(x) + (1 λ)c(0) + λc(0) + (1 λ)c(x). (8) (7) and (8) imply that, for any 0 λ 1, λc(x) + (1 λ)c(0) c(λx + (1 λ) 0), (9a) or λc(0) + (1 λ)c(x) c(λ 0 + (1 λ)x) (9b) holds. Suppose that for some x and some 0 < λ < 1, (9a) does not hold, that is, λc(x) + (1 λ)c(0) > c(λx + (1 λ) 0). (10a) Then, (9b) must hold with strict inequality, that is, λc(0) + (1 λ)c(x) < c(λ 0 + (1 λ)x). (10b) 5

(10a) means that c(x) can not be concave throughout [0, x]. Similarly, (10b) means that c(x) can not be convex throughout [0, x]. It contradicts to Assumption 1. Therefore, both (9a) and (9b) must hold for any x > 0 and any 0 λ 1. Alternatively, we can prove the same result assuming that (9b) instead of (9a) does not hold. Let y = α x for x > 0 and 0 < α < 1. Assume that for some 0 < λ < 1, λc(x) + (1 λ)c(y ) > c(λx + (1 λ)y ). (11) Let By (9a) Then, from (11) z = λx + (1 λ)y = [λ + (1 λ)α ]x. α c(x) c(α x) = c(y ). ( λ α + 1 λ) c(y ) > c(z ). This means where β = c(y ) > β c(z ) + (1 β )c(0), (12) α λ + (1 λ)α > 0, and 1 β = λ(1 α ) λ + (1 λ)α > 0. (11) means that c(x) can not be concave throughout [y, x]. (12) means that c(x) can not be convex throughout [0, z ]. It contradicts to Assumption 1. Therefore, for any x > y 0 and any 0 λ 1 λc(x) + (1 λ)c(y ) c(λx + (1 λ)y ). Thus, c(x) is concave throughout the domain. If c(x) is twice continuously differentiable, convexity of c(x) is equivalent to non-negativity of its second order derivative, that is, c (x) 0, and its concavity is equivalent to nonpositivity of its second order derivative, that is, c (x) 0. We present a proof of Theorem 2 using differentiability. Proof of Theorem 2 with differentiability. Let x > 0 and y > 0. By super-additivity (1) Super-additivity convexity: c(x + y) c(x) + c(y). With the zero fixed cost condition, we have c(x + y) c(x) y c(y) c(0). y 6

Let y 0, Thus, This means Let x 0, we obtain c(x + y) c(x) lim y 0 y c (x) c (0). c (x) c (0) x c(y) c(0) lim. y 0 y 0. c (0) = lim x 0 c (x) c (0) x Suppose that there exist x 2 > x 1 > 0 such that c (x 2 ) < c (x 1 ). Since c (x 2 ) c (0), we have c (x 1 ) > c (0). Then, c(x) is strictly concave in an interval within [x 1, x 2 ] and at the same time strictly convex in an interval within [0, x 1 ]. It contradicts to Assumption 1. Thus, c (x) 0 throughout the domain and c(x) is convex. (2) Sub-additivity concavity: Let x > 0 and y > 0. By sub-additivity c(x + y) c(x) + c(y). With the zero fixed cost condition, we have 0. c(x + y) c(x) y c(y) c(0). y Let y 0, Thus, This means Let x 0, we obtain c(x + y) c(x) lim y 0 y c (x) c (0). c (x) c (0) x c(y) c(0) lim. y 0 y 0. c (0) = lim x 0 c (x) c (0) x Suppose that there exist x 2 > x 1 > 0 such that c (x 2) > c (x 1). Since c (x 2) c (0), we have c (x 1) < c (0). Then, c(x) is strictly convex in an interval within [x 1, x 2] and at the same time strictly concave in an interval within [0, x 1]. It contradicts to Assumption 1. Thus, c (x) 0 throughout the domain and c(x) is concave. 0. 7

3 Concluding Remark We usually assume that a cost function is convex throughout the domain or concave throughout the domain. Therefore, if there is no fixed cost, we can use interchangeably convexity and super-additivity of a cost function, and can use interchangeably concavity and subadditivity of a cost function. References Sen, D. and Stamatopoulos, G. (2016) Licensing under general demand and cost functions, European Journal of Operations Research, 253, pp. 673-680. Bruin, J.-C. and Hiai, F. (2015) Anti-norms on finite von Neumann algebras, Publications of the Research Institute for Mathematical Sciences, 51, pp. 207-235. Bruckner, A. M. and Ostrow, E. (1962) Some function classes related to the class of convex functions, Pacific Journal of Mathematics, 14, pp. 1203-1215. 8