WO 2016/ Al. 21 April 2016 ( ) P O P C T. Figure 2

Size: px
Start display at page:

Download "WO 2016/ Al. 21 April 2016 ( ) P O P C T. Figure 2"

Transcription

1 (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International Publication Date WO 2016/ Al 21 April 2016 ( ) P O P C T (51) International Patent Classification: GA (US). GRAY, Alexander; 550 Moreland Way, G06F 15/18 ( ) Unit 3407, Santa Clara, CA (US). (21) International Application Number: (74) Agents: HOLMES, Matthew, M. et al; Patent Law PCT/US2015/ Works LLP, 201 S. Main Strret, Suite 250, Salt Lake City, (22) International Filing Date: UT (US). 14 October 2015 ( ) (81) Designated States (unless otherwise indicated, for every (25) Filing Language: English kind of national protection available): AE, AG, AL, AM, AO, AT, AU, AZ, BA, BB, BG, BH, BN, BR, BW, BY, (26) Publication Language: English BZ, CA, CH, CL, CN, CO, CR, CU, CZ, DE, DK, DM, DO, DZ, EC, EE, EG, ES, FI, GB, GD, GE, GH, GM, GT, (30) Priority Data: HN, HR, HU, ID, IL, IN, IR, IS, JP, KE, KG, KN, KP, KR, 62/063, October 2014 ( ) US KZ, LA, LC, LK, LR, LS, LU, LY, MA, MD, ME, MG, (71) Applicant: SKYTREE, INC. [US/US]; Technology MK, MN, MW, MX, MY, MZ, NA, NG, NI, NO, NZ, OM, Drive, Suite 700, San Jose, CA (US). PA, PE, PG, PH, PL, PT, QA, RO, RS, RU, RW, SA, SC, SD, SE, SG, SK, SL, SM, ST, SV, SY, TH, TJ, TM, TN, (72) Inventors: GIBIANSKY, Maxsim; 825 East Evelyn Av TR, TT, TZ, UA, UG, US, UZ, VC, VN, ZA, ZM, ZW. enue #506, Sunnyvale, CA (US). RIEGEL, Ryan; 679 N. Capitol Avenue #3, San Jose, CA (US). (84) Designated States (unless otherwise indicated, for every YANG, Yi; 613 Arcadia Terrace Apt. #304, Sunnyvale, kind of regional protection available): ARIPO (BW, GH, CA (US). RAM, Parikshit; 384 7th St. NE, Atlanta, GM, KE, LR, LS, MW, MZ, NA, RW, SD, SL, ST, SZ, TZ, UG, ZM, ZW), Eurasian (AM, AZ, BY, KG, KZ, RU, [Continued on nextpage] (54) Title: CONFIGURABLE MACHINE LEARNING METHOD SELECTION AND PARAMETER OPTIMIZATION SYSTEM AND METHOD r (57) Abstract: A system and method for selecting a machine learn ing method and optimizing the parameters that control its behavior including receiving data; determining, using one or more pro cessors, a first candidate machine learning method; tuning, using one or more processors, one or more parameters of the first candid ate machine learning method; determining, using one or more pro cessors, that the first candidate machine learning method and a first parameter configuration for the first candidate machine learning method are the best based on a measure of fitness subsequent to satisfaction of a stop condition; and outputting, using one or more processors, the first candidate machine learning method and the first parameter configuration for the first candidate machine learn ing method. 00 Figure 2 v o o

2 wo 2016/ Ai I III IIII IIIIII I lllll II II III III II I II TJ, TM), European (AL, AT, BE, BG, CH, CY, CZ, DE, DK, EE, ES, FI, FR, GB, GR, HR, HU, IE, IS, ΓΓ, LT, LU, LV, MC, MK, MT, NL, NO, PL, PT, RO, RS, SE, SI, SK, SM, TR), OAPI (BF, BJ, CF, CG, CI, CM, GA, GN, GQ, GW, KM, ML, MR, NE, SN, TD, TG). Published: with international search report (Art. 21(3))

3 CONFIGURABLE MACHINE LEARNING METHOD SELECTION AND PARAMETER OPTIMIZATION SYSTEM AND METHOD CROSS-REFERENCE TO RELATED APPLICATIONS [001] The present application claims priority, under 35 U.S.C. 119, of U.S. Provisional Patent Application No. 62/063,819, filed October 14, 2014 and entitled "Configurable Machine Learning Method Selection and Parameter Optimization System and Method for Very Large Data Sets," the entirety of which is hereby incorporated by reference. BACKGROUND OF THE INVENTION 1. Field of the Invention [002] The disclosure is related generally to machine learning involving data and in particular to a system and method for selecting between different machine learning methods and optimizing the parameters that control their behavior. 2. Description of Related Art [003] With the fast development in science and engineering, people who analyze data are faced with more and more models and algorithms to choose from, and almost all of them are highly parameterized. In order to obtain satisfactory performance, an appropriate model and/or algorithm with optimized parameter settings has to be carefully selected based on the given dataset, and solving this high dimensional optimization problem has become a challenging task.

4 [004] One commonly used parameter tuning method is grid search, which conducts an exhaustive search in a confined domain for each parameter. However, this traditional method is restricted to tuning over parameters within one model, and can be extremely computationally intensive when tuning more than one parameter, as is typically necessary for the best-performing models on the largest datasets, which typically have dozens if not more parameters. Additionally, the statistical performance of grid search is highly sensitive to user input, e.g. the searching range and the step size. This makes grid search unapproachable for non-expert users, who may conclude that a particular machine learning method is inferior when actually they have just misjudged the appropriate ranges for one or more of its parameters. To alleviate these drawbacks, researchers have proposed techniques such as iterative refinement, which can accelerate the tuning process to some extent, but unfortunately still requires input from users and is not efficient enough for high dimensional cases. Random search is another popular method, but its performance is also sensitive to the initial setting and the dataset. Regardless, neither of these two techniques can effectively help select from among different models and/or algorithms. [005] Recently, researchers have proposed another type of method, model-based parameter tuning, which has shown to outperform traditional methods on high dimensional problems. Previous work on model based tuning method includes the tree-structured Parzen estimator (TPE), proposed by Bergstra, J. S., Bardenet, R., Bengio, Y., and Kegl, B., "Algorithms for hyper-parameter optimization," Advances in Neural Information Processing Systems, (201 1), and sequential model-based algorithm configuration (SMAC), proposed by Hutter, F., Hoos, H., and Leyton-Brown, K., "Sequential model-based optimization for general algorithm configuration," Learning and Intelligent Optimization, Springer Berlin Heidelberg, (201 1). Thornton, C, Hutter, F., Hoos, H. H., and Leyton-Brown, K., "Auto-WEKA: Combined selection and hyperparameter optimization of

5 classification algorithms," Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining, ACM, (2013) has combined the work in the above papers and applied different techniques for tuning classification algorithms implemented in Waikato Environment for Knowledge Analysis (WEKA). However, this model is restricted to the classification task on small datasets, and it does not allow users to specify and configure the tuning space for a specific task. [006] Thus, there is a need for a system and method that selects between different machine learning methods and optimizing the parameters that control their behavior. SUMMARY OF THE INVENTION [007] The present invention overcomes one or more of the deficiencies of the prior art at least in part by providing a system and method for selecting between different machine learning methods and optimizing the parameters that control their behavior. [008] According to one innovative aspect of the subject matter described in this disclosure, a system comprises: one or more processors; and a memory storing instructions that, when executed by the one or more processors, cause the system to: receive data; determine a first candidate machine learning method; tune one or more parameters of the first candidate machine learning method; determine that the first candidate machine learning method and a first parameter configuration for the first candidate machine learning method are the best based on a measure of fitness subsequent to satisfaction of a stop condition; and output the first candidate machine learning method and the first parameter configuration for the first candidate machine learning method. [009] In general, another innovative aspect of the subject matter described in this disclosure may be embodied in methods that include receiving data; determining, using one

6 or more processors, a first candidate machine learning method; tuning, using one or more processors, one or more parameters of the first candidate machine learning method; determining, using one or more processors, that the first candidate machine learning method and a first parameter configuration for the first candidate machine learning method are the best based on a measure of fitness subsequent to satisfaction of a stop condition; and outputting, using one or more processors, the first candidate machine learning method and the first parameter configuration for the first candidate machine learning method. [0010] Other aspects include corresponding methods, systems, apparatus, and computer program products. These and other implementations may each optionally include one or more of the following features. [0011] For instance, the operations further include: determining a second machine learning method; tuning, using one or more processors, one or more parameters of the second candidate machine learning method, the second candidate machine learning method differing from the first candidate machine learning method; and wherein the determination that the first candidate machine learning method and the first parameter configuration for the first candidate machine learning method are the best based on the measure of fitness includes determining that the first candidate machine learning method and the first parameter configuration for the first candidate machine learning method provide superior performance with regard to the measure of fitness when compared to the second candidate machine learning method with the second parameter configuration. [0012] For instance, the features include: the tuning of the one or more parameters of the first candidate machine learning method is performed using a first processor of the one or more processors and the tuning of the one or more parameters of the second candidate machine learning method is performed using a second processor of the one or more processors in parallel with the tuning of the first candidate machine learning method.

7 [0013] For instance, the features include: a first processor of the one or more processors alternates between the tuning the one or more parameters of the first candidate machine learning method and the tuning of the one or more parameters of the second candidate machine learning method. [0014] For instance, the features include: a greater portion of the resources of the one or more processors is dedicated to tuning the one or more parameters of the first candidate machine learning method than to tuning the one or more parameters of the second candidate machine learning method based on tuning already performed on the first candidate machine learning method and the second candidate machine learning method, the tuning already performed indicating that the first candidate machine learning method is performing better than the second machine learning method based on the measure of fitness. [0015] For instance, the features include: the user specifies the data, and wherein the first candidate machine learning method and the second machine learning method are selected and the tunings and determination are performed automatically without userprovided information or with user-provided information. [0016] For instance, the features include: tuning the one or more parameters of the first candidate machine learning method further comprising: setting a prior parameter distribution; generating a set of sample parameters for the one or more parameters of the first candidate machine learning method based on the prior parameter distribution; forming a new parameter distribution based on the prior parameter distribution and the previously generated set of sample parameters for each of the one or more parameters of the first candidate; generating a new set of sample parameters for the one or more parameters of the first candidate machine learning method. [0017] For instance, the operations further include: determining the stop condition is not met; setting the new parameter distribution as the previously learned parameter

8 distribution and setting the new set of sample parameters as the previously generated set of sample parameters; and repeatedly forming a new parameter distribution based on the previously learned parameter distribution and the previously generated sample parameters for each of the one or more parameters of the first candidate machine learning method, generating a new set of sample parameters for the one or more parameters of the first candidate machine learning method, setting the new parameter distribution as the previously learned parameter distribution and setting the new set of sample parameters as the previously generated set of sample parameters before the stop condition is met. [0018] For instance, the features include: one or more of the determination of the first candidate tuning method and the tuning of the one or more parameters of the first candidate machine learning method are based on a previously learned parameter distribution. [0019] For instance, the features include: the received data includes at least a portion of a Big Data data set and wherein the tuning of the one or more parameters of the first candidate machine learning method is based on the Big Data data set. [0020] Advantages of the system and method described herein may include, but are not limited to, automatic selection of a machine learning method and optimized parameters from among multiple possible machine learning methods, parallelization of tuning one or more machine learning methods and associated parameters, selection and optimization of a machine learning method and associated parameters using Big Data, using a previous distribution to identify one or more of a machine learning method and one or more parameter configurations likely to perform well based on a measure of fitness, executing any of the preceding for a novice user and allowing an expert user to utilize his/her domain knowledge to modify the execution of the preceding. [0021] The features and advantages described herein are not all-inclusive and many additional features and advantages will be apparent to one of ordinary skill in the art in view

9 of the figures and description. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and not to limit the scope of the inventive subject matter. BRIEF DESCRIPTION OF THE DRAWINGS [0022] The invention is illustrated by way of example, and not by way of limitation in the figures of the accompanying drawings in which like reference numerals are used to refer to similar elements. [0023] Figure 1 is a block diagram of an example system for machine learning method selection and parameter optimization according to one implementation. [0024] Figure 2 is a block diagram of an example of a selection and optimization server according to one implementation. [0025] Figure 3 is a flowchart of an example method for a parameter optimization process according to one implementation. [0026] Figure 4 is a flowchart of an example method for a machine learning method selection and parameter optimization process according to one implementation. [0027] Figure 5 is a graphical representation of example input options available to users of the system and method according to one implementation. [0028] Figure 6 is a graphical representation of an example user interface for receiving user inputs according to one implementation. [0029] Figure 7a and b are illustrations of an example hierarchical relationship between parameters according to one or more implementations.

10 [0030] Figure 8 is a graphical representation of an example user interface for output of the machine learning method selection and parameter optimization process according to one implementation. DETAILED DESCRIPTION [0031] One or more of the deficiencies of existing solutions noted in the background are addressed by the disclosure herein. In the below description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the invention can be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to avoid obscuring the invention. For example, the present invention is described in one implementation below with reference to particular hardware and software implementations. However, the present invention applies to other types of implementations distributed in the cloud, over multiple machines, using multiple processors or cores, using virtual machines, appliances or integrated as a single machine. [0032] Reference in the specification to "one implementation" or "an implementation" means that a particular feature, structure, or characteristic described in connection with the implementation is included in at least one implementation of the invention. The appearances of the phrase "in one implementation" in various places in the specification are not necessarily all referring to the same implementation. In particular the present invention is described below in the context of multiple distinct architectures and some of the components are operable in multiple architectures while others are not. [0033] Some portions of the detailed descriptions are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These

11 algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers or the like. [0034] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices. [0035] The present disclosure also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a nontransitory computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only

12 memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, each coupled to a computer system bus. [0036] Aspects of the method and system described herein, such as the logic, may also be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (PLDs), such as field programmable gate arrays (FPGAs), programmable array logic (PAL) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits. Some other possibilities for implementing aspects include: memory devices, microcontrollers with memory (such as EEPROM), embedded microprocessors, firmware, software, etc. Furthermore, aspects may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. The underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor (MOSFET) technologies like complementary metaloxide semiconductor (CMOS), bipolar technologies like emitter-coupled logic (ECL), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, and so on. [0037] Furthermore, the algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present invention is described without reference to any particular programming language.

13 It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein. [0038] A system and method for selecting between different machine learning methods and optimizing the parameters that control their behavior is described. The disclosure is particularly applicable to a machine learning method selection and parameter optimization system and method implemented in a plurality of lines of code and provided in a client/server system and it is in this context that the disclosure is described. It will be appreciated, however, that the system and method has greater utility because it can be implemented in hardware (examples of which are described below in more detail), or implemented on other computer systems such as a cloud computing system, a standalone computer system, and the like and these implementations are all within the scope of the disclosure. [0039] A method and system are disclosed for automatically and simultaneously selecting between distinct machine learning models and finding optimal model parameters for various machine learning tasks. Examples of machine learning tasks include, but are not limited to, classification, regression, and ranking. The performance can be measured by and optimized using one or more measures of fitness. The one or more measures of fitness used may vary based on the specific goal of a project. Examples of potential measures of fitness include, but are not limited to, error rate, F-score, area under curve (AUC), Gini, precision, performance stability, time cost, etc. [0040] Unlike the traditional grid-search-based tuning method, the model-based automatic parameter tuning method described herein is able to explore the entire space formed by different models together with their associated parameters. The model-based automatic parameter tuning method described herein is further able to intelligently and automatically detect effective search directions and refine the tuning region, and hence arrive

14 at the desired result in an efficient way. Further, unlike other previous work, the method is able to run on datasets that are too large to be stored and/or processed on a single computer, can evaluate and learn from multiple parameter configurations simultaneously, and is appropriate for users with different skill levels. [0041] Figure 1 shows an implementation of a system 100 for selecting between different machine learning methods and optimizing the parameters that control their behavior. In the depicted implementation, the system 100 includes a selection and optimization server 102, a plurality of client devices 114a n, a production server 108, a data collector 110 and associated data store 112. In Figure 1 and the remaining figures, a letter after a reference number, e.g., " 114a," represents a reference to the element having that particular reference number. A reference number in the text without a following letter, e.g., "114," represents a general reference to instances of the element bearing that reference number. In the depicted implementation, these entities of the system 100 are communicatively coupled via a network 106. [0042] In some implementations, the system 100 includes one or more selection and optimization servers 102 coupled to the network 106 for communication with the other components of the system 100, such as the plurality of client devices 114a n, the production server 108, and the data collector 110 and associated data store 112. In some implementations, the selection and optimization server 102 may either be a hardware server, a software server, or a combination of software and hardware. [0043] In some implementations, the selection and optimization server 102 is a computing device having data processing (e.g. at least one processor), storing (e.g. a pool of shared or unshared memory), and communication capabilities. For example, the selection and optimization server 102 may include one or more hardware servers, server arrays, storage devices and/or systems, etc. In some implementations, the selection and optimization server

15 102 may include one or more virtual servers, which operate in a host server environment and access the physical hardware of the host server including, for example, a processor, memory, storage, network interfaces, etc., via an abstraction layer (e.g., a virtual machine manager). In some implementations, the selection and optimization server 102 may optionally include a web server 116 for processing content requests, such as a Hypertext Transfer Protocol (HTTP) server, a Representational State Transfer (REST) service, or some other server type, having structure and/or functionality for satisfying content requests and receiving content from one or more computing devices that are coupled to the network 106 (e.g., the production server 108, the data collector 110, the client device 114, etc.). [0044] In some implementations, the components of the selection and optimization server 102 may be configured to implement the selection and optimization unit 104 described in more detail below. In some implementations, the selection and optimization server 102 determines a set of one or more candidate machine learning methods, automatically and intelligently tunes one or more parameters in the set of one or more candidate machine learning methods to optimize performance (based on the one or more measures of fitness), and selects a best (based on the one or more measures of fitness) performing machine learning method and the tuned parameter configuration associated therewith. For example, the selection and optimization server 102 receives a set of training data (e.g. via a data collector 110), determines a first machine learning method and second machine learning method are candidate machine learning methods, determines the measure of fitness is AUC, automatically and intelligently tunes the parameters of the first candidate machine learning method to maximize AUC, automatically and intelligently tunes, at least in part, the parameters of the second candidate machine learning method to maximize AUC, determines that the first candidate machine learning method with its tuned parameters has a greater,

16 maximum AUC than the second candidate machine learning method, and selects the first candidate machine learning method with its tuned parameters. [0045] In one implementation, a model includes a choice of a machine learning method (e.g. GBM or SVM), hyperparameter settings (e.g. SVM's regularization term) and parameter settings (e.g. SVM's alpha coefficients on each data point) and the system and method herein can determine any of thes values which define a modle. It should be recognized that indicators such as "first" and "second" (e.g. with regard candidate machine learning methods, parameters, processors, etc.) are used for clarity and convenience as identifiers and do not necessarily indicate an ordering in time, rank or otherwise. [0046] Although only a single selection and optimization server 102 is shown in Figure 1, it should be understood that there may be a number of selection and optimization servers 102 or a server cluster depending on the implementation. Similarly, it should be understood that the features and functionality of the selection and optimization server 102 may be combined with the features and functionalities of one or more other servers 108/1 10 into a single server (not shown). [0047] The data collector 110 is a server/service which collects data and/or analyses from other servers (not shown) coupled to the network 106. In some implementations, the data collector 110 may be a first or third-party server (that is, a server associated with a separate company or service provider), which mines data, crawls the Internet, and/or receives/ retrieves data from other servers. For example, the data collector 110 may collect user data, item data, and/or user-item interaction data from other servers and then provide it and/or perform analysis on it as a service. In some implementations, the data collector 110 may be a data warehouse or belong to a data repository owned by an organization. [0048] The data store 112 is coupled to the data collector 110 and comprises a non volatile memory device or similar permanent storage device and media. The data collector

17 110 stores the data in the data store 112 and, in some implementations, provides access to the selection and optimization server 102 to retrieve the data collected by the data store 112 (e.g. training data, response variables, rewards, tuning data, test data, user data, experiments and their results, learned parameter settings, system logs, etc.). In machine learning, a response variable, which may occasionally be referred to herein as a "response," refers to a data feature containing the objective result of a prediction. A response may vary based on the context (e.g. based on the type of predictions to be made by the machine learning method). For example, responses may include, but are not limited to, class labels (classification), targets (general, but particularly relevant to regression), rankings (ranking/recommendation), ratings (recommendation), dependent values, predicted values, or objective values. [0049] Although only a single data collector 110 and associated data store 112 is shown in Figure 1, it should be understood that there may be any number of data collectors 110 and associated data stores 112. In some implementations, there may be a first data collector 110 and associated data store 112 accessed by the selection and optimization server 102 and a second data collector 110 and associated data store 112 accessed by the production server 108. In some implementations, the data collector 110 may be omitted. For example in some implementations the data store 112 may be included in or otherwise accessible to the selection and optimization server 102 (e.g. as network accessible storage or one or more storage device(s) included in the selection and optimization server 102). [0050] In some implementations, the one or more selection and optimization servers 102 include a web server 116. The web server 116 may facilitate the coupling of the client devices 114 to the selection and optimization server 102 (e.g. negotiating a communication protocol, etc.) and may prepare the data and/or information, such as forms, web pages, tables, plots, etc., that is exchanged with each client computing device 114. For example, the web server 116 may generate a user interface to submit a set of data for processing and then return

18 a user interface to display the results of machine learning method selection and parameter optimization as applied to the submitted data. Also, instead of or in addition to a web server 116, the selection and optimization server 102 may implement its own API for the transmission of instructions, data, results, and other information between the selection and optimization server 102 and an application installed or otherwise implemented on the client device 114. [0051] The production server 108 is a computing device having data processing, storing, and communication capabilities. For example, the production server 108 may include one or more hardware servers, server arrays, storage devices and/or systems, etc. In some implementations, the production server 108 may include one or more virtual servers, which operate in a host server environment and access the physical hardware of the host server including, for example, a processor, memory, storage, network interfaces, etc., via an abstraction layer (e.g., a virtual machine manager). In some implementations, the production server 108 may include a web server (not shown) for processing content requests, such as a Hypertext Transfer Protocol (HTTP) server, a Representational State Transfer (REST) service, or some other server type, having structure and/or functionality for satisfying content requests and receiving content from one or more computing devices that are coupled to the network 106 (e.g., the selection and optimization server 102, the data collector 110, the client device 114, etc.). In some implementations, the production server 108 may receive the selected machine learning method with the optimized parameters for deployment and deploy the selected machine learning method with the optimized parameters (e.g. on a test dataset in batch mode or online for data analysis). [0052] The network 106 is a conventional type, wired or wireless, and may have any number of different configurations such as a star configuration, token ring configuration, or other configurations known to those skilled in the art. Furthermore, the network 106 may

19 comprise a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or any other interconnected data path across which multiple devices may communicate. In one implementation, the network 106 may include a peer-to-peer network. The network 106 may also be coupled to or include portions of a telecommunications network for sending data in a variety of different communication protocols. In some instances, the network 106 includes Bluetooth communication networks or a cellular communications network. In some instances, the network 106 includes a virtual private network (VPN). [0053] The client devices 114a n include one or more computing devices having data processing and communication capabilities. In some implementations, a client device 114 may include a processor (e.g., virtual, physical, etc.), a memory, a power source, a communication unit, and/or other software and/or hardware components, such as a display, graphics processor (for handling general graphics and multimedia processing for any type of application), wireless transceivers, keyboard, camera, sensors, firmware, operating systems, drivers, various physical connection interfaces (e.g., USB, HDMI, etc.). The client device 114a may couple to and communicate with other client devices 114n and the other entities of the system 100 (e.g. the selection and optimization server 102) via the network 106 using a wireless and/or wired connection. [0054] A plurality of client devices 114a...114n are depicted in Figure 1 to indicate that the selection and optimization server 102 may communicate and interact with a multiplicity of users on a multiplicity of client devices 114a...114n. In some implementations, the plurality of client devices 114a n may include a browser application through which a client device 114 interacts with the selection and optimization server 102, may include an application installed enabling the device to couple and interact with the selection and optimization server 102, may include a text terminal or terminal emulator application to interact with the selection and optimization server 102, or may couple

20 with the selection and optimization server 102 in some other way. In the case of a standalone computer embodiment of the machine learning method selection and parameter optimization system 100, the client device 114 and selection and optimization server 102 are combined together and the standalone computer may, similar to the above, generate a user interface either using a browser application, an installed application, a terminal emulator application, or the like. [0055] Examples of client devices 114 may include, but are not limited to, mobile phones, tablets, laptops, desktops, terminals, netbooks, server appliances, servers, virtual machines, TVs, set-top boxes, media streaming devices, portable media players, navigation devices, personal digital assistants, etc. While two client devices 114a and 114n are depicted in Figure 1, the system 100 may include any number of client devices 114. In addition, the client devices 114a...114n may be the same or different types of computing devices. [0056] It should be understood that the present disclosure is intended to cover the many different implementations of the system 100. In a first example, the selection and optimization server 102, the data collector 110, and the production server 108 may each be dedicated devices or machines coupled for communication with each other by the network 106. In a second example, two or more of the servers 102, 110, and 108 may be combined into a single device or machine (e.g. the selection and optimization server 102 and the production server 108 may be included in the same server). In a third example, any one or more of the servers 102, 110, and 108 may be operable on a cluster of computing cores in the cloud and configured for communication with each other. In a fourth example, any one or more of one or more servers 102, 110, and 108 may be virtual machines operating on computing resources distributed over the internet. In a fifth example, any one or more of the servers 102, 110, and 108 may each be dedicated devices or machines that are firewalled or

21 completely isolated from each other e.g., the servers 102 and 108 may not be coupled for communication with each other by the network 106). [0057] While the selection and optimization server 102 and the production server 108 are shown as separate devices in Figure 1, it should be understood that in some implementations, the selection and optimization server 102 and the production server 108 may be integrated into the same device or machine. While the system 100 shows only one device 102, 106, 108, 110 and 112 of each type, it should be understood that there could be any number of devices of each type. For example, in one embodiment, the system includes multiple selection and optimization servers 102. [0058] Moreover, it should be understood that some or all of the elements of the system 100 could be distributed and operate in the cloud using the same or different processors or cores, or multiple cores allocated for use on a dynamic as needed basis. Furthermore, it should be understood that the selection and optimization server 102 and the production server 108 may be firewalled from each other and have access to separate data collectors 110 and associated data store 112. For example, the selection and optimization server 102 and the production server 108 may be in a network isolated configuration. [0059] Referring now to Figure 2, an example implementation of a selection and optimization server 102 is described in more detail. The illustrated selection and optimization server 102 comprises a processor 202, a memory 204, a display module 206, a network I/F module 208, an input/output device 210, and a storage device 212 coupled for communication with each other via a bus 220. The selection and optimization server 102 depicted in Figure 2 is provided by way of example and it should be understood that it may take other forms and include additional or fewer components without departing from the scope of the present disclosure. For instance, various components may be coupled for communication using a variety of communication protocols and/or technologies including,

22 for instance, communication buses, software communication mechanisms, computer networks, etc. While not shown, the selection and optimization server 102 may include various operating systems, sensors, additional processors, and other physical configurations. [0060] The processor 202 comprises an arithmetic logic unit, a microprocessor, a general purpose controller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or some other processor array, or some combination thereof to execute software instructions by performing various input, logical, and/or mathematical operations to provide the features and functionality described herein. The processor 202 processes data signals and may comprise various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets. The processor(s) 202 may be physical and/or virtual, and may include a single core or plurality of processing units and/or cores. Although only a single processor is shown in Figure 2, multiple processors may be included. It should be understood that other processors, operating systems, sensors, displays, and physical configurations are possible. In some implementations, the processor(s) 202 may be coupled to the memory 204 via the bus 220 to access data and instructions therefrom and store data therein. The bus 220 may couple the processor 202 to the other components of the selection and optimization server 102 including, for example, the display module 206, the network I/F module 208, the input/output device(s) 210, and the storage device 212. [0061] The memory 204 may store and provide access to data to the other components of the selection and optimization server 102. The memory 204 may be included in a single computing device or a plurality of computing devices. In some implementations, the memory 204 may store instructions and/or data that may be executed by the processor 202. For example, as depicted in Figure 2, the memory 204 may store the selection and

23 optimization unit 104, and its respective components, depending on the configuration. The memory 204 is also capable of storing other instructions and data, including, for example, an operating system, hardware drivers, other software applications, databases, etc. The memory 204 may be coupled to the bus 220 for communication with the processor 202 and the other components of selection and optimization server 102. [0062] The instructions stored by the memory 204 and/or data may comprise code for performing any and/or all of the techniques described herein. The memory 204 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, flash memory, or some other memory device known in the art. In some implementations, the memory 204 also includes a non-volatile memory such as a hard disk drive or flash drive for storing information on a more permanent basis. The memory 204 is coupled by the bus 220 for communication with the other components of the selection and optimization server 102. It should be understood that the memory 204 may be a single device or may include multiple types of devices and configurations. [0063] The display module 206 may include software and routines for sending processed data, analytics, or results for display to a client device 114, for example, to allow a user to interact with the selection and optimization server 102. In some implementations, the display module may include hardware, such as a graphics processor, for rendering interfaces, data, analytics, or recommendations. [0064] The network I/F module 208 may be coupled to the network 106 (e.g., via signal line 214) and the bus 220. The network I/F module 208 links the processor 202 to the network 106 and other processing systems. The network I/F module 208 also provides other conventional connections to the network 106 for distribution of files using standard network protocols such as TCP/IP, HTTP, HTTPS, and SMTP as will be understood to those skilled in the art. In an alternate implementation, the network I/F module 208 is coupled to the network

24 106 by a wireless connection and the network I/F module 208 includes a transceiver for sending and receiving data. In such an alternate implementation, the network I/F module 208 includes a Wi-Fi transceiver for wireless communication with an access point. In another alternate implementation, network I/F module 208 includes a Bluetooth transceiver for wireless communication with other devices. In yet another implementation, the network I/F module 208 includes a cellular communications transceiver for sending and receiving data over a cellular communications network such as via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP), direct data connection, wireless access protocol (WAP), , etc. In still another implementation, the network I/F module 208 includes ports for wired connectivity such as but not limited to universal serial bus (USB), secure digital (SD), CAT-5, CAT-5e, CAT-6, fiber optic, etc. [0065] The input/output device(s) ("I/O devices") 210 may include any device for inputting or outputting information from the selection and optimization server 102 and can be coupled to the system either directly or through intervening I/O controllers. The I/O devices 210 may include a keyboard, mouse, camera, stylus, touch screen, display device to display electronic images, printer, speakers, etc. An input device may be any device or mechanism of providing or modifying instructions in the selection and optimization server 102. An output device may be any device or mechanism of outputting information from the selection and optimization server 102, for example, it may indicate status of the selection and optimization server 102 such as: whether it has power and is operational, has network connectivity, or is processing transactions. [0066] The storage device 212 is an information source for storing and providing access to data, such as a plurality of datasets. The data stored by the storage device 212 may be organized and queried using various criteria including any type of data stored by it. The storage device 212 may include data tables, databases, or other organized collections of data.

25 The storage device 212 may be included in the selection and optimization server 102 or in another computing system and/or storage system distinct from but coupled to or accessible by the selection and optimization server 102. The storage device 212 can include one or more non-transitory computer-readable mediums for storing data. In some implementations, the storage device 212 may be incorporated with the memory 204 or may be distinct therefrom. In some implementations, the storage device 212 may store data associated with a relational database management system (RDBMS) operable on the selection and optimization server 102. For example, the RDBMS could include a structured query language (SQL) RDBMS, a NoSQL RDBMS, various combinations thereof, etc. In some instances, the RDBMS may store data in multi-dimensional tables comprised of rows and columns, and manipulate, e.g., insert, query, update, and/or delete rows of data using programmatic operations. In some implementations, the storage device 212 may store data associated with a Hadoop distributed file system (HDFS) or a cloud based storage system such as Amazon S3. [0067] The bus 220 represents a shared bus for communicating information and data throughout the selection and optimization server 102. The bus 220 can include a communication bus for transferring data between components of a computing device or between computing devices, a network bus system including the network 106 or portions thereof, a processor mesh, a combination thereof, etc. In some implementations, the processor 202, memory 204, display module 206, network I/F module 208, input/output device(s) 210, storage device 212, various other components operating on the selection and optimization server 102 (operating systems, device drivers, etc.), and any of the components of the selection and optimization unit 104 may cooperate and communicate via a communication mechanism included in or implemented in association with the bus 220. The software communication mechanism can include and/or facilitate, for example, inter-process communication, local function or procedure calls, remote procedure calls, an object broker

26 (e.g., CORBA), direct socket communication (e.g., TCP/IP sockets) among software modules, UDP broadcasts and receipts, HTTP connections, etc. Further, any or all of the communication could be secure (e.g., SSH, HTTPS, etc.). [0068] As depicted in Figure 2, the selection and optimization unit 104 may include and may signal the following to perform their functions: a machine learning method unit 230, a parameter optimization unit 240, a result scoring unit 250, and a data management unit 260. These components 230, 240, 250, 260, and/or components thereof, may be communicatively coupled by the bus 220 and/or the processor 202 to one another and/or the other components 206, 208, 210, and 212 of the selection and optimization server 102. In some implementations, the components 230, 240, 250, and/or 260 may include computer logic (e.g., software logic, hardware logic, etc.) executable by the processor 202 to provide their acts and/or functionality. In any of the foregoing implementations, these components 230, 240, 250, and/or 260 may be adapted for cooperation and communication with the processor 202 and the other components of the selection and optimization server 102. [0069] For clarity and convenience, the disclosure will occasionally refer to the following example scenario and system: assume that a user desires to classify as spam or not spam; also, assume that the data includes s correctly labeled as spam or not spam, the labels ("spam" and "not spam") and some tuning data; furthermore, assume that the system 100 supports only two machine learning methods - support vector machines (SVM) and gradient boosted machines (GBM); additionally, assume that the user desires the machine learning method and parameter setting that results in the greatest accuracy. However, it should be recognized that this example is merely one example and that other examples and implementations which may perform different tasks (e.g. rank instead of classify), have different data (e.g. different labels), support a different number of machine learning methods and/or different machine learning methods, etc.

WO 2013/ Al. 17 January 2013 ( ) P O P C T

WO 2013/ Al. 17 January 2013 ( ) P O P C T (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

TMCH Report March February 2017

TMCH Report March February 2017 TMCH Report March 2013 - February 2017 Contents Contents 2 1 Trademark Clearinghouse global reporting 3 1.1 Number of jurisdictions for which a trademark record has been submitted for 3 2 Trademark Clearinghouse

More information

WO 2017/ Al. 15 June 2017 ( )

WO 2017/ Al. 15 June 2017 ( ) (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

(43) International Publication Date n n / ft * 3 May 2012 ( ) U l / 5 A

(43) International Publication Date n n / ft * 3 May 2012 ( ) U l / 5 A (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT)

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (43) International Publication Date (10) International

More information

TM), European (AL, AT, BE, BG, CH, CY, CZ, DE, DK, W., Houston, Texas (US).

TM), European (AL, AT, BE, BG, CH, CY, CZ, DE, DK, W., Houston, Texas (US). (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2012/34

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2012/34 (19) (12) EUROPEAN PATENT APPLICATION (11) EP 2 490 138 A1 (43) Date of publication: 22.08.2012 Bulletin 2012/34 (1) Int Cl.: G06F 17/30 (2006.01) (21) Application number: 1214420.9 (22) Date of filing:

More information

TEPZZ 8_8997A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION

TEPZZ 8_8997A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION (19) TEPZZ 8_8997A_T (11) EP 2 818 997 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 31.12.2014 Bulletin 2015/01 (21) Application number: 13174439.3 (51) Int Cl.: G06F 3/0488 (2013.01)

More information

(43) International Publication Date WO 2013/ Al 4 April 2013 ( ) W P O P C T

(43) International Publication Date WO 2013/ Al 4 April 2013 ( ) W P O P C T (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

TEPZZ 6Z8446A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: H04L 9/08 ( ) H04L 9/32 (2006.

TEPZZ 6Z8446A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: H04L 9/08 ( ) H04L 9/32 (2006. (19) TEPZZ 6Z8446A_T (11) EP 2 608 446 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 26.06.2013 Bulletin 2013/26 (1) Int Cl.: H04L 9/08 (2006.01) H04L 9/32 (2006.01) (21) Application number:

More information

SYSTEM AND METHOD FOR SPEECH RECOGNITION

SYSTEM AND METHOD FOR SPEECH RECOGNITION Technical Disclosure Commons Defensive Publications Series September 06, 2016 SYSTEM AND METHOD FOR SPEECH RECOGNITION Dimitri Kanevsky Tara Sainath Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

(51) Int Cl.: H04L 12/24 ( ) WU, Qin

(51) Int Cl.: H04L 12/24 ( ) WU, Qin (19) TEPZZ Z 68A_T (11) EP 3 3 68 A1 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 13(4) EPC (43) Date of publication: 09.08.17 Bulletin 17/32 (21) Application number: 182297.9 (22)

More information

GM, KE, LR, LS, MW, MZ, NA, RW, SD, SL, SZ, TZ, fornia (US).

GM, KE, LR, LS, MW, MZ, NA, RW, SD, SL, SZ, TZ, fornia (US). (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

TEPZZ 74_475A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: H04L 29/12 ( )

TEPZZ 74_475A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: H04L 29/12 ( ) (19) TEPZZ 74_47A_T (11) EP 2 741 47 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 11.06.14 Bulletin 14/24 (1) Int Cl.: H04L 29/12 (06.01) (21) Application number: 131968.6 (22) Date of

More information

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

SYSTEMS AND METHODS FOR ROUTING COMMUNICATIONS IN A COMPUTER NETWORK

SYSTEMS AND METHODS FOR ROUTING COMMUNICATIONS IN A COMPUTER NETWORK SYSTEMS AND METHODS FOR ROUTING COMMUNICATIONS IN A COMPUTER NETWORK FIELD OF THE DISCLOSURE (01) The present disclosure relates to systems and methods for routing communications in a computer network.

More information

TEPZZ 98 _55A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION

TEPZZ 98 _55A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION (19) TEPZZ 98 _A_T (11) EP 2 983 1 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication:.02.16 Bulletin 16/06 (21) Application number: 1180049.7 (1) Int Cl.: G08G /06 (06.01) G08G 7/00 (06.01)

More information

TEPZZ _968ZZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06K 7/10 ( )

TEPZZ _968ZZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06K 7/10 ( ) (19) TEPZZ _968ZZA_T (11) EP 3 196 800 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 26.07.17 Bulletin 17/ (1) Int Cl.: G06K 7/ (06.01) (21) Application number: 1719738.8 (22) Date of filing:

More information

(51) Int Cl.: H04L 29/06 ( )

(51) Int Cl.: H04L 29/06 ( ) (19) TEPZZ 94Z96B_T (11) EP 2 9 96 B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the patent: 26.04.17 Bulletin 17/17 (1) Int Cl.: H04L 29/06 (06.01) (21) Application

More information

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT)

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 17/30 ( )

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 17/30 ( ) (19) (12) EUROPEAN PATENT APPLICATION (11) EP 2 447 858 A1 (43) Date of publication: 02.05.2012 Bulletin 2012/18 (51) Int Cl.: G06F 17/30 (2006.01) (21) Application number: 11004965.7 (22) Date of filing:

More information

PCT WO 2007/ Al

PCT WO 2007/ Al (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (43) International Publication Date (10) International

More information

SYSTEM AND METHOD FOR FACILITATING SECURE TRANSACTIONS

SYSTEM AND METHOD FOR FACILITATING SECURE TRANSACTIONS FCOOK.001PR PATENT SYSTEM AND METHOD FOR FACILITATING SECURE TRANSACTIONS BRIEF DESCRIPTION OF THE DRAWINGS [0001] Embodiments of various inventive features will now be described with reference to the

More information

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06T 15/60 ( )

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06T 15/60 ( ) (19) (12) EUROPEAN PATENT APPLICATION (11) EP 2 234 069 A1 (43) Date of publication: 29.09.2010 Bulletin 2010/39 (51) Int Cl.: G06T 15/60 (2006.01) (21) Application number: 09364002.7 (22) Date of filing:

More information

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT)

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

EP A1 (19) (11) EP A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 153(4) EPC

EP A1 (19) (11) EP A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 153(4) EPC (19) (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 13(4) EPC (11) EP 2 482 24 A1 (43) Date of publication: 01.08.2012 Bulletin 2012/31 (21) Application number: 818282. (22) Date of

More information

30 June 2011 ( ) W / / / / A

30 June 2011 ( ) W / / / / A (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1 (19) United States US 20140282538A1 (12) Patent Application Publication (10) Pub. No.: US 2014/0282538A1 ConoVer et al. ( 43) Pub. Date: Sep. 18, 2014 (54) (71) (72) (73) (21) (22) (60) MINIMIZING SCSI

More information

Gesture-Based Controls Via Bone Conduction

Gesture-Based Controls Via Bone Conduction ( 9 of 13 ) United States Patent Application 20150128094 Kind Code A1 Baldwin; Christopher ; et al. May 7, 2015 Gesture-Based Controls Via Bone Conduction Abstract Concepts and technologies are disclosed

More information

FILE SYSTEM 102 DIRECTORY MODULE 104 SELECTION MODULE. Fig. 1

FILE SYSTEM 102 DIRECTORY MODULE 104 SELECTION MODULE. Fig. 1 (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

(CN). PCT/CN20 14/ (81) Designated States (unless otherwise indicated, for every kind of national protection available): AE, AG, AL, AM,

(CN). PCT/CN20 14/ (81) Designated States (unless otherwise indicated, for every kind of national protection available): AE, AG, AL, AM, (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

Figure 1. (43) International Publication Date WO 2015/ Al 9 July 2015 ( ) W P O P C T. [Continued on nextpage]

Figure 1. (43) International Publication Date WO 2015/ Al 9 July 2015 ( ) W P O P C T. [Continued on nextpage] (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

Lionbridge ondemand for Adobe Experience Manager

Lionbridge ondemand for Adobe Experience Manager Lionbridge ondemand for Adobe Experience Manager Version 1.1.0 Configuration Guide October 24, 2017 Copyright Copyright 2017 Lionbridge Technologies, Inc. All rights reserved. Published in the USA. March,

More information

ica) Inc., 2355 Dulles Corner Boulevard, 7th Floor, before the expiration of the time limit for amending the

ica) Inc., 2355 Dulles Corner Boulevard, 7th Floor, before the expiration of the time limit for amending the (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT)

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (43) International Publication Date (10) International

More information

(51) Int Cl.: G06F 21/00 ( ) G11B 20/00 ( ) G06Q 10/00 ( )

(51) Int Cl.: G06F 21/00 ( ) G11B 20/00 ( ) G06Q 10/00 ( ) (19) Europäisches Patentamt European Patent Office Office européen des brevets (12) EUROPEAN PATENT APPLICATION (11) EP 1 724 699 A1 (43) Date of publication: 22.11.2006 Bulletin 2006/47 (21) Application

More information

EP A1 (19) (11) EP A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 153(4) EPC

EP A1 (19) (11) EP A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 153(4) EPC (19) (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 153(4) EPC (11) EP 2 493 239 A1 (43) Date of publication: 29.08.2012 Bulletin 2012/35 (21) Application number: 10829523.9 (22) Date

More information

TEPZZ _Z_56ZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 17/30 ( )

TEPZZ _Z_56ZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06F 17/30 ( ) (19) TEPZZ _Z_6ZA_T (11) EP 3 1 60 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 07.12.16 Bulletin 16/49 (1) Int Cl.: G06F 17/ (06.01) (21) Application number: 16176.9 (22) Date of filing:

More information

Virtual Private Radio via Virtual Private Network - patent application

Virtual Private Radio via Virtual Private Network - patent application From the SelectedWorks of Marc A Sherman February, 2006 Virtual Private Radio via Virtual Private Network - patent application Marc A Sherman Available at: https://works.bepress.com/marc_sherman/2/ UNITED

More information

TEPZZ 78779ZB_T EP B1 (19) (11) EP B1 (12) EUROPEAN PATENT SPECIFICATION

TEPZZ 78779ZB_T EP B1 (19) (11) EP B1 (12) EUROPEAN PATENT SPECIFICATION (19) TEPZZ 78779ZB_T (11) EP 2 787 790 B1 (12) EUROPEAN PATENT SPECIFICATION (4) Date of publication and mention of the grant of the patent: 26.07.17 Bulletin 17/ (21) Application number: 12878644.9 (22)

More information

10 December 2009 ( ) WO 2009/ A2

10 December 2009 ( ) WO 2009/ A2 (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (43) International Publication Date (10) International

More information

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT)

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau 1111111111111111 111111 111111111111111 111 111 11111111111111111111

More information

TEPZZ Z7999A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: B05B 15/04 ( )

TEPZZ Z7999A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: B05B 15/04 ( ) (19) TEPZZ Z7999A_T (11) EP 3 7 999 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 23.08.17 Bulletin 17/34 (1) Int Cl.: B0B 1/04 (06.01) (21) Application number: 1617686.1 (22) Date of filing:

More information

TEPZZ _4748 A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION

TEPZZ _4748 A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION (19) TEPZZ _4748 A_T (11) EP 3 147 483 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 29.03.17 Bulletin 17/13 (21) Application number: 161896.0 (1) Int Cl.: F02C 9/28 (06.01) F02C 9/46 (06.01)

More information

(12) Patent Application Publication (10) Pub. No.: US 2008/ A1

(12) Patent Application Publication (10) Pub. No.: US 2008/ A1 (19) United States US 20080244164A1 (12) Patent Application Publication (10) Pub. No.: US 2008/0244164 A1 Chang et al. (43) Pub. Date: Oct. 2, 2008 (54) STORAGE DEVICE EQUIPPED WITH NAND FLASH MEMORY AND

More information

Note: Text based on automatic Optical Character Recognition processes. SAMSUNG GALAXY NOTE

Note: Text based on automatic Optical Character Recognition processes. SAMSUNG GALAXY NOTE Note: Text based on automatic Optical Character Recognition processes. SAMSUNG GALAXY NOTE PRIORITY This application is a Continuation of U.S. application Ser. No. 14/540,447, which was filed in the U.S.

More information

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: H04L 12/56 ( )

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: H04L 12/56 ( ) (19) (12) EUROPEAN PATENT APPLICATION (11) EP 1 760 963 A1 (43) Date of publication: 07.03.07 Bulletin 07/ (1) Int Cl.: H04L 12/6 (06.01) (21) Application number: 06018260.7 (22) Date of filing: 31.08.06

More information

DZ, EC, EE, EG, ES, FI, GB, GD, GE, GH, GM, GT, HN, HR, HU, ID, IL, IN, IS, KE, KG, KM, KN, KP, KR, TZ, UA, UG, US, UZ, VC, VN, ZA, ZM, ZW.

DZ, EC, EE, EG, ES, FI, GB, GD, GE, GH, GM, GT, HN, HR, HU, ID, IL, IN, IS, KE, KG, KM, KN, KP, KR, TZ, UA, UG, US, UZ, VC, VN, ZA, ZM, ZW. (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (43) International Publication Date (10) International

More information

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1

(12) Patent Application Publication (10) Pub. No.: US 2017/ A1 (19) United States US 20170 126039A1 (12) Patent Application Publication (10) Pub. No.: US 2017/0126039 A1 NGUYEN (43) Pub. Date: (54) BATTERY CHARGER WITH USB TYPE-C (52) U.S. Cl. ADAPTER CPC... H02J

More information

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1

(12) Patent Application Publication (10) Pub. No.: US 2012/ A1 (19) United States US 20120047545A1 (12) Patent Application Publication (10) Pub. No.: US 2012/0047545 A1 SELLERS et al. (43) Pub. Date: Feb. 23, 2012 (54) TOPOGRAPHIC FRAUD DETECTION (52) U.S. Cl....

More information

*EP A2* EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2005/37

*EP A2* EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2005/37 (19) Europäisches Patentamt European Patent Office Office européen des brevets *EP007312A2* (11) EP 1 7 312 A2 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 14.09.0 Bulletin 0/37 (1) Int Cl.

More information

DNSSEC Workshop. Dan York, Internet Society ICANN 53 June 2015

DNSSEC Workshop. Dan York, Internet Society ICANN 53 June 2015 DNSSEC Workshop Dan York, Internet Society ICANN 53 June 2015 First, a word about our host... 2 Program Committee Steve Crocker, Shinkuro, Inc. Mark Elkins, DNS/ZACR Cath Goulding, Nominet Jean Robert

More information

TEPZZ _9 7A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2017/29

TEPZZ _9 7A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2017/29 (19) TEPZZ _9 7A_T (11) EP 3 193 237 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 19.07.2017 Bulletin 2017/29 (1) Int Cl.: G06F 1/32 (2006.01) (21) Application number: 1714829.0 (22) Date

More information

(12) United States Patent (10) Patent No.: US 8,385,897 B1

(12) United States Patent (10) Patent No.: US 8,385,897 B1 US0083.85897 B1 (12) United States Patent (10) Patent No.: Yadav-Ranjan (45) Date of Patent: Feb. 26, 2013 (54) GLOBAL TEXT MESSAGING SYSTEMAND (56) References Cited METHOD U.S. PATENT DOCUMENTS (75) Inventor:

More information

TEPZZ 5976 A T EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G08G 5/00 ( ) H04M 1/725 (2006.

TEPZZ 5976 A T EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G08G 5/00 ( ) H04M 1/725 (2006. (19) TEPZZ 976 A T (11) EP 2 97 633 A2 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 29.0.13 Bulletin 13/22 (1) Int Cl.: G08G /00 (06.01) H04M 1/72 (06.01) (21) Application number: 12193473.1

More information

TEPZZ Z47A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06Q 30/00 ( )

TEPZZ Z47A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: G06Q 30/00 ( ) (19) TEPZZ _ _Z47A_T (11) EP 3 131 047 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 1.02.17 Bulletin 17/07 (1) Int Cl.: G06Q /00 (12.01) (21) Application number: 160297.4 (22) Date of

More information

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1 US 20160364902A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2016/0364902 A1 Hong et al. (43) Pub. Date: (54) HIGH QUALITY EMBEDDED GRAPHICS (52) U.S. Cl. FOR REMOTE VISUALIZATION

More information

TEPZZ 6 8A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION

TEPZZ 6 8A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION (19) TEPZZ 6 8A_T (11) EP 3 121 638 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 2.01.17 Bulletin 17/04 (21) Application number: 1380032.1 (1) Int Cl.: G02B 27/01 (06.01) G06F 11/16 (06.01)

More information

System and method for encoding and decoding data files

System and method for encoding and decoding data files ( 1 of 1 ) United States Patent 7,246,177 Anton, et al. July 17, 2007 System and method for encoding and decoding data files Abstract Distributed compression of a data file can comprise a master server

More information

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT)

(12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (l J w;~:s~:!~:::.:opcrty ~ llllllllllll~~~~~~~~;~~~~~~~~~~~~~~~~.~~~~~!~~~~~llllllllll (43) International Publication

More information

2016 Survey of Internet Carrier Interconnection Agreements

2016 Survey of Internet Carrier Interconnection Agreements 2016 Survey of Internet Carrier Interconnection Agreements Bill Woodcock Marco Frigino Packet Clearing House November 21, 2016 PCH Peering Survey 2011 Five years ago, PCH conducted the first-ever broad

More information

Xying. GoD-12 ACL 1-1. (12) Patent Application Publication (10) Pub. No.: US 2009/ A1. (19) United States SUPPLIER POLICY DRIVER/-108 PLATFORM

Xying. GoD-12 ACL 1-1. (12) Patent Application Publication (10) Pub. No.: US 2009/ A1. (19) United States SUPPLIER POLICY DRIVER/-108 PLATFORM (19) United States US 20090172797A1 (12) Patent Application Publication (10) Pub. No.: US 2009/0172797 A1 Yao et al. (43) Pub. Date: Jul. 2, 2009 (54) METHOD AND SYSTEM FOR SECURING APPLICATION PROGRAMINTERFACES

More information

Scalable networ...a network environment - Google Patents.pdf

Scalable networ...a network environment - Google Patents.pdf University of Maryland at College Park From the SelectedWorks of Puneet Kumar Fall November 10, 2016 Scalable networ...a network environment - Google Patents.pdf Puneet Kumar, University of Maryland at

More information

... (12) Patent Application Publication (10) Pub. No.: US 2003/ A1. (19) United States. icopying unit d:

... (12) Patent Application Publication (10) Pub. No.: US 2003/ A1. (19) United States. icopying unit d: (19) United States US 2003.01.01188A1 (12) Patent Application Publication (10) Pub. No.: US 2003/0101188A1 Teng et al. (43) Pub. Date: May 29, 2003 (54) APPARATUS AND METHOD FOR A NETWORK COPYING SYSTEM

More information

TEPZZ 8Z9Z A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: H04L 12/26 ( )

TEPZZ 8Z9Z A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: H04L 12/26 ( ) (19) TEPZZ 8Z9Z A_T (11) EP 2 809 033 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 03.12.14 Bulletin 14/49 (1) Int Cl.: H04L 12/26 (06.01) (21) Application number: 1417000.4 (22) Date

More information

(43) International Publication Date \ / 0 1 / 1 ' 9 September 2011 ( ) 2 1 VI / A 2

(43) International Publication Date \ / 0 1 / 1 ' 9 September 2011 ( ) 2 1 VI / A 2 (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (10) International Publication Number (43) International

More information

TEPZZ 85 9Z_A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION

TEPZZ 85 9Z_A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION (19) TEPZZ 8 9Z_A_T (11) EP 2 83 901 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 01.04.1 Bulletin 1/14 (21) Application number: 141861.1 (1) Int Cl.: G01P 21/00 (06.01) G01C 2/00 (06.01)

More information

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1. Choi et al. (43) Pub. Date: Apr. 27, 2006

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1. Choi et al. (43) Pub. Date: Apr. 27, 2006 US 20060090088A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2006/0090088 A1 Choi et al. (43) Pub. Date: Apr. 27, 2006 (54) METHOD AND APPARATUS FOR Publication Classification

More information

TEPZZ 8864Z9A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: B60W 30/14 ( ) B60W 50/00 (2006.

TEPZZ 8864Z9A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (51) Int Cl.: B60W 30/14 ( ) B60W 50/00 (2006. (19) TEPZZ 8864Z9A_T (11) EP 2 886 9 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 24.06. Bulletin /26 (1) Int Cl.: B60W /14 (06.01) B60W 0/00 (06.01) (21) Application number: 106043.7

More information

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2008/32

EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2008/32 (19) (12) EUROPEAN PATENT APPLICATION (11) EP 1 93 663 A1 (43) Date of publication: 06.08.08 Bulletin 08/32 (1) Int Cl.: G06F 21/00 (06.01) G06F 3/023 (06.01) (21) Application number: 07124.4 (22) Date

More information

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1 (19) United States US 2006.0062400A1 (12) Patent Application Publication (10) Pub. No.: Chia-Chun (43) Pub. Date: Mar. 23, 2006 (54) BLUETOOTH HEADSET DEVICE CAPABLE OF PROCESSING BOTH AUDIO AND DIGITAL

More information

*EP A1* EP A1 (19) (11) EP A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art.

*EP A1* EP A1 (19) (11) EP A1. (12) EUROPEAN PATENT APPLICATION published in accordance with Art. (19) Europäisches Patentamt European Patent Office Office européen des brevets *EP00182883A1* (11) EP 1 82 883 A1 (12) EUROPEAN PATENT APPLICATION published in accordance with Art. 18(3) EPC (43) Date

More information

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1

(12) Patent Application Publication (10) Pub. No.: US 2014/ A1 US 20140O82324A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2014/0082324 A1 Elhamias et al. (43) Pub. Date: Mar. 20, 2014 (54) METHOD AND STORAGE DEVICE FOR (52) U.S. Cl.

More information

(12) Patent Application Publication (10) Pub. No.: US 2003/ A1

(12) Patent Application Publication (10) Pub. No.: US 2003/ A1 (19) United States US 2003.01.10403A1 (12) Patent Application Publication (10) Pub. No.: US 2003/0110403 A1 Crutchfield et al. (43) Pub. Date: Jun. 12, 2003 (54) SYSTEM FOR SHARED POWER SUPPLY IN COMPUTER

More information

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1 (19) United States US 2004O231004A1 (12) Patent Application Publication (10) Pub. No.: US 2004/0231004 A1 Seo (43) Pub. Date: (54) HTTP BASED VIDEO STREAMING APPARATUS AND METHOD IN MOBILE COMMUNICATION

More information

SYSTEM AND PROCESS FOR ALTERING MUSICAL OUTPUT FOR AUDIO ENTERTAINMENT BASED ON LOCATION

SYSTEM AND PROCESS FOR ALTERING MUSICAL OUTPUT FOR AUDIO ENTERTAINMENT BASED ON LOCATION SYSTEM AND PROCESS FOR ALTERING MUSICAL OUTPUT FOR AUDIO ENTERTAINMENT BASED ON LOCATION BACKGROUND [001] Embodiments of the invention described in this specification relate generally to audio entertainment

More information

*EP A2* EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2000/33

*EP A2* EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2000/33 (19) Europäisches Patentamt European Patent Office Office européen des brevets *EP002842A2* (11) EP 1 028 42 A2 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 16.08.00 Bulletin 00/33 (1) Int

More information

MAWA Forum State of Play. Cooperation Planning & Support Henk Corporaal MAWA Forum Chair

MAWA Forum State of Play. Cooperation Planning & Support Henk Corporaal MAWA Forum Chair MAWA Forum State of Play Cooperation Planning & Support Henk Corporaal MAWA Forum Chair Content Background MAWA Initiative Achievements and Status to date Future Outlook 2 Background MAWA Initiative The

More information

2016 Survey of Internet Carrier Interconnection Agreements

2016 Survey of Internet Carrier Interconnection Agreements 2016 Survey of Internet Carrier Interconnection Agreements Bill Woodcock Marco Frigino Packet Clearing House February 6, 2017 PCH Peering Survey 2011 Five years ago, PCH conducted the first-ever broad

More information

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1 US 2016O156189A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2016/0156189 A1 Ci (43) Pub. Date: Jun. 2, 2016 (54) CLOUD BASED ENERGY SYSTEM (52) U.S. Cl. CPC. H02J 3/32 (2013.01);

More information

(12) United States Patent (10) Patent No.: US 6,208,340 B1. Amin et al. (45) Date of Patent: Mar. 27, 2001

(12) United States Patent (10) Patent No.: US 6,208,340 B1. Amin et al. (45) Date of Patent: Mar. 27, 2001 USOO620834OB1 (12) United States Patent (10) Patent No.: US 6,208,340 B1 Amin et al. (45) Date of Patent: Mar. 27, 2001 (54) GRAPHICAL USER INTERFACE 5,317,687 5/1994 Torres... 395/159 INCLUDING A DROP-DOWN

More information

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1

(12) Patent Application Publication (10) Pub. No.: US 2006/ A1 (19) United States US 200601 01189A1 (12) Patent Application Publication (10) Pub. No.: US 2006/0101189 A1 Chandrasekaran et al. (43) Pub. Date: (54) SYSTEM AND METHOD FOR HOT (52) U.S. Cl.... 711 f6 CLONING

More information

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1 US 2016035.0099A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2016/035.0099 A1 Suparna et al. (43) Pub. Date: Dec. 1, 2016 (54) APPLICATION DEPLOYMENT TO VIRTUAL Publication

More information

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1

(12) Patent Application Publication (10) Pub. No.: US 2010/ A1 (19) United States US 2010.019 1896A1 (12) Patent Application Publication (10) Pub. No.: US 2010/0191896 A1 Yang et al. (43) Pub. Date: Jul. 29, 2010 (54) SOLID STATE DRIVE CONTROLLER WITH FAST NVRAM BUFFER

More information

(12) United States Patent (10) Patent No.: US 6,650,589 B2

(12) United States Patent (10) Patent No.: US 6,650,589 B2 USOO6650589B2 (12) United States Patent (10) Patent No.: US 6,650,589 B2 Clark (45) Date of Patent: Nov. 18, 2003 (54) LOW VOLTAGE OPERATION OF STATIC 6,205,078 B1 * 3/2001 Merritt... 365/226 RANDOMACCESS

More information

ALTERNATIVE CHARGE CONTROL SYSTEM FOR MERCHANDISE DISPLAY SECURITY SYSTEM

ALTERNATIVE CHARGE CONTROL SYSTEM FOR MERCHANDISE DISPLAY SECURITY SYSTEM Technical Disclosure Commons InVue Defensive Publications Defensive Publications Series August 11, 2017 ALTERNATIVE CHARGE CONTROL SYSTEM FOR MERCHANDISE DISPLAY SECURITY SYSTEM InVue Security Products

More information

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1

(12) Patent Application Publication (10) Pub. No.: US 2004/ A1 (19) United States US 2004O246971A1 (12) Patent Application Publication (10) Pub. No.: US 2004/0246971 A1 Banerjee et al. (43) Pub. Date: Dec. 9, 2004 (54) APPARATUS FOR ENABLING MULTI-TUPLE TCP SOCKETS

More information

TEPZZ 57 7 ZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2013/13

TEPZZ 57 7 ZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2013/13 (19) TEPZZ 57 7 ZA_T (11) EP 2 573 720 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 27.03.2013 Bulletin 2013/13 (51) Int Cl.: G06Q 10/00 (2012.01) (21) Application number: 11182591.5 (22)

More information

TEPZZ 99894ZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION

TEPZZ 99894ZA_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION (19) TEPZZ 99894ZA_T (11) EP 2 998 9 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 23.03.16 Bulletin 16/12 (21) Application number: 18973.3 (1) Int Cl.: G07C 9/00 (06.01) B62H /00 (06.01)

More information

(12) Patent Application Publication (10) Pub. No.: US 2008/ A1

(12) Patent Application Publication (10) Pub. No.: US 2008/ A1 (19) United States US 2008.0068375A1 (12) Patent Application Publication (10) Pub. No.: US 2008/0068375 A1 Min et al. (43) Pub. Date: Mar. 20, 2008 (54) METHOD AND SYSTEM FOR EARLY Z (30) Foreign Application

More information

(12) United States Patent

(12) United States Patent US007107617B2 (12) United States Patent Hursey et al. (10) Patent No.: (45) Date of Patent: Sep. 12, 2006 (54) MALWARE SCANNING OF COMPRESSED COMPUTER S (75) Inventors: Nell John Hursey, Hertfordshire

More information

I International Bureau (10) International Publication Number (43) International Publication Date

I International Bureau (10) International Publication Number (43) International Publication Date (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization I International Bureau (10) International Publication Number (43) International

More information

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1

(12) Patent Application Publication (10) Pub. No.: US 2016/ A1 US 20160261583A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2016/0261583 A1 ZHANG (43) Pub. Date: Sep. 8, 2016 (54) METHOD AND APPARATUS FOR USER Publication Classification

More information

Components of a personal computer

Components of a personal computer Components of a personal computer Computer systems ranging from a controller in a microwave oven to a large supercomputer contain components providing five functions. A typical personal computer has hard,

More information

TEPZZ Z5_748A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION

TEPZZ Z5_748A_T EP A1 (19) (11) EP A1 (12) EUROPEAN PATENT APPLICATION (19) TEPZZ Z_748A_T (11) EP 3 01 748 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: 03.08.16 Bulletin 16/31 (21) Application number: 118.1 (1) Int Cl.: H04L 12/14 (06.01) H04W 48/18 (09.01)

More information

SURVEY ON APPLICATION NUMBERING SYSTEMS

SURVEY ON APPLICATION NUMBERING SYSTEMS Ref.: Examples and IPO practices page: 7..5.0 SURVEY ON APPLICATION NUMBERING SYSTEMS Editorial note by the International Bureau The following survey presents the information on various aspects of application

More information

(12) United States Patent (10) Patent No.: US 6,199,058 B1

(12) United States Patent (10) Patent No.: US 6,199,058 B1 USOO6199058B1 (12) United States Patent (10) Patent No.: US 6,199,058 B1 Wong et al. (45) Date of Patent: Mar. 6, 2001 (54) REPORT SERVER CACHING 5,168,444 12/1992 Cukor et al.... 705/1 5,625,818 4/1997

More information

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1

(12) Patent Application Publication (10) Pub. No.: US 2011/ A1 US 2011 O270691A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2011/0270691 A1 Park (43) Pub. Date: Nov. 3, 2011 (54) METHOD AND SYSTEM FOR PROVIDING Publication Classification

More information

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1. Large et al. (43) Pub. Date: Aug. 8, 2013

(12) Patent Application Publication (10) Pub. No.: US 2013/ A1. Large et al. (43) Pub. Date: Aug. 8, 2013 (19) United States US 201302011 12A1 (12) Patent Application Publication (10) Pub. No.: US 2013/0201112 A1 Large et al. (43) Pub. Date: Aug. 8, 2013 (54) LOW-LATENCY TOUCH-INPUT DEVICE (52) U.S. Cl. USPC...

More information

(12) United States Patent (10) Patent No.: US 6,526,272 B1

(12) United States Patent (10) Patent No.: US 6,526,272 B1 USOO6526272B1 (12) United States Patent (10) Patent No.: Bansal et al. (45) Date of Patent: Feb. 25, 2003 (54) REDUCING CALLING COSTS FOR 6,167,250 A * 12/2000 Rahman et al... 455/408 WIRELESS PHONES USING

More information