Package elmnnrcpp. July 21, 2018

Similar documents
Package ECctmc. May 1, 2018

Package KernelKnn. January 16, 2018

Package nmslibr. April 14, 2018

Package cattonum. R topics documented: May 2, Type Package Version Title Encode Categorical Features

Package validara. October 19, 2017

Package fastdummies. January 8, 2018

Package geojsonsf. R topics documented: January 11, Type Package Title GeoJSON to Simple Feature Converter Version 1.3.

Package bisect. April 16, 2018

Package milr. June 8, 2017

Package geogrid. August 19, 2018

Package canvasxpress

Package spark. July 21, 2017

Package kdtools. April 26, 2018

Package SEMrushR. November 3, 2018

Package rnn. R topics documented: June 21, Title Recurrent Neural Network Version 0.8.1

Package semver. January 6, 2017

Package bigreadr. R topics documented: August 13, Version Date Title Read Large Text Files

Package stapler. November 27, 2017

Package datasets.load

Package strat. November 23, 2016

Analysis of Randomized Performance of Bias Parameters and Activation Function of Extreme Learning Machine

Package vinereg. August 10, 2018

Package MTLR. March 9, 2019

Package messaging. May 27, 2018

Package gggenes. R topics documented: November 7, Title Draw Gene Arrow Maps in 'ggplot2' Version 0.3.2

Package embed. November 19, 2018

Package kerasformula

Package mixsqp. November 14, 2018

Package wrswor. R topics documented: February 2, Type Package

Package jdx. R topics documented: January 9, Type Package Title 'Java' Data Exchange for 'R' and 'rjava'

Package calpassapi. August 25, 2018

Package ramcmc. May 29, 2018

Package GauPro. September 11, 2017

Package sgmcmc. September 26, Type Package

Package ggimage. R topics documented: November 1, Title Use Image in 'ggplot2' Version 0.0.7

Package knitrprogressbar

Package rucrdtw. October 13, 2017

Package TVsMiss. April 5, 2018

Package WordR. September 7, 2017

Package splithalf. March 17, 2018

Package fastrtext. December 10, 2017

Package rmi. R topics documented: August 2, Title Mutual Information Estimators Version Author Isaac Michaud [cre, aut]

Package TrafficBDE. March 1, 2018

Package robustreg. R topics documented: April 27, Version Date Title Robust Regression Functions

Package SAENET. June 4, 2015

Package projector. February 27, 2018

Package ggimage. R topics documented: December 5, Title Use Image in 'ggplot2' Version 0.1.0

Package farver. November 20, 2018

Package Numero. November 24, 2018

Package clipr. June 23, 2018

Package glcm. August 29, 2016

Package fasttextr. May 12, 2017

Package balance. October 12, 2018

Package gtrendsr. October 19, 2017

Package edfreader. R topics documented: May 21, 2017

Package zoomgrid. January 3, 2019

Package goodpractice

Package pwrab. R topics documented: June 6, Type Package Title Power Analysis for AB Testing Version 0.1.0

Package NNLM. June 18, 2018

Package effectr. January 17, 2018

Package lumberjack. R topics documented: July 20, 2018

Package pdfsearch. July 10, 2018

Package triebeard. August 29, 2016

Package githubinstall

Package docxtools. July 6, 2018

Package PUlasso. April 7, 2018

Package automl. September 13, 2018

Package readxl. April 18, 2017

Package FSelectorRcpp

Package rsppfp. November 20, 2018

Package queuecomputer

Package liftr. R topics documented: May 14, Type Package

Package preprosim. July 26, 2016

Package epitab. July 4, 2018

Package sfdct. August 29, 2017

Package gtrendsr. August 4, 2018

Package postgistools

Package raker. October 10, 2017

Package sankey. R topics documented: October 22, 2017

Package robotstxt. November 12, 2017

Package nonlinearicp

Package wikitaxa. December 21, 2017

Package diffusr. May 17, 2018

Package gppm. July 5, 2018

Package crochet. January 8, 2018

Package nngeo. September 29, 2018

Package barcoder. October 26, 2018

Package editdata. October 7, 2017

Package coga. May 8, 2018

Package crossrun. October 8, 2018

Package eply. April 6, 2018

Package widyr. August 14, 2017

Package gifti. February 1, 2018

Package deductive. June 2, 2017

Package RODBCext. July 31, 2017

Package filematrix. R topics documented: February 27, Type Package

Package TANDEM. R topics documented: June 15, Type Package

arxiv: v1 [cs.lg] 25 Jan 2018

Package condusco. November 8, 2017

Package dkanr. July 12, 2018

Transcription:

Type Package Title The Extreme Learning Machine Algorithm Version 1.0.1 Date 2018-07-21 Package elmnnrcpp July 21, 2018 BugReports https://github.com/mlampros/elmnnrcpp/issues URL https://github.com/mlampros/elmnnrcpp Training and predict functions for Single Hidden-layer Feedforward Neural Networks (SLFN) using the Extreme Learning Machine (ELM) algorithm. The ELM algorithm differs from the traditional gradient-based algorithms for very short training times (it doesn't need any iterative tuning, this makes learning time very fast) and there is no need to set any other parameters like learning rate, momentum, epochs, etc. This is a reimplementation of the 'elmnn' package using 'RcppArmadillo' after the 'elmnn' package was archived. For more information, see ``Extreme learning machine: Theory and applications'' by Guang-Bin Huang, Qin- Yu Zhu, Chee-Kheong Siew (2006), Elsevier B.V, <doi:10.1016/j.neucom.2005.12.126>. License GPL (>= 2) Encoding UTF-8 LazyData true Depends R(>= 3.0.2), KernelKnn Imports Rcpp (>= 0.12.17) LinkingTo Rcpp, RcppArmadillo (>= 0.8) Suggests testthat, covr, knitr, rmarkdown VignetteBuilder knitr RoxygenNote 6.0.1 NeedsCompilation yes Author Lampros Mouselimis [aut, cre], Alberto Gosso [aut] Maintainer Lampros Mouselimis <mouselimislampros@gmail.com> Repository CRAN Date/Publication 2018-07-21 20:00:03 UTC 1

2 elm_predict R topics documented: elm_predict......................................... 2 elm_train.......................................... 3 onehot_encode....................................... 5 Index 6 elm_predict Extreme Learning Machine predict function Usage Extreme Learning Machine predict function elm_predict(elm_train_object, newdata, normalize = FALSE) Arguments elm_train_object it should be the output of the elm_train function newdata normalize Examples an input matrix with number of columns equal to the x parameter of the elm_train function a boolean specifying if the output predictions in case of classification should be normalized. If TRUE then the values of each row of the output-probabilitymatrix that are less than 0 and greater than 1 will be pushed to the [0,1] range library(elmnnrcpp) # Regression data(boston, package = 'KernelKnn') Boston = as.matrix(boston) dimnames(boston) = NULL x = Boston[, -ncol(boston)] y = matrix(boston[, ncol(boston)], nrow = length(boston[, ncol(boston)]), ncol = 1) out_regr = elm_train(x, y, nhid = 20, actfun = 'purelin', init_weights = 'uniform_negative') pr_regr = elm_predict(out_regr, x)

elm_train 3 ---- # Classification ---- data(ionosphere, package = 'KernelKnn') x_class = ionosphere[, -c(2, ncol(ionosphere))] x_class = as.matrix(x_class) dimnames(x_class) = NULL y_class = as.numeric(ionosphere[, ncol(ionosphere)]) y_class_onehot = onehot_encode(y_class - 1) # class labels should begin from 0 out_class = elm_train(x_class, y_class_onehot, nhid = 20, actfun = 'relu') pr_class = elm_predict(out_class, x_class, normalize = TRUE) elm_train Extreme Learning Machine training function Extreme Learning Machine training function Usage elm_train(x, y, nhid, actfun, init_weights = "normal_gaussian", bias = FALSE, moorep_pseudoinv_tol = 0.01, leaky_relu_alpha = 0, seed = 1, verbose = FALSE) Arguments x y a matrix. The columns of the input matrix should be of type numeric a matrix. In case of regression the matrix should have n rows and 1 column. In case of classification it should consist of n rows and n columns, where n > 1 and equals to the number of the unique labels. nhid a numeric value specifying the hidden neurons. Must be >= 1 actfun a character string specifying the type of activation function. It should be one of the following : sig ( sigmoid ), sin ( sine ), radbas ( radial basis ), hardlim ( hard-limit ), hardlims ( symmetric hard-limit ), satlins ( satlins ), tansig ( tan-sigmoid ), tribas ( triangular basis ), relu ( rectifier linear unit ) or purelin ( linear )

4 elm_train init_weights a character string spcecifying the distribution from which the input-weights and the bias should be initialized. It should be one of the following : normal_gaussian ( in the range [0,1] ), uniform_positive ( in the range [0,1] ) or uniform_negative ( in the range [-1,1] ) bias either TRUE or FALSE. If TRUE then bias weights will be added to the hidden layer moorep_pseudoinv_tol a numeric value. See the references web-link for more details on Moore-Penrose pseudo-inverse and specifically on the pseudo inverse tolerance value leaky_relu_alpha a numeric value between 0.0 and 1.0. If 0.0 then a simple relu ( f(x) = 0.0 for x < 0, f(x) = x for x >= 0 ) activation function will be used, otherwise a leaky-relu ( f(x) = alpha * x for x < 0, f(x) = x for x >= 0 ). It is applicable only if actfun equals to relu seed a numeric value specifying the random seed. Defaults to 1 verbose a boolean. If TRUE then information will be printed in the console Details The input matrix should be of type numeric. This means the user should convert any character, factor or boolean columns to numeric values before using the elm_train function References http://arma.sourceforge.net/docs.html https://en.wikipedia.org/wiki/moore https://www.kaggle.com/robertbm/extreme-learning-machine-example http://rt.dgyblog.com/ml/ml-elm.html Examples library(elmnnrcpp) # Regression data(boston, package = 'KernelKnn') Boston = as.matrix(boston) dimnames(boston) = NULL x = Boston[, -ncol(boston)] y = matrix(boston[, ncol(boston)], nrow = length(boston[, ncol(boston)]), ncol = 1) out_regr = elm_train(x, y, nhid = 20, actfun = 'purelin', init_weights = 'uniform_negative')

onehot_encode 5 ---- # Classification ---- data(ionosphere, package = 'KernelKnn') x_class = ionosphere[, -c(2, ncol(ionosphere))] x_class = as.matrix(x_class) dimnames(x_class) = NULL y_class = as.numeric(ionosphere[, ncol(ionosphere)]) y_class_onehot = onehot_encode(y_class - 1) # class labels should begin from 0 out_class = elm_train(x_class, y_class_onehot, nhid = 20, actfun = 'relu') onehot_encode One-hot-encoding of the labels in case of classification One-hot-encoding of the labels in case of classification Usage onehot_encode(y) Arguments y a numeric vector consisting of the response variable labels. The minimum value of the unique labels should begin from 0 Examples library(elmnnrcpp) y = sample(0:3, 100, replace = TRUE) y_expand = onehot_encode(y)

Index elm_predict, 2 elm_train, 3 onehot_encode, 5 6