DEEP LEARNING IN PYTHON Introduction to deep learning
Imagine you work for a bank You need to predict how many transactions each customer will make next year
Example as seen by linear regression Age Bank Balance Number of Transactions Retirement Status
Example as seen by linear regression Model with no interactions Model with interactions Predicted Transactions Not Retired Predicted Transactions Not Retired Retired Retired Bank Balance Bank Balance
Interactions Neural networks account for interactions really well Deep learning uses especially powerful neural networks Text Images Videos Audio Source code
Course structure First two chapters focus on conceptual knowledge Debug and tune deep learning models on conventional prediction problems Lay the foundation for progressing towards modern applications This will pay off in the third and fourth chapters
Build deep learning models with keras In []: import numpy as np In []: from keras.layers import Dense In [3]: from keras.models import Sequential In [4]: predictors = np.loadtxt('predictors_data.csv', delimiter=',') In [5]: n_cols = predictors.shape[] In [6]: model = Sequential() In [7]: model.add(dense(00, activation='relu', input_shape = (n_cols,))) In [8]: model.add(dense(00, activation='relu') In [9]: model.add(dense())
Deep learning models capture interactions Age Bank Balance Number of Transactions Retirement Status
Interactions in neural network Input Layer Hidden Layer Age Output Layer Income # Accounts Number of Transactions
DEEP LEARNING IN PYTHON Let s practice!
DEEP LEARNING IN PYTHON Forward propagation
Course Title Bank transactions example Make predictions based on: Number of children Number of existing accounts
Forward propagation Input Hidden Layer # Children 5 Output - 9 # Transactions - # Accounts 3
Forward propagation Input Hidden Layer # Children 5 Output - 9 # Transactions - # Accounts 3
Forward propagation Input Hidden Layer # Children 5 Output - 9 # Transactions - # Accounts 3
Forward propagation Input Hidden Layer # Children 5 Output - 9 # Transactions - # Accounts 3
Course Title Forward propagation Multiply - add process Dot product Forward propagation for one data point at a time Output is the prediction for that data point
Forward propagation code In []: import numpy as np In []: input_data = np.array([, 3]) In [3]: weights = {'node_0': np.array([, ]),...: 'node_': np.array([-, ]),...: 'output': np.array([, -])} In [4]: node_0_value = (input_data * weights['node_0']).sum() In [5]: node value = (input_data * weights['node_']).sum() Input Hidden Layer Output 5 3 - -
Forward propagation code In [6]: hidden_layer_values = np.array([node_0_value, node value]) In [7]: print(hidden_layer_values) [5, ] In [8]: output = (hidden_layer_values * weights['output']).sum() In [9]: print(output) 9 Input Hidden Layer Output 5 9 3 - -
DEEP LEARNING IN PYTHON Let s practice!
DEEP LEARNING IN PYTHON Activation functions
Linear vs Nonlinear Functions Linear Functions Nonlinear Functions
Activation functions Applied to node inputs to produce node output
Improving our neural network Input Hidden Layer 5 Output - - 9 3
Activation functions Input Hidden Layer tanh(+3) Output - - 9 3 tanh(-+3)
ReLU (Rectified Linear Activation) Rectifier
Activation functions In []: import numpy as np In []: input_data = np.array([-, ]) In [3]: weights = {'node_0': np.array([3, 3]),...: 'node_': np.array([, 5]),...: 'output': np.array([, -])} In [4]: node_0_input = (input_data * weights['node_0']).sum() In [5]: node_0_output = np.tanh(node_0_input) In [6]: node input = (input_data * weights['node_']).sum() In [7]: node output = np.tanh(node input) In [8]: hidden_layer_outputs = np.array([node_0_output, node output]) In [9]: output = (hidden_layer_output * weights['output']).sum() In [0]: print(output).3845569454
DEEP LEARNING IN PYTHON Let s practice!
DEEP LEARNING IN PYTHON Deeper networks
Multiple hidden layers 3 - Age -3 4 4 7 5-5 Calculate with ReLU Activation Function
Multiple hidden layers 3 - Age -3 4 4 7 5-5 Calculate with ReLU Activation Function
Multiple hidden layers 3 - Age -3 4 4 7 5-5 Calculate with ReLU Activation Function
Multiple hidden layers 3 - Age -3 4 4 7 5-5 Calculate with ReLU Activation Function
Multiple hidden layers 3 - Age -3 4 4 7 5-5 Calculate with ReLU Activation Function
Multiple hidden layers 3 Age 4 5 Calculate with ReLU Activation Function
Multiple hidden layers 3 Age 4 5 Calculate with ReLU Activation Function
Multiple hidden layers 3 Age 4 5 Calculate with ReLU Activation Function
Multiple hidden layers 3 6 Age 4 5 Calculate with ReLU Activation Function
Multiple hidden layers 3 6 Age 4 5-5 Calculate with ReLU Activation Function
Multiple hidden layers 3 6 Age 4 5-5 Calculate with ReLU Activation Function
Multiple hidden layers 3 6 Age 4 5-5 Calculate with ReLU Activation Function
Multiple hidden layers 3 6 - Age -3 4 4 7 5-5 0 Calculate with ReLU Activation Function
Multiple hidden layers 3 4 6 - Age 0-3 4 7 364 5-5 0 5 Calculate with ReLU Activation Function
Representation learning Deep networks internally build representations of patterns in the data Partially replace the need for feature engineering Subsequent layers build increasingly sophisticated representations of raw data
Representation learning Deep Learning in Python
Deep learning Modeler doesn t need to specify the interactions When you train the model, the neural network gets weights that find the relevant patterns to make better predictions
DEEP LEARNING IN PYTHON Let s practice!