Neural Network and deep-learning are the buzzwords lately. Machine learning has been in vogue for some time, but the easy availability of storage and processing power has made it popular. The interest is palpable in business schools as well. The ML related techniques have not percolated much from the IT departments to business, but everybody seems to be interested. So, let us build a Neural Network model in 10 minutes.
This is the scenario:
You have a collection of independent variables (IV) that predict a dependent variable (DV). You have a theoretical model and want to know if it is good enough. Remember, we are not testing the model. We are just checking how good the IVs are in predicting DV. If they are not good predictors to start with, why waste time conjuring a fancy model! Sounds familiar? Let’s get started.
The first step is to import few modules. If you don’t know what these are, just copy paste and ignore. Consider them as a header that you require.
# Modules import sys import numpy from imblearn.over_sampling import RandomOverSampler from keras.layers import Dense from keras.models import Sequential from pandas import read_csv
Create a CSV file with your data with the last column as your DV. Now import that file.
# Import data dataset = read_csv(sys.argv, header=1) (nrows, ncols) = dataset.shape
nrows and ncols are the numbers of rows and columns. Now separate DV (y) from IVs (X) as below.
# Separate DV from IVs values = dataset.values X = values[:, 0:ncols-1] y = values[:, ncols-1]
In most cases, you will be trying to predict a rare event. So add some oversampling for taste 🙂
# Oversampling ros = RandomOverSampler(random_state=0) X_R, y_R = ros.fit_sample(X, y)
Create, compile and fit the model.
# create model model = Sequential() model.add(Dense(12, input_dim=vnum, kernel_initializer='uniform', activation='relu')) model.add(Dense(8, kernel_initializer='uniform', activation='relu')) model.add(Dense(1, kernel_initializer='uniform', activation='sigmoid')) # Compile model model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) # Fit the model model.fit(X_R, y_R, epochs=150, batch_size=10, verbose=2)
The three model.add statements represent the three layers in Neural Network. The number after Dense is the number of neurons in each layer. You can play with these values a bit. These settings should work in most business cases. Read this for more information.
Now evaluate the model.
# evaluate the model scores = model.evaluate(X_R, y_R) print("\n") print("\n Accuracy of the model") print("\n%s: %.2f%%" % (model.metrics_names, scores * 100)) print("\n --------------------------------------------------")
Put this code in a file (say nnet.py) and use it as below.
python nnet.py mydata.csv
Just use QRMine. nnet.py is in there.
Operationalizing Neural Network models
Shortly, I will show you how to operationalize a model using flask.