Cardiovascular diseases (CVDs) are the number 1 cause of death globally, taking an estimated 17.9 million lives each year, which accounts for 31% of all deaths worlwide. Heart failure is a common event caused by CVDs and this dataset contains 12 features that can be used to predict mortality by heart failure. Most cardiovascular diseases can be prevented by addressing behavioural risk factors such as tobacco use, unhealthy diet and obesity, physical inactivity and harmful use of alcohol using population-wide strategies. People with cardiovascular disease or who are at high cardiovascular risk (due to the presence of one or more risk factors such as hypertension, diabetes, hyperlipidaemia or already established disease) need early detection and management wherein a machine learning model can be of great help.
import warnings
warnings.filterwarnings('ignore')
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn import preprocessing
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import train_test_split
from sklearn import svm
from keras.layers import Dense, BatchNormalization, Dropout, LSTM
from keras.models import Sequential
from keras import callbacks
from sklearn.metrics import precision_score, recall_score, confusion_matrix, classification_report, accuracy_score, f1_score
Heart_Failure = pd.read_csv('heart_failure.csv')
Heart_Failure.head()
age | anaemia | creatinine_phosphokinase | diabetes | ejection_fraction | high_blood_pressure | platelets | serum_creatinine | serum_sodium | sex | smoking | time | DEATH_EVENT | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 75.0 | 0 | 582 | 0 | 20 | 1 | 265000.00 | 1.9 | 130 | 1 | 0 | 4 | 1 |
1 | 55.0 | 0 | 7861 | 0 | 38 | 0 | 263358.03 | 1.1 | 136 | 1 | 0 | 6 | 1 |
2 | 65.0 | 0 | 146 | 0 | 20 | 0 | 162000.00 | 1.3 | 129 | 1 | 1 | 7 | 1 |
3 | 50.0 | 1 | 111 | 0 | 20 | 0 | 210000.00 | 1.9 | 137 | 1 | 0 | 7 | 1 |
4 | 65.0 | 1 | 160 | 1 | 20 | 0 | 327000.00 | 2.7 | 116 | 0 | 0 | 8 | 1 |
Heart_Failure.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 299 entries, 0 to 298
Data columns (total 13 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 age 299 non-null float64
1 anaemia 299 non-null int64
2 creatinine_phosphokinase 299 non-null int64
3 diabetes 299 non-null int64
4 ejection_fraction 299 non-null int64
5 high_blood_pressure 299 non-null int64
6 platelets 299 non-null float64
7 serum_creatinine 299 non-null float64
8 serum_sodium 299 non-null int64
9 sex 299 non-null int64
10 smoking 299 non-null int64
11 time 299 non-null int64
12 DEATH_EVENT 299 non-null int64
dtypes: float64(3), int64(10)
memory usage: 30.5 KB
cols= ["#CD5C5C","#FF0000"]
ax = sns.countplot(x= Heart_Failure["DEATH_EVENT"], palette= cols)
ax.bar_label(ax.containers[0])
[Text(0, 0, '203'), Text(0, 0, '96')]
Heart_Failure.describe().T
count | mean | std | min | 25% | 50% | 75% | max | |
---|---|---|---|---|---|---|---|---|
age | 299.0 | 60.833893 | 11.894809 | 40.0 | 51.0 | 60.0 | 70.0 | 95.0 |
anaemia | 299.0 | 0.431438 | 0.496107 | 0.0 | 0.0 | 0.0 | 1.0 | 1.0 |
creatinine_phosphokinase | 299.0 | 581.839465 | 970.287881 | 23.0 | 116.5 | 250.0 | 582.0 | 7861.0 |
diabetes | 299.0 | 0.418060 | 0.494067 | 0.0 | 0.0 | 0.0 | 1.0 | 1.0 |
ejection_fraction | 299.0 | 38.083612 | 11.834841 | 14.0 | 30.0 | 38.0 | 45.0 | 80.0 |
high_blood_pressure | 299.0 | 0.351171 | 0.478136 | 0.0 | 0.0 | 0.0 | 1.0 | 1.0 |
platelets | 299.0 | 263358.029264 | 97804.236869 | 25100.0 | 212500.0 | 262000.0 | 303500.0 | 850000.0 |
serum_creatinine | 299.0 | 1.393880 | 1.034510 | 0.5 | 0.9 | 1.1 | 1.4 | 9.4 |
serum_sodium | 299.0 | 136.625418 | 4.412477 | 113.0 | 134.0 | 137.0 | 140.0 | 148.0 |
sex | 299.0 | 0.648829 | 0.478136 | 0.0 | 0.0 | 1.0 | 1.0 | 1.0 |
smoking | 299.0 | 0.321070 | 0.467670 | 0.0 | 0.0 | 0.0 | 1.0 | 1.0 |
time | 299.0 | 130.260870 | 77.614208 | 4.0 | 73.0 | 115.0 | 203.0 | 285.0 |
DEATH_EVENT | 299.0 | 0.321070 | 0.467670 | 0.0 | 0.0 | 0.0 | 1.0 | 1.0 |
cmap = sns.diverging_palette(2, 165, s=80, l=55, n=9)
corrmat = Heart_Failure.corr()
plt.subplots(figsize=(20,20))
sns.heatmap(corrmat,cmap= cmap,annot=True, square=True)
<AxesSubplot:>
“time” is the most important feature as it would’ve been very crucial to get diagnosed early with cardivascular issue so as to get timely treatment thus, reducing the chances of any fatality. (Evident from the inverse relationship)
“serum_creatinine” is the next important feature as serum’s (essential component of blood) abundancy in blood makes it easier for heart to function.
“ejection_fraction” has also significant influence on target variable which is expected since it is basically the efficiency of the heart.
Can be seen from the inverse relation pattern that heart’s functioning declines with ageing.
plt.figure(figsize=(15,10))
Days_of_week=sns.countplot(x=Heart_Failure['age'],data=Heart_Failure, hue ="DEATH_EVENT",palette = cols)
Days_of_week.set_title("Distribution Of Age", color="#774571")
Text(0.5, 1.0, 'Distribution Of Age')
feature = ["age","creatinine_phosphokinase","ejection_fraction","platelets","serum_creatinine","serum_sodium", "time"]
for i in feature:
plt.figure(figsize=(10,7))
sns.swarmplot(x=Heart_Failure["DEATH_EVENT"], y=Heart_Failure[i], color="black", alpha=0.7)
sns.boxenplot(x=Heart_Failure["DEATH_EVENT"], y=Heart_Failure[i], palette=cols)
plt.show()
sns.kdeplot(x=Heart_Failure["time"], y=Heart_Failure["age"], hue =Heart_Failure["DEATH_EVENT"], palette=cols)
<AxesSubplot:xlabel='time', ylabel='age'>
X=Heart_Failure.drop(["DEATH_EVENT"],axis=1)
y=Heart_Failure["DEATH_EVENT"]
col_names = list(X.columns)
s_scaler = preprocessing.StandardScaler()
X_scaled= s_scaler.fit_transform(X)
X_scaled = pd.DataFrame(X_scaled, columns=col_names)
X_scaled.describe().T
count | mean | std | min | 25% | 50% | 75% | max | |
---|---|---|---|---|---|---|---|---|
age | 299.0 | 5.703353e-16 | 1.001676 | -1.754448 | -0.828124 | -0.070223 | 0.771889 | 2.877170 |
anaemia | 299.0 | 1.009969e-16 | 1.001676 | -0.871105 | -0.871105 | -0.871105 | 1.147968 | 1.147968 |
creatinine_phosphokinase | 299.0 | 0.000000e+00 | 1.001676 | -0.576918 | -0.480393 | -0.342574 | 0.000166 | 7.514640 |
diabetes | 299.0 | 9.060014e-17 | 1.001676 | -0.847579 | -0.847579 | -0.847579 | 1.179830 | 1.179830 |
ejection_fraction | 299.0 | -3.267546e-17 | 1.001676 | -2.038387 | -0.684180 | -0.007077 | 0.585389 | 3.547716 |
high_blood_pressure | 299.0 | 0.000000e+00 | 1.001676 | -0.735688 | -0.735688 | -0.735688 | 1.359272 | 1.359272 |
platelets | 299.0 | 7.723291e-17 | 1.001676 | -2.440155 | -0.520870 | -0.013908 | 0.411120 | 6.008180 |
serum_creatinine | 299.0 | 1.425838e-16 | 1.001676 | -0.865509 | -0.478205 | -0.284552 | 0.005926 | 7.752020 |
serum_sodium | 299.0 | -8.673849e-16 | 1.001676 | -5.363206 | -0.595996 | 0.085034 | 0.766064 | 2.582144 |
sex | 299.0 | -8.911489e-18 | 1.001676 | -1.359272 | -1.359272 | 0.735688 | 0.735688 | 0.735688 |
smoking | 299.0 | -1.188199e-17 | 1.001676 | -0.687682 | -0.687682 | -0.687682 | 1.454161 | 1.454161 |
time | 299.0 | -1.901118e-16 | 1.001676 | -1.629502 | -0.739000 | -0.196954 | 0.938759 | 1.997038 |
colors =["#CD5C5C","#F08080","#FA8072","#E9967A","#FFA07A"]
plt.figure(figsize=(20,10))
sns.boxenplot(data = X_scaled,palette = colors)
plt.xticks(rotation=60)
plt.show()
X_train, X_test, y_train,y_test = train_test_split(X_scaled,y,test_size=0.30,random_state=25)
model1=svm.SVC()
model1.fit (X_train, y_train)
y_pred = model1.predict(X_test)
model1.score (X_test, y_test)
0.7888888888888889
print(classification_report(y_test, y_pred))
precision recall f1-score support
0 0.84 0.85 0.84 60
1 0.69 0.67 0.68 30
accuracy 0.79 90
macro avg 0.76 0.76 0.76 90
weighted avg 0.79 0.79 0.79 90
cmap1 = sns.diverging_palette(2, 165, s=80, l=55, n=9)
plt.subplots(figsize=(10,7))
cf_matrix = confusion_matrix(y_test, y_pred)
sns.heatmap(cf_matrix/np.sum(cf_matrix), cmap = cmap1, annot = True, annot_kws = {'size':25})
<AxesSubplot:>
early_stopping = callbacks.EarlyStopping(
min_delta=0.001, # minimium amount of change to count as an improvement
patience=20, # how many epochs to wait before stopping
restore_best_weights=True)
# Initialising the NN
model = Sequential()
# layers
model.add(Dense(units = 16, kernel_initializer = 'uniform', activation = 'relu', input_dim = 12))
model.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu'))
model.add(Dropout(0.25))
model.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu'))
model.add(Dropout(0.5))
model.add(Dense(units = 1, kernel_initializer = 'uniform', activation = 'sigmoid'))
# Compiling the ANN
model.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
# Train the ANN
history = model.fit(X_train, y_train, batch_size = 25, epochs = 80,callbacks=[early_stopping], validation_split=0.25)
Epoch 1/80
7/7 [==============================] - 1s 33ms/step - loss: 0.6928 - accuracy: 0.6410 - val_loss: 0.6908 - val_accuracy: 0.8302
Epoch 2/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6918 - accuracy: 0.6346 - val_loss: 0.6885 - val_accuracy: 0.8302
Epoch 3/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6909 - accuracy: 0.6346 - val_loss: 0.6863 - val_accuracy: 0.8302
Epoch 4/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6901 - accuracy: 0.6346 - val_loss: 0.6843 - val_accuracy: 0.8302
Epoch 5/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6892 - accuracy: 0.6346 - val_loss: 0.6823 - val_accuracy: 0.8302
Epoch 6/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6884 - accuracy: 0.6346 - val_loss: 0.6800 - val_accuracy: 0.8302
Epoch 7/80
7/7 [==============================] - 0s 8ms/step - loss: 0.6873 - accuracy: 0.6346 - val_loss: 0.6772 - val_accuracy: 0.8302
Epoch 8/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6860 - accuracy: 0.6346 - val_loss: 0.6742 - val_accuracy: 0.8302
Epoch 9/80
7/7 [==============================] - 0s 8ms/step - loss: 0.6845 - accuracy: 0.6346 - val_loss: 0.6712 - val_accuracy: 0.8302
Epoch 10/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6823 - accuracy: 0.6346 - val_loss: 0.6670 - val_accuracy: 0.8302
Epoch 11/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6805 - accuracy: 0.6346 - val_loss: 0.6616 - val_accuracy: 0.8302
Epoch 12/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6760 - accuracy: 0.6346 - val_loss: 0.6533 - val_accuracy: 0.8302
Epoch 13/80
7/7 [==============================] - 0s 8ms/step - loss: 0.6710 - accuracy: 0.6346 - val_loss: 0.6422 - val_accuracy: 0.8302
Epoch 14/80
7/7 [==============================] - 0s 9ms/step - loss: 0.6619 - accuracy: 0.6346 - val_loss: 0.6277 - val_accuracy: 0.8302
Epoch 15/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6482 - accuracy: 0.6346 - val_loss: 0.6073 - val_accuracy: 0.8302
Epoch 16/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6352 - accuracy: 0.6346 - val_loss: 0.5811 - val_accuracy: 0.8302
Epoch 17/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6156 - accuracy: 0.6346 - val_loss: 0.5487 - val_accuracy: 0.8302
Epoch 18/80
7/7 [==============================] - 0s 7ms/step - loss: 0.6024 - accuracy: 0.6346 - val_loss: 0.5128 - val_accuracy: 0.8302
Epoch 19/80
7/7 [==============================] - 0s 7ms/step - loss: 0.5714 - accuracy: 0.6346 - val_loss: 0.4776 - val_accuracy: 0.8302
Epoch 20/80
7/7 [==============================] - 0s 8ms/step - loss: 0.5533 - accuracy: 0.6346 - val_loss: 0.4429 - val_accuracy: 0.8302
Epoch 21/80
7/7 [==============================] - 0s 7ms/step - loss: 0.5448 - accuracy: 0.6346 - val_loss: 0.4079 - val_accuracy: 0.8302
Epoch 22/80
7/7 [==============================] - 0s 7ms/step - loss: 0.5098 - accuracy: 0.6346 - val_loss: 0.3860 - val_accuracy: 0.8302
Epoch 23/80
7/7 [==============================] - 0s 6ms/step - loss: 0.5223 - accuracy: 0.6346 - val_loss: 0.3647 - val_accuracy: 0.8302
Epoch 24/80
7/7 [==============================] - 0s 7ms/step - loss: 0.5233 - accuracy: 0.6346 - val_loss: 0.3509 - val_accuracy: 0.8302
Epoch 25/80
7/7 [==============================] - 0s 7ms/step - loss: 0.5034 - accuracy: 0.6346 - val_loss: 0.3438 - val_accuracy: 0.8302
Epoch 26/80
7/7 [==============================] - 0s 8ms/step - loss: 0.4793 - accuracy: 0.6346 - val_loss: 0.3324 - val_accuracy: 0.8302
Epoch 27/80
7/7 [==============================] - 0s 8ms/step - loss: 0.5073 - accuracy: 0.6346 - val_loss: 0.3219 - val_accuracy: 0.8302
Epoch 28/80
7/7 [==============================] - 0s 8ms/step - loss: 0.4910 - accuracy: 0.6346 - val_loss: 0.3147 - val_accuracy: 0.8302
Epoch 29/80
7/7 [==============================] - 0s 8ms/step - loss: 0.5124 - accuracy: 0.6346 - val_loss: 0.3113 - val_accuracy: 0.8302
Epoch 30/80
7/7 [==============================] - 0s 7ms/step - loss: 0.5133 - accuracy: 0.6346 - val_loss: 0.3111 - val_accuracy: 0.8302
Epoch 31/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4766 - accuracy: 0.6346 - val_loss: 0.3088 - val_accuracy: 0.8302
Epoch 32/80
7/7 [==============================] - 0s 9ms/step - loss: 0.4975 - accuracy: 0.6346 - val_loss: 0.3080 - val_accuracy: 0.8302
Epoch 33/80
7/7 [==============================] - 0s 7ms/step - loss: 0.5191 - accuracy: 0.6346 - val_loss: 0.3073 - val_accuracy: 0.8302
Epoch 34/80
7/7 [==============================] - 0s 6ms/step - loss: 0.4627 - accuracy: 0.6346 - val_loss: 0.3029 - val_accuracy: 0.8302
Epoch 35/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4563 - accuracy: 0.6346 - val_loss: 0.2953 - val_accuracy: 0.8302
Epoch 36/80
7/7 [==============================] - 0s 8ms/step - loss: 0.4848 - accuracy: 0.6346 - val_loss: 0.2915 - val_accuracy: 0.8302
Epoch 37/80
7/7 [==============================] - 0s 8ms/step - loss: 0.4772 - accuracy: 0.6346 - val_loss: 0.2920 - val_accuracy: 0.8302
Epoch 38/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4652 - accuracy: 0.6346 - val_loss: 0.2912 - val_accuracy: 0.8302
Epoch 39/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4511 - accuracy: 0.6346 - val_loss: 0.2884 - val_accuracy: 0.8302
Epoch 40/80
7/7 [==============================] - 0s 8ms/step - loss: 0.4453 - accuracy: 0.6346 - val_loss: 0.2868 - val_accuracy: 0.8302
Epoch 41/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4740 - accuracy: 0.6346 - val_loss: 0.2862 - val_accuracy: 0.8302
Epoch 42/80
7/7 [==============================] - 0s 8ms/step - loss: 0.4492 - accuracy: 0.6346 - val_loss: 0.2844 - val_accuracy: 0.8302
Epoch 43/80
7/7 [==============================] - 0s 8ms/step - loss: 0.4367 - accuracy: 0.6346 - val_loss: 0.2828 - val_accuracy: 0.8302
Epoch 44/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4651 - accuracy: 0.6346 - val_loss: 0.2812 - val_accuracy: 0.8302
Epoch 45/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4550 - accuracy: 0.6346 - val_loss: 0.2799 - val_accuracy: 0.8302
Epoch 46/80
7/7 [==============================] - 0s 8ms/step - loss: 0.4571 - accuracy: 0.6346 - val_loss: 0.2819 - val_accuracy: 0.8868
Epoch 47/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4477 - accuracy: 0.7564 - val_loss: 0.2817 - val_accuracy: 0.8868
Epoch 48/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4479 - accuracy: 0.7756 - val_loss: 0.2814 - val_accuracy: 0.8868
Epoch 49/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4063 - accuracy: 0.7949 - val_loss: 0.2817 - val_accuracy: 0.8868
Epoch 50/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4578 - accuracy: 0.8205 - val_loss: 0.2802 - val_accuracy: 0.8868
Epoch 51/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4499 - accuracy: 0.8205 - val_loss: 0.2792 - val_accuracy: 0.8868
Epoch 52/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4375 - accuracy: 0.7949 - val_loss: 0.2771 - val_accuracy: 0.8868
Epoch 53/80
7/7 [==============================] - 0s 9ms/step - loss: 0.4340 - accuracy: 0.8333 - val_loss: 0.2761 - val_accuracy: 0.8868
Epoch 54/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4182 - accuracy: 0.8333 - val_loss: 0.2745 - val_accuracy: 0.8868
Epoch 55/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4072 - accuracy: 0.8333 - val_loss: 0.2755 - val_accuracy: 0.8868
Epoch 56/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4231 - accuracy: 0.8333 - val_loss: 0.2741 - val_accuracy: 0.8868
Epoch 57/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4398 - accuracy: 0.8397 - val_loss: 0.2732 - val_accuracy: 0.8868
Epoch 58/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4172 - accuracy: 0.8654 - val_loss: 0.2722 - val_accuracy: 0.8868
Epoch 59/80
7/7 [==============================] - 0s 6ms/step - loss: 0.4264 - accuracy: 0.8397 - val_loss: 0.2722 - val_accuracy: 0.8868
Epoch 60/80
7/7 [==============================] - 0s 6ms/step - loss: 0.4185 - accuracy: 0.8397 - val_loss: 0.2724 - val_accuracy: 0.8868
Epoch 61/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4107 - accuracy: 0.8397 - val_loss: 0.2743 - val_accuracy: 0.8868
Epoch 62/80
7/7 [==============================] - 0s 6ms/step - loss: 0.4347 - accuracy: 0.8526 - val_loss: 0.2762 - val_accuracy: 0.8679
Epoch 63/80
7/7 [==============================] - 0s 6ms/step - loss: 0.4120 - accuracy: 0.8397 - val_loss: 0.2782 - val_accuracy: 0.8679
Epoch 64/80
7/7 [==============================] - 0s 6ms/step - loss: 0.4141 - accuracy: 0.8462 - val_loss: 0.2798 - val_accuracy: 0.8679
Epoch 65/80
7/7 [==============================] - 0s 6ms/step - loss: 0.3953 - accuracy: 0.8654 - val_loss: 0.2792 - val_accuracy: 0.8679
Epoch 66/80
7/7 [==============================] - 0s 6ms/step - loss: 0.4200 - accuracy: 0.8654 - val_loss: 0.2790 - val_accuracy: 0.8679
Epoch 67/80
7/7 [==============================] - 0s 6ms/step - loss: 0.4119 - accuracy: 0.8654 - val_loss: 0.2845 - val_accuracy: 0.8679
Epoch 68/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4106 - accuracy: 0.8462 - val_loss: 0.2856 - val_accuracy: 0.8679
Epoch 69/80
7/7 [==============================] - 0s 7ms/step - loss: 0.3859 - accuracy: 0.8590 - val_loss: 0.2860 - val_accuracy: 0.8491
Epoch 70/80
7/7 [==============================] - 0s 7ms/step - loss: 0.4077 - accuracy: 0.8654 - val_loss: 0.2886 - val_accuracy: 0.8491
Epoch 71/80
7/7 [==============================] - 0s 7ms/step - loss: 0.3842 - accuracy: 0.8718 - val_loss: 0.2867 - val_accuracy: 0.8491
Epoch 72/80
7/7 [==============================] - 0s 11ms/step - loss: 0.3903 - accuracy: 0.8718 - val_loss: 0.2844 - val_accuracy: 0.8679
Epoch 73/80
7/7 [==============================] - 0s 6ms/step - loss: 0.3757 - accuracy: 0.8654 - val_loss: 0.2845 - val_accuracy: 0.8679
Epoch 74/80
7/7 [==============================] - 0s 7ms/step - loss: 0.3903 - accuracy: 0.8654 - val_loss: 0.2835 - val_accuracy: 0.8679
Epoch 75/80
7/7 [==============================] - 0s 8ms/step - loss: 0.3863 - accuracy: 0.8654 - val_loss: 0.2853 - val_accuracy: 0.8491
Epoch 76/80
7/7 [==============================] - 0s 6ms/step - loss: 0.3679 - accuracy: 0.8718 - val_loss: 0.2855 - val_accuracy: 0.8491
Epoch 77/80
7/7 [==============================] - 0s 6ms/step - loss: 0.3748 - accuracy: 0.8782 - val_loss: 0.2842 - val_accuracy: 0.8491
Epoch 78/80
7/7 [==============================] - 0s 6ms/step - loss: 0.4015 - accuracy: 0.8654 - val_loss: 0.2846 - val_accuracy: 0.8491
val_accuracy = np.mean(history.history['val_accuracy'])
print("\n%s: %.2f%%" % ('val_accuracy is', val_accuracy*100))
val_accuracy is: 84.83%
history_df = pd.DataFrame(history.history)
plt.plot(history_df.loc[:, ['loss']], "#CD5C5C", label='Training loss')
plt.plot(history_df.loc[:, ['val_loss']],"#FF0000", label='Validation loss')
plt.title('Training and Validation loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend(loc="best")
plt.show()
history_df = pd.DataFrame(history.history)
plt.plot(history_df.loc[:, ['accuracy']], "#CD5C5C", label='Training accuracy')
plt.plot(history_df.loc[:, ['val_accuracy']],"#FF0000", label='Validation accuracy')
plt.title('Training and Validation accuracy')
plt.xlabel('Epochs')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
y_pred = model.predict(X_test)
y_pred = (y_pred > 0.4)
np.set_printoptions()
cmap1 = sns.diverging_palette(2, 165, s=80, l=55, n=9)
plt.subplots(figsize=(10,7))
cf_matrix = confusion_matrix(y_test, y_pred)
sns.heatmap(cf_matrix/np.sum(cf_matrix), cmap = cmap1, annot = True, annot_kws = {'size':25})
<AxesSubplot:>
print(classification_report(y_test, y_pred))
precision recall f1-score support
0 0.84 0.78 0.81 60
1 0.62 0.70 0.66 30
accuracy 0.76 90
macro avg 0.73 0.74 0.73 90
weighted avg 0.77 0.76 0.76 90