Python Code

Download as pdf or txt
Download as pdf or txt
You are on page 1of 52

8-2-3rdmar-ai-jan-2024-batch

March 19, 2024

1 Image Recognition using CNN on CIFAR-10 Dataset


In this project we will be using CIFAR-10 dataset. This dataset includes thousands of pictures of
10 different kinds of objects like airplanes, automobiles, birds and so on.
Each image in the dataset includes a matching label so we know what kind of image it is.
The images in the CIFAR-10 dataset are only 32x32 pixels
[1]: #!pip install --upgrade numpy

[2]: #!pip install --upgrade tensorflow keras

[3]: import keras


from keras.datasets import cifar10
from keras.models import Sequential
from keras.layers import Dense, Dropout, Flatten, Conv2D, MaxPooling2D
from pathlib import Path
from keras.utils import to_categorical

C:\ProgramData\Anaconda3\lib\site-packages\scipy\__init__.py:146: UserWarning: A
NumPy version >=1.16.5 and <1.23.0 is required for this version of SciPy
(detected version 1.26.4
warnings.warn(f"A NumPy version >={np_minversion} and <{np_maxversion}"
WARNING:tensorflow:From C:\Users\Mohit
Tripathi\AppData\Roaming\Python\Python39\site-packages\keras\src\losses.py:2976:
The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use
tf.compat.v1.losses.sparse_softmax_cross_entropy instead.

Load the dataset


[4]: (X_train,y_train),(X_test, y_test)=cifar10.load_data()

Normalize the data


[5]: X_train=X_train.astype('float32')
X_test=X_test.astype('float32')
X_train/=255.0
X_test/=255.0

1
Convert class vectors to binary class matrices
[6]: y_train=to_categorical(y_train,10)
y_test=to_categorical(y_test,10)

[7]: model=Sequential()
model.
↪add(Conv2D(32,(3,3),padding='same',input_shape=(32,32,3),activation='relu'))

model.add(Conv2D(32,(3,3),activation='relu'))
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Dropout(0.25))

model.add(Conv2D(64,(3,3),padding='same',activation='relu'))
model.add(Conv2D(32,(3,3),activation='relu'))
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Dropout(0.25))

model.add(Flatten())
model.add(Dense(512, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(10,activation='softmax'))

WARNING:tensorflow:From C:\Users\Mohit
Tripathi\AppData\Roaming\Python\Python39\site-packages\keras\src\backend.py:873:
The name tf.get_default_graph is deprecated. Please use
tf.compat.v1.get_default_graph instead.

WARNING:tensorflow:From C:\Users\Mohit
Tripathi\AppData\Roaming\Python\Python39\site-
packages\keras\src\layers\pooling\max_pooling2d.py:161: The name tf.nn.max_pool
is deprecated. Please use tf.nn.max_pool2d instead.

[8]: model.compile(
loss='categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])

model.summary()

WARNING:tensorflow:From C:\Users\Mohit
Tripathi\AppData\Roaming\Python\Python39\site-
packages\keras\src\optimizers\__init__.py:309: The name tf.train.Optimizer is
deprecated. Please use tf.compat.v1.train.Optimizer instead.

Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #

2
=================================================================
conv2d (Conv2D) (None, 32, 32, 32) 896

conv2d_1 (Conv2D) (None, 30, 30, 32) 9248

max_pooling2d (MaxPooling2 (None, 15, 15, 32) 0


D)

dropout (Dropout) (None, 15, 15, 32) 0

conv2d_2 (Conv2D) (None, 15, 15, 64) 18496

conv2d_3 (Conv2D) (None, 13, 13, 32) 18464

max_pooling2d_1 (MaxPoolin (None, 6, 6, 32) 0


g2D)

dropout_1 (Dropout) (None, 6, 6, 32) 0

flatten (Flatten) (None, 1152) 0

dense (Dense) (None, 512) 590336

dropout_2 (Dropout) (None, 512) 0

dense_1 (Dense) (None, 10) 5130

=================================================================
Total params: 642570 (2.45 MB)
Trainable params: 642570 (2.45 MB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
Train the mdoel
[11]: model.fit(
X_train,
y_train,
batch_size=32,
epochs=25,
validation_data=(X_test, y_test),
shuffle=True)

Epoch 1/12
1563/1563 [==============================] - 79s 51ms/step - loss: 0.7935 -
accuracy: 0.7224 - val_loss: 0.8110 - val_accuracy: 0.7217
Epoch 2/12
1563/1563 [==============================] - 82s 52ms/step - loss: 0.7688 -

3
accuracy: 0.7313 - val_loss: 0.8039 - val_accuracy: 0.7226
Epoch 3/12
1563/1563 [==============================] - 90s 58ms/step - loss: 0.7393 -
accuracy: 0.7408 - val_loss: 0.7014 - val_accuracy: 0.7560
Epoch 4/12
1563/1563 [==============================] - 94s 60ms/step - loss: 0.7160 -
accuracy: 0.7486 - val_loss: 0.7234 - val_accuracy: 0.7539
Epoch 5/12
1563/1563 [==============================] - 77s 49ms/step - loss: 0.6989 -
accuracy: 0.7558 - val_loss: 0.7185 - val_accuracy: 0.7573
Epoch 6/12
1563/1563 [==============================] - 65s 41ms/step - loss: 0.6750 -
accuracy: 0.7649 - val_loss: 0.6682 - val_accuracy: 0.7683
Epoch 7/12
1563/1563 [==============================] - 68s 44ms/step - loss: 0.6595 -
accuracy: 0.7688 - val_loss: 0.7020 - val_accuracy: 0.7600
Epoch 8/12
1563/1563 [==============================] - 67s 43ms/step - loss: 0.6509 -
accuracy: 0.7712 - val_loss: 0.7156 - val_accuracy: 0.7529
Epoch 9/12
1563/1563 [==============================] - 68s 44ms/step - loss: 0.6331 -
accuracy: 0.7774 - val_loss: 0.6964 - val_accuracy: 0.7626
Epoch 10/12
1563/1563 [==============================] - 75s 48ms/step - loss: 0.6272 -
accuracy: 0.7778 - val_loss: 0.6782 - val_accuracy: 0.7651
Epoch 11/12
1563/1563 [==============================] - 76s 49ms/step - loss: 0.6095 -
accuracy: 0.7858 - val_loss: 0.6812 - val_accuracy: 0.7710
Epoch 12/12
1563/1563 [==============================] - 79s 51ms/step - loss: 0.6077 -
accuracy: 0.7876 - val_loss: 0.6468 - val_accuracy: 0.7781

[11]: <keras.src.callbacks.History at 0x260763504c0>

Save the neural network architecture


[12]: model_structure=model.to_json()
f=Path("model_structure.json")
f.write_text(model_structure)

[12]: 6342

Save the trained neural network weights


[13]: model.save_weights("model_weight.h5")

Making Predictions on the images

4
[14]: from keras.models import model_from_json
from pathlib import Path
from keras.preprocessing import image
import numpy as np

[15]: class_labels=[
"Planes",
"car",
"Bird",
"Cat",
"Deer",
"Dog",
"Frog",
"Horse",
"Boat",
"Truck"
]

load the json file that contains the mdoel structure


[16]: f=Path("model_structure.json")
model_structure=f.read_text()

Recreate the keras modeel bject from the json data


[17]: model=model_from_json(model_structure)

reload the model training weights


[18]: model.load_weights("model_weight.h5")

Load an image file to test


[19]: import matplotlib.pyplot as plt
from tensorflow.keras.utils import load_img, img_to_array
img=load_img("dog.png",target_size=(32,32))
plt.imshow(img)

[19]: <matplotlib.image.AxesImage at 0x260770ac400>

5
Convert the image to a numpy array
[20]: from tensorflow.keras.utils import img_to_array
image_to_test=img_to_array(img)

[21]: list_of_images=np.expand_dims(image_to_test,axis=0)

make predictions using the model


[22]: results=model.predict(list_of_images)

1/1 [==============================] - 0s 355ms/step


since we are only testing one image, we only need to check the first result
[23]: single_result=results[0]

We will get a likeliood score for all 10 possible classes. Find out which class has the highest score
[24]: most_likely_class_index=int(np.argmax(single_result))
class_likelihood=single_result[most_likely_class_index]

Get the name of the most likely class


[25]: class_label=class_labels[most_likely_class_index]

[26]: print("This is a image is a {} likelihood: {:2f}".format(class_label,␣


↪class_likelihood))

6
This is a image is a Cat likelihood: 1.000000

2 Sentiment Classification using NLP and Classification Algo-


rithm
Sentiment Analysis is a means to identofy the view or emotion behind a situation.
It basically means to analyse and find the emotion or intent behind a piece of text or speech or any
model of communication.
This burger has a very bad taste- negative review
I ordered this pizza today- nutral sentiment/review
I love this cheese sandwich, its so delicious- positive review
[14]: import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
import re

[15]: import nltk


from nltk.corpus import stopwords
from nltk.stem import WordNetLemmatizer

[16]: from sklearn.feature_extraction.text import CountVectorizer


from sklearn.model_selection import GridSearchCV
from sklearn.ensemble import RandomForestClassifier

[17]: from sklearn.metrics import accuracy_score, precision_score, recall_score,␣


↪confusion_matrix, roc_curve

from sklearn.metrics import classification_report, plot_confusion_matrix

[18]: df_train=pd.read_csv("train.txt",delimiter=";",names=['text','label'])
df_val=pd.read_csv("val.txt",delimiter=";",names=['text','label'])

[19]: df=pd.concat([df_train, df_val])


df.reset_index(inplace=True, drop=True)

[20]: print("Shape of the dataframe: ",df.shape)


df.sample(5)

Shape of the dataframe: (18000, 2)

[20]: text label


9645 i feeling im look a like those innocent lame h… joy
16221 i still don t feel devastated by the break up sadness
16392 i remember feeling disheartened one day when w… sadness
8886 i feel i am so strong enough to take this pain… joy

7
871 i feel the most uncertain about the project fear

[21]: import warnings


warnings.filterwarnings("ignore")

sns.countplot(df.label)

[21]: <AxesSubplot:xlabel='label', ylabel='count'>

Positve Sentiment- joy, love, surprise


Negative sentiment- anger, sadness, fear
Now we will create a custom encoder to convert categorical target labels to numerical i.e 0 and 1
[22]: def custom_encoder(df):
df.replace(to_replace="surprise",value=1, inplace=True)
df.replace(to_replace="love",value=1, inplace=True)
df.replace(to_replace="joy",value=1, inplace=True)
df.replace(to_replace="fear",value=0, inplace=True)
df.replace(to_replace="anger",value=0, inplace=True)
df.replace(to_replace="sadness",value=0, inplace=True)

[23]: custom_encoder(df['label'])

[24]: sns.countplot(df.label)

8
[24]: <AxesSubplot:xlabel='label', ylabel='count'>

Preprocessing Steps
Get rid of any characters apart from alphabets
Convert the string to lowercase because Python is case-sensitive
Check and remove the stopwords
Perform lemmatization
[25]: lm=WordNetLemmatizer()

[26]: def text_transformation(df_col):


corpus=[]
for item in df_col:
new_item=re.sub('[^a-zA-Z]',' ',str(item))
new_item=new_item.lower()
new_item=new_item.split()
new_item=[lm.lemmatize(word) for word in new_item if word not in␣
↪set(stopwords.words('english'))]

corpus.append(' '.join(str(x) for x in new_item))


return corpus

[27]: corpus=text_transformation(df['text'])

9
[28]: cv=CountVectorizer(ngram_range=(1,2))
traindata=cv.fit_transform(corpus)
X=traindata
y=df.label

Now we will fit the data into gri search and view the best parameters using the best_params
attribute
[29]: parameters={'max_features':('auto','sqrt'),
'n_estimators':[5,10],
'max_depth':[10,None],
'min_samples_split':[5],
'min_samples_leaf':[1],
'bootstrap':[True]
}

[30]: grid_search=GridSearchCV(RandomForestClassifier(), parameters, cv=5,␣


↪return_train_score=True,n_jobs=-1)

grid_search.fit(X,y)
grid_search.best_params_

[30]: {'bootstrap': True,


'max_depth': None,
'max_features': 'auto',
'min_samples_leaf': 1,
'min_samples_split': 5,
'n_estimators': 10}

We can view all the models and their respective parameters, mean test score and rank as GridSearch
CV
[31]: for i in range(8):
print('Parameters: ', grid_search.cv_results_['params'][i])
print('Mean test Score: ',grid_search.cv_results_['mean_test_score'][i])
print("Rank: ",grid_search.cv_results_['rank_test_score'])

Parameters: {'bootstrap': True, 'max_depth': 10, 'max_features': 'auto',


'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 5}
Mean test Score: 0.5937222222222223
Rank: [8 5 7 6 4 1 3 2]
Parameters: {'bootstrap': True, 'max_depth': 10, 'max_features': 'auto',
'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 10}
Mean test Score: 0.6168333333333333
Rank: [8 5 7 6 4 1 3 2]
Parameters: {'bootstrap': True, 'max_depth': 10, 'max_features': 'sqrt',
'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 5}
Mean test Score: 0.5997777777777779
Rank: [8 5 7 6 4 1 3 2]

10
Parameters: {'bootstrap': True, 'max_depth': 10, 'max_features': 'sqrt',
'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 10}
Mean test Score: 0.6161666666666668
Rank: [8 5 7 6 4 1 3 2]
Parameters: {'bootstrap': True, 'max_depth': None, 'max_features': 'auto',
'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 5}
Mean test Score: 0.9116111111111111
Rank: [8 5 7 6 4 1 3 2]
Parameters: {'bootstrap': True, 'max_depth': None, 'max_features': 'auto',
'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 10}
Mean test Score: 0.9369444444444444
Rank: [8 5 7 6 4 1 3 2]
Parameters: {'bootstrap': True, 'max_depth': None, 'max_features': 'sqrt',
'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 5}
Mean test Score: 0.9189444444444443
Rank: [8 5 7 6 4 1 3 2]
Parameters: {'bootstrap': True, 'max_depth': None, 'max_features': 'sqrt',
'min_samples_leaf': 1, 'min_samples_split': 5, 'n_estimators': 10}
Mean test Score: 0.933
Rank: [8 5 7 6 4 1 3 2]
Now we will choose the best parameter obtained from GridSearchCV and create a final random
forest classifier model and then train our model.
[32]: rfc= RandomForestClassifier(max_features=grid_search.
↪best_params_['max_features'],

max_depth=grid_search.best_params_['max_depth'],
n_estimators=grid_search.
↪best_params_['n_estimators'],

min_samples_split=grid_search.
↪best_params_['min_samples_split'],

min_samples_leaf=grid_search.
↪best_params_['min_samples_leaf'],

bootstrap=grid_search.best_params_['bootstrap'])

rfc.fit(X,y)

[32]: RandomForestClassifier(min_samples_split=5, n_estimators=10)

Test Data Transformation


[33]: test_df=pd.read_csv('test.txt',delimiter=';',names=['text','label'])

[34]: X_test, y_test=test_df.text, test_df.label

# encode the labels into two classes 0 and 1


test_df= custom_encoder(y_test)

11
# preprocessing of text
test_corpus=text_transformation(X_test)

# convert the text data into vectors


testdata=cv.transform(test_corpus)

#predict the target


predictions=rfc.predict(testdata)

Model Evaluation
We will evaluate our model using various metrics such as accuracy score, precision score, recall
score confusion matrix.
[35]: acc_score= accuracy_score(y_test, predictions)
pre_score= precision_score(y_test, predictions)
rec_score=recall_score(y_test, predictions)

print('Accuracy Score:',acc_score)
print("Precision Score:",pre_score)
print('Recall Score',rec_score)

print("-"*50)

cr=classification_report(y_test, predictions)
print(cr)

Accuracy Score: 0.943


Precision Score: 0.9448123620309051
Recall Score 0.9304347826086956
--------------------------------------------------
precision recall f1-score support

0 0.94 0.95 0.95 1080


1 0.94 0.93 0.94 920

accuracy 0.94 2000


macro avg 0.94 0.94 0.94 2000
weighted avg 0.94 0.94 0.94 2000

ROC Curve- We will plot probability of the class using the predict_proba() method of random
forest classifier and then we will plot the cureve
[36]: predictions_probability= rfc.predict_proba(testdata)
fpr,tpr,thresholds=roc_curve(y_test, predictions_probability[:,1])

plt.plot(fpr,tpr)
plt.plot([0,1])

12
plt.title('ROC Curve')
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.show()

As we cna see that our model performed very well in classifying the sentiments, with an accuracy
score, precision score and recall score of approx 96%.
Now we will check for custom input as well and let our model identify the sentiment of the input
statement.
[37]: def expression_check(prediction_input):
if prediction_input==0:
print("Input statement has negative sentiment")
elif prediction_input==1:
print("Input statement has positive sentiment")
else:
print("Invalid Statement")

Function to take the input statement and performs the same transformation as we did earlier
[38]: def sentiment_predictor(input):
input=text_transformation(input)
transformed_input=cv.transform(input)
prediction=rfc.predict(transformed_input)

13
expression_check(prediction)

[39]: input1=["Sometimes I just don't want to go out"]


input2=["I bought a new phone and it's so good"]

[40]: sentiment_predictor(input1)
sentiment_predictor(input2)

Input statement has negative sentiment


Input statement has positive sentiment

3 Chatbot using NLP and Neural Networks in Python


Tag means classes
Patterns means what user is going to ask
Response is chatbot response
[1]: data={"intents":[
{"tag":"greeting",
"patterns":["Hello","How are you?","Hi There","Hi","What's up"],
"responses":["Howdy Partner!","Hello","How are you doing?","Greetings!
↪","How do you do"]

},
{"tag":"age",
"patterns":["how old are you","when is your birthday","when was you born"],
"responses":["I am 24 years old","I was born in 1966","My birthday is July␣
↪3rd and I was born in 1996","03/07/1996"]

},
{"tag":"date",
"patterns":["what are you doing this weekend",
"do you want to hangout sometime?","what are your plans for␣
↪this week"],

"responses":["I am available this week","I don't have any plans","I am not␣


↪busy"]

},
{"tag":"name",
"patterns":["what's your name","what are you called","who are you"],
"responses":["My name is Kippi","I'm Kippi","Kippi"]
},
{"tag":"goodbye",
"patterns":["bye","g2g","see ya","adios","cya"],
"responses":["It was nice speaking to you","See you later","Speak Soon"]
},
]}

For each tag we created, we would specify patterns. Essentially this defines the different ways of

14
how a user may pose a query to the chatbot.
The chatbot would then take these patterns and use them as training data to determine what
someone is asking and the chatbot response would be relevant to that qustion.
[2]: import json
import string
import random

import nltk
import numpy as np
from nltk.stem import WordNetLemmatizer

import tensorflow as tf
from tensorflow.keras import Sequential
from tensorflow.keras.layers import Dense,Dropout

nltk.download("punkt")
nltk.download("wordnet")

C:\ProgramData\Anaconda3\lib\site-packages\scipy\__init__.py:146: UserWarning: A
NumPy version >=1.16.5 and <1.23.0 is required for this version of SciPy
(detected version 1.26.4
warnings.warn(f"A NumPy version >={np_minversion} and <{np_maxversion}"
WARNING:tensorflow:From C:\Users\Mohit
Tripathi\AppData\Roaming\Python\Python39\site-packages\keras\src\losses.py:2976:
The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use
tf.compat.v1.losses.sparse_softmax_cross_entropy instead.

[nltk_data] Downloading package punkt to C:\Users\Mohit


[nltk_data] Tripathi\AppData\Roaming\nltk_data…
[nltk_data] Package punkt is already up-to-date!
[nltk_data] Downloading package wordnet to C:\Users\Mohit
[nltk_data] Tripathi\AppData\Roaming\nltk_data…
[nltk_data] Package wordnet is already up-to-date!

[2]: True

In order to create our training data below steps to be followed


Create a vocabulary of all the words used in the patterns
Create a list of the classes- tage of ach intent
Create a list of all the patterns within the intents file
Create a list of all the associated tags to go with each patterns in the intents file.
Initialising lemmatier to get stem of words

15
[3]: lemmatizer=WordNetLemmatizer()

words=[]
classes=[]
doc_x=[]
doc_y=[]

Loop through all the intents


Tokenize each patter and append token to words, the patterns and the associated tag to their
associated list
[4]: for intent in data["intents"]:
for pattern in intent["patterns"]:
tokens=nltk.word_tokenize(pattern)
words.extend(tokens)
doc_x.append(pattern)
doc_y.append(intent["tag"])
if intent["tag"] not in classes:
classes.append(intent["tag"])

Lemmatize all the words in the vocab and convert them to lowercase
[5]: words=[lemmatizer.lemmatize(word.lower()) for word in words if word not in␣
↪string.punctuation]

Sorting the vocab and classes in alphabeical order and taking the set to ensure no duplicates occur
[6]: words=sorted(set(words))
classes=sorted(set(classes))

[7]: print(words)

["'s", 'adios', 'are', 'birthday', 'born', 'bye', 'called', 'cya', 'do',


'doing', 'for', 'g2g', 'hangout', 'hello', 'hi', 'how', 'is', 'name', 'old',
'plan', 'see', 'sometime', 'there', 'this', 'to', 'up', 'wa', 'want', 'week',
'weekend', 'what', 'when', 'who', 'ya', 'you', 'your']

[8]: print(classes)

['age', 'date', 'goodbye', 'greeting', 'name']

[9]: print(doc_x)

['Hello', 'How are you?', 'Hi There', 'Hi', "What's up", 'how old are you',
'when is your birthday', 'when was you born', 'what are you doing this weekend',
'do you want to hangout sometime?', 'what are your plans for this week', "what's
your name", 'what are you called', 'who are you', 'bye', 'g2g', 'see ya',
'adios', 'cya']

16
[10]: print(doc_y)

['greeting', 'greeting', 'greeting', 'greeting', 'greeting', 'age', 'age',


'age', 'date', 'date', 'date', 'name', 'name', 'name', 'goodbye', 'goodbye',
'goodbye', 'goodbye', 'goodbye']
List for training data
[11]: training=[]
out_empty=[0]*len(classes)

# creating a bag of words model

for idx, doc in enumerate(doc_x):


bow=[]
text=lemmatizer.lemmatize(doc.lower())
for word in words:
bow.append(1) if word in text else bow.append(0)
output_row=list(out_empty)
output_row[classes.index(doc_y[idx])]=1

training.append([bow, output_row])

random.shuffle(training)

training=np.array(training,dtype=object)

train_X=np.array(list(training[:,0]))
train_y=np.array(list(training[:,1]))

The model will look at the features and predict the tagasscoiated with the features and then will
select an appropiate message/response from the tag.

[12]: input_shape=(len(train_X[0]),)
output_shape=len(train_y[0])

epochs=500

[13]: from tensorflow.keras.models import Sequential


from tensorflow.keras.layers import Dense, Dropout

# Create a Sequential model


model = Sequential()
model.add(Dense(128, input_shape=input_shape, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(64, activation='relu'))
model.add(Dropout(0.3))
model.add(Dense(output_shape, activation='softmax'))

17
# Create the Adam optimizer with a specified learning rate
adam = tf.keras.optimizers.Adam(learning_rate=0.01)

# Compile the model using the Adam optimizer


model.compile(loss='categorical_crossentropy',
optimizer=adam,
metrics=['accuracy'])

print(model.summary())

WARNING:tensorflow:From C:\Users\Mohit
Tripathi\AppData\Roaming\Python\Python39\site-packages\keras\src\backend.py:873:
The name tf.get_default_graph is deprecated. Please use
tf.compat.v1.get_default_graph instead.

Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 4736

dropout (Dropout) (None, 128) 0

dense_1 (Dense) (None, 64) 8256

dropout_1 (Dropout) (None, 64) 0

dense_2 (Dense) (None, 5) 325

=================================================================
Total params: 13317 (52.02 KB)
Trainable params: 13317 (52.02 KB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
None

[14]: model.fit(x=train_X, y=train_y, epochs=500, verbose=1)

Epoch 1/500
WARNING:tensorflow:From C:\Users\Mohit
Tripathi\AppData\Roaming\Python\Python39\site-
packages\keras\src\utils\tf_utils.py:492: The name tf.ragged.RaggedTensorValue
is deprecated. Please use tf.compat.v1.ragged.RaggedTensorValue instead.

WARNING:tensorflow:From C:\Users\Mohit
Tripathi\AppData\Roaming\Python\Python39\site-
packages\keras\src\engine\base_layer_utils.py:384: The name

18
tf.executing_eagerly_outside_functions is deprecated. Please use
tf.compat.v1.executing_eagerly_outside_functions instead.

1/1 [==============================] - 2s 2s/step - loss: 1.7215 - accuracy:


0.3158
Epoch 2/500
1/1 [==============================] - 0s 16ms/step - loss: 1.5838 - accuracy:
0.3158
Epoch 3/500
1/1 [==============================] - 0s 15ms/step - loss: 1.3981 - accuracy:
0.5263
Epoch 4/500
1/1 [==============================] - 0s 18ms/step - loss: 1.3111 - accuracy:
0.5789
Epoch 5/500
1/1 [==============================] - 0s 15ms/step - loss: 1.2066 - accuracy:
0.6316
Epoch 6/500
1/1 [==============================] - 0s 12ms/step - loss: 1.0635 - accuracy:
0.6842
Epoch 7/500
1/1 [==============================] - 0s 11ms/step - loss: 1.0505 - accuracy:
0.7368
Epoch 8/500
1/1 [==============================] - 0s 13ms/step - loss: 0.7819 - accuracy:
0.8947
Epoch 9/500
1/1 [==============================] - 0s 14ms/step - loss: 0.7170 - accuracy:
0.8421
Epoch 10/500
1/1 [==============================] - 0s 11ms/step - loss: 0.6563 - accuracy:
0.7895
Epoch 11/500
1/1 [==============================] - 0s 11ms/step - loss: 0.6571 - accuracy:
0.8947
Epoch 12/500
1/1 [==============================] - 0s 9ms/step - loss: 0.4337 - accuracy:
0.9474
Epoch 13/500
1/1 [==============================] - 0s 9ms/step - loss: 0.3369 - accuracy:
0.9474
Epoch 14/500
1/1 [==============================] - 0s 11ms/step - loss: 0.3091 - accuracy:
0.9474
Epoch 15/500
1/1 [==============================] - 0s 11ms/step - loss: 0.1857 - accuracy:
1.0000
Epoch 16/500

19
1/1 [==============================] - 0s 10ms/step - loss: 0.1736 - accuracy:
1.0000
Epoch 17/500
1/1 [==============================] - 0s 12ms/step - loss: 0.2928 - accuracy:
0.8947
Epoch 18/500
1/1 [==============================] - 0s 11ms/step - loss: 0.1294 - accuracy:
1.0000
Epoch 19/500
1/1 [==============================] - 0s 12ms/step - loss: 0.1065 - accuracy:
1.0000
Epoch 20/500
1/1 [==============================] - 0s 11ms/step - loss: 0.2748 - accuracy:
0.9474
Epoch 21/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0781 - accuracy:
1.0000
Epoch 22/500
1/1 [==============================] - 0s 11ms/step - loss: 0.1004 - accuracy:
0.9474
Epoch 23/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0378 - accuracy:
1.0000
Epoch 24/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0333 - accuracy:
1.0000
Epoch 25/500
1/1 [==============================] - 0s 16ms/step - loss: 0.1022 - accuracy:
0.8947
Epoch 26/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0080 - accuracy:
1.0000
Epoch 27/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0499 - accuracy:
1.0000
Epoch 28/500
1/1 [==============================] - 0s 13ms/step - loss: 0.1325 - accuracy:
0.9474
Epoch 29/500
1/1 [==============================] - 0s 15ms/step - loss: 0.0453 - accuracy:
1.0000
Epoch 30/500
1/1 [==============================] - 0s 17ms/step - loss: 0.0154 - accuracy:
1.0000
Epoch 31/500
1/1 [==============================] - 0s 16ms/step - loss: 0.0283 - accuracy:
1.0000
Epoch 32/500

20
1/1 [==============================] - 0s 19ms/step - loss: 0.0367 - accuracy:
1.0000
Epoch 33/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0170 - accuracy:
1.0000
Epoch 34/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0031 - accuracy:
1.0000
Epoch 35/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0052 - accuracy:
1.0000
Epoch 36/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0090 - accuracy:
1.0000
Epoch 37/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0119 - accuracy:
1.0000
Epoch 38/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0039 - accuracy:
1.0000
Epoch 39/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0175 - accuracy:
1.0000
Epoch 40/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0188 - accuracy:
1.0000
Epoch 41/500
1/1 [==============================] - 0s 16ms/step - loss: 0.0085 - accuracy:
1.0000
Epoch 42/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0041 - accuracy:
1.0000
Epoch 43/500
1/1 [==============================] - 0s 14ms/step - loss: 5.3311e-04 -
accuracy: 1.0000
Epoch 44/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0110 - accuracy:
1.0000
Epoch 45/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0023 - accuracy:
1.0000
Epoch 46/500
1/1 [==============================] - 0s 15ms/step - loss: 0.0078 - accuracy:
1.0000
Epoch 47/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0093 - accuracy:
1.0000
Epoch 48/500

21
1/1 [==============================] - 0s 12ms/step - loss: 0.0016 - accuracy:
1.0000
Epoch 49/500
1/1 [==============================] - 0s 10ms/step - loss: 9.0239e-05 -
accuracy: 1.0000
Epoch 50/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0336 - accuracy:
1.0000
Epoch 51/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0402 - accuracy:
1.0000
Epoch 52/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0031 - accuracy:
1.0000
Epoch 53/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0023 - accuracy:
1.0000
Epoch 54/500
1/1 [==============================] - 0s 14ms/step - loss: 5.9949e-04 -
accuracy: 1.0000
Epoch 55/500
1/1 [==============================] - 0s 16ms/step - loss: 0.0392 - accuracy:
1.0000
Epoch 56/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0069 - accuracy:
1.0000
Epoch 57/500
1/1 [==============================] - 0s 15ms/step - loss: 0.0053 - accuracy:
1.0000
Epoch 58/500
1/1 [==============================] - 0s 16ms/step - loss: 0.0012 - accuracy:
1.0000
Epoch 59/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0059 - accuracy:
1.0000
Epoch 60/500
1/1 [==============================] - 0s 14ms/step - loss: 4.7009e-04 -
accuracy: 1.0000
Epoch 61/500
1/1 [==============================] - 0s 15ms/step - loss: 4.7488e-04 -
accuracy: 1.0000
Epoch 62/500
1/1 [==============================] - 0s 12ms/step - loss: 3.5188e-04 -
accuracy: 1.0000
Epoch 63/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0032 - accuracy:
1.0000
Epoch 64/500

22
1/1 [==============================] - 0s 11ms/step - loss: 0.0038 - accuracy:
1.0000
Epoch 65/500
1/1 [==============================] - 0s 11ms/step - loss: 7.6957e-04 -
accuracy: 1.0000
Epoch 66/500
1/1 [==============================] - 0s 17ms/step - loss: 0.0253 - accuracy:
1.0000
Epoch 67/500
1/1 [==============================] - 0s 17ms/step - loss: 0.0139 - accuracy:
1.0000
Epoch 68/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0016 - accuracy:
1.0000
Epoch 69/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0026 - accuracy:
1.0000
Epoch 70/500
1/1 [==============================] - 0s 13ms/step - loss: 3.2516e-04 -
accuracy: 1.0000
Epoch 71/500
1/1 [==============================] - 0s 14ms/step - loss: 9.8762e-04 -
accuracy: 1.0000
Epoch 72/500
1/1 [==============================] - 0s 17ms/step - loss: 2.0152e-04 -
accuracy: 1.0000
Epoch 73/500
1/1 [==============================] - 0s 18ms/step - loss: 0.0015 - accuracy:
1.0000
Epoch 74/500
1/1 [==============================] - 0s 15ms/step - loss: 0.0090 - accuracy:
1.0000
Epoch 75/500
1/1 [==============================] - 0s 16ms/step - loss: 0.0050 - accuracy:
1.0000
Epoch 76/500
1/1 [==============================] - 0s 33ms/step - loss: 0.0186 - accuracy:
1.0000
Epoch 77/500
1/1 [==============================] - 0s 18ms/step - loss: 0.0033 - accuracy:
1.0000
Epoch 78/500
1/1 [==============================] - 0s 17ms/step - loss: 7.3342e-04 -
accuracy: 1.0000
Epoch 79/500
1/1 [==============================] - 0s 17ms/step - loss: 0.0082 - accuracy:
1.0000
Epoch 80/500

23
1/1 [==============================] - 0s 15ms/step - loss: 0.0766 - accuracy:
0.9474
Epoch 81/500
1/1 [==============================] - 0s 20ms/step - loss: 3.6914e-04 -
accuracy: 1.0000
Epoch 82/500
1/1 [==============================] - 0s 21ms/step - loss: 2.8317e-05 -
accuracy: 1.0000
Epoch 83/500
1/1 [==============================] - 0s 18ms/step - loss: 0.0015 - accuracy:
1.0000
Epoch 84/500
1/1 [==============================] - 0s 23ms/step - loss: 3.1079e-04 -
accuracy: 1.0000
Epoch 85/500
1/1 [==============================] - 0s 17ms/step - loss: 0.0046 - accuracy:
1.0000
Epoch 86/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0393 - accuracy:
1.0000
Epoch 87/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0067 - accuracy:
1.0000
Epoch 88/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0047 - accuracy:
1.0000
Epoch 89/500
1/1 [==============================] - 0s 14ms/step - loss: 5.4569e-04 -
accuracy: 1.0000
Epoch 90/500
1/1 [==============================] - 0s 17ms/step - loss: 6.4029e-04 -
accuracy: 1.0000
Epoch 91/500
1/1 [==============================] - 0s 14ms/step - loss: 9.5446e-04 -
accuracy: 1.0000
Epoch 92/500
1/1 [==============================] - 0s 17ms/step - loss: 0.0066 - accuracy:
1.0000
Epoch 93/500
1/1 [==============================] - 0s 20ms/step - loss: 0.0029 - accuracy:
1.0000
Epoch 94/500
1/1 [==============================] - 0s 17ms/step - loss: 0.0033 - accuracy:
1.0000
Epoch 95/500
1/1 [==============================] - 0s 14ms/step - loss: 2.9102e-04 -
accuracy: 1.0000
Epoch 96/500

24
1/1 [==============================] - 0s 12ms/step - loss: 5.8412e-04 -
accuracy: 1.0000
Epoch 97/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0160 - accuracy:
1.0000
Epoch 98/500
1/1 [==============================] - 0s 13ms/step - loss: 3.3784e-04 -
accuracy: 1.0000
Epoch 99/500
1/1 [==============================] - 0s 12ms/step - loss: 8.7976e-04 -
accuracy: 1.0000
Epoch 100/500
1/1 [==============================] - 0s 11ms/step - loss: 9.0057e-04 -
accuracy: 1.0000
Epoch 101/500
1/1 [==============================] - 0s 15ms/step - loss: 1.0364e-04 -
accuracy: 1.0000
Epoch 102/500
1/1 [==============================] - 0s 13ms/step - loss: 5.0469e-04 -
accuracy: 1.0000
Epoch 103/500
1/1 [==============================] - 0s 16ms/step - loss: 2.5477e-04 -
accuracy: 1.0000
Epoch 104/500
1/1 [==============================] - 0s 15ms/step - loss: 4.9313e-04 -
accuracy: 1.0000
Epoch 105/500
1/1 [==============================] - 0s 20ms/step - loss: 3.7212e-04 -
accuracy: 1.0000
Epoch 106/500
1/1 [==============================] - 0s 16ms/step - loss: 0.0014 - accuracy:
1.0000
Epoch 107/500
1/1 [==============================] - 0s 13ms/step - loss: 8.2698e-04 -
accuracy: 1.0000
Epoch 108/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0011 - accuracy:
1.0000
Epoch 109/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0063 - accuracy:
1.0000
Epoch 110/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0041 - accuracy:
1.0000
Epoch 111/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0037 - accuracy:
1.0000
Epoch 112/500

25
1/1 [==============================] - 0s 11ms/step - loss: 7.3420e-04 -
accuracy: 1.0000
Epoch 113/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0061 - accuracy:
1.0000
Epoch 114/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0034 - accuracy:
1.0000
Epoch 115/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0376 - accuracy:
1.0000
Epoch 116/500
1/1 [==============================] - 0s 15ms/step - loss: 0.0020 - accuracy:
1.0000
Epoch 117/500
1/1 [==============================] - 0s 13ms/step - loss: 8.8638e-04 -
accuracy: 1.0000
Epoch 118/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0120 - accuracy:
1.0000
Epoch 119/500
1/1 [==============================] - 0s 11ms/step - loss: 1.2360e-05 -
accuracy: 1.0000
Epoch 120/500
1/1 [==============================] - 0s 12ms/step - loss: 4.0708e-04 -
accuracy: 1.0000
Epoch 121/500
1/1 [==============================] - 0s 12ms/step - loss: 5.4011e-04 -
accuracy: 1.0000
Epoch 122/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0019 - accuracy:
1.0000
Epoch 123/500
1/1 [==============================] - 0s 13ms/step - loss: 4.9367e-04 -
accuracy: 1.0000
Epoch 124/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0015 - accuracy:
1.0000
Epoch 125/500
1/1 [==============================] - 0s 13ms/step - loss: 7.9986e-04 -
accuracy: 1.0000
Epoch 126/500
1/1 [==============================] - 0s 13ms/step - loss: 5.7606e-04 -
accuracy: 1.0000
Epoch 127/500
1/1 [==============================] - 0s 12ms/step - loss: 1.0804e-04 -
accuracy: 1.0000
Epoch 128/500

26
1/1 [==============================] - 0s 18ms/step - loss: 1.0476e-04 -
accuracy: 1.0000
Epoch 129/500
1/1 [==============================] - 0s 13ms/step - loss: 2.5918e-04 -
accuracy: 1.0000
Epoch 130/500
1/1 [==============================] - 0s 14ms/step - loss: 4.0657e-04 -
accuracy: 1.0000
Epoch 131/500
1/1 [==============================] - 0s 13ms/step - loss: 3.1231e-04 -
accuracy: 1.0000
Epoch 132/500
1/1 [==============================] - 0s 11ms/step - loss: 5.3596e-04 -
accuracy: 1.0000
Epoch 133/500
1/1 [==============================] - 0s 12ms/step - loss: 4.1711e-04 -
accuracy: 1.0000
Epoch 134/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0024 - accuracy:
1.0000
Epoch 135/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0585 - accuracy:
0.9474
Epoch 136/500
1/1 [==============================] - 0s 12ms/step - loss: 8.2224e-05 -
accuracy: 1.0000
Epoch 137/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0073 - accuracy:
1.0000
Epoch 138/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0035 - accuracy:
1.0000
Epoch 139/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0020 - accuracy:
1.0000
Epoch 140/500
1/1 [==============================] - 0s 9ms/step - loss: 1.5986e-05 -
accuracy: 1.0000
Epoch 141/500
1/1 [==============================] - 0s 12ms/step - loss: 5.6657e-05 -
accuracy: 1.0000
Epoch 142/500
1/1 [==============================] - 0s 17ms/step - loss: 2.7055e-04 -
accuracy: 1.0000
Epoch 143/500
1/1 [==============================] - 0s 12ms/step - loss: 3.3329e-04 -
accuracy: 1.0000
Epoch 144/500

27
1/1 [==============================] - 0s 9ms/step - loss: 6.5242e-04 -
accuracy: 1.0000
Epoch 145/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0168 - accuracy:
1.0000
Epoch 146/500
1/1 [==============================] - 0s 9ms/step - loss: 1.7040e-04 -
accuracy: 1.0000
Epoch 147/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0057 - accuracy:
1.0000
Epoch 148/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0063 - accuracy:
1.0000
Epoch 149/500
1/1 [==============================] - 0s 10ms/step - loss: 2.9529e-04 -
accuracy: 1.0000
Epoch 150/500
1/1 [==============================] - 0s 10ms/step - loss: 5.2132e-05 -
accuracy: 1.0000
Epoch 151/500
1/1 [==============================] - 0s 9ms/step - loss: 0.2337 - accuracy:
0.9474
Epoch 152/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0032 - accuracy:
1.0000
Epoch 153/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0012 - accuracy:
1.0000
Epoch 154/500
1/1 [==============================] - 0s 13ms/step - loss: 6.4493e-05 -
accuracy: 1.0000
Epoch 155/500
1/1 [==============================] - 0s 12ms/step - loss: 4.8396e-04 -
accuracy: 1.0000
Epoch 156/500
1/1 [==============================] - 0s 11ms/step - loss: 8.6659e-04 -
accuracy: 1.0000
Epoch 157/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0028 - accuracy:
1.0000
Epoch 158/500
1/1 [==============================] - 0s 11ms/step - loss: 2.3377e-04 -
accuracy: 1.0000
Epoch 159/500
1/1 [==============================] - 0s 9ms/step - loss: 6.7364e-04 -
accuracy: 1.0000
Epoch 160/500

28
1/1 [==============================] - 0s 9ms/step - loss: 0.0013 - accuracy:
1.0000
Epoch 161/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0126 - accuracy:
1.0000
Epoch 162/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0092 - accuracy:
1.0000
Epoch 163/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0307 - accuracy:
1.0000
Epoch 164/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0095 - accuracy:
1.0000
Epoch 165/500
1/1 [==============================] - 0s 10ms/step - loss: 7.1895e-04 -
accuracy: 1.0000
Epoch 166/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0030 - accuracy:
1.0000
Epoch 167/500
1/1 [==============================] - 0s 11ms/step - loss: 2.6939e-05 -
accuracy: 1.0000
Epoch 168/500
1/1 [==============================] - 0s 15ms/step - loss: 2.7585e-04 -
accuracy: 1.0000
Epoch 169/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0054 - accuracy:
1.0000
Epoch 170/500
1/1 [==============================] - 0s 12ms/step - loss: 4.6489e-04 -
accuracy: 1.0000
Epoch 171/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0032 - accuracy:
1.0000
Epoch 172/500
1/1 [==============================] - 0s 9ms/step - loss: 7.9278e-04 -
accuracy: 1.0000
Epoch 173/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0197 - accuracy:
1.0000
Epoch 174/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0320 - accuracy:
1.0000
Epoch 175/500
1/1 [==============================] - 0s 11ms/step - loss: 6.2070e-04 -
accuracy: 1.0000
Epoch 176/500

29
1/1 [==============================] - 0s 12ms/step - loss: 5.3930e-04 -
accuracy: 1.0000
Epoch 177/500
1/1 [==============================] - 0s 10ms/step - loss: 4.6545e-04 -
accuracy: 1.0000
Epoch 178/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0054 - accuracy:
1.0000
Epoch 179/500
1/1 [==============================] - 0s 11ms/step - loss: 3.6334e-04 -
accuracy: 1.0000
Epoch 180/500
1/1 [==============================] - 0s 11ms/step - loss: 3.0922e-04 -
accuracy: 1.0000
Epoch 181/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0013 - accuracy:
1.0000
Epoch 182/500
1/1 [==============================] - 0s 10ms/step - loss: 4.6118e-05 -
accuracy: 1.0000
Epoch 183/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0312 - accuracy:
1.0000
Epoch 184/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0015 - accuracy:
1.0000
Epoch 185/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0021 - accuracy:
1.0000
Epoch 186/500
1/1 [==============================] - 0s 11ms/step - loss: 6.2088e-05 -
accuracy: 1.0000
Epoch 187/500
1/1 [==============================] - 0s 12ms/step - loss: 1.7385e-05 -
accuracy: 1.0000
Epoch 188/500
1/1 [==============================] - 0s 11ms/step - loss: 5.8248e-04 -
accuracy: 1.0000
Epoch 189/500
1/1 [==============================] - 0s 9ms/step - loss: 1.8288e-05 -
accuracy: 1.0000
Epoch 190/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0032 - accuracy:
1.0000
Epoch 191/500
1/1 [==============================] - 0s 9ms/step - loss: 5.0561e-04 -
accuracy: 1.0000
Epoch 192/500

30
1/1 [==============================] - 0s 11ms/step - loss: 6.3544e-04 -
accuracy: 1.0000
Epoch 193/500
1/1 [==============================] - 0s 10ms/step - loss: 1.0818e-04 -
accuracy: 1.0000
Epoch 194/500
1/1 [==============================] - 0s 12ms/step - loss: 2.1622e-04 -
accuracy: 1.0000
Epoch 195/500
1/1 [==============================] - 0s 14ms/step - loss: 4.7941e-05 -
accuracy: 1.0000
Epoch 196/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0158 - accuracy:
1.0000
Epoch 197/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0021 - accuracy:
1.0000
Epoch 198/500
1/1 [==============================] - 0s 12ms/step - loss: 8.4342e-05 -
accuracy: 1.0000
Epoch 199/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0093 - accuracy:
1.0000
Epoch 200/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0179 - accuracy:
1.0000
Epoch 201/500
1/1 [==============================] - 0s 8ms/step - loss: 1.8732e-04 -
accuracy: 1.0000
Epoch 202/500
1/1 [==============================] - 0s 10ms/step - loss: 3.2656e-05 -
accuracy: 1.0000
Epoch 203/500
1/1 [==============================] - 0s 8ms/step - loss: 2.0093e-04 -
accuracy: 1.0000
Epoch 204/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0204 - accuracy:
1.0000
Epoch 205/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0038 - accuracy:
1.0000
Epoch 206/500
1/1 [==============================] - 0s 9ms/step - loss: 1.6218e-05 -
accuracy: 1.0000
Epoch 207/500
1/1 [==============================] - 0s 8ms/step - loss: 3.4038e-04 -
accuracy: 1.0000
Epoch 208/500

31
1/1 [==============================] - 0s 9ms/step - loss: 1.6801e-05 -
accuracy: 1.0000
Epoch 209/500
1/1 [==============================] - 0s 11ms/step - loss: 9.4768e-05 -
accuracy: 1.0000
Epoch 210/500
1/1 [==============================] - 0s 11ms/step - loss: 5.8913e-06 -
accuracy: 1.0000
Epoch 211/500
1/1 [==============================] - 0s 11ms/step - loss: 3.2545e-05 -
accuracy: 1.0000
Epoch 212/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0012 - accuracy:
1.0000
Epoch 213/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0152 - accuracy:
1.0000
Epoch 214/500
1/1 [==============================] - 0s 13ms/step - loss: 1.8364e-05 -
accuracy: 1.0000
Epoch 215/500
1/1 [==============================] - 0s 11ms/step - loss: 3.7995e-04 -
accuracy: 1.0000
Epoch 216/500
1/1 [==============================] - 0s 12ms/step - loss: 6.6881e-06 -
accuracy: 1.0000
Epoch 217/500
1/1 [==============================] - 0s 10ms/step - loss: 2.2759e-04 -
accuracy: 1.0000
Epoch 218/500
1/1 [==============================] - 0s 10ms/step - loss: 5.4537e-04 -
accuracy: 1.0000
Epoch 219/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0029 - accuracy:
1.0000
Epoch 220/500
1/1 [==============================] - 0s 9ms/step - loss: 5.9881e-05 -
accuracy: 1.0000
Epoch 221/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0024 - accuracy:
1.0000
Epoch 222/500
1/1 [==============================] - 0s 13ms/step - loss: 0.1154 - accuracy:
0.9474
Epoch 223/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0015 - accuracy:
1.0000
Epoch 224/500

32
1/1 [==============================] - 0s 11ms/step - loss: 0.0072 - accuracy:
1.0000
Epoch 225/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0023 - accuracy:
1.0000
Epoch 226/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0387 - accuracy:
1.0000
Epoch 227/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0429 - accuracy:
0.9474
Epoch 228/500
1/1 [==============================] - 0s 16ms/step - loss: 8.9267e-05 -
accuracy: 1.0000
Epoch 229/500
1/1 [==============================] - 0s 14ms/step - loss: 3.8157e-04 -
accuracy: 1.0000
Epoch 230/500
1/1 [==============================] - 0s 12ms/step - loss: 0.0014 - accuracy:
1.0000
Epoch 231/500
1/1 [==============================] - 0s 13ms/step - loss: 1.3584e-04 -
accuracy: 1.0000
Epoch 232/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0013 - accuracy:
1.0000
Epoch 233/500
1/1 [==============================] - 0s 10ms/step - loss: 2.5206e-05 -
accuracy: 1.0000
Epoch 234/500
1/1 [==============================] - 0s 11ms/step - loss: 3.8300e-05 -
accuracy: 1.0000
Epoch 235/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0209 - accuracy:
1.0000
Epoch 236/500
1/1 [==============================] - 0s 11ms/step - loss: 3.3711e-05 -
accuracy: 1.0000
Epoch 237/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0597 - accuracy:
0.9474
Epoch 238/500
1/1 [==============================] - 0s 11ms/step - loss: 9.5675e-06 -
accuracy: 1.0000
Epoch 239/500
1/1 [==============================] - 0s 11ms/step - loss: 2.0777e-04 -
accuracy: 1.0000
Epoch 240/500

33
1/1 [==============================] - 0s 12ms/step - loss: 7.7991e-04 -
accuracy: 1.0000
Epoch 241/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0065 - accuracy:
1.0000
Epoch 242/500
1/1 [==============================] - 0s 11ms/step - loss: 1.9212e-04 -
accuracy: 1.0000
Epoch 243/500
1/1 [==============================] - 0s 10ms/step - loss: 2.8262e-05 -
accuracy: 1.0000
Epoch 244/500
1/1 [==============================] - 0s 10ms/step - loss: 3.5018e-04 -
accuracy: 1.0000
Epoch 245/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0031 - accuracy:
1.0000
Epoch 246/500
1/1 [==============================] - 0s 8ms/step - loss: 1.6251e-04 -
accuracy: 1.0000
Epoch 247/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0043 - accuracy:
1.0000
Epoch 248/500
1/1 [==============================] - 0s 10ms/step - loss: 6.1612e-05 -
accuracy: 1.0000
Epoch 249/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0564 - accuracy:
0.9474
Epoch 250/500
1/1 [==============================] - 0s 10ms/step - loss: 3.1480e-04 -
accuracy: 1.0000
Epoch 251/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0058 - accuracy:
1.0000
Epoch 252/500
1/1 [==============================] - 0s 9ms/step - loss: 3.8275e-04 -
accuracy: 1.0000
Epoch 253/500
1/1 [==============================] - 0s 10ms/step - loss: 2.3889e-05 -
accuracy: 1.0000
Epoch 254/500
1/1 [==============================] - 0s 9ms/step - loss: 8.0083e-05 -
accuracy: 1.0000
Epoch 255/500
1/1 [==============================] - 0s 11ms/step - loss: 6.5539e-04 -
accuracy: 1.0000
Epoch 256/500

34
1/1 [==============================] - 0s 11ms/step - loss: 1.4717e-04 -
accuracy: 1.0000
Epoch 257/500
1/1 [==============================] - 0s 15ms/step - loss: 3.4407e-04 -
accuracy: 1.0000
Epoch 258/500
1/1 [==============================] - 0s 10ms/step - loss: 1.9701e-04 -
accuracy: 1.0000
Epoch 259/500
1/1 [==============================] - 0s 9ms/step - loss: 4.8871e-04 -
accuracy: 1.0000
Epoch 260/500
1/1 [==============================] - 0s 11ms/step - loss: 1.1646e-04 -
accuracy: 1.0000
Epoch 261/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0807 - accuracy:
0.9474
Epoch 262/500
1/1 [==============================] - 0s 12ms/step - loss: 3.4960e-05 -
accuracy: 1.0000
Epoch 263/500
1/1 [==============================] - 0s 14ms/step - loss: 6.2992e-06 -
accuracy: 1.0000
Epoch 264/500
1/1 [==============================] - 0s 16ms/step - loss: 3.1913e-04 -
accuracy: 1.0000
Epoch 265/500
1/1 [==============================] - 0s 20ms/step - loss: 6.9100e-05 -
accuracy: 1.0000
Epoch 266/500
1/1 [==============================] - 0s 16ms/step - loss: 6.6240e-05 -
accuracy: 1.0000
Epoch 267/500
1/1 [==============================] - 0s 14ms/step - loss: 3.1856e-05 -
accuracy: 1.0000
Epoch 268/500
1/1 [==============================] - 0s 15ms/step - loss: 1.5767e-04 -
accuracy: 1.0000
Epoch 269/500
1/1 [==============================] - 0s 11ms/step - loss: 5.0470e-05 -
accuracy: 1.0000
Epoch 270/500
1/1 [==============================] - 0s 11ms/step - loss: 6.1674e-06 -
accuracy: 1.0000
Epoch 271/500
1/1 [==============================] - 0s 44ms/step - loss: 1.7310e-04 -
accuracy: 1.0000
Epoch 272/500

35
1/1 [==============================] - 0s 12ms/step - loss: 0.0041 - accuracy:
1.0000
Epoch 273/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0106 - accuracy:
1.0000
Epoch 274/500
1/1 [==============================] - 0s 12ms/step - loss: 2.5710e-04 -
accuracy: 1.0000
Epoch 275/500
1/1 [==============================] - 0s 10ms/step - loss: 1.6078e-04 -
accuracy: 1.0000
Epoch 276/500
1/1 [==============================] - 0s 10ms/step - loss: 5.0694e-06 -
accuracy: 1.0000
Epoch 277/500
1/1 [==============================] - 0s 12ms/step - loss: 9.7674e-05 -
accuracy: 1.0000
Epoch 278/500
1/1 [==============================] - 0s 9ms/step - loss: 1.4605e-05 -
accuracy: 1.0000
Epoch 279/500
1/1 [==============================] - 0s 10ms/step - loss: 3.2632e-05 -
accuracy: 1.0000
Epoch 280/500
1/1 [==============================] - 0s 14ms/step - loss: 1.7308e-04 -
accuracy: 1.0000
Epoch 281/500
1/1 [==============================] - 0s 18ms/step - loss: 3.1057e-06 -
accuracy: 1.0000
Epoch 282/500
1/1 [==============================] - 0s 11ms/step - loss: 1.8918e-04 -
accuracy: 1.0000
Epoch 283/500
1/1 [==============================] - 0s 11ms/step - loss: 1.9737e-05 -
accuracy: 1.0000
Epoch 284/500
1/1 [==============================] - 0s 11ms/step - loss: 2.3026e-06 -
accuracy: 1.0000
Epoch 285/500
1/1 [==============================] - 0s 13ms/step - loss: 0.0018 - accuracy:
1.0000
Epoch 286/500
1/1 [==============================] - 0s 11ms/step - loss: 8.3446e-07 -
accuracy: 1.0000
Epoch 287/500
1/1 [==============================] - 0s 9ms/step - loss: 2.9275e-05 -
accuracy: 1.0000
Epoch 288/500

36
1/1 [==============================] - 0s 10ms/step - loss: 0.0023 - accuracy:
1.0000
Epoch 289/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0016 - accuracy:
1.0000
Epoch 290/500
1/1 [==============================] - 0s 9ms/step - loss: 3.1480e-05 -
accuracy: 1.0000
Epoch 291/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0023 - accuracy:
1.0000
Epoch 292/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0019 - accuracy:
1.0000
Epoch 293/500
1/1 [==============================] - 0s 11ms/step - loss: 2.1900e-05 -
accuracy: 1.0000
Epoch 294/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0011 - accuracy:
1.0000
Epoch 295/500
1/1 [==============================] - 0s 15ms/step - loss: 1.9760e-05 -
accuracy: 1.0000
Epoch 296/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0013 - accuracy:
1.0000
Epoch 297/500
1/1 [==============================] - 0s 8ms/step - loss: 3.7723e-05 -
accuracy: 1.0000
Epoch 298/500
1/1 [==============================] - 0s 9ms/step - loss: 1.0127e-04 -
accuracy: 1.0000
Epoch 299/500
1/1 [==============================] - 0s 12ms/step - loss: 2.9489e-07 -
accuracy: 1.0000
Epoch 300/500
1/1 [==============================] - 0s 10ms/step - loss: 2.8904e-05 -
accuracy: 1.0000
Epoch 301/500
1/1 [==============================] - 0s 11ms/step - loss: 4.9501e-06 -
accuracy: 1.0000
Epoch 302/500
1/1 [==============================] - 0s 10ms/step - loss: 2.6226e-06 -
accuracy: 1.0000
Epoch 303/500
1/1 [==============================] - 0s 10ms/step - loss: 7.2795e-05 -
accuracy: 1.0000
Epoch 304/500

37
1/1 [==============================] - 0s 9ms/step - loss: 2.0046e-04 -
accuracy: 1.0000
Epoch 305/500
1/1 [==============================] - 0s 9ms/step - loss: 3.7399e-04 -
accuracy: 1.0000
Epoch 306/500
1/1 [==============================] - 0s 10ms/step - loss: 1.0269e-04 -
accuracy: 1.0000
Epoch 307/500
1/1 [==============================] - 0s 11ms/step - loss: 1.0300e-04 -
accuracy: 1.0000
Epoch 308/500
1/1 [==============================] - 0s 9ms/step - loss: 7.2716e-06 -
accuracy: 1.0000
Epoch 309/500
1/1 [==============================] - 0s 12ms/step - loss: 1.9073e-06 -
accuracy: 1.0000
Epoch 310/500
1/1 [==============================] - 0s 11ms/step - loss: 1.8840e-04 -
accuracy: 1.0000
Epoch 311/500
1/1 [==============================] - 0s 13ms/step - loss: 9.9319e-05 -
accuracy: 1.0000
Epoch 312/500
1/1 [==============================] - 0s 12ms/step - loss: 4.4107e-06 -
accuracy: 1.0000
Epoch 313/500
1/1 [==============================] - 0s 10ms/step - loss: 6.3743e-06 -
accuracy: 1.0000
Epoch 314/500
1/1 [==============================] - 0s 8ms/step - loss: 2.9788e-04 -
accuracy: 1.0000
Epoch 315/500
1/1 [==============================] - 0s 9ms/step - loss: 2.1991e-04 -
accuracy: 1.0000
Epoch 316/500
1/1 [==============================] - 0s 9ms/step - loss: 2.5473e-06 -
accuracy: 1.0000
Epoch 317/500
1/1 [==============================] - 0s 11ms/step - loss: 5.1892e-05 -
accuracy: 1.0000
Epoch 318/500
1/1 [==============================] - 0s 10ms/step - loss: 3.7456e-06 -
accuracy: 1.0000
Epoch 319/500
1/1 [==============================] - 0s 8ms/step - loss: 5.4585e-07 -
accuracy: 1.0000
Epoch 320/500

38
1/1 [==============================] - 0s 10ms/step - loss: 2.8085e-05 -
accuracy: 1.0000
Epoch 321/500
1/1 [==============================] - 0s 9ms/step - loss: 6.6378e-06 -
accuracy: 1.0000
Epoch 322/500
1/1 [==============================] - 0s 10ms/step - loss: 1.0706e-04 -
accuracy: 1.0000
Epoch 323/500
1/1 [==============================] - 0s 10ms/step - loss: 5.0067e-06 -
accuracy: 1.0000
Epoch 324/500
1/1 [==============================] - 0s 10ms/step - loss: 8.0243e-06 -
accuracy: 1.0000
Epoch 325/500
1/1 [==============================] - 0s 9ms/step - loss: 5.2388e-06 -
accuracy: 1.0000
Epoch 326/500
1/1 [==============================] - 0s 8ms/step - loss: 4.0314e-04 -
accuracy: 1.0000
Epoch 327/500
1/1 [==============================] - 0s 10ms/step - loss: 2.8902e-04 -
accuracy: 1.0000
Epoch 328/500
1/1 [==============================] - 0s 13ms/step - loss: 1.6940e-07 -
accuracy: 1.0000
Epoch 329/500
1/1 [==============================] - 0s 15ms/step - loss: 6.2417e-05 -
accuracy: 1.0000
Epoch 330/500
1/1 [==============================] - 0s 9ms/step - loss: 1.9468e-05 -
accuracy: 1.0000
Epoch 331/500
1/1 [==============================] - 0s 11ms/step - loss: 2.5178e-04 -
accuracy: 1.0000
Epoch 332/500
1/1 [==============================] - 0s 8ms/step - loss: 1.8070e-06 -
accuracy: 1.0000
Epoch 333/500
1/1 [==============================] - 0s 9ms/step - loss: 1.7704e-05 -
accuracy: 1.0000
Epoch 334/500
1/1 [==============================] - 0s 10ms/step - loss: 1.7096e-05 -
accuracy: 1.0000
Epoch 335/500
1/1 [==============================] - 0s 9ms/step - loss: 2.0578e-05 -
accuracy: 1.0000
Epoch 336/500

39
1/1 [==============================] - 0s 9ms/step - loss: 1.5027e-04 -
accuracy: 1.0000
Epoch 337/500
1/1 [==============================] - 0s 9ms/step - loss: 5.2450e-06 -
accuracy: 1.0000
Epoch 338/500
1/1 [==============================] - 0s 9ms/step - loss: 1.1001e-04 -
accuracy: 1.0000
Epoch 339/500
1/1 [==============================] - 0s 7ms/step - loss: 8.6551e-04 -
accuracy: 1.0000
Epoch 340/500
1/1 [==============================] - 0s 9ms/step - loss: 3.4194e-06 -
accuracy: 1.0000
Epoch 341/500
1/1 [==============================] - 0s 10ms/step - loss: 7.5727e-06 -
accuracy: 1.0000
Epoch 342/500
1/1 [==============================] - 0s 9ms/step - loss: 2.4469e-07 -
accuracy: 1.0000
Epoch 343/500
1/1 [==============================] - 0s 10ms/step - loss: 9.7807e-06 -
accuracy: 1.0000
Epoch 344/500
1/1 [==============================] - 0s 11ms/step - loss: 5.3454e-06 -
accuracy: 1.0000
Epoch 345/500
1/1 [==============================] - 0s 9ms/step - loss: 2.6163e-06 -
accuracy: 1.0000
Epoch 346/500
1/1 [==============================] - 0s 10ms/step - loss: 3.5449e-06 -
accuracy: 1.0000
Epoch 347/500
1/1 [==============================] - 0s 8ms/step - loss: 6.9263e-06 -
accuracy: 1.0000
Epoch 348/500
1/1 [==============================] - 0s 11ms/step - loss: 1.8726e-04 -
accuracy: 1.0000
Epoch 349/500
1/1 [==============================] - 0s 9ms/step - loss: 5.5441e-04 -
accuracy: 1.0000
Epoch 350/500
1/1 [==============================] - 0s 10ms/step - loss: 5.8662e-06 -
accuracy: 1.0000
Epoch 351/500
1/1 [==============================] - 0s 9ms/step - loss: 2.2033e-05 -
accuracy: 1.0000
Epoch 352/500

40
1/1 [==============================] - 0s 9ms/step - loss: 1.6782e-05 -
accuracy: 1.0000
Epoch 353/500
1/1 [==============================] - 0s 10ms/step - loss: 1.1280e-05 -
accuracy: 1.0000
Epoch 354/500
1/1 [==============================] - 0s 10ms/step - loss: 3.8055e-05 -
accuracy: 1.0000
Epoch 355/500
1/1 [==============================] - 0s 9ms/step - loss: 9.1603e-07 -
accuracy: 1.0000
Epoch 356/500
1/1 [==============================] - 0s 9ms/step - loss: 1.9628e-04 -
accuracy: 1.0000
Epoch 357/500
1/1 [==============================] - 0s 12ms/step - loss: 1.7128e-06 -
accuracy: 1.0000
Epoch 358/500
1/1 [==============================] - 0s 12ms/step - loss: 3.6260e-05 -
accuracy: 1.0000
Epoch 359/500
1/1 [==============================] - 0s 13ms/step - loss: 2.8547e-06 -
accuracy: 1.0000
Epoch 360/500
1/1 [==============================] - 0s 10ms/step - loss: 2.6469e-05 -
accuracy: 1.0000
Epoch 361/500
1/1 [==============================] - 0s 11ms/step - loss: 2.2984e-05 -
accuracy: 1.0000
Epoch 362/500
1/1 [==============================] - 0s 10ms/step - loss: 5.0127e-04 -
accuracy: 1.0000
Epoch 363/500
1/1 [==============================] - 0s 10ms/step - loss: 8.7347e-05 -
accuracy: 1.0000
Epoch 364/500
1/1 [==============================] - 0s 10ms/step - loss: 7.4095e-05 -
accuracy: 1.0000
Epoch 365/500
1/1 [==============================] - 0s 11ms/step - loss: 2.8218e-04 -
accuracy: 1.0000
Epoch 366/500
1/1 [==============================] - 0s 9ms/step - loss: 1.2548e-07 -
accuracy: 1.0000
Epoch 367/500
1/1 [==============================] - 0s 10ms/step - loss: 1.6782e-04 -
accuracy: 1.0000
Epoch 368/500

41
1/1 [==============================] - 0s 9ms/step - loss: 1.9450e-07 -
accuracy: 1.0000
Epoch 369/500
1/1 [==============================] - 0s 10ms/step - loss: 1.8823e-08 -
accuracy: 1.0000
Epoch 370/500
1/1 [==============================] - 0s 10ms/step - loss: 2.7983e-06 -
accuracy: 1.0000
Epoch 371/500
1/1 [==============================] - 0s 10ms/step - loss: 2.9993e-04 -
accuracy: 1.0000
Epoch 372/500
1/1 [==============================] - 0s 8ms/step - loss: 1.8697e-06 -
accuracy: 1.0000
Epoch 373/500
1/1 [==============================] - 0s 11ms/step - loss: 9.8374e-06 -
accuracy: 1.0000
Epoch 374/500
1/1 [==============================] - 0s 10ms/step - loss: 2.2006e-05 -
accuracy: 1.0000
Epoch 375/500
1/1 [==============================] - 0s 10ms/step - loss: 1.4743e-05 -
accuracy: 1.0000
Epoch 376/500
1/1 [==============================] - 0s 11ms/step - loss: 1.0029e-04 -
accuracy: 1.0000
Epoch 377/500
1/1 [==============================] - 0s 9ms/step - loss: 1.2673e-05 -
accuracy: 1.0000
Epoch 378/500
1/1 [==============================] - 0s 9ms/step - loss: 3.6954e-06 -
accuracy: 1.0000
Epoch 379/500
1/1 [==============================] - 0s 9ms/step - loss: 2.0579e-06 -
accuracy: 1.0000
Epoch 380/500
1/1 [==============================] - 0s 9ms/step - loss: 6.5248e-06 -
accuracy: 1.0000
Epoch 381/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0870 - accuracy:
0.9474
Epoch 382/500
1/1 [==============================] - 0s 10ms/step - loss: 4.0405e-06 -
accuracy: 1.0000
Epoch 383/500
1/1 [==============================] - 0s 10ms/step - loss: 7.7285e-05 -
accuracy: 1.0000
Epoch 384/500

42
1/1 [==============================] - 0s 11ms/step - loss: 3.3136e-05 -
accuracy: 1.0000
Epoch 385/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0018 - accuracy:
1.0000
Epoch 386/500
1/1 [==============================] - 0s 8ms/step - loss: 2.7325e-04 -
accuracy: 1.0000
Epoch 387/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0626 - accuracy:
0.9474
Epoch 388/500
1/1 [==============================] - 0s 11ms/step - loss: 4.0695e-05 -
accuracy: 1.0000
Epoch 389/500
1/1 [==============================] - 0s 12ms/step - loss: 3.6763e-04 -
accuracy: 1.0000
Epoch 390/500
1/1 [==============================] - 0s 9ms/step - loss: 4.5584e-05 -
accuracy: 1.0000
Epoch 391/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0100 - accuracy:
1.0000
Epoch 392/500
1/1 [==============================] - 0s 8ms/step - loss: 0.1189 - accuracy:
0.9474
Epoch 393/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0152 - accuracy:
1.0000
Epoch 394/500
1/1 [==============================] - 0s 11ms/step - loss: 9.2274e-05 -
accuracy: 1.0000
Epoch 395/500
1/1 [==============================] - 0s 8ms/step - loss: 7.9994e-06 -
accuracy: 1.0000
Epoch 396/500
1/1 [==============================] - 0s 9ms/step - loss: 2.8484e-06 -
accuracy: 1.0000
Epoch 397/500
1/1 [==============================] - 0s 10ms/step - loss: 1.9561e-05 -
accuracy: 1.0000
Epoch 398/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0076 - accuracy:
1.0000
Epoch 399/500
1/1 [==============================] - 0s 10ms/step - loss: 5.4835e-06 -
accuracy: 1.0000
Epoch 400/500

43
1/1 [==============================] - 0s 10ms/step - loss: 5.4271e-06 -
accuracy: 1.0000
Epoch 401/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0018 - accuracy:
1.0000
Epoch 402/500
1/1 [==============================] - 0s 14ms/step - loss: 1.3206e-04 -
accuracy: 1.0000
Epoch 403/500
1/1 [==============================] - 0s 10ms/step - loss: 2.9400e-05 -
accuracy: 1.0000
Epoch 404/500
1/1 [==============================] - 0s 14ms/step - loss: 3.0425e-05 -
accuracy: 1.0000
Epoch 405/500
1/1 [==============================] - 0s 13ms/step - loss: 1.4912e-05 -
accuracy: 1.0000
Epoch 406/500
1/1 [==============================] - 0s 10ms/step - loss: 9.6060e-05 -
accuracy: 1.0000
Epoch 407/500
1/1 [==============================] - 0s 8ms/step - loss: 2.2446e-05 -
accuracy: 1.0000
Epoch 408/500
1/1 [==============================] - 0s 9ms/step - loss: 3.7830e-04 -
accuracy: 1.0000
Epoch 409/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0074 - accuracy:
1.0000
Epoch 410/500
1/1 [==============================] - 0s 9ms/step - loss: 7.4603e-04 -
accuracy: 1.0000
Epoch 411/500
1/1 [==============================] - 0s 8ms/step - loss: 2.0290e-04 -
accuracy: 1.0000
Epoch 412/500
1/1 [==============================] - 0s 10ms/step - loss: 1.2698e-05 -
accuracy: 1.0000
Epoch 413/500
1/1 [==============================] - 0s 23ms/step - loss: 6.4726e-05 -
accuracy: 1.0000
Epoch 414/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0016 - accuracy:
1.0000
Epoch 415/500
1/1 [==============================] - 0s 10ms/step - loss: 5.2263e-06 -
accuracy: 1.0000
Epoch 416/500

44
1/1 [==============================] - 0s 9ms/step - loss: 1.9793e-05 -
accuracy: 1.0000
Epoch 417/500
1/1 [==============================] - 0s 9ms/step - loss: 5.2322e-04 -
accuracy: 1.0000
Epoch 418/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0016 - accuracy:
1.0000
Epoch 419/500
1/1 [==============================] - 0s 13ms/step - loss: 6.3905e-05 -
accuracy: 1.0000
Epoch 420/500
1/1 [==============================] - 0s 13ms/step - loss: 1.8964e-04 -
accuracy: 1.0000
Epoch 421/500
1/1 [==============================] - 0s 9ms/step - loss: 5.8702e-05 -
accuracy: 1.0000
Epoch 422/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0022 - accuracy:
1.0000
Epoch 423/500
1/1 [==============================] - 0s 10ms/step - loss: 2.8230e-05 -
accuracy: 1.0000
Epoch 424/500
1/1 [==============================] - 0s 11ms/step - loss: 1.1073e-05 -
accuracy: 1.0000
Epoch 425/500
1/1 [==============================] - 0s 9ms/step - loss: 3.3325e-05 -
accuracy: 1.0000
Epoch 426/500
1/1 [==============================] - 0s 10ms/step - loss: 1.2642e-04 -
accuracy: 1.0000
Epoch 427/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0371 - accuracy:
0.9474
Epoch 428/500
1/1 [==============================] - 0s 10ms/step - loss: 1.4172e-05 -
accuracy: 1.0000
Epoch 429/500
1/1 [==============================] - 0s 11ms/step - loss: 1.7379e-06 -
accuracy: 1.0000
Epoch 430/500
1/1 [==============================] - 0s 11ms/step - loss: 1.2146e-05 -
accuracy: 1.0000
Epoch 431/500
1/1 [==============================] - 0s 10ms/step - loss: 9.1977e-06 -
accuracy: 1.0000
Epoch 432/500

45
1/1 [==============================] - 0s 10ms/step - loss: 0.0325 - accuracy:
1.0000
Epoch 433/500
1/1 [==============================] - 0s 10ms/step - loss: 7.7546e-05 -
accuracy: 1.0000
Epoch 434/500
1/1 [==============================] - 0s 13ms/step - loss: 8.0305e-06 -
accuracy: 1.0000
Epoch 435/500
1/1 [==============================] - 0s 11ms/step - loss: 9.2664e-06 -
accuracy: 1.0000
Epoch 436/500
1/1 [==============================] - 0s 9ms/step - loss: 0.0031 - accuracy:
1.0000
Epoch 437/500
1/1 [==============================] - 0s 8ms/step - loss: 0.0018 - accuracy:
1.0000
Epoch 438/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0025 - accuracy:
1.0000
Epoch 439/500
1/1 [==============================] - 0s 12ms/step - loss: 0.1294 - accuracy:
0.9474
Epoch 440/500
1/1 [==============================] - 0s 10ms/step - loss: 3.3758e-05 -
accuracy: 1.0000
Epoch 441/500
1/1 [==============================] - 0s 11ms/step - loss: 1.0882e-04 -
accuracy: 1.0000
Epoch 442/500
1/1 [==============================] - 0s 9ms/step - loss: 7.5290e-07 -
accuracy: 1.0000
Epoch 443/500
1/1 [==============================] - 0s 9ms/step - loss: 1.5246e-04 -
accuracy: 1.0000
Epoch 444/500
1/1 [==============================] - 0s 10ms/step - loss: 3.0154e-05 -
accuracy: 1.0000
Epoch 445/500
1/1 [==============================] - 0s 11ms/step - loss: 1.4870e-06 -
accuracy: 1.0000
Epoch 446/500
1/1 [==============================] - 0s 10ms/step - loss: 4.3103e-06 -
accuracy: 1.0000
Epoch 447/500
1/1 [==============================] - 0s 10ms/step - loss: 0.0471 - accuracy:
0.9474
Epoch 448/500

46
1/1 [==============================] - 0s 12ms/step - loss: 7.9146e-04 -
accuracy: 1.0000
Epoch 449/500
1/1 [==============================] - 0s 14ms/step - loss: 0.0026 - accuracy:
1.0000
Epoch 450/500
1/1 [==============================] - 0s 11ms/step - loss: 9.3230e-06 -
accuracy: 1.0000
Epoch 451/500
1/1 [==============================] - 0s 10ms/step - loss: 5.2134e-05 -
accuracy: 1.0000
Epoch 452/500
1/1 [==============================] - 0s 11ms/step - loss: 7.6343e-05 -
accuracy: 1.0000
Epoch 453/500
1/1 [==============================] - 0s 10ms/step - loss: 0.1937 - accuracy:
0.9474
Epoch 454/500
1/1 [==============================] - 0s 12ms/step - loss: 2.2584e-04 -
accuracy: 1.0000
Epoch 455/500
1/1 [==============================] - 0s 10ms/step - loss: 1.7630e-06 -
accuracy: 1.0000
Epoch 456/500
1/1 [==============================] - 0s 9ms/step - loss: 5.8779e-05 -
accuracy: 1.0000
Epoch 457/500
1/1 [==============================] - 0s 10ms/step - loss: 2.5598e-06 -
accuracy: 1.0000
Epoch 458/500
1/1 [==============================] - 0s 9ms/step - loss: 1.8948e-06 -
accuracy: 1.0000
Epoch 459/500
1/1 [==============================] - 0s 11ms/step - loss: 5.1384e-06 -
accuracy: 1.0000
Epoch 460/500
1/1 [==============================] - 0s 11ms/step - loss: 7.2093e-04 -
accuracy: 1.0000
Epoch 461/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0069 - accuracy:
1.0000
Epoch 462/500
1/1 [==============================] - 0s 10ms/step - loss: 6.1361e-06 -
accuracy: 1.0000
Epoch 463/500
1/1 [==============================] - 0s 9ms/step - loss: 8.7838e-08 -
accuracy: 1.0000
Epoch 464/500

47
1/1 [==============================] - 0s 15ms/step - loss: 0.0016 - accuracy:
1.0000
Epoch 465/500
1/1 [==============================] - 0s 11ms/step - loss: 4.8959e-04 -
accuracy: 1.0000
Epoch 466/500
1/1 [==============================] - 0s 10ms/step - loss: 3.6390e-07 -
accuracy: 1.0000
Epoch 467/500
1/1 [==============================] - 0s 8ms/step - loss: 4.2162e-06 -
accuracy: 1.0000
Epoch 468/500
1/1 [==============================] - 0s 11ms/step - loss: 0.0013 - accuracy:
1.0000
Epoch 469/500
1/1 [==============================] - 0s 9ms/step - loss: 4.6051e-06 -
accuracy: 1.0000
Epoch 470/500
1/1 [==============================] - 0s 14ms/step - loss: 1.2401e-04 -
accuracy: 1.0000
Epoch 471/500
1/1 [==============================] - 0s 18ms/step - loss: 1.5433e-04 -
accuracy: 1.0000
Epoch 472/500
1/1 [==============================] - 0s 11ms/step - loss: 2.8924e-06 -
accuracy: 1.0000
Epoch 473/500
1/1 [==============================] - 0s 13ms/step - loss: 2.2319e-04 -
accuracy: 1.0000
Epoch 474/500
1/1 [==============================] - 0s 16ms/step - loss: 7.5286e-05 -
accuracy: 1.0000
Epoch 475/500
1/1 [==============================] - 0s 17ms/step - loss: 8.9651e-06 -
accuracy: 1.0000
Epoch 476/500
1/1 [==============================] - 0s 15ms/step - loss: 0.1130 - accuracy:
0.9474
Epoch 477/500
1/1 [==============================] - 0s 17ms/step - loss: 7.7172e-07 -
accuracy: 1.0000
Epoch 478/500
1/1 [==============================] - 0s 15ms/step - loss: 6.0859e-07 -
accuracy: 1.0000
Epoch 479/500
1/1 [==============================] - 0s 14ms/step - loss: 1.7271e-05 -
accuracy: 1.0000
Epoch 480/500

48
1/1 [==============================] - 0s 16ms/step - loss: 0.0015 - accuracy:
1.0000
Epoch 481/500
1/1 [==============================] - 0s 16ms/step - loss: 2.3790e-05 -
accuracy: 1.0000
Epoch 482/500
1/1 [==============================] - 0s 17ms/step - loss: 7.7923e-06 -
accuracy: 1.0000
Epoch 483/500
1/1 [==============================] - 0s 16ms/step - loss: 2.0920e-05 -
accuracy: 1.0000
Epoch 484/500
1/1 [==============================] - 0s 14ms/step - loss: 1.6965e-04 -
accuracy: 1.0000
Epoch 485/500
1/1 [==============================] - 0s 13ms/step - loss: 5.4584e-06 -
accuracy: 1.0000
Epoch 486/500
1/1 [==============================] - 0s 14ms/step - loss: 1.9463e-04 -
accuracy: 1.0000
Epoch 487/500
1/1 [==============================] - 0s 13ms/step - loss: 1.1452e-04 -
accuracy: 1.0000
Epoch 488/500
1/1 [==============================] - 0s 13ms/step - loss: 1.6752e-06 -
accuracy: 1.0000
Epoch 489/500
1/1 [==============================] - 0s 16ms/step - loss: 1.3176e-07 -
accuracy: 1.0000
Epoch 490/500
1/1 [==============================] - 0s 12ms/step - loss: 1.6349e-05 -
accuracy: 1.0000
Epoch 491/500
1/1 [==============================] - 0s 10ms/step - loss: 5.6237e-05 -
accuracy: 1.0000
Epoch 492/500
1/1 [==============================] - 0s 10ms/step - loss: 1.4253e-05 -
accuracy: 1.0000
Epoch 493/500
1/1 [==============================] - 0s 12ms/step - loss: 7.0946e-05 -
accuracy: 1.0000
Epoch 494/500
1/1 [==============================] - 0s 12ms/step - loss: 3.7489e-04 -
accuracy: 1.0000
Epoch 495/500
1/1 [==============================] - 0s 12ms/step - loss: 1.9612e-05 -
accuracy: 1.0000
Epoch 496/500

49
1/1 [==============================] - 0s 10ms/step - loss: 0.1280 - accuracy:
0.9474
Epoch 497/500
1/1 [==============================] - 0s 9ms/step - loss: 1.7322e-05 -
accuracy: 1.0000
Epoch 498/500
1/1 [==============================] - 0s 12ms/step - loss: 2.7015e-04 -
accuracy: 1.0000
Epoch 499/500
1/1 [==============================] - 0s 15ms/step - loss: 2.1618e-04 -
accuracy: 1.0000
Epoch 500/500
1/1 [==============================] - 0s 12ms/step - loss: 2.2602e-05 -
accuracy: 1.0000

[14]: <keras.src.callbacks.History at 0x21eba215df0>

[15]: def clean_text(text):


tokens=nltk.word_tokenize(text)
tokens=[lemmatizer.lemmatize(word) for word in tokens]
return tokens

def bag_of_words(text,vocab):
tokens=clean_text(text)
bow=[0]*len(vocab)
for w in tokens:
for idx, word in enumerate(vocab):
if word==w:
bow[idx]=1
return np.array(bow)

[16]: def pred_class(text, vocab,labels):


bow=bag_of_words(text, vocab)
result=model.predict(np.array([bow]))[0]
thresh=0.2
y_pred=[[idx,res] for idx, res in enumerate(result) if res>thresh]

y_pred.sort(key=lambda x:x[1], reverse=True)


return_list=[]
for r in y_pred:
return_list.append(labels[r[0]])
return return_list

def get_response(intents_list, intents_json):


tag=intents_list[0]
list_of_intents=intents_json["intents"]
for i in list_of_intents:

50
if i["tag"]==tag:
result=random.choice(i["responses"])
break
return result

Running the chatbot


[17]: while True:
message=input("")
intents=pred_class(message, words, classes)
result=get_response(intents,data)
print(result)

hello
1/1 [==============================] - 0s 67ms/step
How are you doing?
what is your name
1/1 [==============================] - 0s 16ms/step
I'm Kippi
goodbye
1/1 [==============================] - 0s 15ms/step
Howdy Partner!

---------------------------------------------------------------------------
KeyboardInterrupt Traceback (most recent call last)
Input In [17], in <cell line: 2>()
1 while True:
----> 2 message=input("")
3 intents=pred_class(message, words, classes)
4 result=get_response(intents,data)

File C:\ProgramData\Anaconda3\lib\site-packages\ipykernel\kernelbase.py:1075, in␣


↪Kernel.raw_input(self, prompt)

1071 if not self._allow_stdin:


1072 raise StdinNotImplementedError(
1073 "raw_input was called, but this frontend does not support input␣
↪requests."

1074 )
-> 1075 return self._input_request(
1076 str(prompt),
1077 self._parent_ident["shell"],
1078 self.get_parent("shell"),
1079 password=False,
1080 )

File C:\ProgramData\Anaconda3\lib\site-packages\ipykernel\kernelbase.py:1120, in␣


↪Kernel._input_request(self, prompt, ident, parent, password)

51
1117 break
1118 except KeyboardInterrupt:
1119 # re-raise KeyboardInterrupt, to truncate traceback
-> 1120 raise KeyboardInterrupt("Interrupted by user") from None
1121 except Exception:
1122 self.log.warning("Invalid Message:", exc_info=True)

KeyboardInterrupt: Interrupted by user

[ ]:

[ ]:

[ ]:

[ ]:

[ ]:

[ ]:

52

You might also like