Problem in Accuracy

Hello,
For quite some time now, I have been trying my hands out in the Pokemon Classification using Alexnet Problem. I tried by playing with the hyper parameters of the Alexnet Code and increasing the no of ephocs but still,nothing seems to work out as i only get an accuracy of 13% after 250 epochs(lol). Please help me with this.
My code is given below

import keras
from keras.models import Sequential
from keras.layers import Dense, Activation, Dropout, Flatten,
Conv2D, MaxPooling2D
from keras.layers.normalization import BatchNormalization
import numpy as np
np.random.seed(1000)
model = Sequential()
model.add(Conv2D(filters=96, input_shape=(227,227,3), kernel_size=(11,11),
strides=(4,4), padding=‘valid’))
model.add(Activation(‘relu’))
model.add(MaxPooling2D(pool_size=(2,2), strides=(2,2), padding=‘valid’))
model.add(BatchNormalization())
model.add(Conv2D(filters=256, kernel_size=(11,11), strides=(1,1), padding=‘valid’))
model.add(Activation(‘relu’))
model.add(MaxPooling2D(pool_size=(2,2), strides=(2,2), padding=‘valid’))
model.add(BatchNormalization())
model.add(Conv2D(filters=384, kernel_size=(3,3), strides=(1,1), padding=‘valid’))
model.add(Activation(‘relu’))
model.add(BatchNormalization())
model.add(Conv2D(filters=384, kernel_size=(3,3), strides=(1,1), padding=‘valid’))
model.add(Activation(‘relu’))
model.add(BatchNormalization())
model.add(Conv2D(filters=256, kernel_size=(3,3), strides=(1,1), padding=‘valid’))
model.add(Activation(‘relu’)
model.add(MaxPooling2D(pool_size=(2,2), strides=(2,2), padding=‘valid’))
model.add(BatchNormalization())
modl.add(Flatten())
model.add(Dense(4096, input_shape=(2272273,)))
model.add(Activation(‘relu’))
model.add(Dropout(0.4))
model.add(BatchNormalization())
model.add(Dense(4096))
model.add(Activation(‘relu’))
model.add(Dropout(0.4))
model.add(BatchNormalization())
model.add(Dense(1000))
model.add(Activation(‘relu’))
model.add(Dropout(0.4))
model.add(BatchNormalization())
model.add(Dense(10))
model.add(Activation(‘softmax’))

model.summary()

model.compile(loss=‘categorical_crossentropy’, optimizer=‘adam’,
metrics=[‘accuracy’])

Same code as AlexNet

https://colab.research.google.com/drive/111ROij5_im8IjaM2GSB1wHO5HcBCzu_S

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.