4/15/25, 11:59 AM Untitled3
In [1]: #SL II(ANN) - Assignment=12 (Group C2)
import tensorflow as tf
from tensorflow.keras.datasets import mnist
from tensorflow.keras.layers import Conv2D, Dense, Flatten, MaxPooling2D
from tensorflow.keras.models import Sequential
from tensorflow.keras.optimizers import Adam
# Load the dataset
(x_train, y_train), (x_test, y_test) = mnist.load_data()
# Preprocess the data
x_train = x_train / 255.0
x_test = x_test / 255.0
# Reshape the input data to have a channel dimension
x_train = x_train.reshape((x_train.shape[0], 28, 28, 1))
x_test = x_test.reshape((x_test.shape[0], 28, 28, 1))
# Define the CNN model
model = Sequential([
Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)),
MaxPooling2D((2, 2)),
Conv2D(64, (3, 3), activation='relu'),
MaxPooling2D((2, 2)),
Flatten(),
Dense(64, activation='relu'),
Dense(10, activation='softmax')
])
# Compile the model
model.compile(
optimizer=Adam(learning_rate=0.001),
loss='sparse_categorical_crossentropy',
metrics=['accuracy']
)
# Train the model
model.fit(x_train, y_train, validation_split=0.1, epochs=10, batch_size=32)
# Evaluate the model
test_loss, test_acc = model.evaluate(x_test, y_test)
print(f"Test loss: {test_loss:.4f}, Test accuracy: {test_acc:.4f}")
localhost:8888/nbconvert/html/Untitled3.ipynb?download=false 1/2
4/15/25, 11:59 AM Untitled3
Epoch 1/10
1688/1688 [==============================] - 14s 8ms/step - loss: 0.1610 - accuracy: 0.9514 - val_
loss: 0.0625 - val_accuracy: 0.9797
Epoch 2/10
1688/1688 [==============================] - 13s 8ms/step - loss: 0.0556 - accuracy: 0.9826 - val_
loss: 0.0409 - val_accuracy: 0.9885
Epoch 3/10
1688/1688 [==============================] - 13s 8ms/step - loss: 0.0385 - accuracy: 0.9881 - val_
loss: 0.0400 - val_accuracy: 0.9885
Epoch 4/10
1688/1688 [==============================] - 13s 8ms/step - loss: 0.0287 - accuracy: 0.9909 - val_
loss: 0.0327 - val_accuracy: 0.9902
Epoch 5/10
1688/1688 [==============================] - 14s 8ms/step - loss: 0.0225 - accuracy: 0.9930 - val_
loss: 0.0329 - val_accuracy: 0.9913
Epoch 6/10
1688/1688 [==============================] - 14s 8ms/step - loss: 0.0156 - accuracy: 0.9947 - val_
loss: 0.0328 - val_accuracy: 0.9900
Epoch 7/10
1688/1688 [==============================] - 13s 8ms/step - loss: 0.0132 - accuracy: 0.9957 - val_
loss: 0.0399 - val_accuracy: 0.9902
Epoch 8/10
1688/1688 [==============================] - 13s 8ms/step - loss: 0.0104 - accuracy: 0.9967 - val_
loss: 0.0418 - val_accuracy: 0.9892
Epoch 9/10
1688/1688 [==============================] - 13s 8ms/step - loss: 0.0081 - accuracy: 0.9974 - val_
loss: 0.0536 - val_accuracy: 0.9903
Epoch 10/10
1688/1688 [==============================] - 13s 8ms/step - loss: 0.0086 - accuracy: 0.9969 - val_
loss: 0.0388 - val_accuracy: 0.9920
313/313 [==============================] - 1s 3ms/step - loss: 0.0424 - accuracy: 0.9904
Test loss: 0.0424, Test accuracy: 0.9904
In [ ]:
localhost:8888/nbconvert/html/Untitled3.ipynb?download=false 2/2