Introduction
In this project, we will apply the knowledge gained from previous modules to build an image classification model using PyTorch. This project will cover the following steps:
- Data Loading and Preprocessing
- Building the Neural Network
- Training the Model
- Evaluating the Model
- Saving and Loading the Model
Step 1: Data Loading and Preprocessing
1.1 Importing Libraries
First, we need to import the necessary libraries.
import torch import torch.nn as nn import torch.optim as optim import torchvision import torchvision.transforms as transforms from torch.utils.data import DataLoader
1.2 Data Transformation
We will use the CIFAR-10 dataset for this project. The dataset consists of 60,000 32x32 color images in 10 classes, with 6,000 images per class.
transform = transforms.Compose([ transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)) ])
1.3 Loading the Dataset
We will load the training and test datasets using torchvision.datasets
.
trainset = torchvision.datasets.CIFAR10(root='./data', train=True, download=True, transform=transform) trainloader = DataLoader(trainset, batch_size=100, shuffle=True, num_workers=2) testset = torchvision.datasets.CIFAR10(root='./data', train=False, download=True, transform=transform) testloader = DataLoader(testset, batch_size=100, shuffle=False, num_workers=2)
Step 2: Building the Neural Network
2.1 Defining the Network Architecture
We will define a simple Convolutional Neural Network (CNN) for image classification.
class SimpleCNN(nn.Module): def __init__(self): super(SimpleCNN, self).__init__() self.conv1 = nn.Conv2d(3, 32, 3, padding=1) self.conv2 = nn.Conv2d(32, 64, 3, padding=1) self.pool = nn.MaxPool2d(2, 2) self.fc1 = nn.Linear(64 * 8 * 8, 512) self.fc2 = nn.Linear(512, 10) self.relu = nn.ReLU() self.dropout = nn.Dropout(0.5) def forward(self, x): x = self.pool(self.relu(self.conv1(x))) x = self.pool(self.relu(self.conv2(x))) x = x.view(-1, 64 * 8 * 8) x = self.relu(self.fc1(x)) x = self.dropout(x) x = self.fc2(x) return x net = SimpleCNN()
Step 3: Training the Model
3.1 Defining Loss Function and Optimizer
We will use Cross-Entropy Loss and the Adam optimizer.
3.2 Training Loop
We will train the model for 10 epochs.
for epoch in range(10): # loop over the dataset multiple times running_loss = 0.0 for i, data in enumerate(trainloader, 0): inputs, labels = data optimizer.zero_grad() outputs = net(inputs) loss = criterion(outputs, labels) loss.backward() optimizer.step() running_loss += loss.item() if i % 100 == 99: # print every 100 mini-batches print(f'[Epoch {epoch + 1}, Batch {i + 1}] loss: {running_loss / 100:.3f}') running_loss = 0.0 print('Finished Training')
Step 4: Evaluating the Model
4.1 Testing the Model
We will evaluate the model on the test dataset.
correct = 0 total = 0 with torch.no_grad(): for data in testloader: images, labels = data outputs = net(images) _, predicted = torch.max(outputs.data, 1) total += labels.size(0) correct += (predicted == labels).sum().item() print(f'Accuracy of the network on the 10000 test images: {100 * correct / total:.2f}%')
Step 5: Saving and Loading the Model
5.1 Saving the Model
We will save the trained model to a file.
5.2 Loading the Model
We can load the model later for inference or further training.
Conclusion
In this project, we successfully built, trained, and evaluated a simple CNN for image classification using the CIFAR-10 dataset. We also learned how to save and load the model for future use. This project serves as a practical application of the concepts covered in the previous modules and provides a solid foundation for more complex image classification tasks.
PyTorch: From Beginner to Advanced
Module 1: Introduction to PyTorch
- What is PyTorch?
- Setting Up the Environment
- Basic Tensor Operations
- Autograd: Automatic Differentiation
Module 2: Building Neural Networks
- Introduction to Neural Networks
- Creating a Simple Neural Network
- Activation Functions
- Loss Functions and Optimization
Module 3: Training Neural Networks
Module 4: Convolutional Neural Networks (CNNs)
- Introduction to CNNs
- Building a CNN from Scratch
- Transfer Learning with Pre-trained Models
- Fine-Tuning CNNs
Module 5: Recurrent Neural Networks (RNNs)
- Introduction to RNNs
- Building an RNN from Scratch
- Long Short-Term Memory (LSTM) Networks
- Gated Recurrent Units (GRUs)
Module 6: Advanced Topics
- Generative Adversarial Networks (GANs)
- Reinforcement Learning with PyTorch
- Deploying PyTorch Models
- Optimizing Performance