Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

Deep Learning with PyTorch

A Review By Thom Ives, Ph.D.

I am writing this during my review of PyTorch where I am mostly using Deep


Learning with PyTorch and also exploring other educational materials
forPyTorch too. I highly recommend Deep Learning with PyTorch. It is
amazingly well written and well organized. While it is not my only source for
reviewing PyTorch, it is my central source.

To benefit the quickest possible way from PyTorch, we want to use pretrained
models. Predefined models are available in torchvision.models .

from torchvision import models

Let's look at the available models. There are many!

dir(models)
['AlexNet',
'DenseNet',
'EfficientNet',
'GoogLeNet',
'GoogLeNetOutputs',
'Inception3',
'InceptionOutputs',
'MNASNet',
'MobileNetV2',
'MobileNetV3',
'RegNet',
'ResNet',
'ShuffleNetV2',
'SqueezeNet',
'VGG',
'_GoogLeNetOutputs',
'_InceptionOutputs',
'__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__path__',
'__spec__',
'_utils',
'alexnet',
'densenet',
'densenet121',
'densenet161',
'densenet169',
'densenet201',
'detection',
'efficientnet',
'efficientnet_b0',
'efficientnet_b1',
'efficientnet_b2',
'efficientnet_b3',
'efficientnet_b4',
'efficientnet_b5',
'efficientnet_b6',
'efficientnet_b7',
'feature_extraction',
'googlenet',
'inception',
'inception_v3',
'mnasnet',
'mnasnet0_5',
'mnasnet0_75',
'mnasnet1_0',
'mnasnet1_3',
'mobilenet',
'mobilenet_v2',
'mobilenet_v3_large',
'mobilenet_v3_small',
'mobilenetv2',
'mobilenetv3',
'quantization',
'regnet',
'regnet_x_16gf',
'regnet_x_1_6gf',
'regnet_x_32gf',
'regnet_x_3_2gf',
'regnet_x_400mf',
'regnet_x_800mf',
'regnet_x_8gf',
'regnet_y_16gf',
'regnet_y_1_6gf',
'regnet_y_32gf',
'regnet_y_3_2gf',
'regnet_y_400mf',
'regnet_y_800mf',
'regnet_y_8gf',
'resnet',
'resnet101',
'resnet152',
'resnet18',
'resnet34',
'resnet50',
'resnext101_32x8d',
'resnext50_32x4d',
'segmentation',
'shufflenet_v2_x0_5',
'shufflenet_v2_x1_0',
'shufflenet_v2_x1_5',
'shufflenet_v2_x2_0',
'shufflenetv2',
'squeezenet',
'squeezenet1_0',
'squeezenet1_1',
'vgg',
'vgg11',
'vgg11_bn',
'vgg13',
'vgg13_bn',
'vgg16',
'vgg16_bn',
'vgg19',
'vgg19_bn',
'video',
'wide_resnet101_2',
'wide_resnet50_2']

This table helps to explain the above output.

Name Type Meaning

Capitalized Names Popular Model Classes

Lower Case Names Functions Returning Specific Instances of Classes

For example, DenseNet is a model class. It has instances of 'densenet', 'densenet121', 'densenet161',
'densenet169', 'densenet201', where each number after densenet would represent different attributes
of architecture for a DenseNet attribute.

AlexNet
AlexNet was a barrier breaker for deep learning. It won the 2012 ILSVRC by a large margin. The
second place model did not use deep learning, and this gave confidence that deep learning could
make a significant contribution to vision tasks. This new encouragement led to a succession of
breakthroughs.

Running AlexNet
As explained above, if we want to run AlexNet, we need to create an instance of it.
alexnet = models.AlexNet()

Now alexnet is an instance of object (class) type AlexNet that can run the AlexNet architecture.
AlexNet, even though it's 10 years old at the time of this writing, has many details. It's a deep learning
architecture.

When we feed it image inputs of the correct size (and we could automate the resizing on the front end
as needed), it will provide a classification output. How many classes can it choose from? 1000. Again,
small by today's standards (2022), but let's respect it as the architecture that showed that deep
learning neural nets were promising. At this time, we now have tranformers that are doing amazing
things. I tend to think that AlexNet helped the data science world get there.

Next we need to initialize our instance or train it. This simply means that we need the weights for the
connections between the neurons in the different layers, OR we need to train alexnet to determine
those weights.

For now, we won't train an instance of AlexNet. We will use a pretrained model.

Once we have input images to feed our instance of AlexNet, the line we'd use to predict the class
would be

output_name = alexnet(image)

Uh Oh. There are some pretrained models in the above list, but there is not one for AlexNet. No
worries. We can load it from elsewhere. AlexNet is after all still famous and rightfully so. A quick
Google search will lead us right where we need to go. I Google searched "pretrained alexnet pytorch".

I found This AlexNet Page on the PyTorch.org Site. It has directions on how to load a pretrained
AlexNet model.

import torch
model = torch.hub.load('pytorch/vision:v0.10.0', 'alexnet', pretrained=True)
model.eval()
Downloading: "https://github.com/pytorch/vision/archive/v0.10.0.zip" to /root/.ca
Downloading: "https://download.pytorch.org/models/alexnet-owt-7be5be79.pth" to /r

0%| | 0.00/233M [00:00<?, ?B/s]

AlexNet(
(features): Sequential(
(0): Conv2d(3, 64, kernel_size=(11, 11), stride=(4, 4), padding=(2, 2))
(1): ReLU(inplace=True)
(2): MaxPool2d(kernel_size=3, stride=2, padding=0, dilation=1, ceil_mode=Fals
(3): Conv2d(64, 192, kernel_size=(5, 5), stride=(1, 1), padding=(2, 2))
(4): ReLU(inplace=True)
(5): MaxPool2d(kernel_size=3, stride=2, padding=0, dilation=1, ceil_mode=Fals
(6): Conv2d(192, 384, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(7): ReLU(inplace=True)
(8): Conv2d(384, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(9): ReLU(inplace=True)
(10): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(11): ReLU(inplace=True)
(12): MaxPool2d(kernel_size=3, stride=2, padding=0, dilation=1, ceil_mode=Fal
)
(avgpool): AdaptiveAvgPool2d(output_size=(6, 6))
(classifier): Sequential(
(0): Dropout(p=0.5, inplace=False)
(1): Linear(in_features=9216, out_features=4096, bias=True)
(2): ReLU(inplace=True)
(3): Dropout(p=0.5, inplace=False)
(4): Linear(in_features=4096, out_features=4096, bias=True)
(5): ReLU(inplace=True)
(6): Linear(in_features=4096, out_features=1000, bias=True)
)
)

According to the linked page above, and as we discussed previously, pre-trained models expect input
images of a set size, but they also need to be
normalized in the same way, i.e. mini-batches of 3-channel RGB images of shape (3 x H x W),
where H and W are expected to be at least 224. The images have to be loaded in to a range of [0,
1] and then normalized using mean = [0.485, 0.456, 0.406] and std = [0.229, 0.224, 0.225].

If that did not make sense to you, take some time to learn what it means to "normalize" an image, that
is the pixel values for an image. It involves some basic statistical manipulations of your images' pixel
values. We are data scientists. Conditioning training data and model input data is what we do! 😎

Let's do something with this now.

# Download an example image from the pytorch website


import urllib
url, filename = (
"https://github.com/pytorch/hub/raw/master/images/dog.jpg",
"dog.jpg")

try:
urllib.URLopener().retrieve(url, filename)
except: # OH NO! A Bare Except Statement - let's forgive the authors of this stolen code and mo
urllib.request.urlretrieve(url, filename)

That was easy! And that is a Samoyed Husky! My wife had one years ago. They have amazingly kind
personalities, AND their coats get VERY thick! If they sleep in the snow outside, good luck finding one.

Well how nice! The torchvision module gives us some nice routines to normalize and condition the
data from our dog image - NICE!
from PIL import Image
from torchvision import transforms
input_image = Image.open(filename)
preprocess = transforms.Compose([
transforms.Resize(256),
transforms.CenterCrop(224),
transforms.ToTensor(),
transforms.Normalize(
mean=[0.485, 0.456, 0.406],
std=[0.229, 0.224, 0.225]),
])
input_tensor = preprocess(input_image)
input_batch = input_tensor.unsqueeze(0) # create a mini-batch as expected by the model

# move the input and model to GPU for speed if available


if torch.cuda.is_available():
input_batch = input_batch.to('cuda')
model.to('cuda')

with torch.no_grad():
output = model(input_batch)
# Tensor of shape 1000, with confidence scores over Imagenet's 1000 classes
print(output[0])
# The output has unnormalized scores. To get probabilities, you can run a softmax on it.
probabilities = torch.nn.functional.softmax(output[0], dim=0)
print(probabilities)
tensor([-1.6531e+00, -4.3505e+00, -1.8172e+00, -4.2143e+00, -3.1914e+00,
3.4163e-01, 1.0877e+00, 5.9350e+00, 8.0425e+00, -7.0243e-01,
-9.4130e-01, -6.0822e-01, -2.4097e-01, -1.9946e+00, -1.5288e+00,
-3.2656e+00, -5.5800e-01, 1.0524e+00, 1.9211e-01, -4.7202e+00,
-3.3880e+00, 4.3048e+00, -1.0997e+00, 4.6132e+00, -5.7408e-03,
-5.3437e+00, -4.7378e+00, -3.3974e+00, -4.1287e+00, 2.9064e-01,
-3.2955e+00, -6.7051e+00, -4.7232e+00, -4.1778e+00, -2.1859e+00,
-2.9469e+00, 3.0465e+00, -3.5882e+00, -6.3890e+00, -4.4203e+00,
-3.3685e+00, -5.0983e+00, -4.9006e+00, -5.5235e+00, -3.7233e+00,
-4.0204e+00, 2.6998e-01, -4.4702e+00, -5.6617e+00, -5.4880e+00,
-2.6801e+00, -3.2129e+00, -1.6294e+00, -5.2289e+00, -2.7495e+00,
-2.6286e+00, -1.8206e+00, -2.3196e+00, -5.2806e+00, -3.7652e+00,
-3.0987e+00, -4.1421e+00, -5.2531e+00, -4.6505e+00, -3.5815e+00,
-4.0189e+00, -4.0008e+00, -4.5512e+00, -3.2248e+00, -7.7903e+00,
-1.4484e+00, -3.8347e+00, -4.5611e+00, -4.3681e+00, 2.7234e-01,
-4.0162e+00, -4.2136e+00, -5.4524e+00, 1.1744e+00, -4.7785e+00,
-1.8335e+00, 4.1288e-01, 2.2239e+00, -9.9920e-02, 4.8216e+00,
-8.4304e-01, 5.6911e-01, -4.0484e+00, -3.3013e+00, 2.8698e+00,
-1.1419e+00, -9.1690e-01, -2.9284e+00, -2.6097e+00, -1.8213e-01,
-2.5429e+00, -2.1095e+00, 2.2419e+00, -1.6281e+00, 7.4458e+00,
2.3184e+00, -5.7408e+00, -7.4332e-01, -5.4066e+00, 1.5177e+01,
-4.4741e-02, 1.8237e+00, -3.7741e+00, 9.2271e-01, -4.3687e-01,
-1.4003e+00, -4.3026e+00, 6.3782e-01, -1.0808e+00, -1.4173e+00,
2.6194e+00, -3.8418e+00, 1.1598e+00, -2.6876e+00, -3.6103e+00,
-4.9281e+00, -4.1411e+00, -3.3603e+00, -3.4296e+00, -1.4997e+00,
-2.8381e+00, -1.2843e+00, 1.5745e+00, -1.7449e+00, 4.2903e-01,
3.1234e-01, -2.8206e+00, 3.6688e-01, -2.1033e+00, 1.6481e+00,
1.4222e+00, -2.7303e+00, -3.6292e+00, 1.2864e+00, -2.5541e+00,
-2.9663e+00, -4.1575e+00, -3.1954e+00, -4.6487e-01, 1.8916e+00,
-7.4721e-01, 4.5986e+00, -2.5443e+00, -6.2003e+00, -1.3215e+00,
-2.6225e+00, 9.9639e+00, 9.7772e+00, 9.6715e+00, 9.0857e+00,
5.7773e+00, 1.5155e+00, 9.1023e+00, 4.5466e+00, -1.1239e+00,
5.9134e+00, -3.1552e-01, -7.8261e-01, -5.5760e-01, -1.8962e+00,
-4.1288e+00, -1.7884e+00, -7.5252e-01, 3.9682e-01, 7.8177e+00,
3.8134e+00, -1.3543e+00, 1.2913e+00, 6.4669e+00, 5.8514e+00,
2.6018e+00, 2.6898e+00, 1.9930e+00, -2.1234e+00, 4.6571e+00,
3.4011e+00, 1.3347e+00, 4.1266e+00, 1.6413e+00, 3.9784e+00,
6.9870e+00, 7.8652e+00, 2.9290e+00, 6.3692e+00, 6.8548e-01,
6.0998e+00, 7.3568e-01, 5.5853e+00, 3.0740e+00, 5.9060e+00,
3.4023e+00, 4.0194e+00, 2.0428e+00, 1.7228e+00, 8.4634e+00,
7.7994e+00, 2.3141e+00, 4.9181e+00, 1.0704e+01, 6.0487e+00,
3.1511e+00, -1.1049e+00, 9.0660e+00, 6.9668e+00, 2.9854e+00,
-1.5658e+00, 2.3524e-01, 4.7346e+00, 5.5086e-01, 2.8366e+00,
5.1261e+00, 5.5401e+00, 5.3350e+00, 2.5738e+00, 2.5391e+00,
6.2561e+00, 2.6661e-01, 1.0529e+01, 5.5757e+00, 4.3984e+00,
4.9205e-01, 4.5775e+00, 6.5327e+00, 6.2971e+00, 8.2239e+00,
1.1580e+01, 1.0441e+01, 9.1162e+00, 4.5136e+00, 5.0055e-02,
8.2789e+00, 2.7933e+00, 1.2667e+00, 3.9543e+00, 4.6614e+00,
4.2697e+00, 1.8932e+00, 3.7421e+00, -5.2015e-01, 5.4627e+00,
4.3209e+00, 4.8047e+00, 6.4502e+00, 1.1558e+01, 9.4745e+00,
1.0396e+01, 5.1257e+00, 2.2752e+00, 3.1707e+00, 2.5909e+00,
3.2563e+00, 6.0614e+00, 1.1309e+01, 1.6825e+01, 1.4313e+01,
9.6473e+00, 9.2852e+00, 3.8578e+00, 9.4158e+00, 9.2970e+00,
8.3887e+00, 7.5184e+00, 8.6426e+00, -1.6414e+00, 7.3179e+00,
1.1844e+01, 2.4881e+00, 2.1645e+00, 7.3728e+00, 4.0743e+00,
-3.2729e+00, -7.5632e-01, 4.5371e+00, 6.3825e+00, 1.2761e+01,
2.9800e+00, 6.7029e+00, 6.0649e+00, 1.1554e+01, 8.2879e+00,
7.7940e+00, 2.0335e+00, 7.2505e+00, -8.8975e-01, 8.8960e-01,
-2.3303e+00, 1.7753e+00, 3.9600e+00, 6.9571e-01, -2.1025e+00,
-1.3851e+00, 3.9058e+00, -4.8924e+00, -2.5443e+00, -2.1743e+00,
-4.1369e+00, -1.7405e+00, -3.3237e+00, -5.2608e+00, -4.6303e+00,
-4.7160e+00, -4.6914e+00, -4.0146e+00, 2.7552e-01, -1.2765e+00,
-4.7126e-01, -1.7545e+00, -3.1982e+00, -1.6435e+00, -2.2036e+00,
-3.2059e+00, -2.7874e+00, -2.5656e+00, -2.8286e+00, -5.4447e+00,
-5.7600e+00, -7.4015e-02, -1.7109e+00, -3.7817e+00, -1.3318e+00,
-3.8970e+00, -1.4044e+00, 1.6140e+00, -1.3453e+00, -2.1589e+00,
1.0178e+01, 1.2249e+01, 1.3368e+01, 8.9446e+00, 2.5004e+00,
4.3795e+00, 2.5553e+00, 1.1427e+00, 5.2525e+00, -1.1034e+00,
5.1649e-01, 2.3851e+00, -4.6297e-01, -3.0384e+00, -4.8685e+00,
1.8297e+00, -2.2429e+00, -3.4593e+00, 8.8696e+00, -4.5637e-01,
-1.9446e+00, -4.8571e+00, -5.4359e-01, 1.7708e+00, -1.6086e+00,
8.0669e+00, 7.6676e+00, 6.6009e+00, 7.1212e+00, 6.5511e+00,
-1.6199e+00, 3.6741e+00, 1.3183e+00, -4.4796e-01, -2.8096e+00,
-3.1187e+00, -3.4911e+00, -5.1303e+00, 1.1377e+00, -2.9906e+00,
9.2196e-01, -9.2509e-02, -7.5148e-01, 1.4160e+00, 1.3435e+00,
-2.3818e+00, -3.2409e+00, 2.5115e+00, 1.1660e+00, -1.8630e+00,
1.6514e+00, -1.0128e+00, -1.3111e+00, 2.8397e+00, 1.6169e+00,
-6.1281e+00, -7.2739e+00, -1.7869e+00, 2.2952e-01, -2.6277e+00,
-4.7378e+00, -2.5931e-01, -4.3804e+00, -2.6091e+00, -2.7485e+00,
-2.2095e+00, -3.1298e+00, -4.1939e+00, -3.0170e-01, -3.3679e+00,
-3.5106e+00, -1.6222e+00, -1.5633e+00, -3.8200e+00, -3.4389e-01,
1.4091e+00, -5.1334e+00, -1.0952e+00, -3.3511e+00, 8.7169e-01,
-1.5623e+00, -2.1561e+00, 5.2436e+00, -1.4502e+00, 1.3622e+00,
-1.9643e+00, -2.2841e+00, 1.5012e+00, 7.6907e-01, -2.5404e-01,
-1.2807e+00, -1.6897e-02, -1.1180e+00, -2.2726e+00, -2.1818e+00,
8.1014e-01, -2.1304e+00, -1.4641e+00, 7.2842e+00, 4.9355e+00,
3.8115e-01, 1.6081e+00, 1.4207e-01, -2.2375e+00, 2.0593e+00,
1.5662e+00, -7.4137e-01, 3.8722e+00, 1.7906e+00, 2.6670e+00,
-7.0649e-01, -1.4162e+00, 2.3219e-01, 1.4732e+00, -2.0789e+00,
-4.6607e-01, -1.7435e+00, -1.9540e+00, 9.2343e-01, -1.5039e+00,
1.6313e+00, -1.3970e+00, 2.9115e+00, -3.0625e-02, 6.6558e-01,
4.3597e-01, -2.0390e+00, 7.4015e+00, 6.6309e-01, 2.5805e-01,
2.1681e+00, -2.0966e+00, 2.8071e+00, 7.7179e+00, -1.8649e+00,
-3.1810e+00, -3.2157e+00, -3.1040e+00, 6.2041e-01, -5.7294e-01,
2.7674e+00, 2.2801e+00, -2.1331e-02, -2.9130e+00, -1.9868e+00,
2.6698e-02, -5.4558e-01, -1.8165e+00, 3.4987e+00, -4.9991e-01,
-1.6378e+00, -3.1351e+00, -2.5348e+00, 5.4957e-01, -2.9551e+00,
-3.4927e+00, -3.4600e+00, 2.6452e+00, -2.8489e+00, 3.3068e+00,
-1.0797e+00, -1.7570e+00, -1.0633e+00, -1.3794e+00, -1.9415e+00,
-2.9466e+00, 3.5183e+00, 1.8789e+00, -3.0337e+00, -2.2272e+00,
-6.7641e+00, 2.8545e-01, -2.9972e-01, -1.6004e+00, 1.2635e+00,
-2.6329e+00, -2.0945e+00, -4.6475e+00, 2.5814e+00, -1.4716e+00,
-3.2143e+00, -1.6445e+00, -2.8780e+00, -8.7335e-01, -1.2002e+00,
3.5576e+00, -3.4127e+00, -5.1085e-02, -6.7062e-02, 3.8194e+00,
-8.2055e-01, 3.1030e+00, 2.6055e+00, 2.1326e-01, -3.2778e+00,
-3.6279e+00, -1.0141e+00, -5.1831e-01, -2.2252e+00, 1.9741e-02,
3.4781e-01, -1.8625e+00, -1.3475e+00, 6.9911e-01, 3.6843e+00,
-4.7123e+00, -3.5867e+00, 3.2842e+00, -4.1673e+00, 7.3663e+00,
-4.9297e+00, -1.6607e+00, -1.2216e+00, 3.6301e+00, -2.5615e+00,
-6.5167e-01, -7.7884e-01, -4.0890e+00, -3.4347e+00, -6.7509e-01,
-6.0739e+00, -1.9719e+00, 5.0621e+00, 1.6963e+00, -1.1343e+00,
-9.8868e-01, 6.4425e-01, 3.5839e+00, -4.9558e-01, -1.0440e+00,
2.8464e+00, -3.4423e+00, 7.8792e-01, -2.4293e+00, -2.9206e+00,
-3.8922e+00, 3.4876e-01, -1.2449e+00, 3.5581e+00, -1.6783e+00,
-2.2167e+00, -2.4084e+00, -7.1024e-02, -1.4492e+00, 1.3445e-01,
-2.0429e-01, -3.4434e+00, -3.7481e+00, 3.8119e+00, -3.3786e+00,
-1.9093e-01, -1.4251e+00, -1.1209e+00, -2.9558e+00, 1.0028e+00,
-1.5030e-01, -6.3680e-01, 2.0952e+00, 1.5482e+00, 8.5714e-01,
-2.6253e+00, -1.7071e+00, -2.1729e+00, 6.8448e-01, -5.8839e-01,
-1.5405e+00, 2.0318e+00, -5.5361e+00, -1.2917e+00, -2.9692e+00,
-1.6848e+00, 2.9436e+00, -1.6952e+00, 3.1041e-02, -2.6507e+00,
7.6504e-02, -5.9168e-01, -3.0478e+00, 2.7757e+00, -3.6653e-01,
-1.2022e+00, 8.2107e-01, -3.4665e+00, -2.7079e+00, 2.5938e+00,
-6.7904e-01, -5.2500e+00, -2.1233e+00, -2.4535e+00, -3.2205e-01,
4.0266e+00, 3.8202e+00, 1.8239e+00, -2.2377e+00, -2.5470e+00,
-1.8961e+00, 1.1507e+00, -9.0863e-01, -2.9222e+00, -2.6309e-02,
-1.0096e+00, 1.4418e+00, -2.1566e+00, -2.9206e+00, -2.7211e+00,
-1.6706e+00, -2.6514e-01, 1.2232e+00, -3.6349e-01, -1.8891e-02,
6.5338e-01, -1.6350e+00, -2.5687e+00, 2.0967e+00, 1.9325e+00,
7.3041e-02, 6.1249e-01, 4.2801e-01, -1.1372e-01, -3.6632e+00,
-3.4853e+00, -1.7770e+00, 2.9665e-01, 8.1546e-01, -1.0411e+00,
-1.9992e+00, -1.2594e+00, 4.5067e+00, 2.8344e+00, -9.8013e-02,
-1.1884e+00, -3.3622e+00, -1.6943e+00, -2.0767e+00, 4.2992e+00,
-9.0375e-01, -2.7186e+00, 3.9871e+00, -9.7471e-01, -3.6234e+00,
1.0620e+00, -2.9262e+00, -9.6398e-01, 3.1839e+00, 5.6498e-01,
-1.8956e+00, 6.8538e+00, -6.9846e-01, -1.2373e+00, -3.3318e+00,
2.5409e+00, 3.6782e+00, -4.4193e+00, -3.2795e+00, 4.5217e-01,
-6.3557e-01, -4.5141e+00, -5.6394e+00, -1.7874e+00, -5.3225e-01,
-1.8976e+00, -2.4368e+00, -1.5791e+00, -1.5968e+00, -6.6347e-01,
-3.2542e+00, 1.9707e+00, -1.4510e+00, -2.8576e+00, 1.8115e-01,
3.4011e+00, 3.6433e+00, -4.1488e+00, 5.3942e+00, 1.7155e+00,
-1.7607e+00, -1.8127e+00, -5.6651e-01, -1.7138e-01, 2.9694e-01,
-1.2053e+00, -3.1280e-01, -4.4542e-01, -1.6290e+00, -5.1477e-02,
-4.4725e+00, 1.3656e+00, -3.8251e-01, -1.1162e+00, 1.5726e+00,
1.1778e+00, 3.1226e+00, 4.7183e+00, 4.2658e+00, -2.8885e+00,
1.1531e+00, -6.0749e+00, -2.5053e-01, 2.7380e+00, -3.5834e+00,
-2.4107e+00, 1.4842e+00, -4.2243e-01, 1.2599e+00, -2.4290e+00,
-1.8403e+00, -8.0502e-01, -9.5763e-02, 1.6516e+00, -5.9440e+00,
-1.2061e+00, -1.7422e+00, 3.3496e+00, -4.4659e+00, 5.2483e+00,
-1.5209e+00, -2.0713e+00, -6.2343e-01, 1.6578e+00, 3.8215e+00,
3.9360e+00, -6.6866e-01, 3.6884e-01, 1.1271e+00, -1.4996e+00,
-1.2209e+00, -2.5420e-01, -6.7888e-01, -1.9382e+00, -1.2613e+00,
2.6638e+00, 1.8173e+00, -2.6642e+00, -7.8372e-01, -8.0335e-02,
2.2117e+00, -2.9847e+00, -3.8204e-01, 2.9803e+00, 2.0670e+00,
4.8433e-01, -1.3782e+00, -1.1167e+00, 4.7222e-01, -3.2166e+00,
-1.0120e+00, -3.0073e+00, -3.4002e+00, 2.6437e+00, -3.6960e+00,
4.2051e-02, -8.5282e-01, 4.7455e+00, -1.7194e+00, -1.6682e-01,
1.2189e-01, -2.5808e-01, -6.6408e-01, -6.2078e-01, -1.6177e+00,
1.2100e+00, 2.0241e+00, 1.0516e+00, 7.0260e-01, 2.2579e+00,
1.9435e+00, -1.1773e+00, 2.3265e+00, -3.1482e+00, 1.5769e+00,
1.5495e+00, -1.5095e+00, 1.1099e+00, -2.7918e+00, 9.3726e-01,
5.4669e+00, 3.5356e+00, -2.0647e-01, -2.0895e+00, -1.0241e+00,
6.9857e-01, 2.5615e+00, -7.1752e-02, 5.4015e-01, 7.8100e-01,
1.9773e-01, -1.1075e+00, -2.0489e+00, 2.8353e-01, -2.1950e+00,
1.7179e-01, -3.3436e+00, -1.7189e+00, -1.1354e+00, 4.3628e-01,
-6.8839e-01, -1.3995e+00, -1.2938e+00, -7.1000e-01, -2.6531e+00,
-1.7930e+00, -6.2157e-01, -1.3453e+00, -4.4531e+00, -6.2878e-01,
-7.9791e-01, 7.6609e-01, 1.3241e+00, -1.0893e+00, -2.8454e+00,
1.3747e+00, -9.1034e-01, 9.1863e-01, -6.4992e-01, -4.1809e-01,
1.2721e+00, -4.1329e-01, -2.4140e-01, -3.4234e+00, 3.4303e-01,
5.5246e+00, 1.5737e-01, 8.8572e+00, -2.8212e+00, -1.6723e+00,
-5.7758e-01, -3.0367e+00, -2.5280e+00, 1.6939e-01, -3.5722e-01,
-2.5129e+00, 2.8057e+00, -5.4940e-01, -1.3355e+00, -2.5097e+00,
1.4674e+00, -8.7753e-01, -1.3430e+00, -5.1088e-01, -1.4248e+00,
9.4307e-01, -2.6515e+00, -2.7976e-01, -3.0502e+00, -4.6979e+00,
-4.0597e+00, 2.0581e+00, -1.1199e+00, -1.0410e+00, 2.3181e+00,
1.0971e+00, -4.0029e+00, 5.7723e-01, -2.6048e+00, -3.5901e+00,
-9.2429e-01, -1.8759e+00, -2.8008e+00, 8.2150e-01, -2.2683e+00,
-1.4691e+00, -1.2846e-01, 1.0042e-01, -2.6058e+00, -2.6467e+00,
1.4808e+00, 2.9393e+00, 1.6651e+00, 4.1545e-01, 1.4050e+00,
-1.8840e-01, -4.0952e+00, -2.7387e-01, 6.0093e-01, 1.4788e+00,
1.3376e+00, -2.3120e+00, -1.7013e+00, 1.2680e+00, -5.4642e-01,
-1.2299e+00, 1.5726e+00, 4.5007e+00, -5.1495e+00, -3.4774e+00,
-2.8654e+00, 9.2763e-01, -1.3225e+00, 2.2688e+00, -4.2309e-01,
-1.8813e+00, 1.9572e+00, -3.4873e+00, -3.1902e+00, -2.2388e+00,
-2.8536e+00, -2.5456e+00, -7.9504e-01, -3.3175e+00, 6.8583e+00,
-4.1436e+00, -3.9959e+00, -4.4752e-01, -4.1032e+00, -2.9127e+00,
1.0799e+00, -2.7834e+00, 4.3743e+00, 1.2949e+00, -2.0735e+00,
-2.7583e+00, -3.6715e+00, -5.4814e-01, -2.3058e+00, -3.2384e+00,
-2.2241e+00, -4.7857e-01, -4.7532e-01, -2.4020e+00, -1.0765e+00,
-3.3852e+00, -4.4681e+00, -3.9925e+00, -1.1987e+00, -3.8723e+00,
-3.1125e+00, -6.8728e-01, -1.6097e+00, 2.3063e+00, -5.1035e+00,
-4.0172e+00, 3.2436e-01, -5.2703e+00, -2.9099e+00, -4.0322e+00,
-1.7708e+00, -3.2843e+00, -2.3601e+00, 5.0528e+00, -9.2873e-02,
-1.7664e+00, 4.2515e+00, 5.0532e-01, -1.4602e+00, 5.5997e-01,
2.1899e+00, -3.9130e+00, -1.2206e+00, -2.9555e+00, -4.3260e-01,
-1.4437e+00, 9.8381e-01, 3.3481e+00, -4.2170e+00, 2.3720e+00,
-1.7488e-01, -4.8491e+00, 2.7957e+00, -3.9669e+00, 5.0905e-01,
-1.8495e-01, -6.4006e-01, -1.1009e-01, -1.2967e+00, -2.5475e+00,
2.8983e+00, -2.8059e+00, -3.2593e+00, 8.9075e-01, 2.3235e+00])
tensor([6.8397e-09, 4.6082e-10, 5.8043e-09, 5.2806e-10, 1.4687e-09, 5.0271e-08,
1.0601e-07, 1.3505e-05, 1.1112e-04, 1.7697e-08, 1.3936e-08, 1.9445e-08,
2.8074e-08, 4.8609e-09, 7.7447e-09, 1.3637e-09, 2.0446e-08, 1.0233e-07,
4.3290e-08, 3.1840e-10, 1.2066e-09, 2.6455e-06, 1.1895e-08, 3.6010e-06,
3.5519e-08, 1.7069e-10, 3.1286e-10, 1.1954e-09, 5.7530e-10, 4.7772e-08,
1.3235e-09, 4.3751e-11, 3.1747e-10, 5.4774e-10, 4.0143e-09, 1.8756e-09,
7.5171e-07, 9.8765e-10, 6.0015e-11, 4.2979e-10, 1.2303e-09, 2.1816e-10,
2.6585e-10, 1.4260e-10, 8.6282e-10, 6.4110e-10, 4.6795e-08, 4.0887e-10,
1.2420e-10, 1.4775e-10, 2.4490e-09, 1.4374e-09, 7.0035e-09, 1.9145e-10,
2.2848e-09, 2.5784e-09, 5.7845e-09, 3.5120e-09, 1.8181e-10, 8.2746e-10,
1.6114e-09, 5.6761e-10, 1.8687e-10, 3.4142e-10, 9.9433e-10, 6.4207e-10,
6.5379e-10, 3.7705e-10, 1.4204e-09, 1.4779e-11, 8.3926e-09, 7.7194e-10,
3.7334e-10, 4.5279e-10, 4.6906e-08, 6.4378e-10, 5.2843e-10, 1.5311e-10,
1.1561e-07, 3.0038e-10, 5.7107e-09, 5.3984e-08, 3.3022e-07, 3.2326e-08,
4.4354e-06, 1.5375e-08, 6.3112e-08, 6.2338e-10, 1.3159e-09, 6.2991e-07,
1.1403e-08, 1.4281e-08, 1.9106e-09, 2.6277e-09, 2.9775e-08, 2.8091e-09,
4.3331e-09, 3.3621e-07, 7.0129e-09, 6.1182e-05, 3.6293e-07, 1.1475e-10,
1.6988e-08, 1.6028e-10, 1.3938e-01, 3.4160e-08, 2.2129e-07, 8.2013e-10,
8.9884e-08, 2.3079e-08, 8.8066e-09, 4.8343e-10, 6.7601e-08, 1.2122e-08,
8.6581e-09, 4.9038e-07, 7.6647e-10, 1.1393e-07, 2.4306e-09, 9.6613e-10,
2.5865e-10, 5.6820e-10, 1.2404e-09, 1.1574e-09, 7.9730e-09, 2.0910e-09,
9.8900e-09, 1.7248e-07, 6.2397e-09, 5.4862e-08, 4.8820e-08, 2.1280e-09,
5.1557e-08, 4.3601e-09, 1.8567e-07, 1.4811e-07, 2.3291e-09, 9.4803e-10,
1.2931e-07, 2.7780e-09, 1.8395e-09, 5.5895e-10, 1.4629e-09, 2.2442e-08,
2.3685e-07, 1.6922e-08, 3.5488e-06, 2.8053e-09, 7.2476e-11, 9.5282e-09,
2.5942e-09, 7.5895e-04, 6.2972e-04, 5.6657e-04, 3.1536e-04, 1.1535e-05,
1.6260e-07, 3.2064e-04, 3.3691e-06, 1.1611e-08, 1.3217e-05, 2.6057e-08,
1.6333e-08, 2.0454e-08, 5.3632e-09, 5.7520e-10, 5.9737e-09, 1.6832e-08,
5.3124e-08, 8.8743e-05, 1.6184e-06, 9.2215e-09, 1.2994e-07, 2.2986e-05,
1.2421e-05, 4.8183e-07, 5.2618e-07, 2.6212e-07, 4.2734e-09, 3.7627e-06,
1.0716e-06, 1.3571e-07, 2.2137e-06, 1.8440e-07, 1.9087e-06, 3.8670e-05,
9.3062e-05, 6.6835e-07, 2.0849e-05, 7.0901e-08, 1.5925e-05, 7.4551e-08,
9.5196e-06, 7.7260e-07, 1.3118e-05, 1.0729e-06, 1.9886e-06, 2.7551e-07,
2.0006e-07, 1.6925e-04, 8.7135e-05, 3.6138e-07, 4.8848e-06, 1.5911e-03,
1.5131e-05, 8.3455e-07, 1.1833e-08, 3.0921e-04, 3.7895e-05, 7.0716e-07,
7.4633e-09, 4.5198e-08, 4.0660e-06, 6.1970e-08, 6.0938e-07, 6.0143e-06,
9.0986e-06, 7.4113e-06, 4.6851e-07, 4.5257e-07, 1.8618e-05, 4.6638e-08,
1.3357e-03, 9.4285e-06, 2.9049e-06, 5.8432e-08, 3.4748e-06, 2.4552e-05,
1.9397e-05, 1.3322e-04, 3.8189e-03, 1.2226e-03, 3.2513e-04, 3.2596e-06,
3.7557e-08, 1.4074e-04, 5.8353e-07, 1.2678e-07, 1.8634e-06, 3.7790e-06,
2.5541e-06, 2.3722e-07, 1.5070e-06, 2.1235e-08, 8.4210e-06, 2.6884e-06,
4.3611e-06, 2.2606e-05, 3.7377e-03, 4.6522e-04, 1.1688e-03, 6.0120e-06,
3.4759e-07, 8.5106e-07, 4.7660e-07, 9.2711e-07, 1.5325e-05, 2.9141e-03,
7.2448e-01, 5.8750e-02, 5.5299e-04, 3.8498e-04, 1.6918e-06, 4.3870e-04,
3.8956e-04, 1.5708e-04, 6.5791e-05, 2.0247e-04, 6.9196e-09, 5.3836e-05,
4.9747e-03, 4.3003e-07, 3.1115e-07, 5.6877e-05, 2.1009e-06, 1.3537e-09,
1.6768e-08, 3.3371e-06, 2.1127e-05, 1.2450e-02, 7.0332e-07, 2.9105e-05,
1.5378e-05, 3.7206e-03, 1.4202e-04, 8.6664e-05, 2.7294e-07, 5.0325e-05,
1.4674e-08, 8.6956e-08, 3.4746e-09, 2.1084e-07, 1.8739e-06, 7.1630e-08,
4.3637e-09, 8.9419e-09, 1.7752e-06, 2.6805e-10, 2.8054e-09, 4.0611e-09,
5.7057e-10, 6.2668e-09, 1.2868e-09, 1.8544e-10, 3.4836e-10, 3.1977e-10,
3.2773e-10, 6.4478e-10, 4.7055e-08, 9.9670e-09, 2.2299e-08, 6.1798e-09,
1.4589e-09, 6.9051e-09, 3.9440e-09, 1.4476e-09, 2.1998e-09, 2.7462e-09,
2.1110e-09, 1.5429e-10, 1.1257e-10, 3.3175e-08, 6.4554e-09, 8.1395e-10,
9.4308e-09, 7.2524e-10, 8.7705e-09, 1.7944e-07, 9.3043e-09, 4.1245e-09,
9.4008e-04, 7.4557e-03, 2.2830e-02, 2.7387e-04, 4.3539e-07, 2.8505e-06,
4.5996e-07, 1.1200e-07, 6.8244e-06, 1.1851e-08, 5.9877e-08, 3.8796e-07,
2.2485e-08, 1.7116e-09, 2.7453e-10, 2.2264e-07, 3.7921e-09, 1.1236e-09,
2.5407e-04, 2.2633e-08, 5.1098e-09, 2.7768e-10, 2.0743e-08, 2.0990e-07,
7.1505e-09, 1.1386e-04, 7.6374e-05, 2.6284e-05, 4.4222e-05, 2.5007e-05,
7.0700e-09, 1.4079e-06, 1.3350e-07, 2.2825e-08, 2.1515e-09, 1.5796e-09,
1.0884e-09, 2.1129e-10, 1.1145e-07, 1.7953e-09, 8.9816e-08, 3.2567e-08,
1.6849e-08, 1.4721e-07, 1.3691e-07, 3.3003e-09, 1.3978e-09, 4.4021e-07,
1.1464e-07, 5.5446e-09, 1.8627e-07, 1.2974e-08, 9.6283e-09, 6.1124e-07,
1.7996e-07, 7.7905e-11, 2.4770e-11, 5.9827e-09, 4.4940e-08, 2.5808e-09,
3.1287e-10, 2.7564e-08, 4.4726e-10, 2.6294e-09, 2.2872e-09, 3.9210e-09,
1.5621e-09, 5.3895e-10, 2.6419e-08, 1.2311e-09, 1.0674e-09, 7.0538e-09,
7.4818e-09, 7.8331e-10, 2.5328e-08, 1.4619e-07, 2.1065e-10, 1.1949e-08,
1.2519e-09, 8.5412e-08, 7.4896e-09, 4.1359e-09, 6.7645e-06, 8.3780e-09,
1.3950e-07, 5.0101e-09, 3.6390e-09, 1.6029e-07, 7.7082e-08, 2.7709e-08,
9.9254e-09, 3.5125e-08, 1.1679e-08, 3.6811e-09, 4.0311e-09, 8.0314e-08,
4.2435e-09, 8.2626e-09, 5.2055e-05, 4.9706e-06, 5.2298e-08, 1.7838e-07,
4.1177e-08, 3.8125e-09, 2.8008e-07, 1.7105e-07, 1.7021e-08, 1.7165e-06,
2.1408e-07, 5.1428e-07, 1.7625e-08, 8.6673e-09, 4.5060e-08, 1.5586e-07,
4.4680e-09, 2.2415e-08, 6.2480e-09, 5.0621e-09, 8.9948e-08, 7.9401e-09,
1.8257e-07, 8.8360e-09, 6.5673e-07, 3.4646e-08, 6.9504e-08, 5.5245e-08,
4.6496e-09, 5.8529e-05, 6.9331e-08, 4.6240e-08, 3.1230e-07, 4.3894e-09,
5.9164e-07, 8.0311e-05, 5.5340e-09, 1.4841e-09, 1.4335e-09, 1.6028e-09,
6.6434e-08, 2.0143e-08, 5.6863e-07, 3.4930e-07, 3.4969e-08, 1.9402e-09,
4.8990e-09, 3.6690e-08, 2.0702e-08, 5.8085e-09, 1.1815e-06, 2.1669e-08,
6.9445e-09, 1.5538e-09, 2.8321e-09, 6.1891e-08, 1.8602e-09, 1.0867e-09,
1.1228e-09, 5.0321e-07, 2.0688e-09, 9.7521e-07, 1.2136e-08, 6.1647e-09,
1.2335e-08, 8.9926e-09, 5.1257e-09, 1.8760e-09, 1.2048e-06, 2.3386e-07,
1.7195e-09, 3.8522e-09, 4.1244e-11, 4.7525e-08, 2.6472e-08, 7.2098e-09,
1.2638e-07, 2.5674e-09, 4.3986e-09, 3.4244e-10, 4.7210e-07, 8.2006e-09,
1.4355e-09, 6.8986e-09, 2.0093e-09, 1.4916e-08, 1.0757e-08, 1.2532e-06,
1.1772e-09, 3.3944e-08, 3.3406e-08, 1.6282e-06, 1.5725e-08, 7.9538e-07,
4.8360e-07, 4.4215e-08, 1.3471e-09, 9.4927e-10, 1.2958e-08, 2.1274e-08,
3.8597e-09, 3.6435e-08, 5.0583e-08, 5.5471e-09, 9.2838e-09, 7.1874e-08,
1.4223e-06, 3.2093e-10, 9.8915e-10, 9.5336e-07, 5.5349e-10, 5.6508e-05,
2.5824e-10, 6.7878e-09, 1.0530e-08, 1.3473e-06, 2.7575e-09, 1.8618e-08,
1.6395e-08, 5.9859e-10, 1.1515e-09, 1.8187e-08, 8.2240e-11, 4.9724e-09,
5.6414e-06, 1.9482e-07, 1.1490e-08, 1.3291e-08, 6.8037e-08, 1.2866e-06,
2.1763e-08, 1.2576e-08, 6.1533e-07, 1.1428e-09, 7.8549e-08, 3.1471e-09,
1.9256e-09, 7.2877e-10, 5.0631e-08, 1.0288e-08, 1.2538e-06, 6.6693e-09,
3.8925e-09, 3.2137e-09, 3.3274e-08, 8.3865e-09, 4.0864e-08, 2.9122e-08,
1.1416e-09, 8.4175e-10, 1.6160e-06, 1.2180e-09, 2.9514e-08, 8.5913e-09,
1.1645e-08, 1.8590e-09, 9.7377e-08, 3.0738e-08, 1.8897e-08, 2.9034e-07,
1.6801e-07, 8.4179e-08, 2.5870e-09, 6.4800e-09, 4.0668e-09, 7.0830e-08,
1.9834e-08, 7.6543e-09, 2.7248e-07, 1.4082e-10, 9.8171e-09, 1.8342e-09,
6.6260e-09, 6.7816e-07, 6.5572e-09, 3.6850e-08, 2.5221e-09, 3.8563e-08,
1.9769e-08, 1.6955e-09, 5.7336e-07, 2.4761e-08, 1.0736e-08, 8.1197e-08,
1.1155e-09, 2.3819e-09, 4.7799e-07, 1.8115e-08, 1.8746e-10, 4.2739e-09,
3.0720e-09, 2.5887e-08, 2.0031e-06, 1.6294e-06, 2.2134e-07, 3.8118e-09,
2.7977e-09, 5.3637e-09, 1.1290e-07, 1.4399e-08, 1.9225e-09, 3.4796e-08,
1.3017e-08, 1.5106e-07, 4.1339e-09, 1.9255e-09, 2.3507e-09, 6.7208e-09,
2.7403e-08, 1.2139e-07, 2.4836e-08, 3.5055e-08, 6.8661e-08, 6.9644e-09,
2.7378e-09, 2.9076e-07, 2.4674e-07, 3.8430e-08, 6.5910e-08, 5.4806e-08,
3.1883e-08, 9.1634e-10, 1.0947e-09, 6.0424e-09, 4.8060e-08, 8.0743e-08,
1.2613e-08, 4.8384e-09, 1.0139e-08, 3.2373e-06, 6.0803e-07, 3.2388e-08,
1.0885e-08, 1.2381e-09, 6.5633e-09, 4.4777e-09, 2.6308e-06, 1.4470e-08,
2.3565e-09, 1.9254e-06, 1.3478e-08, 9.5355e-10, 1.0332e-07, 1.9148e-09,
1.3624e-08, 8.6242e-07, 6.2852e-08, 5.3668e-09, 3.3846e-05, 1.7767e-08,
1.0366e-08, 1.2763e-09, 4.5335e-07, 1.4138e-06, 4.3019e-10, 1.3449e-09,
5.6147e-08, 1.8920e-08, 3.9131e-10, 1.2699e-10, 5.9802e-09, 2.0980e-08,
5.3558e-09, 3.1236e-09, 7.3651e-09, 7.2354e-09, 1.8400e-08, 1.3793e-09,
2.5635e-07, 8.3715e-09, 2.0507e-09, 4.2818e-08, 1.0716e-06, 1.3653e-06,
5.6383e-10, 7.8632e-06, 1.9861e-07, 6.1419e-09, 5.8305e-09, 2.0273e-08,
3.0097e-08, 4.8074e-08, 1.0703e-08, 2.6128e-08, 2.2883e-08, 7.0061e-09,
3.3931e-08, 4.0789e-10, 1.3997e-07, 2.4369e-08, 1.1700e-08, 1.7215e-07,
1.1601e-07, 8.1107e-07, 4.0002e-06, 2.5443e-06, 1.9883e-09, 1.1317e-07,
8.2160e-11, 2.7806e-08, 5.5212e-07, 9.9241e-10, 3.2062e-09, 1.5759e-07,
2.3415e-08, 1.2593e-07, 3.1481e-09, 5.6718e-09, 1.5971e-08, 3.2461e-08,
1.8630e-07, 9.3654e-11, 1.0694e-08, 6.2566e-09, 1.0178e-06, 4.1061e-10,
6.7961e-06, 7.8063e-09, 4.5020e-09, 1.9151e-08, 1.8747e-07, 1.6315e-06,
1.8295e-06, 1.8304e-08, 5.1658e-08, 1.1027e-07, 7.9739e-09, 1.0537e-08,
2.7705e-08, 1.8118e-08, 5.1431e-09, 1.0120e-08, 5.1264e-07, 2.1988e-07,
2.4884e-09, 1.6315e-08, 3.2966e-08, 3.2620e-07, 1.8060e-09, 2.4380e-08,
7.0355e-07, 2.8226e-07, 5.7982e-08, 9.0037e-09, 1.1695e-08, 5.7284e-08,
1.4321e-09, 1.2985e-08, 1.7657e-09, 1.1919e-09, 5.0246e-07, 8.8672e-10,
3.7257e-08, 1.5226e-08, 4.1105e-06, 6.4005e-09, 3.0234e-08, 4.0354e-08,
2.7597e-08, 1.8388e-08, 1.9202e-08, 7.0859e-09, 1.1980e-07, 2.7039e-07,
1.0225e-07, 7.2125e-08, 3.4162e-07, 2.4946e-07, 1.1006e-08, 3.6589e-07,
1.5336e-09, 1.7289e-07, 1.6822e-07, 7.8954e-09, 1.0838e-07, 2.1902e-09,
9.1201e-08, 8.4568e-06, 1.2259e-06, 2.9059e-08, 4.4207e-09, 1.2829e-08,
7.1835e-08, 4.6281e-07, 3.3250e-08, 6.1310e-08, 7.8007e-08, 4.3534e-08,
1.1802e-08, 4.6038e-09, 4.7434e-08, 3.9780e-09, 4.2419e-08, 1.2614e-09,
6.4039e-09, 1.1478e-08, 5.5262e-08, 1.7947e-08, 8.8132e-09, 9.7958e-09,
1.7563e-08, 2.5160e-09, 5.9464e-09, 1.9187e-08, 9.3046e-09, 4.1590e-10,
1.9049e-08, 1.6085e-08, 7.6853e-08, 1.3427e-07, 1.2019e-08, 2.0758e-09,
1.4124e-07, 1.4375e-08, 8.9517e-08, 1.8651e-08, 2.3517e-08, 1.2747e-07,
2.3630e-08, 2.8061e-08, 1.1646e-09, 5.0342e-08, 8.9593e-06, 4.1811e-08,
2.5095e-04, 2.1267e-09, 6.7094e-09, 2.0050e-08, 1.7145e-09, 2.8512e-09,
4.2317e-08, 2.4993e-08, 2.8948e-09, 5.9082e-07, 2.0623e-08, 9.3965e-09,
2.9040e-09, 1.5497e-07, 1.4854e-08, 9.3256e-09, 2.1433e-08, 8.5935e-09,
9.1732e-08, 2.5202e-09, 2.7006e-08, 1.6915e-09, 3.2558e-10, 6.1639e-10,
2.7975e-07, 1.1657e-08, 1.2614e-08, 3.6281e-07, 1.0700e-07, 6.5242e-10,
6.3626e-08, 2.6405e-09, 9.8584e-10, 1.4176e-08, 5.4732e-09, 2.1706e-09,
8.1232e-08, 3.6969e-09, 8.2210e-09, 3.1417e-08, 3.9497e-08, 2.6380e-09,
2.5321e-09, 1.5705e-07, 6.7529e-07, 1.8884e-07, 5.4123e-08, 1.4560e-07,
2.9589e-08, 5.9486e-10, 2.7165e-08, 6.5152e-08, 1.5674e-07, 1.3610e-07,
3.5387e-09, 6.5177e-09, 1.2695e-07, 2.0684e-08, 1.0443e-08, 1.7216e-07,
3.2180e-06, 2.0729e-10, 1.1034e-09, 2.0349e-09, 9.0327e-08, 9.5191e-09,
3.4538e-07, 2.3399e-08, 5.4438e-09, 2.5289e-07, 1.0925e-09, 1.4705e-09,
3.8076e-09, 2.0590e-09, 2.8016e-09, 1.6131e-08, 1.2947e-09, 3.3999e-05,
5.6679e-10, 6.5696e-10, 2.2835e-08, 5.9014e-10, 1.9407e-09, 1.0519e-07,
2.2088e-09, 2.8358e-06, 1.3041e-07, 4.4922e-09, 2.2649e-09, 9.0873e-10,
2.0649e-08, 3.5608e-09, 1.4013e-09, 3.8638e-09, 2.2137e-08, 2.2209e-08,
3.2343e-09, 1.2173e-08, 1.2100e-09, 4.0971e-10, 6.5921e-10, 1.0773e-08,
7.4342e-10, 1.5893e-09, 1.7967e-08, 7.1425e-09, 3.5857e-07, 2.1703e-10,
6.4314e-10, 4.9410e-08, 1.8370e-10, 1.9462e-09, 6.3357e-10, 6.0799e-09,
1.3384e-09, 3.3728e-09, 5.5893e-06, 3.2555e-08, 6.1068e-09, 2.5082e-06,
5.9212e-08, 8.2943e-09, 6.2538e-08, 3.1917e-07, 7.1375e-10, 1.0541e-08,
1.8595e-09, 2.3178e-08, 8.4327e-09, 9.5546e-08, 1.0163e-06, 5.2667e-10,
3.8289e-07, 2.9992e-08, 2.7990e-10, 5.8493e-07, 6.7628e-10, 5.9433e-08,
2.9691e-08, 1.8835e-08, 3.1999e-08, 9.7683e-09, 2.7963e-09, 6.4813e-07,
2.1595e-09, 1.3723e-09, 8.7056e-08, 3.6479e-07])

That generated some visually large tensors. How do we make use of this output? We need some class
names from image net.

NOTE: Even though I am doing this from a Google Colab notebook, we need to do a terminal
operation. When we need to go out into a terminal for this Linux Virtual Machine that Google is kindly
sharing with us, we use the ! symbol in a cell before each of our command line operations. This will
make a notebook cell know that you are doing terminal operations. BUT make sure that you don't mix
terminal line commands and Python code in the same cell.

# Download ImageNet labels


!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt
--2022-04-07 14:03:06-- https://raw.githubusercontent.com/pytorch/hub/master/ima
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.108.13
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.108.1
HTTP request sent, awaiting response... 200 OK
Length: 10472 (10K) [text/plain]
Saving to: ‘imagenet_classes.txt’

imagenet_classes.tx 100%[===================>] 10.23K --.-KB/s in 0s

2022-04-07 14:03:06 (74.0 MB/s) - ‘imagenet_classes.txt’ saved [10472/10472]

Nice, this code that we got from The PyTorch AlexNet Page put the file imagenet_classes.txt in our
Colab files directory. Now we can use data from that file in orchestration with our PyTorch AlexNet
outputs to determine if our alexnet instance predicted a dog.

# Read the categories


with open("imagenet_classes.txt", "r") as f:
categories = [s.strip() for s in f.readlines()]
# Show top categories per image
top5_prob, top5_catid = torch.topk(probabilities, 5)
for i in range(top5_prob.size(0)):
print(categories[top5_catid[i]], top5_prob[i].item())

Samoyed 0.7244770526885986
wallaby 0.13937804102897644
Pomeranian 0.05874986574053764
Angora 0.022829849272966385
Arctic fox 0.012450155802071095

Oh nice! Are you encouraged? I hope so. Wait! That's clearly a Samoyed Husky. Why isn't the
confidence 1.0? Well, that's what we get with classification. It's due to our softmax on the end
(assuming that's what the authors of AlexNet used - still finding the details). But consider this. The
Samoyed is almost 0.6 greater than the next most like prediction, which is a hilarious guess. I think
this is very good personally.

Summary
I hope this helped you to get a good start with PyTorch. Why am I reviewing PyTorch? For many
reasons. I'll share more about why along this journey. Until then, I hope you can continue to follow this
Git Repo on DagsHub while I grow these PyTorch exercises.

Until next time.

Thom

You might also like