Brain Decoding Lab#

In this .ipynb file, we will investigate the Brain Decoding stage as a mini problem. Brain decoding typically involves interpreting the encoded information in the brain to produce various outputs. These outputs can include classification tasks, such as human action recognition, or generative tasks, such as brain image decoding and discovering the input stimuli.

Before we begin, we need to install and import some essential libraries.

%%capture
! pip install nilearn
! pip install h5py
! pip install nibabel
! pip install torch_geometric
! pip install pandas==1.3.5
# Import libs
import numpy as np
import nibabel as nib
from nilearn import plotting
from nilearn.image import new_img_like
import h5py
import nilearn
from nilearn.input_data import NiftiLabelsMasker
import pandas as pd
import pickle
from collections import Counter
import warnings
warnings.filterwarnings(action='once')
import nilearn.connectome
/home/arian/anaconda3/lib/python3.7/site-packages/nilearn/__init__.py:67: FutureWarning: Python 3.7 support is deprecated and will be removed in release 0.12 of Nilearn. Consider switching to Python 3.9 or 3.10.
  _python_deprecation_warnings()
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-2-54ac80d33560> in <module>
      2 import numpy as np
      3 import nibabel as nib
----> 4 from nilearn import plotting
      5 from nilearn.image import new_img_like
      6 import h5py

~/anaconda3/lib/python3.7/site-packages/nilearn/plotting/__init__.py in <module>
     52 
     53 
---> 54 _set_mpl_backend()
     55 
     56 ###############################################################################

~/anaconda3/lib/python3.7/site-packages/nilearn/plotting/__init__.py in _set_mpl_backend()
     32         ):
     33             raise ImportError(
---> 34                 f"A matplotlib version of at least "
     35                 f"{OPTIONAL_MATPLOTLIB_MIN_VERSION} "
     36                 f"is required to use nilearn. {mpl_version} was found. "

ImportError: A matplotlib version of at least 3.3.0 is required to use nilearn. 3.1.3 was found. Please upgrade matplotlib

Friends Dataset#

Friends dataset, is the collection of Brain activities of the people watching Friends TV-series. This dataset is hosted by Institut universitaire de gériatrie de Montréal (IUGM). To visualize it, we use from nilearn import plotting for plotting the actual Brain image and the parcellations.

from nilearn import plotting


Brain = '/content/drive/MyDrive/BrainHack_2024_2/sub-03/sub-03_desc-preproc_T1w.nii.gz'
plotting.plot_img(Brain, title='Brain Image')

parcellation3 = '/content/drive/MyDrive/BrainHack_2024_2/sub-03/sub-03_task-friends_space-T1w_atlas-Ward_desc-400_dseg.nii.gz'
plotting.plot_img(parcellation3, title='Friends Dataset Parcellation')
/usr/local/lib/python3.10/dist-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.
  and should_run_async(code)
<nilearn.plotting.displays._slicers.OrthoSlicer at 0x7d1272cd6b00>
_images/489eac80df3a53f8c9d798803bb3e3b7313d30ce7b08722378fe20bd1393a09b.png _images/1d7d7c5316ac3f1a557febe8203a9584459ef1e5e111ef8a16b69fb842163d26.png

Things Dataset#

Things dataset, is a collection of brain activities (FMRI) data, for different categories. Here, we map the representation of \(\beta\) scores from each voxel to the 3D shape of the brain structure. Subsequently, we compute the average across each parcel. The first image depicts the \(\beta\) scores mapped onto the brain voxels, while the second illustrates the average \(\beta\) values across different brain parcels. It’s important to note that \(\beta\) scores represent the coefficients of a linear model fitted to each fMRI signal voxel.

import numpy as np
import nibabel as nib
import nilearn
from nilearn import plotting
from nilearn.image import new_img_like
import h5py
from nilearn.input_data import NiftiLabelsMasker
import pandas as pd
atlas_filename = '/content/drive/MyDrive/BrainHack_2024_2/sub-03/sub-03_task-friends_space-T1w_atlas-Ward_desc-400_dseg.nii.gz'
atlas_img = nib.load(atlas_filename)
atlas_data = atlas_img.get_fdata()

mask_file = '/content/drive/MyDrive/BrainHack_2024_2/sub-03/sub-03_task-things_space-T1w_label-brain_desc-unionNonNaN_mask.nii'
mask = nilearn.image.load_img(mask_file)
#print(mask)

file_path = '/content/drive/MyDrive/BrainHack_2024_2/sub-03/sub-03_task-things_space-T1w_model-fitHrfGLMdenoiseRR_stats-imageBetas_desc-zscore_statseries.h5'
# Open the HDF5 file
data = []
beta_scores = []
label_name = ''
with h5py.File(file_path, 'r') as f:
    print("Keys in the HDF5 file:", list(f.keys()))
    print(len(list(f.keys())))
    label_name = 'acorn_01b'
    #session_type = 'ses-001_task-s01e02a_timeseries'
    latent_type = 'betas'
    if label_name in f:
        data = f[label_name]
        print(f"Data in {label_name}:")
        print(data.keys())
        print(f"Data in {latent_type}:")
        print(data[latent_type])
        beta_scores = data[latent_type][:]
    else:
        print(f"Dataset {label_name} not found in the file.")


print('beta_scores shape', beta_scores.shape)
# Check if the length of beta scores matches the number of regions in the atlas
print('ATLAS shape', atlas_data.shape)

unmasked_beta = nilearn.masking.unmask(beta_scores, mask, order='F')

# Create a new Nifti image using the beta map
beta_map_img = unmasked_beta
# Plot the image
plotting.plot_img(beta_map_img, title='Beta Scores Mapped to Brain Atlas \n label:'+ label_name,cmap='magma')
plotting.show()

labels_masker = NiftiLabelsMasker(labels_img=atlas_img)
region_signals = labels_masker.fit_transform(beta_map_img)
print("Shape of region signals: ", region_signals.shape)
average_signals = np.mean(region_signals, axis=0)

# Load the atlas data to get region labels
atlas_data = atlas_img.get_fdata()

unique_regions = np.unique(atlas_data)[1:]

region_signal_df = pd.DataFrame({
    'Region': unique_regions,
    'Average Signal': average_signals
})

print(region_signal_df)

average_signal_map = np.zeros(atlas_data.shape)

for i, region in enumerate(unique_regions):
    average_signal_map[atlas_data == region] = average_signals[i]

average_signal_img = nib.Nifti1Image(average_signal_map, atlas_img.affine, atlas_img.header)

# Plot the average signal map
plotting.plot_img(average_signal_img, title='Average Signal Map by Region', black_bg=True, cmap='magma')
plotting.show()
Keys in the HDF5 file: ['acorn_01b', 'acorn_02n', 'acorn_03s', 'acorn_04s', 'acorn_05s', 'acorn_06s', 'airbag_01b', 'airbag_02s', 'airbag_03s', 'airbag_04s', 'airbag_05s', 'airbag_06s', 'aircraft_carrier_01b', 'aircraft_carrier_02s', 'aircraft_carrier_03s', 'aircraft_carrier_04s', 'aircraft_carrier_05s', 'aircraft_carrier_06s', 'airplane_01b', 'airplane_02n', 'airplane_03n', 'airplane_04n', 'airplane_05n', 'airplane_06n', 'alligator_01b', 'alligator_02s', 'alligator_03n', 'alligator_04s', 'alligator_05s', 'alligator_06s', 'aloe_01b', 'aloe_02s', 'aloe_03n', 'aloe_04s', 'aloe_05s', 'aloe_06s', 'altar_01b', 'altar_02s', 'altar_03s', 'altar_04s', 'altar_05s', 'altar_06s', 'aluminum_foil_01b', 'aluminum_foil_02s', 'aluminum_foil_03s', 'aluminum_foil_04s', 'aluminum_foil_05s', 'aluminum_foil_06s', 'anchor_01b', 'anchor_02s', 'anchor_03n', 'anchor_04s', 'anchor_05s', 'anchor_06s', 'anklet_01b', 'anklet_02s', 'anklet_03s', 'anklet_04s', 'anklet_05s', 'anklet_06s', 'ant_01b', 'ant_02s', 'ant_03s', 'ant_04s', 'ant_05s', 'ant_06s', 'anteater_01b', 'anteater_02n', 'anteater_03s', 'anteater_04s', 'anteater_05s', 'anteater_06s', 'antelope_01b', 'antelope_02n', 'antelope_03n', 'antelope_04n', 'antelope_05n', 'antelope_06n', 'antenna_01b', 'antenna_02s', 'antenna_03n', 'antenna_04n', 'antenna_05s', 'antenna_06s', 'anvil_01b', 'anvil_02s', 'anvil_03s', 'anvil_04s', 'anvil_05s', 'anvil_06s', 'apple_01b', 'apple_02s', 'apple_03s', 'apple_04s', 'apple_05s', 'apple_06s', 'applesauce_01b', 'applesauce_02s', 'applesauce_03s', 'applesauce_04s', 'applesauce_05s', 'applesauce_06s', 'artichoke_01b', 'artichoke_02s', 'artichoke_03n', 'artichoke_04s', 'artichoke_05n', 'artichoke_06s', 'ashtray_01s', 'ashtray_02s', 'ashtray_03n', 'ashtray_04s', 'ashtray_05s', 'ashtray_06s', 'asparagus_01b', 'asparagus_02n', 'asparagus_03s', 'asparagus_04s', 'asparagus_05n', 'asparagus_06s', 'avocado_01b', 'avocado_02s', 'avocado_03s', 'avocado_04s', 'avocado_05s', 'avocado_06s', 'axe_01b', 'axe_02n', 'axe_03n', 'axe_04n', 'axe_05n', 'axe_06n', 'baby_01b', 'baby_02s', 'baby_03s', 'baby_04s', 'baby_05s', 'baby_06s', 'backpack_01b', 'backpack_02s', 'backpack_03s', 'backpack_04s', 'backpack_05s', 'backpack_06s', 'bag_01b', 'bag_02s', 'bag_03s', 'bag_04s', 'bag_05s', 'bag_06s', 'bagel_01s', 'bagel_02n', 'bagel_03s', 'bagel_04s', 'bagel_05s', 'bagel_06s', 'ball_01b', 'ball_02s', 'ball_03s', 'ball_04s', 'ball_05s', 'ball_06s', 'balloon_01b', 'balloon_02s', 'balloon_03s', 'balloon_04s', 'balloon_05s', 'balloon_06s', 'bamboo_01b', 'bamboo_02s', 'bamboo_03s', 'bamboo_04n', 'bamboo_05n', 'bamboo_06s', 'banana_01b', 'banana_02s', 'banana_03s', 'banana_04s', 'banana_05s', 'banana_06s', 'banana_split_01b', 'banana_split_02s', 'banana_split_03s', 'banana_split_04s', 'banana_split_05s', 'banana_split_06s', 'bandage_01b', 'bandage_02s', 'bandage_03s', 'bandage_04s', 'bandage_05s', 'bandage_06s', 'banner_01b', 'banner_02s', 'banner_03s', 'banner_04s', 'banner_05s', 'banner_06s', 'barnacle_01s', 'barnacle_02s', 'barnacle_03s', 'barnacle_04n', 'barnacle_05s', 'barnacle_06n', 'barrel_01b', 'barrel_02s', 'barrel_03s', 'barrel_04s', 'barrel_05s', 'barrel_06s', 'basket_01b', 'basket_02s', 'basket_03s', 'basket_04s', 'basket_05s', 'basket_06s', 'bassinet_01b', 'bassinet_02s', 'bassinet_03s', 'bassinet_04s', 'bassinet_05s', 'bassinet_06s', 'battery_01b', 'battery_02s', 'battery_03n', 'battery_04n', 'battery_05n', 'battery_06s', 'beachball_01b', 'beachball_02s', 'beachball_03s', 'beachball_04s', 'beachball_05s', 'beachball_06s', 'beaker_01b', 'beaker_02s', 'beaker_03s', 'beaker_04s', 'beaker_05n', 'beaker_06s', 'bean_01b', 'bean_02s', 'bean_03s', 'bean_04s', 'bean_05s', 'bean_06s', 'bear_01b', 'bear_02s', 'bear_03s', 'bear_04s', 'bear_05s', 'bear_06n', 'beaver_01b', 'beaver_02s', 'beaver_03s', 'beaver_04s', 'beaver_05s', 'beaver_06s', 'bed_01b', 'bed_02s', 'bed_03s', 'bed_04s', 'bed_05n', 'bed_06n', 'bee_01b', 'bee_02s', 'bee_03s', 'bee_04s', 'bee_05s', 'bee_06s', 'beer_01s', 'beer_02s', 'beer_03s', 'beer_04s', 'beer_05s', 'beer_06s', 'beet_01b', 'beet_02s', 'beet_03s', 'beet_04s', 'beet_05s', 'beet_06s', 'beetle_01b', 'beetle_02n', 'beetle_03s', 'beetle_04s', 'beetle_05n', 'beetle_06s', 'bell_01b', 'bell_02s', 'bell_03s', 'bell_04s', 'bell_05n', 'bell_06s', 'bench_01b', 'bench_02s', 'bench_03s', 'bench_04s', 'bench_05s', 'bench_06s', 'bib_01n', 'bib_02s', 'bib_03s', 'bib_04s', 'bib_05s', 'bib_06s', 'bike_01b', 'bike_02s', 'bike_03s', 'bike_04s', 'bike_05s', 'bike_06s', 'binoculars_01s', 'binoculars_02s', 'binoculars_03s', 'binoculars_04s', 'binoculars_05s', 'binoculars_06s', 'bird_01b', 'bird_02s', 'bird_03s', 'bird_04s', 'bird_05s', 'bird_06s', 'birdcage_01b', 'birdcage_02s', 'birdcage_03s', 'birdcage_04s', 'birdcage_05s', 'birdcage_06s', 'birdhouse_01b', 'birdhouse_02s', 'birdhouse_03s', 'birdhouse_04s', 'birdhouse_05s', 'birdhouse_06s', 'bison_01b', 'bison_02s', 'bison_03n', 'bison_04n', 'bison_05s', 'bison_06s', 'blackberry_01b', 'blackberry_02n', 'blackberry_03s', 'blackberry_04s', 'blackberry_05s', 'blackberry_06s', 'blanket_01b', 'blanket_02s', 'blanket_03s', 'blanket_04s', 'blanket_05s', 'blanket_06s', 'blazer_01b', 'blazer_02s', 'blazer_03s', 'blazer_04s', 'blazer_05s', 'blazer_06s', 'blender_01b', 'blender_02s', 'blender_03s', 'blender_04s', 'blender_05s', 'blender_06s', 'blimp_01b', 'blimp_02n', 'blimp_03n', 'blimp_04n', 'blimp_05s', 'blimp_06s', 'blind_01b', 'blind_02s', 'blind_03s', 'blind_04s', 'blind_05s', 'blind_06s', 'blowtorch_01b', 'blowtorch_02s', 'blowtorch_03s', 'blowtorch_04s', 'blowtorch_05s', 'blowtorch_06s', 'blueberry_01b', 'blueberry_02s', 'blueberry_03s', 'blueberry_04n', 'blueberry_05s', 'blueberry_06n', 'boa_01s', 'boa_02s', 'boa_03s', 'boa_04s', 'boa_05s', 'boa_06s', 'boar_01b', 'boar_02s', 'boar_03s', 'boar_04s', 'boar_05s', 'boar_06s', 'boat_01b', 'boat_02s', 'boat_03s', 'boat_04s', 'boat_05s', 'boat_06n', 'bobsled_01s', 'bobsled_02s', 'bobsled_03s', 'bobsled_04s', 'bobsled_05s', 'bobsled_06s', 'bolt_01b', 'bolt_02s', 'bolt_03s', 'bolt_04s', 'bolt_05s', 'bolt_06s', 'book_01b', 'book_02s', 'book_03s', 'book_04s', 'book_05s', 'book_06s', 'boot_01b', 'boot_02s', 'boot_03s', 'boot_04s', 'boot_05s', 'boot_06s', 'bottle_01b', 'bottle_02n', 'bottle_03s', 'bottle_04s', 'bottle_05s', 'bottle_06s', 'bow2_01s', 'bow2_02s', 'bow2_03s', 'bow2_04s', 'bow2_05s', 'bow2_06s', 'bowl_01b', 'bowl_02s', 'bowl_03s', 'bowl_04s', 'bowl_05s', 'bowl_06n', 'boxer_shorts_01b', 'boxer_shorts_02s', 'boxer_shorts_03s', 'boxer_shorts_04s', 'boxer_shorts_05s', 'boxer_shorts_06s', 'boxing_gloves_01b', 'boxing_gloves_02s', 'boxing_gloves_03s', 'boxing_gloves_04s', 'boxing_gloves_05s', 'boxing_gloves_06s', 'boy_01b', 'boy_02s', 'boy_03s', 'boy_04s', 'boy_05s', 'boy_06s', 'brace_01b', 'brace_02s', 'brace_03s', 'brace_04s', 'brace_05s', 'brace_06s', 'bracelet2_01b', 'bracelet2_02s', 'bracelet2_03s', 'bracelet2_04s', 'bracelet2_05s', 'bracelet2_06s', 'bracket_01b', 'bracket_02s', 'bracket_03s', 'bracket_04s', 'bracket_05s', 'bracket_06s', 'bread_01b', 'bread_02s', 'bread_03s', 'bread_04s', 'bread_05s', 'bread_06s', 'breadbox_01b', 'breadbox_02s', 'breadbox_03s', 'breadbox_04s', 'breadbox_05s', 'breadbox_06s', 'breadstick_01b', 'breadstick_02s', 'breadstick_03s', 'breadstick_04s', 'breadstick_05s', 'breadstick_06s', 'broccoli_01b', 'broccoli_02s', 'broccoli_03s', 'broccoli_04s', 'broccoli_05s', 'broccoli_06s', 'broom_01b', 'broom_02s', 'broom_03s', 'broom_04s', 'broom_05s', 'broom_06s', 'brownie_01b', 'brownie_02s', 'brownie_03s', 'brownie_04s', 'brownie_05s', 'brownie_06s', 'brush_01s', 'brush_02s', 'brush_03s', 'brush_04s', 'brush_05s', 'brush_06s', 'bubble_01b', 'bubble_02s', 'bubble_03s', 'bubble_04s', 'bubble_05s', 'bubble_06n', 'bucket_01b', 'bucket_02n', 'bucket_03s', 'bucket_04s', 'bucket_05s', 'bucket_06n', 'buggy_01s', 'buggy_02s', 'buggy_03s', 'buggy_04s', 'buggy_05s', 'buggy_06s', 'bulldozer_01b', 'bulldozer_02s', 'bulldozer_03n', 'bulldozer_04s', 'bulldozer_05n', 'bulldozer_06s', 'bullet_01s', 'bullet_02s', 'bullet_03s', 'bullet_04s', 'bullet_05s', 'bullet_06s', 'bumper_01b', 'bumper_02s', 'bumper_03s', 'bumper_04s', 'bumper_05s', 'bumper_06s', 'bungee_01b', 'bungee_02s', 'bungee_03s', 'bungee_04s', 'bungee_05s', 'bungee_06s', 'bunkbed_01b', 'bunkbed_02s', 'bunkbed_03s', 'bunkbed_04s', 'bunkbed_05s', 'bunkbed_06s', 'burner_01b', 'burner_02s', 'burner_03s', 'burner_04s', 'burner_05s', 'burner_06s', 'burrito_01b', 'burrito_02s', 'burrito_03s', 'burrito_04s', 'burrito_05s', 'burrito_06n', 'bus_01b', 'bus_02n', 'bus_03n', 'bus_04n', 'bus_05n', 'bus_06n', 'butterfly_01b', 'butterfly_02s', 'butterfly_03s', 'butterfly_04s', 'butterfly_05s', 'butterfly_06s', 'cabbage_01b', 'cabbage_02s', 'cabbage_03s', 'cabbage_04s', 'cabbage_05s', 'cabbage_06s', 'cactus_01b', 'cactus_02s', 'cactus_03s', 'cactus_04s', 'cactus_05n', 'cactus_06s', 'cage_01b', 'cage_02s', 'cage_03s', 'cage_04s', 'cage_05s', 'cage_06s', 'cake_01b', 'cake_02s', 'cake_03s', 'cake_04s', 'cake_05s', 'cake_06s', 'calculator_01b', 'calculator_02s', 'calculator_03s', 'calculator_04s', 'calculator_05s', 'calculator_06s', 'camcorder_01b', 'camcorder_02s', 'camcorder_03s', 'camcorder_04s', 'camcorder_05s', 'camcorder_06s', 'camel_01b', 'camel_02s', 'camel_03s', 'camel_04s', 'camel_05s', 'camel_06s', 'camera1_01b', 'camera1_02s', 'camera1_03s', 'camera1_04s', 'camera1_05s', 'camera1_06s', 'camera2_01b', 'camera2_02s', 'camera2_03s', 'camera2_04s', 'camera2_05s', 'camera2_06s', 'candelabra_01b', 'candelabra_02s', 'candelabra_03s', 'candelabra_04s', 'candelabra_05s', 'candelabra_06s', 'candle_01b', 'candle_02n', 'candle_03n', 'candle_04n', 'candle_05n', 'candle_06n', 'candy_01b', 'candy_02s', 'candy_03s', 'candy_04s', 'candy_05s', 'candy_06s', 'candy_bar_01b', 'candy_bar_02s', 'candy_bar_03s', 'candy_bar_04s', 'candy_bar_05s', 'candy_bar_06s', 'cane_01s', 'cane_02s', 'cane_03s', 'cane_04s', 'cane_05n', 'cane_06n', 'cannon_01b', 'cannon_02s', 'cannon_03n', 'cannon_04s', 'cannon_05s', 'cannon_06s', 'canoe_01b', 'canoe_02s', 'canoe_03s', 'canoe_04s', 'canoe_05s', 'canoe_06s', 'canvas_01b', 'canvas_02s', 'canvas_03s', 'canvas_04s', 'canvas_05s', 'canvas_06s', 'car_01b', 'car_02s', 'car_03s', 'car_04s', 'car_05s', 'car_06s', 'cardigan_01b', 'cardigan_02s', 'cardigan_03s', 'cardigan_04s', 'cardigan_05s', 'cardigan_06s', 'carousel_01b', 'carousel_02s', 'carousel_03s', 'carousel_04n', 'carousel_05s', 'carousel_06s', 'carrot_01b', 'carrot_02s', 'carrot_03s', 'carrot_04n', 'carrot_05s', 'carrot_06n', 'cashew_01b', 'cashew_02s', 'cashew_03s', 'cashew_04s', 'cashew_05s', 'cashew_06s', 'cassette_01b', 'cassette_02s', 'cassette_03s', 'cassette_04s', 'cassette_05s', 'cassette_06s', 'cat_01b', 'cat_02s', 'cat_03s', 'cat_04s', 'cat_05s', 'cat_06s', 'caterpillar_01b', 'caterpillar_02s', 'caterpillar_03s', 'caterpillar_04s', 'caterpillar_05s', 'caterpillar_06s', 'cauliflower_01b', 'cauliflower_02s', 'cauliflower_03s', 'cauliflower_04s', 'cauliflower_05s', 'cauliflower_06s', 'celery_01b', 'celery_02n', 'celery_03s', 'celery_04s', 'celery_05s', 'celery_06s', 'cellphone_01b', 'cellphone_02s', 'cellphone_03s', 'cellphone_04s', 'cellphone_05s', 'cellphone_06s', 'chair_01b', 'chair_02s', 'chair_03s', 'chair_04s', 'chair_05s', 'chair_06s', 'chalice_01s', 'chalice_02s', 'chalice_03s', 'chalice_04s', 'chalice_05s', 'chalice_06s', 'chalk_01b', 'chalk_02s', 'chalk_03s', 'chalk_04s', 'chalk_05s', 'chalk_06s', 'chalkboard_01b', 'chalkboard_02s', 'chalkboard_03s', 'chalkboard_04s', 'chalkboard_05s', 'chalkboard_06s', 'chandelier_01b', 'chandelier_02s', 'chandelier_03s', 'chandelier_04s', 'chandelier_05s', 'chandelier_06s', 'cheese_01b', 'cheese_02s', 'cheese_03s', 'cheese_04s', 'cheese_05n', 'cheese_06s', 'cheetah_01b', 'cheetah_02s', 'cheetah_03s', 'cheetah_04s', 'cheetah_05s', 'cheetah_06s', 'cherry_01b', 'cherry_02s', 'cherry_03s', 'cherry_04n', 'cherry_05s', 'cherry_06n', 'chess_piece_01b', 'chess_piece_02s', 'chess_piece_03s', 'chess_piece_04s', 'chess_piece_05s', 'chess_piece_06s', 'chest1_01b', 'chest1_02s', 'chest1_03s', 'chest1_04s', 'chest1_05s', 'chest1_06s', 'chest2_01b', 'chest2_02s', 'chest2_03s', 'chest2_04s', 'chest2_05s', 'chest2_06s', 'chicken_wire_01b', 'chicken_wire_02s', 'chicken_wire_03s', 'chicken_wire_04s', 'chicken_wire_05s', 'chicken_wire_06s', 'chili_01b', 'chili_02s', 'chili_03s', 'chili_04s', 'chili_05s', 'chili_06s', 'chin_01b', 'chin_02s', 'chin_03s', 'chin_04s', 'chin_05s', 'chin_06s', 'chinchilla_01b', 'chinchilla_02s', 'chinchilla_03s', 'chinchilla_04s', 'chinchilla_05s', 'chinchilla_06s', 'chip_01b', 'chip_02s', 'chip_03s', 'chip_04s', 'chip_05s', 'chip_06n', 'chipmunk_01b', 'chipmunk_02n', 'chipmunk_03n', 'chipmunk_04n', 'chipmunk_05s', 'chipmunk_06s', 'chocolate_01b', 'chocolate_02s', 'chocolate_03s', 'chocolate_04n', 'chocolate_05s', 'chocolate_06s', 'christmas_tree_01b', 'christmas_tree_02s', 'christmas_tree_03s', 'christmas_tree_04s', 'christmas_tree_05s', 'christmas_tree_06s', 'cinnamon_01b', 'cinnamon_02s', 'cinnamon_03s', 'cinnamon_04s', 'cinnamon_05s', 'cinnamon_06s', 'clarinet_01b', 'clarinet_02n', 'clarinet_03s', 'clarinet_04s', 'clarinet_05n', 'clarinet_06s', 'clay_01b', 'clay_02s', 'clay_03s', 'clay_04s', 'clay_05s', 'clay_06s', 'clipboard_01b', 'clipboard_02s', 'clipboard_03s', 'clipboard_04s', 'clipboard_05s', 'clipboard_06s', 'clock_01b', 'clock_02n', 'clock_03n', 'clock_04n', 'clock_05n', 'clock_06n', 'closet_01b', 'closet_02s', 'closet_03s', 'closet_04s', 'closet_05s', 'closet_06s', 'coat_01b', 'coat_02s', 'coat_03s', 'coat_04s', 'coat_05s', 'coat_06s', 'coat_rack_01b', 'coat_rack_02s', 'coat_rack_03s', 'coat_rack_04s', 'coat_rack_05s', 'coat_rack_06s', 'cockroach_01b', 'cockroach_02s', 'cockroach_03s', 'cockroach_04n', 'cockroach_05s', 'cockroach_06n', 'coconut_01b', 'coconut_02s', 'coconut_03s', 'coconut_04s', 'coconut_05s', 'coconut_06s', 'coffee_pot_01s', 'coffee_pot_02s', 'coffee_pot_03s', 'coffee_pot_04s', 'coffee_pot_05s', 'coffee_pot_06s', 'coffin_01b', 'coffin_02s', 'coffin_03s', 'coffin_04s', 'coffin_05s', 'coffin_06s', 'coin_01b', 'coin_02s', 'coin_03s', 'coin_04s', 'coin_05s', 'coin_06s', 'coleslaw_01b', 'coleslaw_02n', 'coleslaw_03s', 'coleslaw_04s', 'coleslaw_05s', 'coleslaw_06s', 'comb_01b', 'comb_02s', 'comb_03s', 'comb_04s', 'comb_05s', 'comb_06s', 'compass_01b', 'compass_02s', 'compass_03s', 'compass_04s', 'compass_05s', 'compass_06n', 'computer_01b', 'computer_02s', 'computer_03n', 'computer_04s', 'computer_05s', 'computer_06n', 'computer_screen_01b', 'computer_screen_02s', 'computer_screen_03s', 'computer_screen_04s', 'computer_screen_05s', 'computer_screen_06s', 'confetti_01b', 'confetti_02s', 'confetti_03s', 'confetti_04s', 'confetti_05s', 'confetti_06s', 'contact_lens_01s', 'contact_lens_02s', 'contact_lens_03s', 'contact_lens_04s', 'contact_lens_05s', 'contact_lens_06s', 'cookie_01b', 'cookie_02s', 'cookie_03s', 'cookie_04s', 'cookie_05s', 'cookie_06s', 'copier_01s', 'copier_02s', 'copier_03s', 'copier_04n', 'copier_05s', 'copier_06n', 'cork_01b', 'cork_02s', 'cork_03s', 'cork_04s', 'cork_05s', 'cork_06s', 'corkscrew_01b', 'corkscrew_02s', 'corkscrew_03s', 'corkscrew_04s', 'corkscrew_05s', 'corkscrew_06s', 'corn_01s', 'corn_02s', 'corn_03s', 'corn_04s', 'corn_05s', 'corn_06s', 'cotton_candy_01b', 'cotton_candy_02s', 'cotton_candy_03s', 'cotton_candy_04s', 'cotton_candy_05s', 'cotton_candy_06s', 'couch_01b', 'couch_02s', 'couch_03s', 'couch_04s', 'couch_05s', 'couch_06s', 'coverall_01b', 'coverall_02s', 'coverall_03s', 'coverall_04s', 'coverall_05s', 'coverall_06s', 'cow_01b', 'cow_02s', 'cow_03s', 'cow_04s', 'cow_05s', 'cow_06n', 'cranberry_01b', 'cranberry_02s', 'cranberry_03n', 'cranberry_04n', 'cranberry_05s', 'cranberry_06n', 'crank_01b', 'crank_02s', 'crank_03s', 'crank_04s', 'crank_05s', 'crank_06s', 'crate_01b', 'crate_02s', 'crate_03s', 'crate_04s', 'crate_05s', 'crate_06s', 'crayfish_01b', 'crayfish_02s', 'crayfish_03s', 'crayfish_04s', 'crayfish_05s', 'crayfish_06n', 'crayon_01b', 'crayon_02s', 'crayon_03s', 'crayon_04s', 'crayon_05s', 'crayon_06s', 'credit_card_01b', 'credit_card_02s', 'credit_card_03s', 'credit_card_04s', 'credit_card_05s', 'credit_card_06s', 'crib_01b', 'crib_02s', 'crib_03s', 'crib_04s', 'crib_05s', 'crib_06s', 'croissant_01b', 'croissant_02s', 'croissant_03s', 'croissant_04s', 'croissant_05s', 'croissant_06s', 'crouton_01b', 'crouton_02n', 'crouton_03s', 'crouton_04n', 'crouton_05s', 'crouton_06s', 'crowbar_01b', 'crowbar_02s', 'crowbar_03s', 'crowbar_04s', 'crowbar_05n', 'crowbar_06n', 'crown_01b', 'crown_02s', 'crown_03s', 'crown_04s', 'crown_05n', 'crown_06s', 'crutch_01b', 'crutch_02s', 'crutch_03s', 'crutch_04s', 'crutch_05s', 'crutch_06s', 'crystal1_01b', 'crystal1_02s', 'crystal1_03s', 'crystal1_04s', 'crystal1_05s', 'crystal1_06s', 'cucumber_01b', 'cucumber_02s', 'cucumber_03s', 'cucumber_04s', 'cucumber_05s', 'cucumber_06s', 'cufflink_01b', 'cufflink_02s', 'cufflink_03s', 'cufflink_04s', 'cufflink_05s', 'cufflink_06s', 'cup_01s', 'cup_02s', 'cup_03s', 'cup_04s', 'cup_05s', 'cup_06s', 'cupcake_01b', 'cupcake_02s', 'cupcake_03s', 'cupcake_04s', 'cupcake_05s', 'cupcake_06s', 'curtain_01b', 'curtain_02n', 'curtain_03s', 'curtain_04s', 'curtain_05s', 'curtain_06s', 'cymbal_01b', 'cymbal_02s', 'cymbal_03s', 'cymbal_04s', 'cymbal_05s', 'cymbal_06s', 'dashboard_01b', 'dashboard_02s', 'dashboard_03s', 'dashboard_04s', 'dashboard_05s', 'dashboard_06s', 'deer_01b', 'deer_02s', 'deer_03s', 'deer_04s', 'deer_05s', 'deer_06s', 'defibrillator_01b', 'defibrillator_02s', 'defibrillator_03s', 'defibrillator_04s', 'defibrillator_05s', 'defibrillator_06s', 'desk_01b', 'desk_02s', 'desk_03s', 'desk_04s', 'desk_05s', 'desk_06s', 'diaper_01b', 'diaper_02s', 'diaper_03s', 'diaper_04s', 'diaper_05s', 'diaper_06s', 'dice_01s', 'dice_02s', 'dice_03s', 'dice_04s', 'dice_05s', 'dice_06s', 'dish_01b', 'dish_02s', 'dish_03s', 'dish_04s', 'dish_05s', 'dish_06s', 'dishrag_01b', 'dishrag_02s', 'dishrag_03s', 'dishrag_04s', 'dishrag_05s', 'dishrag_06s', 'dishwasher_01b', 'dishwasher_02s', 'dishwasher_03s', 'dishwasher_04s', 'dishwasher_05s', 'dishwasher_06s', 'dog_01b', 'dog_02s', 'dog_03s', 'dog_04s', 'dog_05s', 'dog_06s', 'doll_01b', 'doll_02s', 'doll_03s', 'doll_04s', 'doll_05s', 'doll_06s', 'dollhouse_01b', 'dollhouse_02s', 'dollhouse_03s', 'dollhouse_04s', 'dollhouse_05s', 'dollhouse_06s', 'dolly_01s', 'dolly_02s', 'dolly_03s', 'dolly_04s', 'dolly_05s', 'dolly_06s', 'dolphin_01b', 'dolphin_02s', 'dolphin_03n', 'dolphin_04s', 'dolphin_05s', 'dolphin_06s', 'donkey_01b', 'donkey_02n', 'donkey_03s', 'donkey_04s', 'donkey_05n', 'donkey_06s', 'donut_01b', 'donut_02s', 'donut_03s', 'donut_04s', 'donut_05s', 'donut_06s', 'doormat_01b', 'doormat_02s', 'doormat_03s', 'doormat_04s', 'doormat_05s', 'doormat_06s', 'dough_01b', 'dough_02s', 'dough_03s', 'dough_04s', 'dough_05s', 'dough_06s', 'dragonfly_01b', 'dragonfly_02s', 'dragonfly_03s', 'dragonfly_04s', 'dragonfly_05n', 'dragonfly_06s', 'drain_01b', 'drain_02s', 'drain_03s', 'drain_04s', 'drain_05s', 'drain_06s', 'drawer_01b', 'drawer_02s', 'drawer_03s', 'drawer_04s', 'drawer_05s', 'drawer_06s', 'dress_01b', 'dress_02s', 'dress_03s', 'dress_04s', 'dress_05s', 'dress_06s', 'dresser_01b', 'dresser_02s', 'dresser_03n', 'dresser_04s', 'dresser_05n', 'dresser_06n', 'drill_01b', 'drill_02s', 'drill_03s', 'drill_04s', 'drill_05s', 'drill_06s', 'drum_01b', 'drum_02s', 'drum_03n', 'drum_04s', 'drum_05s', 'drum_06s', 'duster_01s', 'duster_02s', 'duster_03s', 'duster_04s', 'duster_05s', 'duster_06s', 'dustpan_01b', 'dustpan_02n', 'dustpan_03n', 'dustpan_04n', 'dustpan_05s', 'dustpan_06s', 'earring_01b', 'earring_02s', 'earring_03n', 'earring_04n', 'earring_05s', 'earring_06n', 'earwig_01b', 'earwig_02s', 'earwig_03s', 'earwig_04s', 'earwig_05s', 'earwig_06n', 'easel_01b', 'easel_02s', 'easel_03s', 'easel_04s', 'easel_05s', 'easel_06s', 'egg_01b', 'egg_02s', 'egg_03s', 'egg_04s', 'egg_05s', 'egg_06s', 'egg_roll_01b', 'egg_roll_02s', 'egg_roll_03n', 'egg_roll_04s', 'egg_roll_05s', 'egg_roll_06s', 'eggbeater_01b', 'eggbeater_02s', 'eggbeater_03s', 'eggbeater_04s', 'eggbeater_05s', 'eggbeater_06s', 'eggplant_01b', 'eggplant_02s', 'eggplant_03s', 'eggplant_04s', 'eggplant_05n', 'eggplant_06s', 'elbow_01b', 'elbow_02s', 'elbow_03s', 'elbow_04s', 'elbow_05s', 'elbow_06s', 'elephant_01b', 'elephant_02n', 'elephant_03n', 'elephant_04n', 'elephant_05n', 'elephant_06n', 'emerald_01b', 'emerald_02s', 'emerald_03s', 'emerald_04s', 'emerald_05n', 'emerald_06s', 'envelope_01b', 'envelope_02s', 'envelope_03s', 'envelope_04s', 'envelope_05s', 'envelope_06s', 'extinguisher_01b', 'extinguisher_02s', 'extinguisher_03s', 'extinguisher_04s', 'extinguisher_05s', 'extinguisher_06s', 'eyedropper_01b', 'eyedropper_02s', 'eyedropper_03s', 'eyedropper_04s', 'eyedropper_05s', 'eyedropper_06s', 'eyepiece_01s', 'eyepiece_02s', 'eyepiece_03s', 'eyepiece_04s', 'eyepiece_05s', 'eyepiece_06s', 'face_01b', 'face_02s', 'face_03s', 'face_04s', 'face_05s', 'face_06s', 'face_mask_01b', 'face_mask_02s', 'face_mask_03s', 'face_mask_04s', 'face_mask_05s', 'face_mask_06s', 'fence_01b', 'fence_02s', 'fence_03s', 'fence_04s', 'fence_05s', 'fence_06s', 'fern_01b', 'fern_02n', 'fern_03s', 'fern_04n', 'fern_05s', 'fern_06s', 'ferret_01b', 'ferret_02s', 'ferret_03s', 'ferret_04s', 'ferret_05s', 'ferret_06n', 'ferris_wheel_01b', 'ferris_wheel_02s', 'ferris_wheel_03s', 'ferris_wheel_04s', 'ferris_wheel_05s', 'ferris_wheel_06s', 'fig_01b', 'fig_02s', 'fig_03s', 'fig_04s', 'fig_05s', 'fig_06n', 'filing_cabinet_01b', 'filing_cabinet_02s', 'filing_cabinet_03s', 'filing_cabinet_04s', 'filing_cabinet_05s', 'filing_cabinet_06s', 'fire_01b', 'fire_02s', 'fire_03s', 'fire_04s', 'fire_05s', 'fire_06s', 'firetruck_01b', 'firetruck_02s', 'firetruck_03s', 'firetruck_04s', 'firetruck_05s', 'firetruck_06s', 'fireworks_01b', 'fireworks_02s', 'fireworks_03s', 'fireworks_04s', 'fireworks_05s', 'fireworks_06s', 'fish_01b', 'fish_02s', 'fish_03s', 'fish_04s', 'fish_05s', 'fish_06s', 'flag_01b', 'flag_02s', 'flag_03s', 'flag_04s', 'flag_05s', 'flag_06n', 'flashbulb_01b', 'flashbulb_02s', 'flashbulb_03s', 'flashbulb_04s', 'flashbulb_05s', 'flashbulb_06s', 'flashlight_01b', 'flashlight_02s', 'flashlight_03s', 'flashlight_04s', 'flashlight_05n', 'flashlight_06s', 'floss_01b', 'floss_02s', 'floss_03s', 'floss_04s', 'floss_05s', 'floss_06s', 'flower_01b', 'flower_02s', 'flower_03s', 'flower_04s', 'flower_05s', 'flower_06s', 'flyswatter_01b', 'flyswatter_02s', 'flyswatter_03s', 'flyswatter_04s', 'flyswatter_05s', 'flyswatter_06s', 'fondue_01s', 'fondue_02s', 'fondue_03s', 'fondue_04n', 'fondue_05s', 'fondue_06s', 'football_helmet_01b', 'football_helmet_02s', 'football_helmet_03s', 'football_helmet_04s', 'football_helmet_05s', 'football_helmet_06s', 'footbath_01s', 'footbath_02s', 'footbath_03s', 'footbath_04s', 'footbath_05s', 'footbath_06s', 'footprint_01b', 'footprint_02s', 'footprint_03s', 'footprint_04s', 'footprint_05s', 'footprint_06s', 'footrest_01b', 'footrest_02s', 'footrest_03s', 'footrest_04s', 'footrest_05s', 'footrest_06s', 'forklift_01b', 'forklift_02s', 'forklift_03s', 'forklift_04n', 'forklift_05s', 'forklift_06n', 'frog_01b', 'frog_02s', 'frog_03s', 'frog_04s', 'frog_05s', 'frog_06s', 'fudge_01b', 'fudge_02s', 'fudge_03s', 'fudge_04s', 'fudge_05s', 'fudge_06s', 'game_01s', 'game_02s', 'game_03s', 'game_04s', 'game_05s', 'game_06s', 'garbage_01b', 'garbage_02s', 'garbage_03s', 'garbage_04s', 'garbage_05s', 'garbage_06s', 'gargoyle_01b', 'gargoyle_02s', 'gargoyle_03s', 'gargoyle_04s', 'gargoyle_05s', 'gargoyle_06s', 'garlic_01b', 'garlic_02s', 'garlic_03s', 'garlic_04s', 'garlic_05s', 'garlic_06s', 'garter_01b', 'garter_02s', 'garter_03s', 'garter_04s', 'garter_05s', 'garter_06s', 'gauze_01b', 'gauze_02s', 'gauze_03s', 'gauze_04s', 'gauze_05s', 'gauze_06s', 'gearshift_01b', 'gearshift_02s', 'gearshift_03n', 'gearshift_04n', 'gearshift_05s', 'gearshift_06s', 'gel_01b', 'gel_02s', 'gel_03s', 'gel_04s', 'gel_05s', 'gel_06s', 'giraffe_01b', 'giraffe_02s', 'giraffe_03s', 'giraffe_04s', 'giraffe_05s', 'giraffe_06s', 'girl_01b', 'girl_02s', 'girl_03s', 'girl_04s', 'girl_05s', 'girl_06s', 'glass_01b', 'glass_02s', 'glass_03s', 'glass_04s', 'glass_05s', 'glass_06s', 'go-kart_01b', 'go-kart_02s', 'go-kart_03s', 'go-kart_04s', 'go-kart_05s', 'go-kart_06s', 'goalpost_01b', 'goalpost_02s', 'goalpost_03s', 'goalpost_04s', 'goalpost_05s', 'goalpost_06s', 'goat_01b', 'goat_02n', 'goat_03s', 'goat_04s', 'goat_05n', 'goat_06n', 'golf_cart_01b', 'golf_cart_02s', 'golf_cart_03s', 'golf_cart_04s', 'golf_cart_05s', 'golf_cart_06s', 'gondola_01b', 'gondola_02s', 'gondola_03s', 'gondola_04s', 'gondola_05s', 'gondola_06s', 'graffiti_01b', 'graffiti_02s', 'graffiti_03s', 'graffiti_04s', 'graffiti_05s', 'graffiti_06s', 'gramophone_01b', 'gramophone_02s', 'gramophone_03s', 'gramophone_04s', 'gramophone_05s', 'gramophone_06s', 'granite_01b', 'granite_02s', 'granite_03s', 'granite_04s', 'granite_05s', 'granite_06s', 'granola_01b', 'granola_02s', 'granola_03s', 'granola_04s', 'granola_05s', 'granola_06s', 'grape_01b', 'grape_02s', 'grape_03s', 'grape_04s', 'grape_05s', 'grape_06s', 'grapefruit_01b', 'grapefruit_02s', 'grapefruit_03s', 'grapefruit_04s', 'grapefruit_05s', 'grapefruit_06s', 'grasshopper_01b', 'grasshopper_02s', 'grasshopper_03s', 'grasshopper_04s', 'grasshopper_05s', 'grasshopper_06s', 'grate_01b', 'grate_02s', 'grate_03s', 'grate_04s', 'grate_05s', 'grate_06s', 'gravel_01b', 'gravel_02n', 'gravel_03s', 'gravel_04n', 'gravel_05n', 'gravel_06s', 'gravestone_01b', 'gravestone_02s', 'gravestone_03s', 'gravestone_04s', 'gravestone_05s', 'gravestone_06s', 'grenade_01b', 'grenade_02s', 'grenade_03s', 'grenade_04s', 'grenade_05s', 'grenade_06s', 'grits_01s', 'grits_02s', 'grits_03s', 'grits_04s', 'grits_05s', 'grits_06s', 'groundhog_01b', 'groundhog_02s', 'groundhog_03n', 'groundhog_04s', 'groundhog_05n', 'groundhog_06s', 'guacamole_01b', 'guacamole_02s', 'guacamole_03s', 'guacamole_04s', 'guacamole_05s', 'guacamole_06s', 'guardrail_01b', 'guardrail_02s', 'guardrail_03s', 'guardrail_04s', 'guardrail_05s', 'guardrail_06s', 'guillotine_01b', 'guillotine_02s', 'guillotine_03s', 'guillotine_04s', 'guillotine_05s', 'guillotine_06s', 'guitar_01b', 'guitar_02s', 'guitar_03s', 'guitar_04s', 'guitar_05s', 'guitar_06s', 'gun_01b', 'gun_02s', 'gun_03s', 'gun_04s', 'gun_05s', 'gun_06s', 'hairbrush_01b', 'hairbrush_02s', 'hairbrush_03s', 'hairbrush_04s', 'hairbrush_05s', 'hairbrush_06s', 'hairpin_01b', 'hairpin_02s', 'hairpin_03s', 'hairpin_04s', 'hairpin_05s', 'hairpin_06s', 'hamburger_01s', 'hamburger_02s', 'hamburger_03s', 'hamburger_04s', 'hamburger_05s', 'hamburger_06s', 'hammer_01b', 'hammer_02s', 'hammer_03s', 'hammer_04s', 'hammer_05n', 'hammer_06s', 'hammock_01b', 'hammock_02s', 'hammock_03s', 'hammock_04s', 'hammock_05s', 'hammock_06s', 'hamster_01b', 'hamster_02s', 'hamster_03s', 'hamster_04s', 'hamster_05s', 'hamster_06s', 'hand_01b', 'hand_02s', 'hand_03s', 'hand_04s', 'hand_05s', 'hand_06s', 'handcuff_01b', 'handcuff_02s', 'handcuff_03s', 'handcuff_04s', 'handcuff_05s', 'handcuff_06s', 'hanger_01b', 'hanger_02s', 'hanger_03s', 'hanger_04s', 'hanger_05s', 'hanger_06s', 'hatbox_01b', 'hatbox_02s', 'hatbox_03s', 'hatbox_04s', 'hatbox_05s', 'hatbox_06s', 'hay_01b', 'hay_02s', 'hay_03s', 'hay_04s', 'hay_05s', 'hay_06s', 'headlamp_01b', 'headlamp_02s', 'headlamp_03s', 'headlamp_04s', 'headlamp_05s', 'headlamp_06s', 'headphones_01b', 'headphones_02s', 'headphones_03s', 'headphones_04s', 'headphones_05s', 'headphones_06s', 'hedge_01b', 'hedge_02s', 'hedge_03s', 'hedge_04s', 'hedge_05s', 'hedge_06s', 'hedgehog_01b', 'hedgehog_02s', 'hedgehog_03s', 'hedgehog_04s', 'hedgehog_05n', 'hedgehog_06s', 'helicopter_01b', 'helicopter_02s', 'helicopter_03s', 'helicopter_04s', 'helicopter_05s', 'helicopter_06s', 'helmet_01b', 'helmet_02s', 'helmet_03s', 'helmet_04s', 'helmet_05s', 'helmet_06s', 'highlighter_01b', 'highlighter_02s', 'highlighter_03s', 'highlighter_04s', 'highlighter_05s', 'highlighter_06s', 'hip_01b', 'hip_02s', 'hip_03s', 'hip_04s', 'hip_05s', 'hip_06s', 'hippopotamus_01b', 'hippopotamus_02s', 'hippopotamus_03s', 'hippopotamus_04s', 'hippopotamus_05s', 'hippopotamus_06s', 'honeycomb_01b', 'honeycomb_02s', 'honeycomb_03s', 'honeycomb_04s', 'honeycomb_05s', 'honeycomb_06s', 'hood_01b', 'hood_02s', 'hood_03s', 'hood_04s', 'hood_05s', 'hood_06s', 'horse_01b', 'horse_02s', 'horse_03s', 'horse_04s', 'horse_05s', 'horse_06s', 'horseshoe_01b', 'horseshoe_02s', 'horseshoe_03s', 'horseshoe_04n', 'horseshoe_05s', 'horseshoe_06s', 'hotdog_01b', 'hotdog_02s', 'hotdog_03s', 'hotdog_04s', 'hotdog_05s', 'hotdog_06s', 'hotplate_01b', 'hotplate_02s', 'hotplate_03s', 'hotplate_04s', 'hotplate_05s', 'hotplate_06s', 'hourglass_01b', 'hourglass_02s', 'hourglass_03s', 'hourglass_04s', 'hourglass_05s', 'hourglass_06n', 'hovercraft_01b', 'hovercraft_02s', 'hovercraft_03s', 'hovercraft_04s', 'hovercraft_05s', 'hovercraft_06s', 'hula_hoop_01b', 'hula_hoop_02s', 'hula_hoop_03s', 'hula_hoop_04s', 'hula_hoop_05s', 'hula_hoop_06s', 'iceskate_01b', 'iceskate_02s', 'iceskate_03s', 'iceskate_04s', 'iceskate_05s', 'iceskate_06s', 'icicle_01b', 'icicle_02s', 'icicle_03s', 'icicle_04s', 'icicle_05s', 'icicle_06s', 'iguana_01b', 'iguana_02s', 'iguana_03s', 'iguana_04s', 'iguana_05n', 'iguana_06s', 'inhaler_01b', 'inhaler_02s', 'inhaler_03s', 'inhaler_04s', 'inhaler_05s', 'inhaler_06s', 'iron_01b', 'iron_02s', 'iron_03s', 'iron_04s', 'iron_05s', 'iron_06s', 'jack_01b', 'jack_02s', 'jack_03s', 'jack_04s', 'jack_05s', 'jack_06s', 'jam_01b', 'jam_02s', 'jam_03s', 'jam_04s', 'jam_05s', 'jam_06s', 'jar_01s', 'jar_02s', 'jar_03s', 'jar_04s', 'jar_05s', 'jar_06s', 'jeans_01s', 'jeans_02s', 'jeans_03s', 'jeans_04s', 'jeans_05s', 'jeans_06s', 'jellyfish_01b', 'jellyfish_02s', 'jellyfish_03s', 'jellyfish_04s', 'jellyfish_05s', 'jellyfish_06s', 'jersey_01b', 'jersey_02s', 'jersey_03s', 'jersey_04s', 'jersey_05s', 'jersey_06s', 'joystick_01b', 'joystick_02s', 'joystick_03s', 'joystick_04s', 'joystick_05s', 'joystick_06s', 'jukebox_01n', 'jukebox_02s', 'jukebox_03n', 'jukebox_04s', 'jukebox_05s', 'jukebox_06s', 'kale_01b', 'kale_02s', 'kale_03s', 'kale_04s', 'kale_05s', 'kale_06s', 'kangaroo_01b', 'kangaroo_02s', 'kangaroo_03n', 'kangaroo_04s', 'kangaroo_05s', 'kangaroo_06n', 'kazoo_01s', 'kazoo_02s', 'kazoo_03s', 'kazoo_04s', 'kazoo_05s', 'kazoo_06s', 'kebab_01b', 'kebab_02n', 'kebab_03n', 'kebab_04n', 'kebab_05n', 'kebab_06n', 'kettle_01b', 'kettle_02s', 'kettle_03n', 'kettle_04n', 'kettle_05s', 'kettle_06s', 'key_01b', 'key_02s', 'key_03s', 'key_04s', 'key_05s', 'key_06s', 'kimono_01b', 'kimono_02s', 'kimono_03s', 'kimono_04s', 'kimono_05s', 'kimono_06s', 'kite_01b', 'kite_02s', 'kite_03s', 'kite_04s', 'kite_05s', 'kite_06s', 'knife_01b', 'knife_02s', 'knife_03s', 'knife_04s', 'knife_05s', 'knife_06s', 'knot_01b', 'knot_02s', 'knot_03s', 'knot_04s', 'knot_05n', 'knot_06s', 'koala_01b', 'koala_02s', 'koala_03s', 'koala_04s', 'koala_05s', 'koala_06s', 'ladder_01b', 'ladder_02s', 'ladder_03s', 'ladder_04s', 'ladder_05s', 'ladder_06s', 'ladle_01b', 'ladle_02s', 'ladle_03s', 'ladle_04s', 'ladle_05s', 'ladle_06s', 'lamp_01b', 'lamp_02s', 'lamp_03s', 'lamp_04s', 'lamp_05s', 'lamp_06s', 'lampshade_01b', 'lampshade_02s', 'lampshade_03s', 'lampshade_04s', 'lampshade_05s', 'lampshade_06s', 'lantern_01b', 'lantern_02s', 'lantern_03s', 'lantern_04s', 'lantern_05s', 'lantern_06s', 'lasagna_01b', 'lasagna_02s', 'lasagna_03s', 'lasagna_04s', 'lasagna_05s', 'lasagna_06s', 'lawnmower_01b', 'lawnmower_02s', 'lawnmower_03s', 'lawnmower_04n', 'lawnmower_05n', 'lawnmower_06s', 'lectern_01b', 'lectern_02s', 'lectern_03s', 'lectern_04s', 'lectern_05s', 'lectern_06s', 'leek_01b', 'leek_02s', 'leek_03s', 'leek_04s', 'leek_05s', 'leek_06s', 'leggings_01b', 'leggings_02s', 'leggings_03s', 'leggings_04s', 'leggings_05s', 'leggings_06s', 'lego_01b', 'lego_02s', 'lego_03s', 'lego_04s', 'lego_05s', 'lego_06s', 'lemon_01b', 'lemon_02s', 'lemon_03s', 'lemon_04s', 'lemon_05s', 'lemon_06s', 'lemonade_01b', 'lemonade_02s', 'lemonade_03s', 'lemonade_04s', 'lemonade_05s', 'lemonade_06s', 'leopard_01b', 'leopard_02s', 'leopard_03s', 'leopard_04s', 'leopard_05s', 'leopard_06s', 'lettuce_01b', 'lettuce_02n', 'lettuce_03n', 'lettuce_04n', 'lettuce_05n', 'lettuce_06n', 'life_jacket_01b', 'life_jacket_02s', 'life_jacket_03s', 'life_jacket_04s', 'life_jacket_05s', 'life_jacket_06s', 'lightbulb_01b', 'lightbulb_02s', 'lightbulb_03s', 'lightbulb_04s', 'lightbulb_05s', 'lightbulb_06s', 'lime_01b', 'lime_02s', 'lime_03n', 'lime_04s', 'lime_05s', 'lime_06s', 'limousine_01s', 'limousine_02s', 'limousine_03s', 'limousine_04s', 'limousine_05s', 'limousine_06s', 'lingerie_01b', 'lingerie_02s', 'lingerie_03s', 'lingerie_04s', 'lingerie_05s', 'lingerie_06s', 'lion_01b', 'lion_02s', 'lion_03s', 'lion_04s', 'lion_05s', 'lion_06s', 'lip_balm_01b', 'lip_balm_02s', 'lip_balm_03s', 'lip_balm_04s', 'lip_balm_05s', 'lip_balm_06s', 'lizard_01b', 'lizard_02n', 'lizard_03n', 'lizard_04n', 'lizard_05n', 'lizard_06n', 'llama_01b', 'llama_02s', 'llama_03s', 'llama_04n', 'llama_05s', 'llama_06s', 'lobster_01b', 'lobster_02n', 'lobster_03s', 'lobster_04s', 'lobster_05s', 'lobster_06s', 'lock_01b', 'lock_02s', 'lock_03s', 'lock_04s', 'lock_05s', 'lock_06s', 'locker_01b', 'locker_02s', 'locker_03s', 'locker_04s', 'locker_05s', 'locker_06s', 'lollipop_01b', 'lollipop_02s', 'lollipop_03s', 'lollipop_04s', 'lollipop_05s', 'lollipop_06n', 'macadamia_01b', 'macadamia_02s', 'macadamia_03s', 'macadamia_04s', 'macadamia_05s', 'macadamia_06n', 'magnifying_glass_01b', 'magnifying_glass_02s', 'magnifying_glass_03s', 'magnifying_glass_04s', 'magnifying_glass_05s', 'magnifying_glass_06s', 'mailbox_01b', 'mailbox_02s', 'mailbox_03s', 'mailbox_04s', 'mailbox_05s', 'mailbox_06s', 'man_01b', 'man_02s', 'man_03s', 'man_04s', 'man_05s', 'man_06s', 'manatee_01b', 'manatee_02s', 'manatee_03s', 'manatee_04s', 'manatee_05s', 'manatee_06s', 'mandolin_01s', 'mandolin_02s', 'mandolin_03n', 'mandolin_04s', 'mandolin_05s', 'mandolin_06s', 'mango_01b', 'mango_02s', 'mango_03s', 'mango_04s', 'mango_05s', 'mango_06s', 'manhole_01b', 'manhole_02n', 'manhole_03s', 'manhole_04n', 'manhole_05n', 'manhole_06s', 'map_01s', 'map_02s', 'map_03s', 'map_04s', 'map_05s', 'map_06s', 'marijuana_01b', 'marijuana_02s', 'marijuana_03s', 'marijuana_04s', 'marijuana_05s', 'marijuana_06n', 'marshmallow_01b', 'marshmallow_02s', 'marshmallow_03s', 'marshmallow_04s', 'marshmallow_05s', 'marshmallow_06s', 'mascara_01b', 'mascara_02s', 'mascara_03s', 'mascara_04s', 'mascara_05s', 'mascara_06s', 'mask_01b', 'mask_02s', 'mask_03s', 'mask_04s', 'mask_05s', 'mask_06s', 'mask_affine', 'mask_array', 'mast_01b', 'mast_02s', 'mast_03s', 'mast_04s', 'mast_05s', 'mast_06s', 'match_01b', 'match_02s', 'match_03s', 'match_04s', 'match_05s', 'match_06s', 'meat_01b', 'meat_02s', 'meat_03s', 'meat_04s', 'meat_05s', 'meat_06s', 'meat_grinder_01b', 'meat_grinder_02s', 'meat_grinder_03s', 'meat_grinder_04s', 'meat_grinder_05s', 'meat_grinder_06s', 'meatball_01b', 'meatball_02s', 'meatball_03s', 'meatball_04s', 'meatball_05s', 'meatball_06s', 'melon_01b', 'melon_02s', 'melon_03s', 'melon_04s', 'melon_05s', 'melon_06s', 'microphone_01b', 'microphone_02n', 'microphone_03s', 'microphone_04s', 'microphone_05s', 'microphone_06s', 'microscope_01b', 'microscope_02s', 'microscope_03s', 'microscope_04s', 'microscope_05s', 'microscope_06s', 'milk_01b', 'milk_02s', 'milk_03s', 'milk_04s', 'milk_05s', 'milk_06s', 'milkshake_01b', 'milkshake_02s', 'milkshake_03s', 'milkshake_04s', 'milkshake_05s', 'milkshake_06s', 'mirror_01b', 'mirror_02s', 'mirror_03s', 'mirror_04s', 'mirror_05s', 'mirror_06s', 'missile_01b', 'missile_02s', 'missile_03s', 'missile_04s', 'missile_05s', 'missile_06s', 'mistletoe_01b', 'mistletoe_02s', 'mistletoe_03s', 'mistletoe_04s', 'mistletoe_05n', 'mistletoe_06s', 'mitten_01b', 'mitten_02s', 'mitten_03s', 'mitten_04s', 'mitten_05s', 'mitten_06s', 'monkey_01b', 'monkey_02s', 'monkey_03s', 'monkey_04s', 'monkey_05s', 'monkey_06s', 'moose_01b', 'moose_02s', 'moose_03s', 'moose_04s', 'moose_05s', 'moose_06s', 'mop_01b', 'mop_02s', 'mop_03s', 'mop_04s', 'mop_05s', 'mop_06s', 'mosquito_01b', 'mosquito_02s', 'mosquito_03s', 'mosquito_04s', 'mosquito_05s', 'mosquito_06s', 'mosquito_net_01b', 'mosquito_net_02s', 'mosquito_net_03s', 'mosquito_net_04s', 'mosquito_net_05s', 'mosquito_net_06s', 'moth_01b', 'moth_02n', 'moth_03n', 'moth_04n', 'moth_05n', 'moth_06n', 'motorcycle_01b', 'motorcycle_02s', 'motorcycle_03s', 'motorcycle_04s', 'motorcycle_05s', 'motorcycle_06s', 'mouse1_01b', 'mouse1_02s', 'mouse1_03s', 'mouse1_04s', 'mouse1_05s', 'mouse1_06s', 'mousetrap_01b', 'mousetrap_02s', 'mousetrap_03s', 'mousetrap_04n', 'mousetrap_05s', 'mousetrap_06n', 'mousse_01b', 'mousse_02s', 'mousse_03s', 'mousse_04s', 'mousse_05s', 'mousse_06s', 'muffin_01b', 'muffin_02n', 'muffin_03n', 'muffin_04n', 'muffin_05n', 'muffin_06n', 'mushroom_01b', 'mushroom_02n', 'mushroom_03n', 'mushroom_04n', 'mushroom_05n', 'mushroom_06n', 'nacho_01b', 'nacho_02s', 'nacho_03s', 'nacho_04s', 'nacho_05s', 'nacho_06n', 'nail_polish_01b', 'nail_polish_02s', 'nail_polish_03s', 'nail_polish_04s', 'nail_polish_05s', 'nail_polish_06s', 'napkin_ring_01b', 'napkin_ring_02s', 'napkin_ring_03s', 'napkin_ring_04s', 'napkin_ring_05s', 'napkin_ring_06s', 'necklace_01b', 'necklace_02s', 'necklace_03s', 'necklace_04n', 'necklace_05s', 'necklace_06s', 'needle_01s', 'needle_02s', 'needle_03s', 'needle_04s', 'needle_05n', 'needle_06s', 'nest_01b', 'nest_02s', 'nest_03s', 'nest_04s', 'nest_05s', 'nest_06s', 'net_01b', 'net_02s', 'net_03s', 'net_04s', 'net_05s', 'net_06s', 'noodle_01s', 'noodle_02s', 'noodle_03s', 'noodle_04s', 'noodle_05s', 'noodle_06s', 'nose_01b', 'nose_02s', 'nose_03s', 'nose_04s', 'nose_05s', 'nose_06s', 'oar_01b', 'oar_02s', 'oar_03s', 'oar_04s', 'oar_05s', 'oar_06s', 'oatmeal_01s', 'oatmeal_02s', 'oatmeal_03s', 'oatmeal_04s', 'oatmeal_05s', 'oatmeal_06s', 'octopus_01b', 'octopus_02s', 'octopus_03s', 'octopus_04s', 'octopus_05s', 'octopus_06s', 'odometer_01b', 'odometer_02s', 'odometer_03s', 'odometer_04s', 'odometer_05s', 'odometer_06s', 'okra_01b', 'okra_02n', 'okra_03s', 'okra_04s', 'okra_05s', 'okra_06s', 'omelet_01b', 'omelet_02s', 'omelet_03s', 'omelet_04n', 'omelet_05s', 'omelet_06s', 'onion_01b', 'onion_02n', 'onion_03n', 'onion_04s', 'onion_05s', 'onion_06s', 'otter_01b', 'otter_02s', 'otter_03s', 'otter_04n', 'otter_05s', 'otter_06n', 'overalls_01s', 'overalls_02s', 'overalls_03s', 'overalls_04s', 'overalls_05s', 'overalls_06s', 'oyster_01b', 'oyster_02n', 'oyster_03n', 'oyster_04s', 'oyster_05s', 'oyster_06s', 'pacifier_01s', 'pacifier_02s', 'pacifier_03s', 'pacifier_04s', 'pacifier_05s', 'pacifier_06s', 'padlock_01b', 'padlock_02n', 'padlock_03s', 'padlock_04n', 'padlock_05s', 'padlock_06s', 'paint_01b', 'paint_02s', 'paint_03s', 'paint_04s', 'paint_05s', 'paint_06s', 'paintbrush_01b', 'paintbrush_02s', 'paintbrush_03s', 'paintbrush_04s', 'paintbrush_05s', 'paintbrush_06s', 'pan_01b', 'pan_02s', 'pan_03s', 'pan_04s', 'pan_05s', 'pan_06s', 'pancake_01b', 'pancake_02n', 'pancake_03s', 'pancake_04s', 'pancake_05s', 'pancake_06s', 'panda_01b', 'panda_02s', 'panda_03s', 'panda_04s', 'panda_05s', 'panda_06s', 'pantyhose_01b', 'pantyhose_02s', 'pantyhose_03s', 'pantyhose_04s', 'pantyhose_05s', 'pantyhose_06s', 'papaya_01b', 'papaya_02s', 'papaya_03s', 'papaya_04s', 'papaya_05s', 'papaya_06n', 'paper_01b', 'paper_02s', 'paper_03s', 'paper_04s', 'paper_05s', 'paper_06s', 'paperclip_01s', 'paperclip_02s', 'paperclip_03s', 'paperclip_04s', 'paperclip_05s', 'paperclip_06s', 'parachute_01b', 'parachute_02s', 'parachute_03s', 'parachute_04s', 'parachute_05s', 'parachute_06s', 'parsley_01b', 'parsley_02n', 'parsley_03s', 'parsley_04s', 'parsley_05s', 'parsley_06s', 'pastry_01b', 'pastry_02s', 'pastry_03n', 'pastry_04s', 'pastry_05s', 'pastry_06s', 'pea_01s', 'pea_02s', 'pea_03s', 'pea_04s', 'pea_05s', 'pea_06s', 'peach_01b', 'peach_02n', 'peach_03s', 'peach_04s', 'peach_05s', 'peach_06s', 'peanut_01b', 'peanut_02n', 'peanut_03s', 'peanut_04s', 'peanut_05s', 'peanut_06n', 'pear_01b', 'pear_02s', 'pear_03s', 'pear_04s', 'pear_05s', 'pear_06s', 'pecan_01b', 'pecan_02s', 'pecan_03s', 'pecan_04s', 'pecan_05s', 'pecan_06s', 'peeler_01b', 'peeler_02s', 'peeler_03s', 'peeler_04s', 'peeler_05s', 'peeler_06s', 'pencil_01s', 'pencil_02s', 'pencil_03s', 'pencil_04s', 'pencil_05n', 'pencil_06n', 'pencil_sharpener_01b', 'pencil_sharpener_02s', 'pencil_sharpener_03s', 'pencil_sharpener_04s', 'pencil_sharpener_05s', 'pencil_sharpener_06s', 'penguin_01b', 'penguin_02s', 'penguin_03s', 'penguin_04s', 'penguin_05s', 'penguin_06s', 'pepper1_01b', 'pepper1_02s', 'pepper1_03s', 'pepper1_04s', 'pepper1_05s', 'pepper1_06s', 'petal_01b', 'petal_02s', 'petal_03s', 'petal_04s', 'petal_05s', 'petal_06s', 'phone_01b', 'phone_02s', 'phone_03s', 'phone_04s', 'phone_05s', 'phone_06s', 'piano_01b', 'piano_02n', 'piano_03n', 'piano_04n', 'piano_05n', 'piano_06n', 'pickle_01b', 'pickle_02s', 'pickle_03s', 'pickle_04s', 'pickle_05s', 'pickle_06n', 'pie_01b', 'pie_02s', 'pie_03s', 'pie_04s', 'pie_05s', 'pie_06s', 'pig_01b', 'pig_02s', 'pig_03s', 'pig_04s', 'pig_05s', 'pig_06s', 'pigeon_01b', 'pigeon_02s', 'pigeon_03s', 'pigeon_04s', 'pigeon_05s', 'pigeon_06s', 'pill_01b', 'pill_02s', 'pill_03s', 'pill_04s', 'pill_05s', 'pill_06s', 'pillow_01b', 'pillow_02s', 'pillow_03s', 'pillow_04s', 'pillow_05s', 'pillow_06s', 'pineapple_01b', 'pineapple_02n', 'pineapple_03n', 'pineapple_04s', 'pineapple_05s', 'pineapple_06s', 'ping-pong_table_01b', 'ping-pong_table_02s', 'ping-pong_table_03s', 'ping-pong_table_04s', 'ping-pong_table_05s', 'ping-pong_table_06s', 'pinwheel_01b', 'pinwheel_02s', 'pinwheel_03s', 'pinwheel_04s', 'pinwheel_05s', 'pinwheel_06s', 'pistachio_01b', 'pistachio_02s', 'pistachio_03s', 'pistachio_04s', 'pistachio_05s', 'pistachio_06n', 'pizza_01b', 'pizza_02s', 'pizza_03s', 'pizza_04s', 'pizza_05s', 'pizza_06s', 'plastic_film_01b', 'plastic_film_02s', 'plastic_film_03s', 'plastic_film_04s', 'plastic_film_05s', 'plastic_film_06s', 'plate_01b', 'plate_02s', 'plate_03s', 'plate_04s', 'plate_05s', 'plate_06s', 'platypus_01b', 'platypus_02s', 'platypus_03s', 'platypus_04s', 'platypus_05s', 'platypus_06s', 'playing_card_01b', 'playing_card_02s', 'playing_card_03s', 'playing_card_04s', 'playing_card_05s', 'playing_card_06s', 'pliers_01s', 'pliers_02s', 'pliers_03s', 'pliers_04s', 'pliers_05s', 'pliers_06s', 'plum_01b', 'plum_02s', 'plum_03s', 'plum_04s', 'plum_05n', 'plum_06s', 'pocket_01b', 'pocket_02s', 'pocket_03s', 'pocket_04s', 'pocket_05s', 'pocket_06s', 'pocketknife_01b', 'pocketknife_02s', 'pocketknife_03s', 'pocketknife_04s', 'pocketknife_05s', 'pocketknife_06s', 'pogo_stick_01s', 'pogo_stick_02s', 'pogo_stick_03s', 'pogo_stick_04s', 'pogo_stick_05s', 'pogo_stick_06s', 'polaroid_01b', 'polaroid_02s', 'polaroid_03s', 'polaroid_04s', 'polaroid_05s', 'polaroid_06s', 'pom-pom_01b', 'pom-pom_02s', 'pom-pom_03s', 'pom-pom_04s', 'pom-pom_05s', 'pom-pom_06s', 'pomegranate_01b', 'pomegranate_02s', 'pomegranate_03s', 'pomegranate_04s', 'pomegranate_05s', 'pomegranate_06s', 'popcorn_01b', 'popcorn_02n', 'popcorn_03s', 'popcorn_04s', 'popcorn_05s', 'popcorn_06s', 'popsicle_01b', 'popsicle_02s', 'popsicle_03s', 'popsicle_04s', 'popsicle_05s', 'popsicle_06s', 'porcupine_01b', 'porcupine_02s', 'porcupine_03s', 'porcupine_04s', 'porcupine_05s', 'porcupine_06s', 'possum_01b', 'possum_02s', 'possum_03s', 'possum_04s', 'possum_05s', 'possum_06s', 'poster_01s', 'poster_02s', 'poster_03s', 'poster_04s', 'poster_05s', 'poster_06s', 'pot_01b', 'pot_02s', 'pot_03s', 'pot_04s', 'pot_05s', 'pot_06s', 'potato_01b', 'potato_02n', 'potato_03n', 'potato_04s', 'potato_05n', 'potato_06n', 'pothole_01b', 'pothole_02s', 'pothole_03n', 'pothole_04n', 'pothole_05s', 'pothole_06s', 'powder_01b', 'powder_02s', 'powder_03s', 'powder_04s', 'powder_05s', 'powder_06s', 'praying_mantis_01b', 'praying_mantis_02s', 'praying_mantis_03s', 'praying_mantis_04s', 'praying_mantis_05s', 'praying_mantis_06s', 'pretzel_01b', 'pretzel_02s', 'pretzel_03s', 'pretzel_04s', 'pretzel_05s', 'pretzel_06s', 'prism_01b', 'prism_02s', 'prism_03s', 'prism_04s', 'prism_05s', 'prism_06s', 'projector_01b', 'projector_02s', 'projector_03s', 'projector_04s', 'projector_05s', 'projector_06s', 'puck_01b', 'puck_02s', 'puck_03s', 'puck_04s', 'puck_05s', 'puck_06s', 'pudding_01b', 'pudding_02s', 'pudding_03s', 'pudding_04s', 'pudding_05s', 'pudding_06s', 'pumpkin_01b', 'pumpkin_02s', 'pumpkin_03s', 'pumpkin_04s', 'pumpkin_05s', 'pumpkin_06s', 'punch1_01s', 'punch1_02s', 'punch1_03s', 'punch1_04s', 'punch1_05s', 'punch1_06s', 'punching_bag_01b', 'punching_bag_02s', 'punching_bag_03s', 'punching_bag_04s', 'punching_bag_05s', 'punching_bag_06s', 'quill_01b', 'quill_02s', 'quill_03s', 'quill_04s', 'quill_05s', 'quill_06s', 'rabbit_01b', 'rabbit_02s', 'rabbit_03s', 'rabbit_04n', 'rabbit_05s', 'rabbit_06s', 'raccoon_01b', 'raccoon_02s', 'raccoon_03s', 'raccoon_04n', 'raccoon_05s', 'raccoon_06n', 'racket_01b', 'racket_02s', 'racket_03s', 'racket_04s', 'racket_05s', 'racket_06s', 'radiator_01b', 'radiator_02n', 'radiator_03s', 'radiator_04s', 'radiator_05s', 'radiator_06s', 'radish_01b', 'radish_02s', 'radish_03n', 'radish_04n', 'radish_05s', 'radish_06n', 'raft_01b', 'raft_02s', 'raft_03s', 'raft_04s', 'raft_05s', 'raft_06s', 'railing_01b', 'railing_02s', 'railing_03s', 'railing_04s', 'railing_05s', 'railing_06s', 'raspberry_01b', 'raspberry_02n', 'raspberry_03n', 'raspberry_04s', 'raspberry_05s', 'raspberry_06s', 'rat_01b', 'rat_02s', 'rat_03s', 'rat_04s', 'rat_05s', 'rat_06s', 'rattle_01b', 'rattle_02s', 'rattle_03s', 'rattle_04s', 'rattle_05s', 'rattle_06s', 'razor_01b', 'razor_02s', 'razor_03s', 'razor_04n', 'razor_05s', 'razor_06s', 'ready_meal_01b', 'ready_meal_02s', 'ready_meal_03s', 'ready_meal_04s', 'ready_meal_05s', 'ready_meal_06s', 'record_01b', 'record_02s', 'record_03s', 'record_04s', 'record_05s', 'record_06s', 'refrigerator_01b', 'refrigerator_02s', 'refrigerator_03s', 'refrigerator_04s', 'refrigerator_05s', 'refrigerator_06s', 'retainer_01b', 'retainer_02s', 'retainer_03s', 'retainer_04s', 'retainer_05s', 'retainer_06s', 'revolver_01s', 'revolver_02s', 'revolver_03s', 'revolver_04s', 'revolver_05s', 'revolver_06s', 'rhinoceros_01b', 'rhinoceros_02s', 'rhinoceros_03s', 'rhinoceros_04s', 'rhinoceros_05s', 'rhinoceros_06s', 'rhubarb_01b', 'rhubarb_02n', 'rhubarb_03s', 'rhubarb_04s', 'rhubarb_05s', 'rhubarb_06s', 'ribbon_01b', 'ribbon_02s', 'ribbon_03s', 'ribbon_04s', 'ribbon_05s', 'ribbon_06s', 'rice_01b', 'rice_02s', 'rice_03s', 'rice_04s', 'rice_05s', 'rice_06s', 'rifle_01s', 'rifle_02n', 'rifle_03n', 'rifle_04n', 'rifle_05s', 'rifle_06s', 'ring_01b', 'ring_02s', 'ring_03s', 'ring_04s', 'ring_05s', 'ring_06s', 'road_sign_01b', 'road_sign_02s', 'road_sign_03s', 'road_sign_04s', 'road_sign_05s', 'road_sign_06s', 'robot_01b', 'robot_02s', 'robot_03n', 'robot_04s', 'robot_05s', 'robot_06s', 'rocking_chair_01b', 'rocking_chair_02s', 'rocking_chair_03s', 'rocking_chair_04s', 'rocking_chair_05s', 'rocking_chair_06s', 'roll_01b', 'roll_02s', 'roll_03s', 'roll_04s', 'roll_05s', 'roll_06s', 'rollerblade_01b', 'rollerblade_02s', 'rollerblade_03s', 'rollerblade_04s', 'rollerblade_05s', 'rollerblade_06s', 'roof_rack_01b', 'roof_rack_02s', 'roof_rack_03s', 'roof_rack_04s', 'roof_rack_05s', 'roof_rack_06s', 'rug_01b', 'rug_02s', 'rug_03s', 'rug_04s', 'rug_05s', 'rug_06s', 'ruler_01b', 'ruler_02s', 'ruler_03s', 'ruler_04s', 'ruler_05s', 'ruler_06s', 'saddle_01s', 'saddle_02s', 'saddle_03s', 'saddle_04s', 'saddle_05s', 'saddle_06s', 'saltshaker_01b', 'saltshaker_02s', 'saltshaker_03s', 'saltshaker_04s', 'saltshaker_05s', 'saltshaker_06s', 'sandpaper_01b', 'sandpaper_02s', 'sandpaper_03s', 'sandpaper_04s', 'sandpaper_05s', 'sandpaper_06s', 'satellite_dish_01b', 'satellite_dish_02s', 'satellite_dish_03s', 'satellite_dish_04s', 'satellite_dish_05s', 'satellite_dish_06s', 'scaffolding_01b', 'scaffolding_02s', 'scaffolding_03n', 'scaffolding_04s', 'scaffolding_05s', 'scaffolding_06s', 'scallion_01b', 'scallion_02s', 'scallion_03s', 'scallion_04s', 'scallion_05s', 'scallion_06s', 'scanner_01b', 'scanner_02s', 'scanner_03s', 'scanner_04s', 'scanner_05s', 'scanner_06s', 'scarecrow_01b', 'scarecrow_02s', 'scarecrow_03s', 'scarecrow_04s', 'scarecrow_05s', 'scarecrow_06s', 'scissors_01s', 'scissors_02s', 'scissors_03s', 'scissors_04s', 'scissors_05s', 'scissors_06s', 'scone_01b', 'scone_02s', 'scone_03s', 'scone_04s', 'scone_05s', 'scone_06s', 'scoreboard_01b', 'scoreboard_02s', 'scoreboard_03s', 'scoreboard_04s', 'scoreboard_05s', 'scoreboard_06s', 'scorpion_01b', 'scorpion_02s', 'scorpion_03n', 'scorpion_04s', 'scorpion_05s', 'scorpion_06s', 'screwdriver_01s', 'screwdriver_02s', 'screwdriver_03s', 'screwdriver_04s', 'screwdriver_05s', 'screwdriver_06s', 'sea_urchin_01b', 'sea_urchin_02s', 'sea_urchin_03s', 'sea_urchin_04s', 'sea_urchin_05s', 'sea_urchin_06s', 'seagull_01b', 'seagull_02s', 'seagull_03s', 'seagull_04s', 'seagull_05s', 'seagull_06s', 'seesaw_01b', 'seesaw_02s', 'seesaw_03s', 'seesaw_04s', 'seesaw_05s', 'seesaw_06s', 'sewing_kit_01b', 'sewing_kit_02s', 'sewing_kit_03s', 'sewing_kit_04s', 'sewing_kit_05s', 'sewing_kit_06s', 'shaving_cream_01b', 'shaving_cream_02s', 'shaving_cream_03s', 'shaving_cream_04s', 'shaving_cream_05s', 'shaving_cream_06s', 'sheep_01b', 'sheep_02s', 'sheep_03s', 'sheep_04n', 'sheep_05s', 'sheep_06s', 'shell2_01b', 'shell2_02s', 'shell2_03s', 'shell2_04s', 'shell2_05s', 'shell2_06s', 'shield_01b', 'shield_02s', 'shield_03n', 'shield_04s', 'shield_05s', 'shield_06s', 'ship_01b', 'ship_02s', 'ship_03s', 'ship_04s', 'ship_05s', 'ship_06s', 'shirt_01s', 'shirt_02s', 'shirt_03s', 'shirt_04s', 'shirt_05s', 'shirt_06s', 'shoe_01s', 'shoe_02s', 'shoe_03s', 'shoe_04s', 'shoe_05s', 'shoe_06s', 'shovel_01b', 'shovel_02n', 'shovel_03n', 'shovel_04n', 'shovel_05s', 'shovel_06s', 'shower_01b', 'shower_02s', 'shower_03s', 'shower_04s', 'shower_05n', 'shower_06s', 'shower_cap_01b', 'shower_cap_02s', 'shower_cap_03s', 'shower_cap_04s', 'shower_cap_05s', 'shower_cap_06s', 'shredder_01b', 'shredder_02s', 'shredder_03s', 'shredder_04s', 'shredder_05s', 'shredder_06s', 'shrimp_01b', 'shrimp_02s', 'shrimp_03s', 'shrimp_04s', 'shrimp_05s', 'shrimp_06n', 'sickle_01b', 'sickle_02s', 'sickle_03s', 'sickle_04s', 'sickle_05s', 'sickle_06s', 'silverware_01b', 'silverware_02s', 'silverware_03s', 'silverware_04s', 'silverware_05s', 'silverware_06s', 'sim_card_01b', 'sim_card_02s', 'sim_card_03s', 'sim_card_04s', 'sim_card_05s', 'sim_card_06s', 'sink_01b', 'sink_02s', 'sink_03s', 'sink_04s', 'sink_05s', 'sink_06s', 'ski_pole_01b', 'ski_pole_02s', 'ski_pole_03s', 'ski_pole_04s', 'ski_pole_05s', 'ski_pole_06s', 'skunk_01b', 'skunk_02s', 'skunk_03s', 'skunk_04s', 'skunk_05s', 'skunk_06s', 'sleeping_bag_01b', 'sleeping_bag_02s', 'sleeping_bag_03s', 'sleeping_bag_04s', 'sleeping_bag_05s', 'sleeping_bag_06s', 'slingshot_01b', 'slingshot_02s', 'slingshot_03s', 'slingshot_04s', 'slingshot_05s', 'slingshot_06s', 'sloth_01b', 'sloth_02s', 'sloth_03s', 'sloth_04s', 'sloth_05s', 'sloth_06s', 'smoke_alarm_01b', 'smoke_alarm_02s', 'smoke_alarm_03s', 'smoke_alarm_04s', 'smoke_alarm_05s', 'smoke_alarm_06s', 'snail_01b', 'snail_02s', 'snail_03n', 'snail_04s', 'snail_05s', 'snail_06s', 'snake_01b', 'snake_02s', 'snake_03s', 'snake_04s', 'snake_05s', 'snake_06s', 'snorkel_01b', 'snorkel_02s', 'snorkel_03s', 'snorkel_04s', 'snorkel_05s', 'snorkel_06s', 'snowboard_01b', 'snowboard_02s', 'snowboard_03s', 'snowboard_04s', 'snowboard_05s', 'snowboard_06s', 'snowplow_01b', 'snowplow_02s', 'snowplow_03s', 'snowplow_04s', 'snowplow_05n', 'snowplow_06n', 'sock_01b', 'sock_02s', 'sock_03s', 'sock_04s', 'sock_05s', 'sock_06s', 'soda_fountain_01s', 'soda_fountain_02s', 'soda_fountain_03s', 'soda_fountain_04s', 'soda_fountain_05s', 'soda_fountain_06s', 'solar_panel_01b', 'solar_panel_02s', 'solar_panel_03s', 'solar_panel_04s', 'solar_panel_05s', 'solar_panel_06s', 'sombrero_01b', 'sombrero_02s', 'sombrero_03s', 'sombrero_04s', 'sombrero_05s', 'sombrero_06s', 'sonogram_01b', 'sonogram_02s', 'sonogram_03s', 'sonogram_04s', 'sonogram_05s', 'sonogram_06s', 'soy_sauce_01s', 'soy_sauce_02s', 'soy_sauce_03s', 'soy_sauce_04s', 'soy_sauce_05s', 'soy_sauce_06s', 'space_shuttle_01b', 'space_shuttle_02s', 'space_shuttle_03s', 'space_shuttle_04s', 'space_shuttle_05s', 'space_shuttle_06s', 'spareribs_01b', 'spareribs_02s', 'spareribs_03s', 'spareribs_04s', 'spareribs_05s', 'spareribs_06s', 'spatula_01b', 'spatula_02s', 'spatula_03s', 'spatula_04s', 'spatula_05s', 'spatula_06s', 'speaker_01s', 'speaker_02s', 'speaker_03s', 'speaker_04s', 'speaker_05s', 'speaker_06s', 'spider_01b', 'spider_02s', 'spider_03s', 'spider_04s', 'spider_05s', 'spider_06s', 'spinach_01b', 'spinach_02s', 'spinach_03s', 'spinach_04s', 'spinach_05s', 'spinach_06s', 'spoon_01b', 'spoon_02s', 'spoon_03s', 'spoon_04s', 'spoon_05n', 'spoon_06s', 'spout_01b', 'spout_02s', 'spout_03s', 'spout_04s', 'spout_05s', 'spout_06s', 'sprinkler_01b', 'sprinkler_02n', 'sprinkler_03s', 'sprinkler_04s', 'sprinkler_05s', 'sprinkler_06s', 'squeegee_01b', 'squeegee_02s', 'squeegee_03s', 'squeegee_04s', 'squeegee_05s', 'squeegee_06s', 'squirrel_01b', 'squirrel_02s', 'squirrel_03s', 'squirrel_04s', 'squirrel_05n', 'squirrel_06s', 'stair_01b', 'stair_02s', 'stair_03s', 'stair_04s', 'stair_05s', 'stair_06s', 'stalagmite_01b', 'stalagmite_02s', 'stalagmite_03s', 'stalagmite_04s', 'stalagmite_05s', 'stalagmite_06s', 'stamp2_01b', 'stamp2_02s', 'stamp2_03s', 'stamp2_04s', 'stamp2_05s', 'stamp2_06s', 'stapler_01b', 'stapler_02s', 'stapler_03s', 'stapler_04s', 'stapler_05s', 'stapler_06s', 'starfish_01b', 'starfish_02s', 'starfish_03s', 'starfish_04s', 'starfish_05s', 'starfish_06s', 'steak_01b', 'steak_02s', 'steak_03s', 'steak_04s', 'steak_05s', 'steak_06s', 'steamroller_01b', 'steamroller_02n', 'steamroller_03n', 'steamroller_04n', 'steamroller_05n', 'steamroller_06s', 'stethoscope_01b', 'stethoscope_02s', 'stethoscope_03s', 'stethoscope_04s', 'stethoscope_05s', 'stethoscope_06n', 'stiletto_01b', 'stiletto_02s', 'stiletto_03s', 'stiletto_04s', 'stiletto_05s', 'stiletto_06s', 'stopwatch_01b', 'stopwatch_02s', 'stopwatch_03s', 'stopwatch_04s', 'stopwatch_05s', 'stopwatch_06s', 'stove1_01b', 'stove1_02s', 'stove1_03s', 'stove1_04s', 'stove1_05s', 'stove1_06s', 'strainer_01b', 'strainer_02s', 'strainer_03s', 'strainer_04s', 'strainer_05s', 'strainer_06s', 'strawberry_01b', 'strawberry_02n', 'strawberry_03n', 'strawberry_04n', 'strawberry_05n', 'strawberry_06n', 'streetlight_01b', 'streetlight_02s', 'streetlight_03s', 'streetlight_04s', 'streetlight_05s', 'streetlight_06n', 'submarine_01b', 'submarine_02s', 'submarine_03s', 'submarine_04s', 'submarine_05s', 'submarine_06n', 'subway_01b', 'subway_02s', 'subway_03s', 'subway_04s', 'subway_05s', 'subway_06s', 'suit_01b', 'suit_02s', 'suit_03s', 'suit_04s', 'suit_05s', 'suit_06s', 'sundae_01b', 'sundae_02s', 'sundae_03s', 'sundae_04s', 'sundae_05s', 'sundae_06s', 'sundial_01b', 'sundial_02s', 'sundial_03s', 'sundial_04s', 'sundial_05s', 'sundial_06s', 'sunroof_01b', 'sunroof_02s', 'sunroof_03s', 'sunroof_04s', 'sunroof_05s', 'sunroof_06s', 'sushi_01b', 'sushi_02s', 'sushi_03s', 'sushi_04s', 'sushi_05s', 'sushi_06s', 'suspenders_01b', 'suspenders_02s', 'suspenders_03s', 'suspenders_04s', 'suspenders_05s', 'suspenders_06s', 'sweatsuit_01s', 'sweatsuit_02s', 'sweatsuit_03s', 'sweatsuit_04s', 'sweatsuit_05s', 'sweatsuit_06s', 'sword_01b', 'sword_02s', 'sword_03s', 'sword_04s', 'sword_05s', 'sword_06s', 'syringe_01b', 'syringe_02s', 'syringe_03s', 'syringe_04s', 'syringe_05s', 'syringe_06s', 'syrup_01s', 'syrup_02s', 'syrup_03s', 'syrup_04s', 'syrup_05s', 'syrup_06s', 't-shirt_01b', 't-shirt_02s', 't-shirt_03s', 't-shirt_04s', 't-shirt_05s', 't-shirt_06s', 'tab_01b', 'tab_02s', 'tab_03s', 'tab_04s', 'tab_05s', 'tab_06s', 'table_01b', 'table_02s', 'table_03s', 'table_04s', 'table_05s', 'table_06s', 'tablecloth_01b', 'tablecloth_02s', 'tablecloth_03s', 'tablecloth_04s', 'tablecloth_05s', 'tablecloth_06s', 'taco_01b', 'taco_02s', 'taco_03s', 'taco_04s', 'taco_05s', 'taco_06n', 'tadpole_01b', 'tadpole_02s', 'tadpole_03s', 'tadpole_04s', 'tadpole_05s', 'tadpole_06s', 'tag_01b', 'tag_02s', 'tag_03s', 'tag_04s', 'tag_05s', 'tag_06s', 'taillight_01b', 'taillight_02s', 'taillight_03s', 'taillight_04s', 'taillight_05s', 'taillight_06s', 'tamale_01b', 'tamale_02s', 'tamale_03s', 'tamale_04n', 'tamale_05s', 'tamale_06s', 'tarp_01b', 'tarp_02s', 'tarp_03s', 'tarp_04s', 'tarp_05s', 'tarp_06s', 'tattoo_01b', 'tattoo_02s', 'tattoo_03s', 'tattoo_04s', 'tattoo_05s', 'tattoo_06s', 'teapot_01b', 'teapot_02s', 'teapot_03s', 'teapot_04s', 'teapot_05s', 'teapot_06s', 'telescope_01b', 'telescope_02s', 'telescope_03n', 'telescope_04s', 'telescope_05n', 'telescope_06s', 'television_01b', 'television_02n', 'television_03s', 'television_04s', 'television_05s', 'television_06s', 'tent_01b', 'tent_02s', 'tent_03s', 'tent_04s', 'tent_05n', 'tent_06s', 'thermometer_01b', 'thermometer_02s', 'thermometer_03s', 'thermometer_04s', 'thermometer_05s', 'thermometer_06s', 'thermos_01b', 'thermos_02s', 'thermos_03s', 'thermos_04s', 'thermos_05s', 'thermos_06s', 'thermostat_01b', 'thermostat_02s', 'thermostat_03s', 'thermostat_04s', 'thermostat_05s', 'thermostat_06s', 'thimble_01b', 'thimble_02s', 'thimble_03n', 'thimble_04n', 'thimble_05n', 'thimble_06n', 'thumbtack_01b', 'thumbtack_02s', 'thumbtack_03s', 'thumbtack_04s', 'thumbtack_05s', 'thumbtack_06s', 'tiara_01b', 'tiara_02s', 'tiara_03s', 'tiara_04s', 'tiara_05s', 'tiara_06s', 'tie_01b', 'tie_02s', 'tie_03s', 'tie_04s', 'tie_05s', 'tie_06s', 'tiger_01b', 'tiger_02s', 'tiger_03s', 'tiger_04s', 'tiger_05s', 'tiger_06s', 'tiramisu_01b', 'tiramisu_02s', 'tiramisu_03s', 'tiramisu_04s', 'tiramisu_05s', 'tiramisu_06s', 'toast_01b', 'toast_02s', 'toast_03s', 'toast_04s', 'toast_05s', 'toast_06s', 'toaster_01b', 'toaster_02s', 'toaster_03s', 'toaster_04s', 'toaster_05s', 'toaster_06s', 'toilet_01b', 'toilet_02s', 'toilet_03s', 'toilet_04s', 'toilet_05s', 'toilet_06s', 'toilet_paper_01b', 'toilet_paper_02s', 'toilet_paper_03s', 'toilet_paper_04s', 'toilet_paper_05s', 'toilet_paper_06s', 'tomato_01b', 'tomato_02s', 'tomato_03s', 'tomato_04s', 'tomato_05s', 'tomato_06s', 'tongs_01s', 'tongs_02s', 'tongs_03s', 'tongs_04s', 'tongs_05s', 'tongs_06s', 'tongue_01s', 'tongue_02s', 'tongue_03s', 'tongue_04s', 'tongue_05s', 'tongue_06s', 'toolbox_01b', 'toolbox_02s', 'toolbox_03s', 'toolbox_04s', 'toolbox_05s', 'toolbox_06s', 'toothpick_01b', 'toothpick_02s', 'toothpick_03s', 'toothpick_04s', 'toothpick_05s', 'toothpick_06s', 'tortilla_01b', 'tortilla_02s', 'tortilla_03s', 'tortilla_04s', 'tortilla_05s', 'tortilla_06s', 'toucan_01b', 'toucan_02s', 'toucan_03s', 'toucan_04s', 'toucan_05s', 'toucan_06s', 'touchpad_01b', 'touchpad_02s', 'touchpad_03s', 'touchpad_04s', 'touchpad_05s', 'touchpad_06s', 'towel_01b', 'towel_02s', 'towel_03s', 'towel_04s', 'towel_05s', 'towel_06s', 'towel_rack_01b', 'towel_rack_02s', 'towel_rack_03s', 'towel_rack_04s', 'towel_rack_05s', 'towel_rack_06s', 'tractor_01b', 'tractor_02n', 'tractor_03s', 'tractor_04s', 'tractor_05n', 'tractor_06s', 'train_01b', 'train_02s', 'train_03s', 'train_04s', 'train_05s', 'train_06s', 'trap_01b', 'trap_02s', 'trap_03s', 'trap_04s', 'trap_05s', 'trap_06s', 'trashcan_01b', 'trashcan_02s', 'trashcan_03s', 'trashcan_04s', 'trashcan_05s', 'trashcan_06s', 'tray_01s', 'tray_02s', 'tray_03s', 'tray_04s', 'tray_05s', 'tray_06s', 'treadmill_01b', 'treadmill_02s', 'treadmill_03s', 'treadmill_04s', 'treadmill_05s', 'treadmill_06s', 'tree_01b', 'tree_02s', 'tree_03s', 'tree_04s', 'tree_05s', 'tree_06s', 'tree_trunk_01b', 'tree_trunk_02s', 'tree_trunk_03s', 'tree_trunk_04s', 'tree_trunk_05s', 'tree_trunk_06s', 'trowel_01s', 'trowel_02s', 'trowel_03s', 'trowel_04s', 'trowel_05s', 'trowel_06s', 'truck_01b', 'truck_02s', 'truck_03s', 'truck_04n', 'truck_05s', 'truck_06s', 'trumpet_01s', 'trumpet_02s', 'trumpet_03s', 'trumpet_04s', 'trumpet_05s', 'trumpet_06s', 'tumbleweed_01b', 'tumbleweed_02n', 'tumbleweed_03s', 'tumbleweed_04s', 'tumbleweed_05s', 'tumbleweed_06s', 'turtle_01b', 'turtle_02s', 'turtle_03s', 'turtle_04s', 'turtle_05n', 'turtle_06s', 'tuxedo_01b', 'tuxedo_02s', 'tuxedo_03s', 'tuxedo_04s', 'tuxedo_05s', 'tuxedo_06s', 'tweezers_01b', 'tweezers_02s', 'tweezers_03s', 'tweezers_04s', 'tweezers_05s', 'tweezers_06s', 'typewriter_01b', 'typewriter_02s', 'typewriter_03s', 'typewriter_04s', 'typewriter_05s', 'typewriter_06s', 'ukulele_01b', 'ukulele_02s', 'ukulele_03s', 'ukulele_04s', 'ukulele_05s', 'ukulele_06s', 'umbrella_01b', 'umbrella_02s', 'umbrella_03s', 'umbrella_04s', 'umbrella_05s', 'umbrella_06s', 'undershirt_01s', 'undershirt_02s', 'undershirt_03s', 'undershirt_04s', 'undershirt_05s', 'undershirt_06s', 'uniform_01b', 'uniform_02s', 'uniform_03s', 'uniform_04s', 'uniform_05s', 'uniform_06s', 'urinal_01b', 'urinal_02s', 'urinal_03s', 'urinal_04s', 'urinal_05s', 'urinal_06s', 'valve_01b', 'valve_02s', 'valve_03s', 'valve_04s', 'valve_05s', 'valve_06s', 'vase_01b', 'vase_02s', 'vase_03s', 'vase_04s', 'vase_05s', 'vase_06s', 'velcro_01b', 'velcro_02s', 'velcro_03s', 'velcro_04s', 'velcro_05s', 'velcro_06s', 'videocassette_01b', 'videocassette_02n', 'videocassette_03s', 'videocassette_04n', 'videocassette_05n', 'videocassette_06s', 'viewfinder_01s', 'viewfinder_02s', 'viewfinder_03s', 'viewfinder_04s', 'viewfinder_05s', 'viewfinder_06s', 'violin_01b', 'violin_02s', 'violin_03n', 'violin_04s', 'violin_05s', 'violin_06s', 'waffle_iron_01b', 'waffle_iron_02s', 'waffle_iron_03s', 'waffle_iron_04s', 'waffle_iron_05s', 'waffle_iron_06s', 'walker1_01b', 'walker1_02s', 'walker1_03s', 'walker1_04s', 'walker1_05s', 'walker1_06s', 'wall_01b', 'wall_02s', 'wall_03s', 'wall_04s', 'wall_05s', 'wall_06s', 'wallet_01b', 'wallet_02s', 'wallet_03s', 'wallet_04s', 'wallet_05s', 'wallet_06s', 'wallpaper_01b', 'wallpaper_02s', 'wallpaper_03s', 'wallpaper_04s', 'wallpaper_05s', 'wallpaper_06s', 'walnut_01b', 'walnut_02s', 'walnut_03n', 'walnut_04n', 'walnut_05s', 'walnut_06n', 'walrus_01b', 'walrus_02n', 'walrus_03s', 'walrus_04s', 'walrus_05s', 'walrus_06s', 'wand_01s', 'wand_02s', 'wand_03s', 'wand_04s', 'wand_05n', 'wand_06s', 'warthog_01b', 'warthog_02s', 'warthog_03s', 'warthog_04n', 'warthog_05s', 'warthog_06s', 'washboard_01b', 'washboard_02s', 'washboard_03s', 'washboard_04s', 'washboard_05s', 'washboard_06s', 'wasp_01b', 'wasp_02s', 'wasp_03s', 'wasp_04s', 'wasp_05s', 'wasp_06s', 'watch_01b', 'watch_02s', 'watch_03s', 'watch_04s', 'watch_05s', 'watch_06s', 'water_fountain_01b', 'water_fountain_02s', 'water_fountain_03s', 'water_fountain_04s', 'water_fountain_05s', 'water_fountain_06s', 'weasel_01b', 'weasel_02s', 'weasel_03n', 'weasel_04s', 'weasel_05n', 'weasel_06s', 'whale_01b', 'whale_02s', 'whale_03s', 'whale_04s', 'whale_05s', 'whale_06s', 'wheat_01b', 'wheat_02s', 'wheat_03n', 'wheat_04s', 'wheat_05n', 'wheat_06s', 'wheelbarrow_01b', 'wheelbarrow_02s', 'wheelbarrow_03s', 'wheelbarrow_04s', 'wheelbarrow_05s', 'wheelbarrow_06s', 'wheelchair_01b', 'wheelchair_02s', 'wheelchair_03n', 'wheelchair_04n', 'wheelchair_05s', 'wheelchair_06s', 'whip_01b', 'whip_02s', 'whip_03s', 'whip_04s', 'whip_05s', 'whip_06s', 'whisk_01b', 'whisk_02s', 'whisk_03s', 'whisk_04s', 'whisk_05s', 'whisk_06s', 'wig_01s', 'wig_02s', 'wig_03s', 'wig_04s', 'wig_05s', 'wig_06s', 'windowsill_01b', 'windowsill_02s', 'windowsill_03s', 'windowsill_04s', 'windowsill_05s', 'windowsill_06s', 'wine_01b', 'wine_02s', 'wine_03s', 'wine_04s', 'wine_05s', 'wine_06s', 'wire_01s', 'wire_02s', 'wire_03s', 'wire_04s', 'wire_05s', 'wire_06s', 'wire_cutters_01b', 'wire_cutters_02s', 'wire_cutters_03s', 'wire_cutters_04s', 'wire_cutters_05s', 'wire_cutters_06s', 'wolf_01b', 'wolf_02s', 'wolf_03s', 'wolf_04s', 'wolf_05n', 'wolf_06n', 'woman_01b', 'woman_02s', 'woman_03s', 'woman_04s', 'woman_05s', 'woman_06s', 'wooden_leg_01b', 'wooden_leg_02s', 'wooden_leg_03s', 'wooden_leg_04s', 'wooden_leg_05s', 'wooden_leg_06s', 'worm_01b', 'worm_02s', 'worm_03s', 'worm_04n', 'worm_05n', 'worm_06s', 'wrapping_paper_01b', 'wrapping_paper_02s', 'wrapping_paper_03s', 'wrapping_paper_04s', 'wrapping_paper_05s', 'wrapping_paper_06s', 'wrench_01b', 'wrench_02s', 'wrench_03s', 'wrench_04s', 'wrench_05s', 'wrench_06s', 'yo-yo_01s', 'yo-yo_02s', 'yo-yo_03s', 'yo-yo_04s', 'yo-yo_05s', 'yo-yo_06s', 'zebra_01b', 'zebra_02s', 'zebra_03s', 'zebra_04s', 'zebra_05s', 'zebra_06s', 'zucchini_01b', 'zucchini_02s', 'zucchini_03s', 'zucchini_04s', 'zucchini_05s', 'zucchini_06n']
4322
Data in acorn_01b:
<KeysViewHDF5 ['betas', 'blanks', 'num_reps']>
Data in betas:
<HDF5 dataset "betas": shape (206359,), type "<f4">
beta_scores shape (206359,)
ATLAS shape (71, 91, 71)
_images/244c3f142d700f5a42a61240d72395b07c8d0d5c5d6f84ee9c5f7b03d5c59218.png
Shape of region signals:  (1, 400)
     Region  Average Signal
0       1.0       -0.112586
1       2.0       -0.004418
2       3.0        0.045564
3       4.0       -0.077944
4       5.0       -0.290445
..      ...             ...
395   396.0       -0.250721
396   397.0       -0.225610
397   398.0       -0.357176
398   399.0       -0.359945
399   400.0       -0.221157

[400 rows x 2 columns]
_images/a8bc3d66c04df3bf9c4be522785a94f3f7174b78f5fd927ac402f4ba4b3e1a45.png

Friends Dataset Connectome#

Here, we plot the connectome of the Friends dataset; to get the connectivuty map of different Brain parcels together. Function nilearn.connectome.ConnectivityMeasure plots the connectomes, and reorder=True argument in plotting.plot_matrix makes it as correlation clustered regions in the Brain.

import h5py
import warnings
warnings.filterwarnings(action='once')
import numpy as np
import nilearn.connectome

file_path = '/content/drive/MyDrive/BrainHack_2024_2/sub-03/sub-03_task-friends_space-T1w_atlas-Ward_desc-400_timeseries.h5'
# Open the HDF5 file
X = np.zeros((453,400))
with h5py.File(file_path, 'r') as f:
    # Print all keys in the file
    print("Keys in the HDF5 file:", list(f.keys()))
    print(len(list(f.keys())))
    # Access a specific dataset
    dataset_name = 'ses-030'
    #dataset_name = 'airbag_01b'
    session_type = 'ses-041_task-s04e04a_timeseries'
    #session_type = 'betas'
    if dataset_name in f:
        data = f[dataset_name]
        for ss_type in data.keys():
          print(f"Data in {dataset_name}:")
          print(data.keys())
          print(f"Data in {ss_type}:")
          print(data[ss_type])
          X = np.concatenate((X, np.array(data[ss_type])), axis=0)
    else:
        print(f"Dataset {dataset_name} not found in the file.")

#print(X)

# Estimating connectomes and save for pytorch to load
print('session SHAPE',X.shape)
corr_measure = nilearn.connectome.ConnectivityMeasure(kind="correlation")
conn = corr_measure.fit_transform([X])[0]

n_regions_extracted = X.shape[-1]
title = 'Correlation between %d regions' % n_regions_extracted

print('Correlation matrix shape:',conn.shape)

# First plot the matrix
from nilearn import plotting
display = plotting.plot_matrix(conn, vmax=1, vmin=-1,
                               colorbar=True, title=title,reorder=True, labels= np.arange(1,401))
Keys in the HDF5 file: ['ses-001', 'ses-002', 'ses-003', 'ses-004', 'ses-005', 'ses-006', 'ses-007', 'ses-008', 'ses-009', 'ses-010', 'ses-011', 'ses-012', 'ses-013', 'ses-014', 'ses-015', 'ses-016', 'ses-017', 'ses-018', 'ses-019', 'ses-020', 'ses-021', 'ses-022', 'ses-023', 'ses-024', 'ses-025', 'ses-026', 'ses-027', 'ses-028', 'ses-029', 'ses-030', 'ses-031', 'ses-032', 'ses-033', 'ses-034', 'ses-035', 'ses-036', 'ses-037', 'ses-038', 'ses-039', 'ses-040', 'ses-041', 'ses-042', 'ses-043', 'ses-044', 'ses-045', 'ses-046', 'ses-047', 'ses-048', 'ses-049', 'ses-050', 'ses-052', 'ses-053', 'ses-054', 'ses-055', 'ses-056', 'ses-057', 'ses-058', 'ses-059', 'ses-060', 'ses-061', 'ses-062', 'ses-063', 'ses-065', 'ses-066', 'ses-067', 'ses-070', 'ses-071', 'ses-072', 'ses-074', 'ses-075', 'ses-076', 'ses-077', 'ses-078', 'ses-079', 'ses-080', 'ses-081', 'ses-082', 'ses-083']
78
Data in ses-030:
<KeysViewHDF5 ['ses-030_task-s03e06a_timeseries', 'ses-030_task-s03e06b_timeseries', 'ses-030_task-s03e07a_timeseries', 'ses-030_task-s03e07b_timeseries']>
Data in ses-030_task-s03e06a_timeseries:
<HDF5 dataset "ses-030_task-s03e06a_timeseries": shape (470, 400), type "<f4">
Data in ses-030:
<KeysViewHDF5 ['ses-030_task-s03e06a_timeseries', 'ses-030_task-s03e06b_timeseries', 'ses-030_task-s03e07a_timeseries', 'ses-030_task-s03e07b_timeseries']>
Data in ses-030_task-s03e06b_timeseries:
<HDF5 dataset "ses-030_task-s03e06b_timeseries": shape (470, 400), type "<f4">
Data in ses-030:
<KeysViewHDF5 ['ses-030_task-s03e06a_timeseries', 'ses-030_task-s03e06b_timeseries', 'ses-030_task-s03e07a_timeseries', 'ses-030_task-s03e07b_timeseries']>
Data in ses-030_task-s03e07a_timeseries:
<HDF5 dataset "ses-030_task-s03e07a_timeseries": shape (467, 400), type "<f4">
Data in ses-030:
<KeysViewHDF5 ['ses-030_task-s03e06a_timeseries', 'ses-030_task-s03e06b_timeseries', 'ses-030_task-s03e07a_timeseries', 'ses-030_task-s03e07b_timeseries']>
Data in ses-030_task-s03e07b_timeseries:
<HDF5 dataset "ses-030_task-s03e07b_timeseries": shape (467, 400), type "<f4">
session SHAPE (2327, 400)
Correlation matrix shape: (400, 400)
/usr/local/lib/python3.10/dist-packages/nilearn/connectome/connectivity_matrices.py:507: DeprecationWarning: The default strategy for standardize is currently 'zscore' which incorrectly uses population std to calculate sample zscores. The new strategy 'zscore_sample' corrects this behavior by using the sample std. In release 0.13, the default strategy will be replaced by the new strategy and the 'zscore' option will be removed. Please use 'zscore_sample' instead.
  covariances_std = [
_images/e028605bea1ce7a1a24a8a92ce91c6db8e4258b4a0f364962509c17891ce9cca.png

Cutting Percentile#

As we have seen in the previous cell, the connectome is a dense graph (the adjecency matrix). Therefore to represent our network, we need to pick the top 10 percentile of its weight distribution. Here first we have plotted the distribution of the connectom weights, and then plot the top 10 percentile connectome in the next image. As you can see the output graph is spars and has most informative clusters in different regions.

import numpy
import matplotlib.pyplot as plt
percentile = numpy.percentile(conn.ravel(), q=90)
print(percentile)
print(sum(sum(conn>0)))
plt.hist(conn.ravel())
plt.title('Distribution of the Friends Dataset Connectome')
conn[conn<percentile] = 0
from nilearn import plotting
display = plotting.plot_matrix(conn, vmax=1, vmin=-1,
                               colorbar=True, title='top 10% Percentile of the weights',reorder=True, labels= np.arange(1,401))
0.21622188185501867
82910
_images/94bc183d15be2fb17aae55e9dc445913c3f91574a39dfc5836801dc9eebf0445.png _images/5ad49a060ff3b2fc242cab2ec8ada404e8ba4949f6c665b9ba347a3d2d1bc35f.png

Graph Construction#

After cleaning up the connectome, we can makeup our graph, by calling make_group_graph. Please note that the implementation of the below cell is ispired by here, However it has been slightly modified, to pure adjecency matrix recostruction for the purpose of this note book.

import sys
sys.path.append('../src')
sys.path.append('/content/drive/MyDrive/BrainHack_2024_2/Code')
#from graph_construction import make_group_graph


import numpy as np
import torch
import torch_geometric as tg

def _make_undirected(mat):
    """
    Takes an input adjacency matrix and makes it undirected (symmetric).

    Parameter
    ----------
    mat: array
        Square adjacency matrix.
    """
    if mat.shape[0] != mat.shape[1]:
        raise ValueError("Adjacency matrix must be square.")

    sym = (mat + mat.transpose()) / 2
    if len(np.unique(mat)) == 2:  # if graph was unweighted, return unweighted
        return np.ceil(sym)  # otherwise return average
    return sym

def graph_quantile(mat, self_loops=False, symmetric=True):
    """
    Parameters
    ----------
    mat: array
        Square adjacency matrix.
    Returns:
    Symmetric Adj Matrix
    """

    if mat.shape[0] != mat.shape[1]:
        raise ValueError("Adjacency matrix must be square.")
    dim = mat.shape[0]
    is_directed = not (mat == mat.transpose()).all()
    if is_directed:
        raise ValueError(
            "Input adjacency matrix must be undirected (matrix symmetric)!"
        )

    # absolute correlation
    #mat = np.abs(mat)
    adj = np.copy(mat)
    if not self_loops:
        np.fill_diagonal(adj, 0)
    if symmetric:
        adj = _make_undirected(adj)
    return adj

def make_group_graph(connectomes, self_loops=False, symmetric=True):
    """
    Parameters
    ----------
    connectomes: list of array
        List of connectomes in n_roi x n_roi format, connectomes must all be the same shape.
    self_loops: bool, default=False
        Wether or not to keep self loops in graph, if set to False resulting adjacency matrix
        has zero along diagonal.
    symmetric: bool, default=True
        Wether or not to return a symmetric adjacency matrix. In cases where a node is in the neighbourhood
        of another node that is not its neighbour, the connection strength between the two will be halved.
    """
    if connectomes[0].shape[0] != connectomes[0].shape[1]:
        raise ValueError("Connectomes must be square.")

    # Group average connectome and nndirected 8 k-NN graph
    avg_conn = np.array(connectomes).mean(axis=0)
    avg_conn = np.round(avg_conn, 6)
    avg_conn_k = graph_quantile(
        avg_conn, self_loops=self_loops, symmetric=symmetric
    )

    # Format matrix into graph for torch_geometric
    adj_sparse = tg.utils.dense_to_sparse(torch.from_numpy(avg_conn_k))
    return tg.data.Data(edge_index=adj_sparse[0], edge_attr=adj_sparse[1])

# make a graph for the subject
graph = make_group_graph([conn], self_loops=False, symmetric=True)

Random Graph#

Here we also make random graph which its weights has \(w\sim N(0,1)\), as a base line for comparing the message passing procedure in connectome and random graph. The first image depicts the weight distribution and the second one represents the connectome of the random graph.

import numpy as np
print(conn.shape)
rand_conn = np.random.normal(0, 1, size=conn.shape)
percentile = numpy.percentile(rand_conn.ravel(), q=90)
print(percentile)
print(sum(sum(rand_conn>0)))
plt.hist(rand_conn.ravel())
plt.title('Distribution of the Random Gaussian Connectome')
rand_conn[rand_conn<percentile] = 0
graph_rnd = make_group_graph([_make_undirected(rand_conn)], self_loops=False, symmetric=True)
(400, 400)
1.2818425145007177
80097
_images/7a23da69782f28cb65955453e197c4c7f12745425ffc3d5452303344568a1452.png
from nilearn import plotting
display = plotting.plot_matrix(rand_conn, vmax=1, vmin=-1,
                               colorbar=True, title='top 10% percentile',reorder=True, labels= np.arange(1,401))
_images/6be03b4a024e7745640e23314b15e750c7a9870d6ce40d720fbe87548c20e5ae.png
import sys
sys.path.append('/content/drive/MyDrive/BrainHack_2024_2/Code')
from gcn_windows_dataset import TimeWindowsDataset

In this cell we simply load the dataset with its generalized categories for visualization. Here for this notebook, we use the eight veriosn categories.

import pickle
with open('/content/drive/MyDrive/Top_5_Label_Data.pkl', 'rb') as f:
    top_5_label_data = pickle.load(f)


import pickle
with open('/content/drive/MyDrive/Top_8_Label_Data.pkl', 'rb') as f:
    top_8_label_data = pickle.load(f)

import numpy as np
combined_label_data = {}

for key, value in top_8_label_data.items():
    beta_parcel = value['beta_parcel']
    label = value['label']

    if label not in combined_label_data:
        combined_label_data[label] = beta_parcel
    else:
        combined_label_data[label] = np.vstack((combined_label_data[label], beta_parcel))
print(combined_label_data['animal'].shape)
concat_bold = combined_label_data
(336, 400)

Train, Validation, Test#

In this section, we will talk about the dataset manipulation, and preprocessing stages for GNN netwrok training.

Split Dataset#

Here splilit the dataset to three section: train, validation and test. As the first step, the dataset is trasformed into the standard format as implemeted here.

import os
import pandas as pd

data_dir = os.path.join('..', 'data')
categories = concat_bold.keys()
# split the data by time window size and save to file
window_length = 1
dic_labels = {name: i for i, name in enumerate(categories)}

# set output paths
split_path = os.path.join(data_dir, 'split_win/')
if not os.path.exists(split_path):
    os.makedirs(split_path)
out_file = os.path.join(split_path, '{}_{:04d}.npy')
out_csv = os.path.join(split_path, 'labels.csv')

label_df = pd.DataFrame(columns=['label', 'filename'])
for label, ts_data in concat_bold.items():
    ts_duration = len(ts_data)
    ts_filename = f"{label}_seg"
    valid_label = dic_labels[label]

    # Split the timeseries
    rem = ts_duration % window_length
    n_splits = int(np.floor(ts_duration / window_length))

    ts_data = ts_data[:(ts_duration - rem), :]

    for j, split_ts in enumerate(np.split(ts_data, n_splits)):
        ts_output_file_name = out_file.format(ts_filename, j)

        split_ts = np.swapaxes(split_ts, 0, 1)
        np.save(ts_output_file_name, split_ts)

        curr_label = {'label': valid_label, 'filename': os.path.basename(ts_output_file_name)}
        label_df = label_df.append(curr_label, ignore_index=True)
        #df = pd.concat([label_df, pd.DataFrame([curr_label])], ignore_index=True)
       # label_df = pd.concat([label_df,curr_label],axis=0)

print(label_df)
label_df.to_csv(out_csv, index=False)
/usr/local/lib/python3.10/dist-packages/pandas/core/dtypes/cast.py:1846: DeprecationWarning: np.find_common_type is deprecated.  Please use `np.result_type` or `np.promote_types`.
See https://numpy.org/devdocs/release/1.25.0-notes.html and the docs for more information.  (Deprecated NumPy 1.25)
  return np.find_common_type(types, [])  # type: ignore[arg-type]
     label                       filename
0        0            animal_seg_0000.npy
1        0            animal_seg_0001.npy
2        0            animal_seg_0002.npy
3        0            animal_seg_0003.npy
4        0            animal_seg_0004.npy
...    ...                            ...
1309     7  sports equipment_seg_0091.npy
1310     7  sports equipment_seg_0092.npy
1311     7  sports equipment_seg_0093.npy
1312     7  sports equipment_seg_0094.npy
1313     7  sports equipment_seg_0095.npy

[1314 rows x 2 columns]
<frozen importlib._bootstrap>:914: ImportWarning: _PyDrive2ImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _PyDriveImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _GenerativeAIImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _OpenCVImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: APICoreClientInfoImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _BokehImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _AltairImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _PyDrive2ImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _PyDriveImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _GenerativeAIImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _OpenCVImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: APICoreClientInfoImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _BokehImportHook.find_spec() not found; falling back to find_module()
<frozen importlib._bootstrap>:914: ImportWarning: _AltairImportHook.find_spec() not found; falling back to find_module()
# split dataset
import sys
sys.path.append('/content/drive/MyDrive/BrainHack_2024_2/Code')
from gcn_windows_dataset import TimeWindowsDataset

random_seed = 0

train_dataset = TimeWindowsDataset(
    data_dir=split_path,
    partition="train",
    random_seed=random_seed,
    pin_memory=True,
    normalize=True,
    shuffle=True)

valid_dataset = TimeWindowsDataset(
    data_dir=split_path,
    partition="valid",
    random_seed=random_seed,
    pin_memory=True,
    normalize=True,
    shuffle=True)

test_dataset = TimeWindowsDataset(
    data_dir=split_path,
    partition="test",
    random_seed=random_seed,
    pin_memory=True,
    normalize=True,
    shuffle=True)

print("train dataset: {}".format(train_dataset))
print("valid dataset: {}".format(valid_dataset))
print("test dataset: {}".format(test_dataset))
train dataset: 919*(torch.Size([400, 1]), ())
valid dataset: 263*(torch.Size([400, 1]), ())
test dataset: 132*(torch.Size([400, 1]), ())
import torch
from torch.utils.data import DataLoader

batch_size = 10

torch.manual_seed(random_seed)
train_generator = DataLoader(train_dataset, batch_size=batch_size, shuffle=True)
valid_generator = DataLoader(valid_dataset, batch_size=batch_size, shuffle=True)
test_generator = DataLoader(test_dataset, batch_size=batch_size, shuffle=True)
train_features, train_labels = next(iter(train_generator))
print(f"Feature batch shape: {train_features.size()}; mean {torch.mean(train_features)}")
print(f"Labels batch shape: {train_labels.size()}; mean {torch.mean(torch.Tensor.float(train_labels))}")
Feature batch shape: torch.Size([10, 400, 1]); mean 1.2636184543168838e-08
Labels batch shape: torch.Size([10]); mean 2.9000000953674316

Building the Graph Convolutional Neural Network Model#

Here we use ChebNet as one the well-know GCN models for graph neural network training. The netwrok consists of three layer of chebconv and three fully connected layer followed by a softmax. gcn refers to model based on the Friends dataset connectome, and gcn_rnd refers to the model basedd on the random graph.

from gcn_model import GCN

gcn = GCN(graph.edge_index,
          graph.edge_attr,
          n_roi=X.shape[1],
          batch_size=batch_size,
          n_timepoints=window_length,
          n_classes=len(categories))
model = gcn.double()
from gcn_model import GCN

gcn_rnd = GCN(graph_rnd.edge_index,
          graph_rnd.edge_attr,
          n_roi=X.shape[1],
          batch_size=batch_size,
          n_timepoints=window_length,
          n_classes=len(categories))
model_rnd = gcn_rnd.double()
def train_loop(dataloader, model, loss_fn, optimizer):
    size = len(dataloader.dataset)

    for batch, (X, y) in enumerate(dataloader):
        pred_list = []
        y_list = []
        #print(X.type(), y.type())
        #X,y = X.type(torch.FloatTensor), y.type(torch.FloatTensor)
        X = X.double()
        #y = y.double()
        #X = X.float()
        #y = y.float()
        #print(X.type(), y.type())
        # Compute prediction and loss
        pred = model(X)
        loss = loss_fn(pred, y)

        # Backpropagation
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

        loss, current = loss.item(), batch * dataloader.batch_size

        correct = (pred.argmax(1) == y).type(torch.float).sum().item()
        correct /= X.shape[0]
        pred_list.append(pred.argmax(1).numpy())
        y_list.append(y.numpy())
        pred_concat = np.concatenate(pred_list, axis=0)
        y_concat = np.concatenate(y_list, axis=0)
        cheb_f1 = sklearn.metrics.f1_score(y_concat,pred_concat ,average='weighted')
        if (batch % 10 == 0) or (current == size):
            print(f"#{batch:>5};\ttrain_loss: {loss:>0.3f};\ttrain_accuracy:{(100*correct):>5.1f}%\t\t[{current:>5d}/{size:>5d}]\tF1_score(weighted):{cheb_f1}")


def valid_test_loop(dataloader, model, loss_fn):
    size = len(dataloader.dataset)
    loss, correct = 0, 0
    pred_list = []
    y_list = []
    with torch.no_grad():
        for X, y in dataloader:
            X = X.double()
            pred = model.forward(X)
            loss += loss_fn(pred, y).item()
            correct += (pred.argmax(1) == y).type(torch.float).sum().item()
            pred_list.append(pred.argmax(1).numpy())
            y_list.append(y.numpy())

    loss /= size
    correct /= size
    pred_concat = np.concatenate(pred_list, axis=0)
    y_concat = np.concatenate(y_list, axis=0)
    return loss, correct,y_concat, pred_concat

Grpah Training#

Here is the graph training procedure, based on the 25 epochs. as the number of classes is implanced, both of the F1-score, and accuracy is used for printing in the netwrok output.

loss_fn = torch.nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(gcn.parameters(), lr=1e-4, weight_decay=5e-4)

epochs = 25
for t in range(epochs):
    print(f"Epoch {t+1}/{epochs}\n-------------------------------")
    train_loop(train_generator, gcn, loss_fn, optimizer)
    loss, correct,y_concat,pred_concat = valid_test_loop(valid_generator, gcn, loss_fn)
    print(f"Valid metrics:\n\t avg_loss: {loss:>8f};\t avg_accuracy: {(100*correct):>0.1f}%")
Epoch 1/25
-------------------------------
#    0;	train_loss: 1.870;	train_accuracy: 20.0%		[    0/  919]	F1_score(weighted):0.1650793650793651
#   10;	train_loss: 1.907;	train_accuracy: 30.0%		[  100/  919]	F1_score(weighted):0.13846153846153847
#   20;	train_loss: 2.140;	train_accuracy: 10.0%		[  200/  919]	F1_score(weighted):0.16
#   30;	train_loss: 1.643;	train_accuracy: 50.0%		[  300/  919]	F1_score(weighted):0.4493506493506493
#   40;	train_loss: 1.839;	train_accuracy: 50.0%		[  400/  919]	F1_score(weighted):0.41666666666666663
#   50;	train_loss: 1.939;	train_accuracy: 30.0%		[  500/  919]	F1_score(weighted):0.20606060606060606
#   60;	train_loss: 2.067;	train_accuracy: 20.0%		[  600/  919]	F1_score(weighted):0.08888888888888889
#   70;	train_loss: 1.717;	train_accuracy: 50.0%		[  700/  919]	F1_score(weighted):0.4036363636363637
#   80;	train_loss: 1.751;	train_accuracy: 20.0%		[  800/  919]	F1_score(weighted):0.06666666666666668
#   90;	train_loss: 1.667;	train_accuracy: 50.0%		[  900/  919]	F1_score(weighted):0.38461538461538464
Valid metrics:
	 avg_loss: 0.200916;	 avg_accuracy: 28.9%
Epoch 2/25
-------------------------------
#    0;	train_loss: 1.802;	train_accuracy: 40.0%		[    0/  919]	F1_score(weighted):0.2833333333333333
#   10;	train_loss: 1.971;	train_accuracy: 30.0%		[  100/  919]	F1_score(weighted):0.17333333333333334
#   20;	train_loss: 2.116;	train_accuracy: 10.0%		[  200/  919]	F1_score(weighted):0.02857142857142857
#   30;	train_loss: 1.816;	train_accuracy: 40.0%		[  300/  919]	F1_score(weighted):0.32
#   40;	train_loss: 1.643;	train_accuracy: 50.0%		[  400/  919]	F1_score(weighted):0.33999999999999997
#   50;	train_loss: 2.310;	train_accuracy:  0.0%		[  500/  919]	F1_score(weighted):0.0
#   60;	train_loss: 1.906;	train_accuracy: 40.0%		[  600/  919]	F1_score(weighted):0.325
#   70;	train_loss: 1.523;	train_accuracy: 60.0%		[  700/  919]	F1_score(weighted):0.5
#   80;	train_loss: 1.844;	train_accuracy: 50.0%		[  800/  919]	F1_score(weighted):0.33999999999999997
#   90;	train_loss: 1.979;	train_accuracy: 20.0%		[  900/  919]	F1_score(weighted):0.13333333333333333
Valid metrics:
	 avg_loss: 0.184546;	 avg_accuracy: 35.4%
Epoch 3/25
-------------------------------
#    0;	train_loss: 1.606;	train_accuracy: 50.0%		[    0/  919]	F1_score(weighted):0.38
#   10;	train_loss: 1.761;	train_accuracy: 40.0%		[  100/  919]	F1_score(weighted):0.35151515151515145
#   20;	train_loss: 1.575;	train_accuracy: 50.0%		[  200/  919]	F1_score(weighted):0.4142857142857143
#   30;	train_loss: 1.570;	train_accuracy: 40.0%		[  300/  919]	F1_score(weighted):0.24
#   40;	train_loss: 1.846;	train_accuracy: 50.0%		[  400/  919]	F1_score(weighted):0.3476190476190476
#   50;	train_loss: 1.328;	train_accuracy: 70.0%		[  500/  919]	F1_score(weighted):0.5805555555555555
#   60;	train_loss: 1.773;	train_accuracy: 50.0%		[  600/  919]	F1_score(weighted):0.3916666666666666
#   70;	train_loss: 1.494;	train_accuracy: 50.0%		[  700/  919]	F1_score(weighted):0.34090909090909094
#   80;	train_loss: 1.739;	train_accuracy: 30.0%		[  800/  919]	F1_score(weighted):0.21666666666666665
#   90;	train_loss: 1.678;	train_accuracy: 40.0%		[  900/  919]	F1_score(weighted):0.26666666666666666
Valid metrics:
	 avg_loss: 0.170490;	 avg_accuracy: 39.9%
Epoch 4/25
-------------------------------
#    0;	train_loss: 1.166;	train_accuracy: 60.0%		[    0/  919]	F1_score(weighted):0.5854545454545454
#   10;	train_loss: 1.467;	train_accuracy: 60.0%		[  100/  919]	F1_score(weighted):0.54
#   20;	train_loss: 1.707;	train_accuracy: 40.0%		[  200/  919]	F1_score(weighted):0.27999999999999997
#   30;	train_loss: 1.442;	train_accuracy: 60.0%		[  300/  919]	F1_score(weighted):0.5166666666666668
#   40;	train_loss: 1.336;	train_accuracy: 40.0%		[  400/  919]	F1_score(weighted):0.31
#   50;	train_loss: 1.164;	train_accuracy: 60.0%		[  500/  919]	F1_score(weighted):0.5238095238095237
#   60;	train_loss: 0.944;	train_accuracy: 70.0%		[  600/  919]	F1_score(weighted):0.5878787878787878
#   70;	train_loss: 1.152;	train_accuracy: 50.0%		[  700/  919]	F1_score(weighted):0.36
#   80;	train_loss: 1.889;	train_accuracy: 30.0%		[  800/  919]	F1_score(weighted):0.21428571428571433
#   90;	train_loss: 1.591;	train_accuracy: 60.0%		[  900/  919]	F1_score(weighted):0.4512820512820513
Valid metrics:
	 avg_loss: 0.163098;	 avg_accuracy: 43.0%
Epoch 5/25
-------------------------------
#    0;	train_loss: 1.524;	train_accuracy: 40.0%		[    0/  919]	F1_score(weighted):0.3333333333333333
#   10;	train_loss: 1.068;	train_accuracy: 70.0%		[  100/  919]	F1_score(weighted):0.62
#   20;	train_loss: 0.834;	train_accuracy: 70.0%		[  200/  919]	F1_score(weighted):0.6714285714285715
#   30;	train_loss: 1.253;	train_accuracy: 70.0%		[  300/  919]	F1_score(weighted):0.6333333333333333
#   40;	train_loss: 1.141;	train_accuracy: 70.0%		[  400/  919]	F1_score(weighted):0.7066666666666668
#   50;	train_loss: 1.499;	train_accuracy: 40.0%		[  500/  919]	F1_score(weighted):0.29333333333333333
#   60;	train_loss: 1.715;	train_accuracy: 30.0%		[  600/  919]	F1_score(weighted):0.21428571428571433
#   70;	train_loss: 1.052;	train_accuracy: 50.0%		[  700/  919]	F1_score(weighted):0.44000000000000006
#   80;	train_loss: 0.832;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.86
#   90;	train_loss: 1.514;	train_accuracy: 30.0%		[  900/  919]	F1_score(weighted):0.22499999999999995
Valid metrics:
	 avg_loss: 0.149733;	 avg_accuracy: 43.3%
Epoch 6/25
-------------------------------
#    0;	train_loss: 1.095;	train_accuracy: 70.0%		[    0/  919]	F1_score(weighted):0.6761904761904762
#   10;	train_loss: 0.952;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.72
#   20;	train_loss: 0.953;	train_accuracy: 70.0%		[  200/  919]	F1_score(weighted):0.5809523809523809
#   30;	train_loss: 1.257;	train_accuracy: 50.0%		[  300/  919]	F1_score(weighted):0.4666666666666666
#   40;	train_loss: 1.088;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.7971428571428572
#   50;	train_loss: 0.936;	train_accuracy: 80.0%		[  500/  919]	F1_score(weighted):0.7166666666666666
#   60;	train_loss: 1.041;	train_accuracy: 50.0%		[  600/  919]	F1_score(weighted):0.3333333333333333
#   70;	train_loss: 1.502;	train_accuracy: 40.0%		[  700/  919]	F1_score(weighted):0.36666666666666664
#   80;	train_loss: 0.885;	train_accuracy: 70.0%		[  800/  919]	F1_score(weighted):0.6504761904761904
#   90;	train_loss: 1.392;	train_accuracy: 50.0%		[  900/  919]	F1_score(weighted):0.5833333333333333
Valid metrics:
	 avg_loss: 0.144413;	 avg_accuracy: 49.0%
Epoch 7/25
-------------------------------
#    0;	train_loss: 0.698;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.75
#   10;	train_loss: 0.582;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.8
#   20;	train_loss: 0.860;	train_accuracy: 80.0%		[  200/  919]	F1_score(weighted):0.75
#   30;	train_loss: 1.139;	train_accuracy: 70.0%		[  300/  919]	F1_score(weighted):0.6933333333333334
#   40;	train_loss: 1.417;	train_accuracy: 30.0%		[  400/  919]	F1_score(weighted):0.37499999999999994
#   50;	train_loss: 0.994;	train_accuracy: 60.0%		[  500/  919]	F1_score(weighted):0.5666666666666667
#   60;	train_loss: 1.368;	train_accuracy: 40.0%		[  600/  919]	F1_score(weighted):0.4
#   70;	train_loss: 1.092;	train_accuracy: 60.0%		[  700/  919]	F1_score(weighted):0.5566666666666666
#   80;	train_loss: 0.913;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.8428571428571427
#   90;	train_loss: 0.782;	train_accuracy: 70.0%		[  900/  919]	F1_score(weighted):0.5916666666666666
Valid metrics:
	 avg_loss: 0.142569;	 avg_accuracy: 49.8%
Epoch 8/25
-------------------------------
#    0;	train_loss: 0.947;	train_accuracy: 60.0%		[    0/  919]	F1_score(weighted):0.5642857142857143
#   10;	train_loss: 0.722;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.7555555555555556
#   20;	train_loss: 0.740;	train_accuracy: 80.0%		[  200/  919]	F1_score(weighted):0.7888888888888889
#   30;	train_loss: 1.105;	train_accuracy: 60.0%		[  300/  919]	F1_score(weighted):0.5233333333333333
#   40;	train_loss: 0.820;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.7555555555555555
#   50;	train_loss: 0.799;	train_accuracy: 60.0%		[  500/  919]	F1_score(weighted):0.54
#   60;	train_loss: 0.812;	train_accuracy: 70.0%		[  600/  919]	F1_score(weighted):0.7066666666666668
#   70;	train_loss: 0.710;	train_accuracy: 70.0%		[  700/  919]	F1_score(weighted):0.7047619047619047
#   80;	train_loss: 1.178;	train_accuracy: 60.0%		[  800/  919]	F1_score(weighted):0.5261904761904762
#   90;	train_loss: 0.866;	train_accuracy: 70.0%		[  900/  919]	F1_score(weighted):0.625
Valid metrics:
	 avg_loss: 0.139619;	 avg_accuracy: 48.3%
Epoch 9/25
-------------------------------
#    0;	train_loss: 0.875;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.8095238095238095
#   10;	train_loss: 0.986;	train_accuracy: 60.0%		[  100/  919]	F1_score(weighted):0.6
#   20;	train_loss: 0.923;	train_accuracy: 60.0%		[  200/  919]	F1_score(weighted):0.51
#   30;	train_loss: 1.320;	train_accuracy: 70.0%		[  300/  919]	F1_score(weighted):0.6599999999999999
#   40;	train_loss: 0.842;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.7545454545454545
#   50;	train_loss: 0.536;	train_accuracy: 90.0%		[  500/  919]	F1_score(weighted):0.8971428571428571
#   60;	train_loss: 1.039;	train_accuracy: 50.0%		[  600/  919]	F1_score(weighted):0.5285714285714287
#   70;	train_loss: 0.512;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.8955555555555555
#   80;	train_loss: 0.535;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.9333333333333333
#   90;	train_loss: 1.404;	train_accuracy: 30.0%		[  900/  919]	F1_score(weighted):0.3933333333333333
Valid metrics:
	 avg_loss: 0.130783;	 avg_accuracy: 52.9%
Epoch 10/25
-------------------------------
#    0;	train_loss: 0.547;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.8971428571428571
#   10;	train_loss: 0.759;	train_accuracy: 70.0%		[  100/  919]	F1_score(weighted):0.5866666666666667
#   20;	train_loss: 0.446;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.509;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.9028571428571428
#   40;	train_loss: 0.485;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.8571428571428571
#   50;	train_loss: 0.655;	train_accuracy: 70.0%		[  500/  919]	F1_score(weighted):0.7133333333333333
#   60;	train_loss: 0.272;	train_accuracy: 90.0%		[  600/  919]	F1_score(weighted):0.9
#   70;	train_loss: 0.547;	train_accuracy: 80.0%		[  700/  919]	F1_score(weighted):0.7116883116883116
#   80;	train_loss: 0.908;	train_accuracy: 70.0%		[  800/  919]	F1_score(weighted):0.6666666666666666
#   90;	train_loss: 0.431;	train_accuracy:100.0%		[  900/  919]	F1_score(weighted):1.0
Valid metrics:
	 avg_loss: 0.134313;	 avg_accuracy: 52.1%
Epoch 11/25
-------------------------------
#    0;	train_loss: 0.502;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.8099999999999999
#   10;	train_loss: 0.546;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.7666666666666667
#   20;	train_loss: 0.767;	train_accuracy: 70.0%		[  200/  919]	F1_score(weighted):0.6571428571428571
#   30;	train_loss: 0.624;	train_accuracy: 80.0%		[  300/  919]	F1_score(weighted):0.8777777777777779
#   40;	train_loss: 0.494;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.7666666666666666
#   50;	train_loss: 0.261;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.494;	train_accuracy: 80.0%		[  600/  919]	F1_score(weighted):0.8666666666666666
#   70;	train_loss: 0.566;	train_accuracy: 80.0%		[  700/  919]	F1_score(weighted):0.7533333333333333
#   80;	train_loss: 0.364;	train_accuracy:100.0%		[  800/  919]	F1_score(weighted):1.0
#   90;	train_loss: 0.960;	train_accuracy: 60.0%		[  900/  919]	F1_score(weighted):0.5833333333333333
Valid metrics:
	 avg_loss: 0.146181;	 avg_accuracy: 49.8%
Epoch 12/25
-------------------------------
#    0;	train_loss: 0.731;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.7999999999999999
#   10;	train_loss: 0.345;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.419;	train_accuracy: 80.0%		[  200/  919]	F1_score(weighted):0.8333333333333333
#   30;	train_loss: 0.909;	train_accuracy: 60.0%		[  300/  919]	F1_score(weighted):0.5571428571428572
#   40;	train_loss: 0.568;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.8933333333333333
#   50;	train_loss: 0.262;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.449;	train_accuracy: 80.0%		[  600/  919]	F1_score(weighted):0.8
#   70;	train_loss: 1.692;	train_accuracy: 40.0%		[  700/  919]	F1_score(weighted):0.31
#   80;	train_loss: 0.395;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.9428571428571428
#   90;	train_loss: 0.359;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.8666666666666668
Valid metrics:
	 avg_loss: 0.134197;	 avg_accuracy: 57.0%
Epoch 13/25
-------------------------------
#    0;	train_loss: 0.513;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.76
#   10;	train_loss: 0.307;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.278;	train_accuracy: 90.0%		[  200/  919]	F1_score(weighted):0.8904761904761905
#   30;	train_loss: 0.565;	train_accuracy: 70.0%		[  300/  919]	F1_score(weighted):0.7
#   40;	train_loss: 0.345;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.8666666666666666
#   50;	train_loss: 0.147;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.262;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.642;	train_accuracy: 80.0%		[  700/  919]	F1_score(weighted):0.76
#   80;	train_loss: 0.584;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.86
#   90;	train_loss: 0.244;	train_accuracy:100.0%		[  900/  919]	F1_score(weighted):1.0
Valid metrics:
	 avg_loss: 0.152547;	 avg_accuracy: 54.4%
Epoch 14/25
-------------------------------
#    0;	train_loss: 0.343;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.9
#   10;	train_loss: 0.551;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.7904761904761904
#   20;	train_loss: 0.282;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.335;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.86
#   40;	train_loss: 0.416;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.8904761904761905
#   50;	train_loss: 0.198;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.231;	train_accuracy: 90.0%		[  600/  919]	F1_score(weighted):0.9
#   70;	train_loss: 0.107;	train_accuracy:100.0%		[  700/  919]	F1_score(weighted):1.0
#   80;	train_loss: 0.531;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.7933333333333332
#   90;	train_loss: 0.331;	train_accuracy: 80.0%		[  900/  919]	F1_score(weighted):0.7733333333333333
Valid metrics:
	 avg_loss: 0.152705;	 avg_accuracy: 54.8%
Epoch 15/25
-------------------------------
#    0;	train_loss: 0.094;	train_accuracy:100.0%		[    0/  919]	F1_score(weighted):1.0
#   10;	train_loss: 0.340;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.7971428571428572
#   20;	train_loss: 0.271;	train_accuracy: 90.0%		[  200/  919]	F1_score(weighted):0.9444444444444444
#   30;	train_loss: 0.524;	train_accuracy: 80.0%		[  300/  919]	F1_score(weighted):0.7971428571428572
#   40;	train_loss: 0.227;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.9
#   50;	train_loss: 0.489;	train_accuracy: 90.0%		[  500/  919]	F1_score(weighted):0.9444444444444444
#   60;	train_loss: 0.303;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.412;	train_accuracy: 80.0%		[  700/  919]	F1_score(weighted):0.7333333333333333
#   80;	train_loss: 0.464;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.9400000000000001
#   90;	train_loss: 0.193;	train_accuracy:100.0%		[  900/  919]	F1_score(weighted):1.0
Valid metrics:
	 avg_loss: 0.160881;	 avg_accuracy: 51.0%
Epoch 16/25
-------------------------------
#    0;	train_loss: 0.321;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.8999999999999998
#   10;	train_loss: 0.135;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.762;	train_accuracy: 60.0%		[  200/  919]	F1_score(weighted):0.6166666666666667
#   30;	train_loss: 0.268;	train_accuracy:100.0%		[  300/  919]	F1_score(weighted):1.0
#   40;	train_loss: 0.114;	train_accuracy:100.0%		[  400/  919]	F1_score(weighted):1.0
#   50;	train_loss: 0.550;	train_accuracy: 70.0%		[  500/  919]	F1_score(weighted):0.7404761904761904
#   60;	train_loss: 0.132;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.384;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.8666666666666668
#   80;	train_loss: 0.269;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.8083333333333333
#   90;	train_loss: 0.327;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.8666666666666666
Valid metrics:
	 avg_loss: 0.145256;	 avg_accuracy: 57.8%
Epoch 17/25
-------------------------------
#    0;	train_loss: 0.097;	train_accuracy:100.0%		[    0/  919]	F1_score(weighted):1.0
#   10;	train_loss: 0.365;	train_accuracy: 90.0%		[  100/  919]	F1_score(weighted):0.9400000000000001
#   20;	train_loss: 0.597;	train_accuracy: 70.0%		[  200/  919]	F1_score(weighted):0.6761904761904761
#   30;	train_loss: 0.519;	train_accuracy: 80.0%		[  300/  919]	F1_score(weighted):0.7533333333333333
#   40;	train_loss: 0.379;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.9428571428571428
#   50;	train_loss: 0.155;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.246;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.358;	train_accuracy: 80.0%		[  700/  919]	F1_score(weighted):0.8400000000000001
#   80;	train_loss: 0.331;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.875
#   90;	train_loss: 0.450;	train_accuracy: 80.0%		[  900/  919]	F1_score(weighted):0.7333333333333334
Valid metrics:
	 avg_loss: 0.153717;	 avg_accuracy: 54.4%
Epoch 18/25
-------------------------------
#    0;	train_loss: 0.338;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.8
#   10;	train_loss: 0.112;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.296;	train_accuracy: 80.0%		[  200/  919]	F1_score(weighted):0.7571428571428571
#   30;	train_loss: 0.336;	train_accuracy: 80.0%		[  300/  919]	F1_score(weighted):0.8
#   40;	train_loss: 0.738;	train_accuracy: 60.0%		[  400/  919]	F1_score(weighted):0.625
#   50;	train_loss: 0.167;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.076;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.175;	train_accuracy:100.0%		[  700/  919]	F1_score(weighted):1.0
#   80;	train_loss: 0.067;	train_accuracy:100.0%		[  800/  919]	F1_score(weighted):1.0
#   90;	train_loss: 0.393;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9333333333333332
Valid metrics:
	 avg_loss: 0.151509;	 avg_accuracy: 56.3%
Epoch 19/25
-------------------------------
#    0;	train_loss: 0.383;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.7666666666666666
#   10;	train_loss: 0.363;	train_accuracy: 90.0%		[  100/  919]	F1_score(weighted):0.9
#   20;	train_loss: 0.083;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.326;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.9333333333333333
#   40;	train_loss: 0.549;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.7833333333333333
#   50;	train_loss: 0.116;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.064;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.184;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.9333333333333332
#   80;	train_loss: 0.109;	train_accuracy:100.0%		[  800/  919]	F1_score(weighted):1.0
#   90;	train_loss: 0.513;	train_accuracy: 80.0%		[  900/  919]	F1_score(weighted):0.8400000000000001
Valid metrics:
	 avg_loss: 0.172529;	 avg_accuracy: 52.9%
Epoch 20/25
-------------------------------
#    0;	train_loss: 0.160;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.9428571428571428
#   10;	train_loss: 0.400;	train_accuracy: 90.0%		[  100/  919]	F1_score(weighted):0.9333333333333332
#   20;	train_loss: 0.027;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.439;	train_accuracy: 70.0%		[  300/  919]	F1_score(weighted):0.7
#   40;	train_loss: 0.184;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.8904761904761905
#   50;	train_loss: 0.068;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.087;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.190;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.9066666666666666
#   80;	train_loss: 0.365;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.8904761904761905
#   90;	train_loss: 0.188;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.86
Valid metrics:
	 avg_loss: 0.174843;	 avg_accuracy: 56.7%
Epoch 21/25
-------------------------------
#    0;	train_loss: 0.170;	train_accuracy:100.0%		[    0/  919]	F1_score(weighted):1.0
#   10;	train_loss: 0.188;	train_accuracy: 90.0%		[  100/  919]	F1_score(weighted):0.8933333333333333
#   20;	train_loss: 0.155;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.121;	train_accuracy:100.0%		[  300/  919]	F1_score(weighted):1.0
#   40;	train_loss: 0.025;	train_accuracy:100.0%		[  400/  919]	F1_score(weighted):1.0
#   50;	train_loss: 0.529;	train_accuracy: 80.0%		[  500/  919]	F1_score(weighted):0.8333333333333333
#   60;	train_loss: 0.132;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.148;	train_accuracy:100.0%		[  700/  919]	F1_score(weighted):1.0
#   80;	train_loss: 0.027;	train_accuracy:100.0%		[  800/  919]	F1_score(weighted):1.0
#   90;	train_loss: 0.290;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9
Valid metrics:
	 avg_loss: 0.163132;	 avg_accuracy: 54.4%
Epoch 22/25
-------------------------------
#    0;	train_loss: 0.224;	train_accuracy:100.0%		[    0/  919]	F1_score(weighted):1.0
#   10;	train_loss: 0.045;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.107;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.052;	train_accuracy:100.0%		[  300/  919]	F1_score(weighted):1.0
#   40;	train_loss: 0.337;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.8
#   50;	train_loss: 0.042;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.028;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.273;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.8971428571428571
#   80;	train_loss: 0.327;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.9428571428571428
#   90;	train_loss: 0.110;	train_accuracy:100.0%		[  900/  919]	F1_score(weighted):1.0
Valid metrics:
	 avg_loss: 0.185141;	 avg_accuracy: 56.3%
Epoch 23/25
-------------------------------
#    0;	train_loss: 0.335;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.8971428571428571
#   10;	train_loss: 0.111;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.278;	train_accuracy: 90.0%		[  200/  919]	F1_score(weighted):0.888888888888889
#   30;	train_loss: 0.108;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.8545454545454545
#   40;	train_loss: 0.044;	train_accuracy:100.0%		[  400/  919]	F1_score(weighted):1.0
#   50;	train_loss: 0.069;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.074;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.117;	train_accuracy:100.0%		[  700/  919]	F1_score(weighted):1.0
#   80;	train_loss: 0.145;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.9066666666666666
#   90;	train_loss: 0.262;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9444444444444444
Valid metrics:
	 avg_loss: 0.187326;	 avg_accuracy: 51.7%
Epoch 24/25
-------------------------------
#    0;	train_loss: 0.265;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.7971428571428572
#   10;	train_loss: 0.187;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.7666666666666667
#   20;	train_loss: 0.052;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.121;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.86
#   40;	train_loss: 0.257;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.76
#   50;	train_loss: 0.148;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.188;	train_accuracy: 90.0%		[  600/  919]	F1_score(weighted):0.86
#   70;	train_loss: 0.027;	train_accuracy:100.0%		[  700/  919]	F1_score(weighted):1.0
#   80;	train_loss: 0.521;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.8904761904761905
#   90;	train_loss: 0.260;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9
Valid metrics:
	 avg_loss: 0.188785;	 avg_accuracy: 52.1%
Epoch 25/25
-------------------------------
#    0;	train_loss: 0.109;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.8933333333333333
#   10;	train_loss: 0.472;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.76
#   20;	train_loss: 0.252;	train_accuracy: 90.0%		[  200/  919]	F1_score(weighted):0.8666666666666666
#   30;	train_loss: 0.046;	train_accuracy:100.0%		[  300/  919]	F1_score(weighted):1.0
#   40;	train_loss: 0.177;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.9066666666666666
#   50;	train_loss: 0.045;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.042;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.166;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.9
#   80;	train_loss: 0.484;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.7666666666666668
#   90;	train_loss: 0.102;	train_accuracy:100.0%		[  900/  919]	F1_score(weighted):1.0
Valid metrics:
	 avg_loss: 0.212603;	 avg_accuracy: 51.7%

Random grpah Training#

Same procedure as explained for the initial graph, happenes for the Random graph training as well.

loss_fn = torch.nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(gcn_rnd.parameters(), lr=1e-4, weight_decay=5e-4)

epochs = 25
for t in range(epochs):
    print(f"Epoch {t+1}/{epochs}\n-------------------------------")
    train_loop(train_generator, gcn_rnd, loss_fn, optimizer)
    loss_rnd, correct_rnd,y_concat_rnd,pred_concat_rnd = valid_test_loop(valid_generator, gcn_rnd, loss_fn)
    print(f"Valid metrics:\n\t avg_loss: {loss_rnd:>8f};\t avg_accuracy: {(100*correct_rnd):>0.1f}%")
Epoch 1/25
-------------------------------
#    0;	train_loss: 2.075;	train_accuracy:  0.0%		[    0/  919]	F1_score(weighted):0.0
#   10;	train_loss: 1.980;	train_accuracy: 30.0%		[  100/  919]	F1_score(weighted):0.13846153846153847
#   20;	train_loss: 2.105;	train_accuracy:  0.0%		[  200/  919]	F1_score(weighted):0.0
#   30;	train_loss: 1.712;	train_accuracy: 20.0%		[  300/  919]	F1_score(weighted):0.17222222222222222
#   40;	train_loss: 2.068;	train_accuracy: 20.0%		[  400/  919]	F1_score(weighted):0.06666666666666668
#   50;	train_loss: 1.928;	train_accuracy: 20.0%		[  500/  919]	F1_score(weighted):0.06666666666666668
#   60;	train_loss: 1.978;	train_accuracy: 10.0%		[  600/  919]	F1_score(weighted):0.01818181818181818
#   70;	train_loss: 1.838;	train_accuracy: 50.0%		[  700/  919]	F1_score(weighted):0.37948717948717947
#   80;	train_loss: 1.936;	train_accuracy: 20.0%		[  800/  919]	F1_score(weighted):0.16
#   90;	train_loss: 2.314;	train_accuracy: 10.0%		[  900/  919]	F1_score(weighted):0.08
Valid metrics:
	 avg_loss: 0.202870;	 avg_accuracy: 31.9%
Epoch 2/25
-------------------------------
#    0;	train_loss: 2.111;	train_accuracy: 30.0%		[    0/  919]	F1_score(weighted):0.22222222222222224
#   10;	train_loss: 1.752;	train_accuracy: 30.0%		[  100/  919]	F1_score(weighted):0.23272727272727273
#   20;	train_loss: 1.875;	train_accuracy: 40.0%		[  200/  919]	F1_score(weighted):0.2461538461538461
#   30;	train_loss: 1.983;	train_accuracy: 30.0%		[  300/  919]	F1_score(weighted):0.18333333333333335
#   40;	train_loss: 1.938;	train_accuracy: 30.0%		[  400/  919]	F1_score(weighted):0.24
#   50;	train_loss: 1.991;	train_accuracy: 10.0%		[  500/  919]	F1_score(weighted):0.02222222222222222
#   60;	train_loss: 2.020;	train_accuracy: 30.0%		[  600/  919]	F1_score(weighted):0.13846153846153847
#   70;	train_loss: 2.094;	train_accuracy: 20.0%		[  700/  919]	F1_score(weighted):0.06666666666666668
#   80;	train_loss: 2.055;	train_accuracy: 20.0%		[  800/  919]	F1_score(weighted):0.06666666666666668
#   90;	train_loss: 1.925;	train_accuracy: 30.0%		[  900/  919]	F1_score(weighted):0.33333333333333326
Valid metrics:
	 avg_loss: 0.193808;	 avg_accuracy: 37.6%
Epoch 3/25
-------------------------------
#    0;	train_loss: 1.793;	train_accuracy: 50.0%		[    0/  919]	F1_score(weighted):0.38461538461538464
#   10;	train_loss: 1.724;	train_accuracy: 50.0%		[  100/  919]	F1_score(weighted):0.33999999999999997
#   20;	train_loss: 2.342;	train_accuracy: 20.0%		[  200/  919]	F1_score(weighted):0.06666666666666668
#   30;	train_loss: 1.898;	train_accuracy: 30.0%		[  300/  919]	F1_score(weighted):0.2666666666666666
#   40;	train_loss: 1.653;	train_accuracy: 50.0%		[  400/  919]	F1_score(weighted):0.41666666666666663
#   50;	train_loss: 2.016;	train_accuracy: 10.0%		[  500/  919]	F1_score(weighted):0.05714285714285715
#   60;	train_loss: 1.391;	train_accuracy: 60.0%		[  600/  919]	F1_score(weighted):0.6461538461538461
#   70;	train_loss: 1.822;	train_accuracy: 40.0%		[  700/  919]	F1_score(weighted):0.2857142857142857
#   80;	train_loss: 1.706;	train_accuracy: 30.0%		[  800/  919]	F1_score(weighted):0.1619047619047619
#   90;	train_loss: 1.683;	train_accuracy: 40.0%		[  900/  919]	F1_score(weighted):0.27999999999999997
Valid metrics:
	 avg_loss: 0.174868;	 avg_accuracy: 39.5%
Epoch 4/25
-------------------------------
#    0;	train_loss: 1.209;	train_accuracy: 60.0%		[    0/  919]	F1_score(weighted):0.4533333333333333
#   10;	train_loss: 1.105;	train_accuracy: 70.0%		[  100/  919]	F1_score(weighted):0.5878787878787878
#   20;	train_loss: 1.809;	train_accuracy: 50.0%		[  200/  919]	F1_score(weighted):0.3333333333333333
#   30;	train_loss: 1.962;	train_accuracy: 20.0%		[  300/  919]	F1_score(weighted):0.06666666666666668
#   40;	train_loss: 1.198;	train_accuracy: 60.0%		[  400/  919]	F1_score(weighted):0.48
#   50;	train_loss: 1.298;	train_accuracy: 60.0%		[  500/  919]	F1_score(weighted):0.45714285714285713
#   60;	train_loss: 1.649;	train_accuracy: 30.0%		[  600/  919]	F1_score(weighted):0.1542857142857143
#   70;	train_loss: 1.309;	train_accuracy: 50.0%		[  700/  919]	F1_score(weighted):0.33928571428571425
#   80;	train_loss: 1.393;	train_accuracy: 50.0%		[  800/  919]	F1_score(weighted):0.4542857142857143
#   90;	train_loss: 1.553;	train_accuracy: 40.0%		[  900/  919]	F1_score(weighted):0.3333333333333333
Valid metrics:
	 avg_loss: 0.160056;	 avg_accuracy: 44.5%
Epoch 5/25
-------------------------------
#    0;	train_loss: 1.552;	train_accuracy: 50.0%		[    0/  919]	F1_score(weighted):0.45
#   10;	train_loss: 1.273;	train_accuracy: 50.0%		[  100/  919]	F1_score(weighted):0.42380952380952375
#   20;	train_loss: 1.778;	train_accuracy: 30.0%		[  200/  919]	F1_score(weighted):0.29333333333333333
#   30;	train_loss: 1.636;	train_accuracy: 20.0%		[  300/  919]	F1_score(weighted):0.13333333333333333
#   40;	train_loss: 1.126;	train_accuracy: 70.0%		[  400/  919]	F1_score(weighted):0.6555555555555556
#   50;	train_loss: 1.268;	train_accuracy: 60.0%		[  500/  919]	F1_score(weighted):0.5166666666666666
#   60;	train_loss: 0.934;	train_accuracy: 60.0%		[  600/  919]	F1_score(weighted):0.4904761904761904
#   70;	train_loss: 1.685;	train_accuracy: 30.0%		[  700/  919]	F1_score(weighted):0.22666666666666666
#   80;	train_loss: 1.124;	train_accuracy: 60.0%		[  800/  919]	F1_score(weighted):0.4533333333333333
#   90;	train_loss: 1.792;	train_accuracy: 20.0%		[  900/  919]	F1_score(weighted):0.10857142857142858
Valid metrics:
	 avg_loss: 0.151992;	 avg_accuracy: 45.6%
Epoch 6/25
-------------------------------
#    0;	train_loss: 1.424;	train_accuracy: 60.0%		[    0/  919]	F1_score(weighted):0.5333333333333333
#   10;	train_loss: 1.054;	train_accuracy: 50.0%		[  100/  919]	F1_score(weighted):0.36499999999999994
#   20;	train_loss: 1.478;	train_accuracy: 60.0%		[  200/  919]	F1_score(weighted):0.5666666666666667
#   30;	train_loss: 0.585;	train_accuracy:100.0%		[  300/  919]	F1_score(weighted):1.0
#   40;	train_loss: 1.658;	train_accuracy: 50.0%		[  400/  919]	F1_score(weighted):0.5066666666666666
#   50;	train_loss: 1.394;	train_accuracy: 40.0%		[  500/  919]	F1_score(weighted):0.45
#   60;	train_loss: 1.206;	train_accuracy: 50.0%		[  600/  919]	F1_score(weighted):0.45555555555555555
#   70;	train_loss: 1.032;	train_accuracy: 50.0%		[  700/  919]	F1_score(weighted):0.45555555555555555
#   80;	train_loss: 1.032;	train_accuracy: 70.0%		[  800/  919]	F1_score(weighted):0.6333333333333334
#   90;	train_loss: 2.063;	train_accuracy: 40.0%		[  900/  919]	F1_score(weighted):0.26666666666666666
Valid metrics:
	 avg_loss: 0.146751;	 avg_accuracy: 47.5%
Epoch 7/25
-------------------------------
#    0;	train_loss: 0.938;	train_accuracy: 50.0%		[    0/  919]	F1_score(weighted):0.5271428571428571
#   10;	train_loss: 1.285;	train_accuracy: 40.0%		[  100/  919]	F1_score(weighted):0.35
#   20;	train_loss: 0.942;	train_accuracy: 60.0%		[  200/  919]	F1_score(weighted):0.6000000000000001
#   30;	train_loss: 1.150;	train_accuracy: 60.0%		[  300/  919]	F1_score(weighted):0.5599999999999999
#   40;	train_loss: 0.934;	train_accuracy: 60.0%		[  400/  919]	F1_score(weighted):0.6
#   50;	train_loss: 1.357;	train_accuracy: 50.0%		[  500/  919]	F1_score(weighted):0.4303030303030303
#   60;	train_loss: 0.896;	train_accuracy: 70.0%		[  600/  919]	F1_score(weighted):0.6222222222222222
#   70;	train_loss: 0.816;	train_accuracy: 70.0%		[  700/  919]	F1_score(weighted):0.6904761904761905
#   80;	train_loss: 0.803;	train_accuracy: 60.0%		[  800/  919]	F1_score(weighted):0.6333333333333333
#   90;	train_loss: 1.310;	train_accuracy: 50.0%		[  900/  919]	F1_score(weighted):0.48
Valid metrics:
	 avg_loss: 0.142249;	 avg_accuracy: 48.3%
Epoch 8/25
-------------------------------
#    0;	train_loss: 0.872;	train_accuracy: 60.0%		[    0/  919]	F1_score(weighted):0.5571428571428572
#   10;	train_loss: 1.798;	train_accuracy: 40.0%		[  100/  919]	F1_score(weighted):0.32380952380952377
#   20;	train_loss: 0.905;	train_accuracy: 60.0%		[  200/  919]	F1_score(weighted):0.480952380952381
#   30;	train_loss: 0.798;	train_accuracy: 60.0%		[  300/  919]	F1_score(weighted):0.6
#   40;	train_loss: 0.887;	train_accuracy: 70.0%		[  400/  919]	F1_score(weighted):0.7166666666666666
#   50;	train_loss: 0.745;	train_accuracy: 80.0%		[  500/  919]	F1_score(weighted):0.8333333333333333
#   60;	train_loss: 1.023;	train_accuracy: 60.0%		[  600/  919]	F1_score(weighted):0.6428571428571429
#   70;	train_loss: 0.675;	train_accuracy: 80.0%		[  700/  919]	F1_score(weighted):0.8
#   80;	train_loss: 1.941;	train_accuracy: 40.0%		[  800/  919]	F1_score(weighted):0.36190476190476184
#   90;	train_loss: 0.801;	train_accuracy: 60.0%		[  900/  919]	F1_score(weighted):0.58
Valid metrics:
	 avg_loss: 0.149695;	 avg_accuracy: 44.9%
Epoch 9/25
-------------------------------
#    0;	train_loss: 0.793;	train_accuracy: 70.0%		[    0/  919]	F1_score(weighted):0.6142857142857143
#   10;	train_loss: 1.502;	train_accuracy: 60.0%		[  100/  919]	F1_score(weighted):0.64
#   20;	train_loss: 0.344;	train_accuracy: 90.0%		[  200/  919]	F1_score(weighted):0.8666666666666666
#   30;	train_loss: 1.144;	train_accuracy: 60.0%		[  300/  919]	F1_score(weighted):0.5833333333333333
#   40;	train_loss: 0.797;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.8333333333333333
#   50;	train_loss: 0.882;	train_accuracy: 80.0%		[  500/  919]	F1_score(weighted):0.7238095238095237
#   60;	train_loss: 0.753;	train_accuracy: 80.0%		[  600/  919]	F1_score(weighted):0.7266666666666667
#   70;	train_loss: 1.210;	train_accuracy: 60.0%		[  700/  919]	F1_score(weighted):0.525
#   80;	train_loss: 0.822;	train_accuracy: 70.0%		[  800/  919]	F1_score(weighted):0.6222222222222222
#   90;	train_loss: 0.824;	train_accuracy: 70.0%		[  900/  919]	F1_score(weighted):0.6404761904761904
Valid metrics:
	 avg_loss: 0.130839;	 avg_accuracy: 52.9%
Epoch 10/25
-------------------------------
#    0;	train_loss: 0.847;	train_accuracy: 50.0%		[    0/  919]	F1_score(weighted):0.4
#   10;	train_loss: 0.407;	train_accuracy: 90.0%		[  100/  919]	F1_score(weighted):0.8904761904761905
#   20;	train_loss: 0.967;	train_accuracy: 60.0%		[  200/  919]	F1_score(weighted):0.6000000000000001
#   30;	train_loss: 1.174;	train_accuracy: 60.0%		[  300/  919]	F1_score(weighted):0.47428571428571437
#   40;	train_loss: 0.795;	train_accuracy: 60.0%		[  400/  919]	F1_score(weighted):0.6344444444444445
#   50;	train_loss: 0.621;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.533;	train_accuracy: 90.0%		[  600/  919]	F1_score(weighted):0.8666666666666666
#   70;	train_loss: 1.092;	train_accuracy: 70.0%		[  700/  919]	F1_score(weighted):0.6238095238095237
#   80;	train_loss: 0.655;	train_accuracy: 70.0%		[  800/  919]	F1_score(weighted):0.625
#   90;	train_loss: 0.811;	train_accuracy: 60.0%		[  900/  919]	F1_score(weighted):0.6833333333333333
Valid metrics:
	 avg_loss: 0.132440;	 avg_accuracy: 52.1%
Epoch 11/25
-------------------------------
#    0;	train_loss: 1.085;	train_accuracy: 60.0%		[    0/  919]	F1_score(weighted):0.6900000000000001
#   10;	train_loss: 0.795;	train_accuracy: 50.0%		[  100/  919]	F1_score(weighted):0.54
#   20;	train_loss: 0.537;	train_accuracy: 90.0%		[  200/  919]	F1_score(weighted):0.9044444444444444
#   30;	train_loss: 0.842;	train_accuracy: 60.0%		[  300/  919]	F1_score(weighted):0.5777777777777778
#   40;	train_loss: 1.002;	train_accuracy: 60.0%		[  400/  919]	F1_score(weighted):0.6
#   50;	train_loss: 0.464;	train_accuracy: 80.0%		[  500/  919]	F1_score(weighted):0.8428571428571429
#   60;	train_loss: 0.742;	train_accuracy: 70.0%		[  600/  919]	F1_score(weighted):0.6928571428571428
#   70;	train_loss: 0.527;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.9444444444444444
#   80;	train_loss: 1.350;	train_accuracy: 50.0%		[  800/  919]	F1_score(weighted):0.39999999999999997
#   90;	train_loss: 0.489;	train_accuracy: 80.0%		[  900/  919]	F1_score(weighted):0.7833333333333333
Valid metrics:
	 avg_loss: 0.141297;	 avg_accuracy: 50.6%
Epoch 12/25
-------------------------------
#    0;	train_loss: 0.472;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.9066666666666666
#   10;	train_loss: 0.808;	train_accuracy: 70.0%		[  100/  919]	F1_score(weighted):0.7133333333333333
#   20;	train_loss: 0.836;	train_accuracy: 70.0%		[  200/  919]	F1_score(weighted):0.7
#   30;	train_loss: 0.786;	train_accuracy: 60.0%		[  300/  919]	F1_score(weighted):0.64
#   40;	train_loss: 0.435;	train_accuracy:100.0%		[  400/  919]	F1_score(weighted):1.0
#   50;	train_loss: 0.499;	train_accuracy: 80.0%		[  500/  919]	F1_score(weighted):0.7266666666666667
#   60;	train_loss: 1.176;	train_accuracy: 70.0%		[  600/  919]	F1_score(weighted):0.7333333333333333
#   70;	train_loss: 0.579;	train_accuracy: 80.0%		[  700/  919]	F1_score(weighted):0.7666666666666666
#   80;	train_loss: 1.135;	train_accuracy: 70.0%		[  800/  919]	F1_score(weighted):0.7
#   90;	train_loss: 0.465;	train_accuracy: 80.0%		[  900/  919]	F1_score(weighted):0.7733333333333334
Valid metrics:
	 avg_loss: 0.137995;	 avg_accuracy: 50.6%
Epoch 13/25
-------------------------------
#    0;	train_loss: 0.308;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.9095238095238095
#   10;	train_loss: 0.440;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.7545454545454545
#   20;	train_loss: 0.499;	train_accuracy: 70.0%		[  200/  919]	F1_score(weighted):0.6666666666666666
#   30;	train_loss: 0.513;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.8955555555555555
#   40;	train_loss: 0.667;	train_accuracy: 70.0%		[  400/  919]	F1_score(weighted):0.6416666666666666
#   50;	train_loss: 0.750;	train_accuracy: 70.0%		[  500/  919]	F1_score(weighted):0.6833333333333333
#   60;	train_loss: 0.469;	train_accuracy: 80.0%		[  600/  919]	F1_score(weighted):0.8266666666666665
#   70;	train_loss: 0.906;	train_accuracy: 70.0%		[  700/  919]	F1_score(weighted):0.6266666666666667
#   80;	train_loss: 0.669;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.7833333333333333
#   90;	train_loss: 0.475;	train_accuracy:100.0%		[  900/  919]	F1_score(weighted):1.0
Valid metrics:
	 avg_loss: 0.144964;	 avg_accuracy: 52.1%
Epoch 14/25
-------------------------------
#    0;	train_loss: 0.462;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.8666666666666666
#   10;	train_loss: 0.587;	train_accuracy: 70.0%		[  100/  919]	F1_score(weighted):0.6166666666666666
#   20;	train_loss: 0.461;	train_accuracy: 80.0%		[  200/  919]	F1_score(weighted):0.7833333333333333
#   30;	train_loss: 0.421;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.9333333333333332
#   40;	train_loss: 0.642;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.8333333333333333
#   50;	train_loss: 0.452;	train_accuracy: 80.0%		[  500/  919]	F1_score(weighted):0.8444444444444444
#   60;	train_loss: 0.517;	train_accuracy: 80.0%		[  600/  919]	F1_score(weighted):0.7504761904761904
#   70;	train_loss: 0.780;	train_accuracy: 80.0%		[  700/  919]	F1_score(weighted):0.832857142857143
#   80;	train_loss: 0.679;	train_accuracy: 60.0%		[  800/  919]	F1_score(weighted):0.6066666666666667
#   90;	train_loss: 0.555;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9
Valid metrics:
	 avg_loss: 0.144589;	 avg_accuracy: 51.0%
Epoch 15/25
-------------------------------
#    0;	train_loss: 0.404;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.8904761904761905
#   10;	train_loss: 0.408;	train_accuracy: 90.0%		[  100/  919]	F1_score(weighted):0.9333333333333332
#   20;	train_loss: 0.897;	train_accuracy: 70.0%		[  200/  919]	F1_score(weighted):0.719047619047619
#   30;	train_loss: 0.090;	train_accuracy:100.0%		[  300/  919]	F1_score(weighted):1.0
#   40;	train_loss: 0.685;	train_accuracy: 70.0%		[  400/  919]	F1_score(weighted):0.735
#   50;	train_loss: 0.589;	train_accuracy: 90.0%		[  500/  919]	F1_score(weighted):0.9
#   60;	train_loss: 0.483;	train_accuracy: 80.0%		[  600/  919]	F1_score(weighted):0.8266666666666665
#   70;	train_loss: 0.652;	train_accuracy: 70.0%		[  700/  919]	F1_score(weighted):0.6266666666666667
#   80;	train_loss: 0.548;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.7666666666666667
#   90;	train_loss: 0.685;	train_accuracy: 70.0%		[  900/  919]	F1_score(weighted):0.7333333333333333
Valid metrics:
	 avg_loss: 0.149080;	 avg_accuracy: 51.3%
Epoch 16/25
-------------------------------
#    0;	train_loss: 0.277;	train_accuracy:100.0%		[    0/  919]	F1_score(weighted):1.0
#   10;	train_loss: 0.551;	train_accuracy: 70.0%		[  100/  919]	F1_score(weighted):0.7
#   20;	train_loss: 0.546;	train_accuracy: 70.0%		[  200/  919]	F1_score(weighted):0.6599999999999999
#   30;	train_loss: 0.665;	train_accuracy: 80.0%		[  300/  919]	F1_score(weighted):0.8
#   40;	train_loss: 0.889;	train_accuracy: 70.0%		[  400/  919]	F1_score(weighted):0.6928571428571428
#   50;	train_loss: 0.518;	train_accuracy: 90.0%		[  500/  919]	F1_score(weighted):0.8933333333333333
#   60;	train_loss: 0.665;	train_accuracy: 70.0%		[  600/  919]	F1_score(weighted):0.65
#   70;	train_loss: 0.842;	train_accuracy: 70.0%		[  700/  919]	F1_score(weighted):0.775
#   80;	train_loss: 0.270;	train_accuracy:100.0%		[  800/  919]	F1_score(weighted):1.0
#   90;	train_loss: 0.546;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.86
Valid metrics:
	 avg_loss: 0.155328;	 avg_accuracy: 51.0%
Epoch 17/25
-------------------------------
#    0;	train_loss: 0.762;	train_accuracy: 80.0%		[    0/  919]	F1_score(weighted):0.8
#   10;	train_loss: 0.274;	train_accuracy: 90.0%		[  100/  919]	F1_score(weighted):0.8666666666666666
#   20;	train_loss: 0.410;	train_accuracy: 90.0%		[  200/  919]	F1_score(weighted):0.9111111111111111
#   30;	train_loss: 0.299;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.8571428571428571
#   40;	train_loss: 0.246;	train_accuracy:100.0%		[  400/  919]	F1_score(weighted):1.0
#   50;	train_loss: 0.597;	train_accuracy: 70.0%		[  500/  919]	F1_score(weighted):0.6583333333333332
#   60;	train_loss: 0.552;	train_accuracy: 70.0%		[  600/  919]	F1_score(weighted):0.65
#   70;	train_loss: 0.224;	train_accuracy:100.0%		[  700/  919]	F1_score(weighted):1.0
#   80;	train_loss: 0.643;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.7488888888888889
#   90;	train_loss: 0.078;	train_accuracy:100.0%		[  900/  919]	F1_score(weighted):1.0
Valid metrics:
	 avg_loss: 0.147788;	 avg_accuracy: 51.0%
Epoch 18/25
-------------------------------
#    0;	train_loss: 0.099;	train_accuracy:100.0%		[    0/  919]	F1_score(weighted):1.0
#   10;	train_loss: 0.154;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.189;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.345;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.9
#   40;	train_loss: 0.404;	train_accuracy: 80.0%		[  400/  919]	F1_score(weighted):0.8
#   50;	train_loss: 0.224;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.671;	train_accuracy: 70.0%		[  600/  919]	F1_score(weighted):0.7233333333333334
#   70;	train_loss: 0.258;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.8904761904761905
#   80;	train_loss: 0.166;	train_accuracy:100.0%		[  800/  919]	F1_score(weighted):1.0
#   90;	train_loss: 0.364;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9333333333333333
Valid metrics:
	 avg_loss: 0.159669;	 avg_accuracy: 52.9%
Epoch 19/25
-------------------------------
#    0;	train_loss: 0.189;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.8666666666666666
#   10;	train_loss: 0.194;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.470;	train_accuracy: 80.0%		[  200/  919]	F1_score(weighted):0.788888888888889
#   30;	train_loss: 0.537;	train_accuracy: 80.0%		[  300/  919]	F1_score(weighted):0.846153846153846
#   40;	train_loss: 0.372;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.9
#   50;	train_loss: 0.351;	train_accuracy: 90.0%		[  500/  919]	F1_score(weighted):0.86
#   60;	train_loss: 0.275;	train_accuracy: 90.0%		[  600/  919]	F1_score(weighted):0.9095238095238095
#   70;	train_loss: 0.335;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.9066666666666666
#   80;	train_loss: 0.296;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.76
#   90;	train_loss: 0.681;	train_accuracy: 80.0%		[  900/  919]	F1_score(weighted):0.8
Valid metrics:
	 avg_loss: 0.177310;	 avg_accuracy: 51.3%
Epoch 20/25
-------------------------------
#    0;	train_loss: 0.482;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.9
#   10;	train_loss: 0.348;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.7333333333333333
#   20;	train_loss: 0.194;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.165;	train_accuracy:100.0%		[  300/  919]	F1_score(weighted):1.0
#   40;	train_loss: 0.447;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.8904761904761905
#   50;	train_loss: 0.090;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.067;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.088;	train_accuracy:100.0%		[  700/  919]	F1_score(weighted):1.0
#   80;	train_loss: 0.244;	train_accuracy:100.0%		[  800/  919]	F1_score(weighted):1.0
#   90;	train_loss: 0.280;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9
Valid metrics:
	 avg_loss: 0.168451;	 avg_accuracy: 51.0%
Epoch 21/25
-------------------------------
#    0;	train_loss: 0.117;	train_accuracy:100.0%		[    0/  919]	F1_score(weighted):1.0
#   10;	train_loss: 0.334;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.8444444444444444
#   20;	train_loss: 0.127;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.245;	train_accuracy: 80.0%		[  300/  919]	F1_score(weighted):0.8066666666666666
#   40;	train_loss: 0.164;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.8555555555555555
#   50;	train_loss: 0.464;	train_accuracy: 80.0%		[  500/  919]	F1_score(weighted):0.8
#   60;	train_loss: 0.456;	train_accuracy: 80.0%		[  600/  919]	F1_score(weighted):0.85
#   70;	train_loss: 0.205;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.86
#   80;	train_loss: 0.063;	train_accuracy:100.0%		[  800/  919]	F1_score(weighted):1.0
#   90;	train_loss: 0.290;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9428571428571428
Valid metrics:
	 avg_loss: 0.181029;	 avg_accuracy: 51.7%
Epoch 22/25
-------------------------------
#    0;	train_loss: 0.150;	train_accuracy:100.0%		[    0/  919]	F1_score(weighted):1.0
#   10;	train_loss: 0.246;	train_accuracy: 90.0%		[  100/  919]	F1_score(weighted):0.9
#   20;	train_loss: 0.152;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.300;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.9
#   40;	train_loss: 0.154;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.8571428571428571
#   50;	train_loss: 0.607;	train_accuracy: 80.0%		[  500/  919]	F1_score(weighted):0.7238095238095237
#   60;	train_loss: 0.137;	train_accuracy: 90.0%		[  600/  919]	F1_score(weighted):0.8904761904761905
#   70;	train_loss: 0.533;	train_accuracy: 80.0%		[  700/  919]	F1_score(weighted):0.8066666666666666
#   80;	train_loss: 0.270;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.9333333333333333
#   90;	train_loss: 0.630;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9
Valid metrics:
	 avg_loss: 0.205397;	 avg_accuracy: 54.8%
Epoch 23/25
-------------------------------
#    0;	train_loss: 0.163;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.8666666666666668
#   10;	train_loss: 0.190;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.505;	train_accuracy: 90.0%		[  200/  919]	F1_score(weighted):0.9095238095238095
#   30;	train_loss: 0.128;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.9
#   40;	train_loss: 0.085;	train_accuracy:100.0%		[  400/  919]	F1_score(weighted):1.0
#   50;	train_loss: 0.048;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.113;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.156;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.9400000000000001
#   80;	train_loss: 0.084;	train_accuracy:100.0%		[  800/  919]	F1_score(weighted):1.0
#   90;	train_loss: 0.132;	train_accuracy:100.0%		[  900/  919]	F1_score(weighted):1.0
Valid metrics:
	 avg_loss: 0.191645;	 avg_accuracy: 52.5%
Epoch 24/25
-------------------------------
#    0;	train_loss: 0.347;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.9028571428571428
#   10;	train_loss: 0.584;	train_accuracy: 80.0%		[  100/  919]	F1_score(weighted):0.77
#   20;	train_loss: 0.245;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.156;	train_accuracy: 90.0%		[  300/  919]	F1_score(weighted):0.86
#   40;	train_loss: 0.040;	train_accuracy:100.0%		[  400/  919]	F1_score(weighted):1.0
#   50;	train_loss: 0.064;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.136;	train_accuracy:100.0%		[  600/  919]	F1_score(weighted):1.0
#   70;	train_loss: 0.223;	train_accuracy: 90.0%		[  700/  919]	F1_score(weighted):0.8666666666666668
#   80;	train_loss: 0.340;	train_accuracy: 80.0%		[  800/  919]	F1_score(weighted):0.7999999999999999
#   90;	train_loss: 0.185;	train_accuracy: 90.0%		[  900/  919]	F1_score(weighted):0.9428571428571428
Valid metrics:
	 avg_loss: 0.199888;	 avg_accuracy: 52.1%
Epoch 25/25
-------------------------------
#    0;	train_loss: 0.293;	train_accuracy: 90.0%		[    0/  919]	F1_score(weighted):0.86
#   10;	train_loss: 0.053;	train_accuracy:100.0%		[  100/  919]	F1_score(weighted):1.0
#   20;	train_loss: 0.024;	train_accuracy:100.0%		[  200/  919]	F1_score(weighted):1.0
#   30;	train_loss: 0.037;	train_accuracy:100.0%		[  300/  919]	F1_score(weighted):1.0
#   40;	train_loss: 0.216;	train_accuracy: 90.0%		[  400/  919]	F1_score(weighted):0.9
#   50;	train_loss: 0.081;	train_accuracy:100.0%		[  500/  919]	F1_score(weighted):1.0
#   60;	train_loss: 0.515;	train_accuracy: 80.0%		[  600/  919]	F1_score(weighted):0.8
#   70;	train_loss: 0.039;	train_accuracy:100.0%		[  700/  919]	F1_score(weighted):1.0
#   80;	train_loss: 0.199;	train_accuracy: 90.0%		[  800/  919]	F1_score(weighted):0.9121212121212119
#   90;	train_loss: 0.130;	train_accuracy:100.0%		[  900/  919]	F1_score(weighted):1.0
Valid metrics:
	 avg_loss: 0.193945;	 avg_accuracy: 53.2%

Other methods#

To compare the accuracy of the method with a baseline, we use SVM and DummyClassfier, implemntend in sklearn. Then we will print the F1 score and accuracy for further comparison. In addition, we will also plot the confusion matrix for a detailed analysis of the classifiers.

import torch
from torch.utils.data import DataLoader, TensorDataset
from sklearn.model_selection import train_test_split
import sklearn.metrics
import numpy as np
from sklearn import svm
from sklearn.dummy import DummyClassifier
from sklearn.metrics import accuracy_score
# Extract data from DataLoader
X_list = []
y_list = []

for batch, (X, y) in enumerate(train_generator):
    X_list.append(X.numpy())
    y_list.append(y.numpy())

for batch, (X, y) in enumerate(valid_generator):
    X_list.append(X.numpy())
    y_list.append(y.numpy())

for batch, (X, y) in enumerate(test_generator):
    X_list.append(X.numpy())
    y_list.append(y.numpy())

# Concatenate all batches
X = np.concatenate(X_list, axis=0).squeeze(-1)
y = np.concatenate(y_list, axis=0)

# Check shapes
print(f"Shape of X: {X.shape}")
print(f"Shape of y: {y.shape}")

# Split data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42,stratify=y)

# Initialize scikit-learn models
svm_model = svm.SVC()
dummy_model = DummyClassifier(strategy="most_frequent")

# Train and evaluate SVM model
svm_model.fit(X_train, y_train)
svm_predictions = svm_model.predict(X_test)
svm_accuracy = accuracy_score(y_test, svm_predictions)
print(f"SVM Accuracy: {svm_accuracy}")

# Train and evaluate DummyClassifier model
dummy_model.fit(X_train, y_train)
dummy_predictions = dummy_model.predict(X_test)
dummy_accuracy = accuracy_score(y_test, dummy_predictions)
print(f"Dummy Classifier Accuracy: {dummy_accuracy}")


print('F1 Score')
dummy_f1 = sklearn.metrics.f1_score(y_test, dummy_predictions,average='weighted')
svm_f1 = sklearn.metrics.f1_score(y_test, svm_predictions,average='weighted')
cheb_f1 = sklearn.metrics.f1_score(y_concat,pred_concat ,average='weighted')
cheb_f1_rnd = sklearn.metrics.f1_score(y_concat_rnd,pred_concat_rnd ,average='weighted')
print('dummy_f1', dummy_f1)
print('svm_f1', svm_f1)
print('Chebnet_f1', cheb_f1)
print('Chebnet_f1_rnd', cheb_f1_rnd)
#sklearn.metrics.f1_score(y_test, dummy_predictions)
Shape of X: (1314, 400)
Shape of y: (1314,)
SVM Accuracy: 0.5399239543726235
Dummy Classifier Accuracy: 0.25475285171102663
F1 Score
dummy_f1 0.10344509736144718
svm_f1 0.46028242034325456
Chebnet_f1 0.5279546214189761
Chebnet_f1_rnd 0.5251744354486851
# Plotting COnfusion Matrix
from sklearn import svm
from sklearn.dummy import DummyClassifier
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
from sklearn.metrics import confusion_matrix, ConfusionMatrixDisplay
from matplotlib import pyplot as plt
import pandas as pd
label_dict={}
df_label = pd.read_csv('../data/split_win/labels.csv')
for _, row in df_label.iterrows():
    label = row['label']
    filename_prefix = row['filename'].split('_')[0]
    if label not in label_dict:
        label_dict[label] = filename_prefix

cm = confusion_matrix(y_test, dummy_predictions, labels=dummy_model.classes_)
disp = ConfusionMatrixDisplay(confusion_matrix=cm,
                              display_labels=label_dict.values())

disp.plot()
plt.title("Dummy")

cm = confusion_matrix(y_test, svm_predictions, labels=svm_model.classes_)
disp = ConfusionMatrixDisplay(confusion_matrix=cm,
                              display_labels=label_dict.values())

disp.plot()
plt.title("SVM")


cm = confusion_matrix(y_concat, pred_concat)
disp = ConfusionMatrixDisplay(confusion_matrix=cm,
                              display_labels=label_dict.values())

disp.plot()
plt.title("ChebNet")


cm = confusion_matrix(y_concat_rnd, pred_concat_rnd)
disp = ConfusionMatrixDisplay(confusion_matrix=cm,
                              display_labels=label_dict.values())

disp.plot()
plt.title("ChebNet_Random")
/usr/local/lib/python3.10/dist-packages/pandas/core/dtypes/cast.py:1846: DeprecationWarning: np.find_common_type is deprecated.  Please use `np.result_type` or `np.promote_types`.
See https://numpy.org/devdocs/release/1.25.0-notes.html and the docs for more information.  (Deprecated NumPy 1.25)
  return np.find_common_type(types, [])  # type: ignore[arg-type]
Text(0.5, 1.0, 'ChebNet_Random')
_images/e8c9a408eba03c6d64262216064dc8350b3ac2ed3e648dc9b715e46586e88d83.png _images/2c221a72ac90bf0bd5d8039b3708b59086005c877d5b5eecc2645a81b7c94384.png _images/faa16ea8ff56e5937fb1d7d02755f324b22845b6d6734fac5d621b8d923c02f6.png _images/1b11df0ce6cc8e7710b10d7cce284aff42f165f2248b85f5a6c03a2de1341269.png

Disscusion#

As we have seen in above results the accuracy of the gcn, and gcn_rnd has small differences. Thus, this could be caused either by independency of the underlying graph in the the training and prediction procedure; or it could be caused by the over generalized classes of the dataset. By the way the accuracy of the trained netwroks shows improvement compared to the baseline methods like SVM and Dummy Classifier.