SW/Python

Python : Keras : pretrain model : transfer learning : 활용, 예제, 방법

얇은생각 2020. 2. 9. 07:30
반응형
%matplotlib inline
import matplotlib.pyplot as plt
from keras.applications import vgg16, inception_v3, resnet50, mobilenet
from keras.preprocessing.image import load_img
from keras.preprocessing.image import img_to_array
from keras.applications.imagenet_utils import decode_predictions

import numpy as np

 

필요한 라이브러리를 임포트합니다.

 

 

 

vgg_model = vgg16.VGG16(weights='imagenet')

filename = 'squid.jpg'

org = load_img(filename, target_size=(224,224,3))

img = img_to_array(org)
plt.imshow(np.uint8(img))

 

 

읽어온 데이터

 

이번에는 vgg16이라는 이미 훈련된 모델을 활용해보도록 하겠습니다. 임의의 그림파일을 읽어왔습니다. 이미 훈련된 모델을 적용해보기 위해, 데이터를 전처리합니다.

 

 

 

x = np.expand_dims(img, axis =0)
print(x.shape)

x = vgg16.preprocess_input(x)
print(x)
 
pred = vgg_model.predict(x)
print(pred)

label = decode_predictions(pred)
print(label)

"""
(1, 224, 224, 3)
[[[[ 5.0609970e+00 -1.5778999e+01 -1.5680000e+01]
   [ 9.0609970e+00 -1.0778999e+01 -1.2680000e+01]
   [ 1.4060997e+01 -4.7789993e+00 -1.0680000e+01]
   ...
   [-5.9390030e+00 -2.7778999e+01 -4.4680000e+01]
   [-2.9390030e+00 -2.9778999e+01 -4.2680000e+01]
   [-5.9390030e+00 -3.2778999e+01 -4.5680000e+01]]

  [[ 1.0609970e+00 -1.8778999e+01 -1.8680000e+01]
   [ 4.0609970e+00 -1.4778999e+01 -1.6680000e+01]
   [ 7.0609970e+00 -1.0778999e+01 -1.6680000e+01]
   ...
   [-9.3900299e-01 -2.2778999e+01 -3.8680000e+01]
   [ 2.0609970e+00 -2.4778999e+01 -3.7680000e+01]
   [ 6.0997009e-02 -2.6778999e+01 -3.9680000e+01]]

  [[-7.9390030e+00 -2.4778999e+01 -2.5680000e+01]
   [-4.9390030e+00 -2.1778999e+01 -2.3680000e+01]
   [ 6.0997009e-02 -1.4778999e+01 -2.1680000e+01]
   ...
   [ 2.0609970e+00 -2.0778999e+01 -3.4680000e+01]
   [ 5.0609970e+00 -2.1778999e+01 -3.4680000e+01]
   [ 2.0609970e+00 -2.4778999e+01 -3.7680000e+01]]

  ...

  [[ 4.6060997e+01  4.2221001e+01  4.9320000e+01]
   [ 5.0060997e+01  5.8221001e+01  8.2320000e+01]
   [ 2.8060997e+01  6.3221001e+01  9.2320000e+01]
   ...
   [ 9.8060997e+01  7.0221001e+01  6.1320000e+01]
   [ 1.0306100e+02  7.1221001e+01  5.9320000e+01]
   [ 1.0606100e+02  7.4221001e+01  6.2320000e+01]]

  [[-1.7939003e+01  1.2210007e+00  2.9320000e+01]
   [-3.7939003e+01 -7.7899933e-01  4.4320000e+01]
   [-2.7939003e+01  3.4221001e+01  7.9320000e+01]
   ...
   [ 8.9060997e+01  6.3221001e+01  5.4320000e+01]
   [ 1.0306100e+02  7.2221001e+01  5.8320000e+01]
   [ 1.0806100e+02  7.7221001e+01  6.3320000e+01]]

  [[-6.1939003e+01 -4.3778999e+01 -5.6800003e+00]
   [-3.4939003e+01  7.2210007e+00  6.7320000e+01]
   [-6.0939003e+01  6.2210007e+00  5.8320000e+01]
   ...
   [ 6.0060997e+01  3.5221001e+01  2.6320000e+01]
   [ 8.9060997e+01  5.9221001e+01  4.3320000e+01]
   [ 9.6060997e+01  6.6221001e+01  5.0320000e+01]]]]
[[9.75964021e-09 1.01916342e-07 1.02424993e-08 1.39559475e-09
  4.43180825e-09 2.50196450e-07 1.32926743e-07 1.90917649e-06
  3.21489715e-05 2.42305674e-07 2.81101409e-02 3.33818054e-04
  2.90412372e-05 1.15919160e-04 1.34279690e-04 4.84677730e-05
  8.12515282e-06 1.03425817e-04 1.90711091e-06 2.86172362e-05
  6.74163457e-05 2.15079781e-05 6.15452961e-07 2.33640151e-07
  4.66528718e-06 4.94708672e-07 1.96248425e-06 2.27128430e-06
  6.74794123e-08 2.43943333e-07 1.13402975e-05 9.18197111e-06
  9.67309006e-06 2.17641201e-07 2.69170386e-07 2.32855584e-07
  3.34376250e-06 2.11227714e-07 5.21056450e-07 2.21447840e-06
  1.77766760e-05 2.96374878e-06 7.46757723e-05 3.88970511e-06
  1.81225812e-06 7.89285224e-08 5.74976775e-06 9.59059889e-07
  1.45447416e-07 7.68252391e-07 2.80842386e-07 7.02199321e-09
  1.90315632e-07 1.44644702e-07 2.54562769e-06 3.02130843e-07
  1.43393606e-07 5.29084787e-07 1.69804110e-07 2.55615669e-06
  1.36861138e-06 9.47079073e-08 8.04160095e-07 5.75591457e-07
  6.45825025e-07 3.70321459e-07 1.77539241e-05 2.29700248e-07
  7.35216463e-06 3.84827992e-08 7.44466604e-07 2.34672271e-07
  1.67977149e-07 1.19861443e-06 3.03568186e-06 1.01176170e-07
  8.14139128e-07 3.24049324e-06 3.86057582e-06 2.42007758e-07
  5.03775300e-06 2.00225622e-04 5.56786945e-05 4.61979362e-05
  7.17176317e-06 6.21500774e-04 1.47019600e-04 1.33076156e-07
  2.46072148e-07 1.14426548e-05 8.68076881e-07 1.31391198e-06
  1.00969610e-05 6.07690708e-07 4.68813378e-05 4.49732761e-06
  4.22078088e-08 3.83896040e-05 1.03002844e-06 2.22519375e-05
  5.55177415e-08 7.29050065e-09 3.70895094e-07 2.55472258e-07
  1.72956730e-04 3.74086881e-07 3.95221332e-06 3.24081775e-08
  1.93439917e-07 4.96072118e-07 4.06879195e-07 1.03008242e-08
  5.11541748e-06 1.92036850e-06 2.94948995e-06 1.02606168e-06
  1.92412884e-07 2.10162966e-07 3.82169105e-08 4.19180242e-06
  2.33257811e-06 7.40291313e-08 2.46713494e-08 8.36998737e-08
  2.02044203e-06 1.01353726e-06 9.28567431e-07 4.79106212e-08
  1.61755253e-07 8.39182377e-08 7.71050335e-09 5.94809308e-07
  4.97131055e-07 1.50191465e-06 5.76377545e-07 7.95821336e-07
  1.17240461e-05 1.81777443e-06 8.09118035e-07 3.98093434e-05
  3.03539122e-03 5.51648236e-06 2.11772531e-05 3.74407591e-06
  6.37789867e-07 2.42844362e-06 1.47752598e-06 1.90566887e-08
  4.46467041e-08 8.86037554e-09 6.08290566e-06 1.95193905e-07
  9.19768368e-08 1.39197729e-08 1.04041185e-07 2.00480272e-08
  3.81944147e-07 5.77482631e-08 7.81883671e-07 8.42236219e-08
  1.84235471e-08 7.87191752e-08 1.13412696e-07 2.54251979e-08
  7.08184871e-07 3.41867938e-08 4.40057988e-07 1.29361638e-07
  5.09097845e-07 2.11867061e-07 6.25666488e-08 2.90090156e-07
  3.03897394e-07 2.98714070e-07 1.22719271e-07 1.35074259e-07
  4.44372631e-07 2.11400234e-07 1.49662451e-08 1.70576264e-08
  5.15628003e-08 5.49613674e-07 2.26610794e-07 1.22670954e-07
  4.81784070e-07 6.21589891e-07 4.29393458e-06 3.57512192e-07
  2.04271305e-06 8.87372778e-07 1.26500979e-07 1.48754452e-06
  6.29605324e-07 7.78392803e-07 7.20143234e-08 4.96980519e-08
  2.61413238e-08 3.19265538e-08 1.17173158e-07 4.42976571e-08
  2.64916800e-08 2.69925124e-07 2.60978965e-07 3.21485999e-07
  7.71300748e-08 2.00852313e-08 9.68019975e-09 8.25203188e-07
  4.27202735e-07 1.00664490e-08 4.03792733e-08 1.07756030e-07
  3.11428522e-07 2.81041839e-08 2.27728478e-07 1.44976283e-07
  6.16320222e-08 2.44862672e-08 9.32080752e-07 2.94831835e-07
  5.65560647e-08 7.29762384e-09 6.03688619e-08 2.09477324e-08
  4.11485956e-08 1.67024623e-07 1.88001454e-08 3.25770742e-07
  4.18431760e-08 2.73465393e-08 6.42999566e-06 8.23715891e-06
  1.54792957e-07 1.67333187e-08 1.02180067e-07 2.08619355e-07
  1.16699866e-07 3.20197380e-08 1.89744327e-08 1.13240098e-07
  1.97590043e-06 4.44401451e-07 4.42334880e-08 5.19997378e-08
  1.06027613e-08 1.39126129e-07 6.54056578e-08 9.46038199e-08
  8.51212803e-07 1.35811931e-07 2.67372059e-07 4.86666067e-07
  4.30756586e-09 2.77675713e-06 9.98828611e-08 9.84287141e-08
  5.76103831e-09 7.23735809e-08 1.56590602e-07 6.07595666e-07
  1.27893173e-07 3.72440709e-08 7.22338740e-08 3.56325472e-06
  1.84010378e-06 2.58822439e-08 3.78653731e-08 2.87276638e-08
  1.68964363e-08 4.98728550e-05 8.49641492e-06 5.92821834e-06
  1.06549356e-04 7.76774016e-07 8.41765814e-06 8.73874129e-08
  7.96446955e-07 1.26714713e-05 4.77676185e-05 4.86247427e-06
  7.20213648e-05 1.54089616e-06 4.00854105e-06 3.86748518e-08
  2.64283404e-08 1.06318839e-06 5.48650576e-07 8.52681387e-06
  9.11834741e-06 4.29230113e-06 4.13507587e-06 7.88578018e-06
  1.82781951e-05 2.33404553e-05 2.06044706e-06 7.51328116e-07
  3.52371535e-05 8.35798346e-08 3.16857586e-05 2.48314536e-05
  2.03132032e-07 5.96929203e-06 3.95110362e-07 1.04111905e-06
  1.00101011e-06 2.56938023e-07 3.18155685e-07 9.03063028e-06
  2.87756109e-07 1.22295060e-05 2.37617041e-05 1.33871163e-06
  2.96975685e-07 3.43826443e-08 1.63455439e-07 8.28279170e-08
  1.31165649e-07 1.42704209e-06 3.43458282e-07 1.43390182e-07
  2.42696984e-07 2.43150595e-07 6.83092321e-06 3.74190705e-07
  7.95767119e-06 1.03230459e-05 8.07894776e-06 1.06274378e-07
  3.60628377e-07 1.60337962e-07 7.57780764e-03 8.87890160e-03
  6.56309453e-07 1.61045769e-04 4.82375526e-06 6.98797822e-01
  2.45443851e-01 1.10444031e-03 1.87786700e-05 1.66389952e-07
  9.04862176e-08 7.96009135e-06 2.72804191e-05 3.82022627e-08
  1.65937536e-07 6.12726808e-07 2.28974901e-08 4.54929278e-07
  2.22631752e-06 1.16822423e-06 2.68118438e-05 4.17320479e-07
  1.62337710e-05 3.22297055e-05 4.57200969e-07 1.26258419e-06
  1.30817818e-04 2.13633666e-05 1.28144870e-06 7.32074250e-06
  1.04626652e-05 7.59387788e-08 2.97576698e-05 7.77194145e-07
  8.65893597e-08 5.35219051e-08 2.55395520e-07 1.70957122e-07
  4.75328108e-07 1.61177027e-07 1.80106654e-05 6.95429699e-05
  1.19940455e-06 1.23269983e-05 9.16046338e-06 6.79995793e-08
  1.64475671e-06 4.28132853e-06 7.61633245e-08 2.87709412e-07
  1.77203549e-06 9.88855618e-07 5.22559094e-06 1.06800428e-06
  1.86524915e-06 8.77038797e-09 7.53535279e-09 1.40493185e-07
  6.76916017e-08 2.82672907e-08 6.12940710e-07 3.31534822e-09
  8.45205875e-07 1.90287679e-07 6.81496459e-09 3.52450975e-08
  7.90447189e-07 5.72230692e-06 2.51627625e-07 1.04808358e-08
  1.28170639e-08 1.35522953e-08 2.01871622e-08 4.11404351e-08
  3.70729794e-08 1.72471701e-07 1.30170605e-07 8.50208721e-08
  1.30433179e-07 5.79167875e-07 1.96236906e-05 1.58119651e-08
  8.76557181e-07 1.74687074e-07 9.97024330e-08 1.01278169e-06
  9.53115720e-09 2.83885782e-07 4.42171057e-07 1.47813537e-07
  2.84200699e-08 1.50096881e-07 2.20973689e-07 6.53061516e-09
  4.95598513e-08 2.64565119e-06 6.78823369e-07 6.60399280e-07
  1.17009122e-07 4.73038540e-07 5.77446251e-08 4.26189839e-09
  2.54262620e-08 3.34149917e-08 1.79019173e-06 9.86870646e-07
  5.28880655e-08 1.22705951e-05 7.23607116e-07 3.97502191e-08
  4.44613420e-07 2.36017129e-07 1.01688715e-06 7.54392744e-08
  3.09229158e-08 8.13567453e-08 1.29431908e-06 9.67002279e-07
  1.67230319e-04 7.92072569e-07 5.07157560e-09 9.17845071e-08
  1.23288721e-08 2.47872322e-07 2.33283018e-07 5.17581520e-06
  4.10911873e-08 2.43082894e-07 2.55954251e-06 1.25919319e-08
  3.45975968e-06 6.59361206e-08 1.25298399e-07 3.80427332e-06
  6.05952891e-07 1.34275062e-08 1.13437508e-08 1.24968793e-07
  7.91354111e-08 4.22144915e-08 3.89240768e-07 1.12244166e-07
  2.67630725e-07 1.05958328e-07 6.68268285e-08 1.90718179e-08
  5.80002713e-08 9.50173614e-07 2.07062908e-06 7.64896129e-07
  2.08747338e-07 1.38957603e-05 1.39469068e-06 4.86158169e-05
  6.04985701e-07 2.67640445e-07 3.44152511e-08 2.17973457e-06
  1.60586023e-05 8.95451012e-06 3.87253181e-07 3.84800273e-07
  5.68997116e-08 1.06825837e-07 3.07513801e-06 2.24188774e-08
  7.82249110e-09 5.06988399e-05 9.23411903e-07 4.11544676e-07
  2.65587005e-07 4.15502690e-08 4.80106735e-07 1.03495310e-07
  2.64796711e-07 4.19954688e-07 2.43894249e-07 5.61716604e-07
  1.01759588e-05 2.12070802e-07 8.43801544e-08 5.00161718e-07
  5.88983369e-07 6.50562910e-08 1.45734180e-08 2.13744286e-07
  3.05971248e-08 1.17842283e-07 1.94076861e-07 3.94911649e-06
  1.34972966e-08 3.36659099e-07 3.79540751e-07 8.41428260e-09
  4.68919978e-08 3.32795935e-07 5.67812215e-08 4.38587392e-07
  2.58389008e-07 5.27761879e-09 1.03205643e-07 2.36217772e-07
  1.26661712e-08 1.20041094e-07 6.14576976e-08 4.25432120e-07
  8.84560109e-07 1.17881861e-07 5.60895205e-06 7.89574017e-07
  3.65950399e-08 1.34765905e-08 3.03086836e-08 8.42390406e-08
  4.06996179e-08 6.54016645e-08 1.97453943e-07 2.36879529e-08
  1.29749037e-06 2.46396468e-07 1.23765417e-07 1.00729260e-07
  1.09761508e-07 3.90697892e-07 1.39668529e-07 1.98497538e-08
  1.66866272e-08 4.86141005e-07 2.15939167e-06 8.39420125e-08
  3.67486223e-07 3.14962136e-08 5.45234059e-07 1.44624357e-06
  4.28177138e-09 4.69411248e-08 3.75723317e-07 6.97655338e-08
  4.97345631e-08 4.28819718e-08 1.29530937e-08 2.50685588e-07
  1.96338746e-07 3.25194449e-08 1.04745106e-06 1.43180365e-08
  3.30740257e-09 1.90489402e-07 1.86823542e-07 5.03917752e-09
  4.29739828e-07 2.03526241e-07 2.45804529e-07 3.25283338e-07
  4.05395539e-08 3.07830533e-07 1.39657530e-08 3.19380042e-08
  2.40267717e-08 1.51670871e-08 2.22162794e-06 3.80698964e-07
  7.47738227e-07 2.01682474e-06 5.12370235e-08 3.43456108e-08
  2.85127101e-07 1.33930014e-08 4.08681728e-07 1.81131463e-06
  2.16050194e-06 8.00021027e-09 1.76179018e-08 2.18508536e-08
  4.04047512e-07 1.07622270e-06 1.52037398e-06 3.92825434e-08
  1.80075403e-07 7.20779454e-08 1.01021467e-08 2.93041876e-06
  4.30051417e-09 8.75864004e-08 1.08794653e-08 4.01675493e-09
  5.74526496e-07 1.97050891e-08 5.55023938e-08 2.43682194e-07
  9.43990131e-07 4.38597212e-08 3.29961381e-06 1.20862944e-06
  1.77191637e-07 4.07361057e-07 4.12876352e-06 1.55189177e-08
  5.55159659e-07 2.36372237e-08 5.85798166e-09 8.35749290e-07
  1.39273163e-06 2.14689476e-06 2.78279458e-07 1.37380687e-06
  1.65793654e-08 4.96981102e-06 4.55639402e-08 1.52719970e-08
  3.23491986e-06 1.24660664e-08 1.32995526e-08 1.88913027e-08
  9.72465045e-07 1.45940575e-08 4.93032667e-06 1.92627880e-07
  7.31525404e-07 1.59289152e-06 5.88955551e-08 3.37082838e-07
  4.82000484e-09 7.52055712e-07 2.09126405e-08 5.47316859e-09
  1.26551541e-07 2.56916451e-07 4.54794566e-08 3.47364335e-07
  3.93228902e-07 1.07037434e-09 4.06109308e-07 7.40480527e-06
  4.64038521e-06 5.16619352e-08 2.15130626e-07 5.44770700e-08
  1.38534244e-06 2.28554131e-09 1.65212583e-07 8.27786408e-08
  5.99261227e-07 1.51069889e-06 5.94640842e-05 2.20448424e-08
  4.48892052e-08 2.93005564e-06 2.19396394e-08 4.99526323e-07
  5.09124106e-07 4.34663480e-06 7.33191087e-07 4.05480421e-07
  4.54805877e-06 7.17413240e-09 4.25948699e-08 2.22844569e-08
  2.63142113e-08 2.88820345e-09 1.05634861e-07 3.58809977e-08
  1.75175319e-06 3.22625169e-07 1.03657660e-08 1.67755036e-06
  1.67163904e-07 2.04726169e-09 2.96375174e-06 3.26577918e-07
  1.38862686e-06 1.46222554e-08 8.95984442e-09 6.32263982e-06
  1.43058145e-07 1.68229803e-07 2.41151987e-08 1.17305767e-07
  1.16381003e-07 5.41752243e-08 4.60578804e-06 1.91184927e-06
  9.09603102e-07 7.51238147e-08 1.56251659e-07 1.77241986e-07
  5.64997390e-06 6.67185702e-08 1.26327137e-07 7.48488844e-07
  2.43996595e-07 3.48425218e-08 2.67567685e-07 8.08802838e-08
  1.38412739e-08 3.22372046e-07 2.54390841e-07 4.14706953e-08
  3.42161184e-06 3.53389851e-07 4.75007944e-08 1.85899793e-08
  1.74230223e-07 1.08003405e-06 7.53147322e-09 3.48338176e-08
  1.79999926e-08 4.30673225e-07 3.89348799e-07 2.65338027e-08
  2.52375287e-07 5.72381289e-08 3.09721628e-07 9.90340141e-07
  4.95543020e-07 4.21848689e-08 8.32703435e-08 2.62313762e-08
  2.01670062e-07 2.36386342e-07 8.33772607e-09 2.27954018e-07
  1.98079615e-08 2.13966203e-07 1.25626786e-06 7.69543291e-08
  4.16961825e-08 1.48133245e-06 6.87817533e-07 2.46288380e-07
  3.38237776e-08 3.48222414e-07 2.56693340e-07 1.12183318e-06
  1.27248867e-07 1.88648013e-08 6.66066811e-08 7.47902277e-07
  2.87115171e-07 2.79890128e-06 8.15892989e-08 5.67764232e-07
  1.71324018e-07 2.59781700e-06 6.08278370e-08 6.65544917e-08
  4.93650383e-08 1.42153198e-07 2.71180625e-07 8.46578018e-07
  2.01637761e-07 2.43078745e-07 5.20996582e-06 3.05463095e-06
  5.20577146e-07 9.15538401e-09 5.47794485e-08 3.27062310e-07
  6.13247906e-08 7.20093780e-08 9.50621182e-09 2.93441484e-08
  3.79986574e-07 1.92086453e-08 1.10685370e-07 6.64757977e-07
  5.92426508e-09 7.14398274e-09 6.00067608e-07 1.98473032e-07
  1.94481501e-07 5.74124144e-08 1.63648792e-06 3.61397987e-08
  8.30381111e-07 1.01996349e-07 1.90544007e-08 1.24891457e-07
  3.37053407e-08 1.04390040e-07 2.89041338e-07 6.95842672e-08
  8.39309706e-08 6.70885356e-07 5.95804522e-07 2.89138654e-07
  6.49862173e-08 1.32632522e-06 1.85597946e-06 1.43812331e-08
  5.52312818e-09 4.14081782e-08 6.53173604e-09 4.72401283e-08
  7.42783470e-08 2.51384663e-05 5.85633416e-07 3.05536645e-07
  3.24261862e-07 2.47490590e-07 3.03958103e-08 5.52655255e-09
  8.42491431e-07 1.31717201e-07 2.40272762e-08 2.81153098e-06
  1.19204550e-07 2.28837266e-07 2.96276397e-07 2.03078940e-07
  3.96534972e-07 4.17911679e-08 4.98316091e-08 1.17594297e-07
  9.12983432e-06 1.19373431e-06 1.12122041e-06 9.73436087e-08
  2.04567436e-06 7.08440098e-07 1.78826085e-07 4.38424468e-06
  5.41044699e-07 8.94848711e-07 3.97242204e-08 2.51850543e-06
  4.64092675e-09 4.35372449e-08 1.26054124e-06 9.65341883e-07
  1.39172442e-07 6.40385565e-07 6.02819625e-07 3.06601038e-08
  4.70305643e-08 1.27497529e-07 2.80655623e-08 1.65696451e-06
  7.17303863e-07 6.14437496e-08 6.08628881e-08 1.12544384e-07
  3.32966437e-07 8.11141803e-08 9.97155070e-09 4.98876211e-08
  4.72307818e-07 1.93233678e-08 2.51570981e-07 6.00075936e-08
  1.25407652e-07 6.75661416e-10 6.63500472e-08 1.78611344e-07
  2.85347816e-08 1.64729190e-08 3.11551560e-07 3.78874780e-08
  6.03705885e-08 2.12977476e-08 4.51297266e-09 5.41336306e-07
  1.71944379e-07 8.01622164e-08 9.20515149e-08 1.01699818e-07
  4.31080849e-07 1.44050844e-07 6.06654226e-07 3.59615910e-07
  4.85257914e-08 1.48096461e-07 8.58128715e-07 1.99391916e-08
  3.07752316e-06 4.40115855e-07 7.85726186e-08 1.40427147e-07
  3.53865337e-09 2.60943480e-08 1.17461362e-07 1.59151512e-07
  3.48050991e-04 7.52784288e-08 4.01696184e-07 4.19518607e-08
  4.01644058e-07 7.01582508e-08 8.43663202e-08 6.44666625e-06
  3.97996615e-07 2.15389917e-07 6.01673449e-08 6.06484207e-07
  2.93151317e-07 2.32245583e-08 2.04475725e-08 4.17831870e-07
  6.75510066e-08 3.70624264e-07 1.04867956e-06 1.32742307e-05
  9.29153828e-07 6.00390777e-08 6.37683044e-08 3.79460033e-07
  1.75738020e-08 2.19930618e-04 2.34368116e-07 1.15598830e-07
  2.88089979e-07 1.29747605e-07 6.74564717e-07 3.50483276e-07
  1.85062231e-06 7.98882382e-08 1.73569242e-06 2.17967613e-06
  5.93078823e-07 5.44353149e-08 2.16503759e-06 7.88747627e-07
  1.92732273e-07 1.35092353e-06 2.58923933e-06 1.39361384e-06
  2.51365350e-06 1.17895513e-06 8.23773280e-06 4.13535417e-07
  1.73211561e-08 3.05967205e-06 1.11515000e-07 3.24858917e-07
  6.60374369e-08 1.50098458e-07 1.95604333e-08 6.97303165e-07
  9.65662366e-07 2.86508509e-07 1.75100249e-05 5.32953322e-07
  1.85913098e-06 2.93691505e-06 5.90086711e-06 1.34238007e-05
  3.25468136e-06 7.08730056e-07 4.20049855e-06 1.31976299e-06
  1.66091269e-07 3.20380629e-08 1.04511810e-06 1.13554846e-08
  1.46145248e-05 1.56890667e-06 1.21233455e-07 4.47370199e-04
  3.06224247e-04 5.89163108e-07 2.11117072e-06 7.32366198e-06
  9.63957291e-07 1.38909036e-06 8.25876327e-07 1.73867884e-05
  8.35005631e-06 2.44487524e-06 9.00957268e-04 1.24571940e-07]]
Downloading data from https://storage.googleapis.com/download.tensorflow.org/data/imagenet_class_index.json
40960/35363 [==================================] - 0s 2us/step
[[('n02356798', 'fox_squirrel', 0.6987978), ('n02361337', 'marmot', 0.24544385), ('n01530575', 'brambling', 0.028110141), ('n02326432', 'hare', 0.008878902), ('n02325366', 'wood_rabbit', 0.0075778076)]]
"""

 

읽어온 데이터를 불러온 모델에 적용해봅니다. 70%의 확률로 잘 예측하고 있는 것을 확인할 수 있었습니다.

 

 

vgg_model.summary()

"""
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_6 (InputLayer)         (None, 224, 224, 3)       0         
_________________________________________________________________
block1_conv1 (Conv2D)        (None, 224, 224, 64)      1792      
_________________________________________________________________
block1_conv2 (Conv2D)        (None, 224, 224, 64)      36928     
_________________________________________________________________
block1_pool (MaxPooling2D)   (None, 112, 112, 64)      0         
_________________________________________________________________
block2_conv1 (Conv2D)        (None, 112, 112, 128)     73856     
_________________________________________________________________
block2_conv2 (Conv2D)        (None, 112, 112, 128)     147584    
_________________________________________________________________
block2_pool (MaxPooling2D)   (None, 56, 56, 128)       0         
_________________________________________________________________
block3_conv1 (Conv2D)        (None, 56, 56, 256)       295168    
_________________________________________________________________
block3_conv2 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_conv3 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_pool (MaxPooling2D)   (None, 28, 28, 256)       0         
_________________________________________________________________
block4_conv1 (Conv2D)        (None, 28, 28, 512)       1180160   
_________________________________________________________________
block4_conv2 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_conv3 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_pool (MaxPooling2D)   (None, 14, 14, 512)       0         
_________________________________________________________________
block5_conv1 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv2 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv3 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_pool (MaxPooling2D)   (None, 7, 7, 512)         0         
_________________________________________________________________
flatten (Flatten)            (None, 25088)             0         
_________________________________________________________________
fc1 (Dense)                  (None, 4096)              102764544 
_________________________________________________________________
fc2 (Dense)                  (None, 4096)              16781312  
_________________________________________________________________
predictions (Dense)          (None, 1000)              4097000   
=================================================================
Total params: 138,357,544
Trainable params: 138,357,544
Non-trainable params: 0
_________________________________________________________________
"""

 

임의의 모델의 요약을 보면 위와 같습니다. 불러온 모델의 레이어의 정보는 위와 같다는 것을 알 수 있습니다.

 

 

for layer in vgg_model.layers:
    layer.trainable = False

vgg_model.summary()

"""
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_6 (InputLayer)         (None, 224, 224, 3)       0         
_________________________________________________________________
block1_conv1 (Conv2D)        (None, 224, 224, 64)      1792      
_________________________________________________________________
block1_conv2 (Conv2D)        (None, 224, 224, 64)      36928     
_________________________________________________________________
block1_pool (MaxPooling2D)   (None, 112, 112, 64)      0         
_________________________________________________________________
block2_conv1 (Conv2D)        (None, 112, 112, 128)     73856     
_________________________________________________________________
block2_conv2 (Conv2D)        (None, 112, 112, 128)     147584    
_________________________________________________________________
block2_pool (MaxPooling2D)   (None, 56, 56, 128)       0         
_________________________________________________________________
block3_conv1 (Conv2D)        (None, 56, 56, 256)       295168    
_________________________________________________________________
block3_conv2 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_conv3 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_pool (MaxPooling2D)   (None, 28, 28, 256)       0         
_________________________________________________________________
block4_conv1 (Conv2D)        (None, 28, 28, 512)       1180160   
_________________________________________________________________
block4_conv2 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_conv3 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_pool (MaxPooling2D)   (None, 14, 14, 512)       0         
_________________________________________________________________
block5_conv1 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv2 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv3 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_pool (MaxPooling2D)   (None, 7, 7, 512)         0         
_________________________________________________________________
flatten (Flatten)            (None, 25088)             0         
_________________________________________________________________
fc1 (Dense)                  (None, 4096)              102764544 
_________________________________________________________________
fc2 (Dense)                  (None, 4096)              16781312  
_________________________________________________________________
predictions (Dense)          (None, 1000)              4097000   
=================================================================
Total params: 138,357,544
Trainable params: 0
Non-trainable params: 138,357,544
_________________________________________________________________
"""

 

이제 불러온 모델이 훈련되지 않도록 각 레이어들의 훈련 여부를 false를 바꾸어 줍니다. 그런 다음, 마지막 레이어에만 추가적으로 레이어를 적용합니다. 즉 bottle neck fature 값을 가지고 분류를 진행하는 것입니다. 

이러한 방식으로 적용해 학습을 진행하는 방식을 transfer learning이라고 합니다. 이미 제공되고 있는 다양한 훈련된 모델들을 활용하면 훈련 시간도 줄일 수 있고, 본인 도메인에 맞는 문제들을 더 빠르게 해결해 나갈 수 있습니다. 

반응형