레이더 센싱 데이터를 이용한 classification 문제에 머신러닝을 이용하고자 하는데, amplitude와 phase 두 attribute의 feature를 1D CNN을 이용하여 각각 추출한 후 모델의 끝에서 다시 병합하여 사용하고자 한다.
Keras에서는 이와 같은 방식의 모델을 구성할 수 있도록 도와주는 functional.api가 존재한다.
Keras documentation: The Functional API
The Functional API Author: fchollet Date created: 2019/03/01 Last modified: 2020/04/12 Description: Complete guide to the functional API. View in Colab • GitHub source Setup import numpy as np import tensorflow as tf from tensorflow import keras from ten
keras.io
위 사이트를 참고하여 아래 사진과 같은 모델을 구성해보았다.
def VGG_branch(X, Y, test_X, test_Y, cp_filepath, EPOCH=100):
amp_input = keras.Input(shape=(X.shape[2],1), name="amplitude")
phs_input = keras.Input(shape=(X.shape[2],1), name='phase')
amp_features = layers.Conv1D(64, (3), activation = 'relu', input_shape = (X.shape[2], 1), padding = 'same')(amp_input)
amp_features = layers.Conv1D(64, (3), activation = 'relu', padding = 'same')(amp_features)
amp_features = layers.MaxPool1D(2)(amp_features)
amp_features = layers.Conv1D(128, (3), activation = 'relu', padding = 'same')(amp_features)
amp_features = layers.Conv1D(128, (3), activation = 'relu', padding = 'same')(amp_features)
amp_features = layers.MaxPool1D(2)(amp_features)
amp_features = layers.Conv1D(256,(3), activation = 'relu', padding = 'same')(amp_features)
amp_features = layers.Conv1D(256,(3), activation = 'relu', padding = 'same')(amp_features)
amp_features = layers.Conv1D(256,(3), activation = 'relu', padding = 'same')(amp_features)
amp_features = layers.Conv1D(256,(3), activation = 'relu', padding = 'same')(amp_features)
amp_features = layers.MaxPool1D(2)(amp_features)
print(amp_features.shape)
amp_features = layers.Flatten()(amp_features)
phs_features = layers.Conv1D(64, (3), activation = 'relu', input_shape = (X.shape[2], 1), padding = 'same')(phs_input)
phs_features = layers.Conv1D(64, (3), activation = 'relu', padding = 'same')(phs_features)
phs_features = layers.MaxPool1D(2)(phs_features)
phs_features = layers.Conv1D(128, (3), activation = 'relu', padding = 'same')(phs_features)
phs_features = layers.Conv1D(128, (3), activation = 'relu', padding = 'same')(phs_features)
phs_features = layers.MaxPool1D(2)(phs_features)
phs_features = layers.Conv1D(256,(3), activation = 'relu', padding = 'same')(phs_features)
phs_features = layers.Conv1D(256,(3), activation = 'relu', padding = 'same')(phs_features)
phs_features = layers.Conv1D(256,(3), activation = 'relu', padding = 'same')(phs_features)
phs_features = layers.Conv1D(256,(3), activation = 'relu', padding = 'same')(phs_features)
phs_features = layers.MaxPool1D(2)(phs_features)
print(phs_features.shape)
phs_features = layers.Flatten()(phs_features)
x = layers.concatenate([amp_features, phs_features], axis = -1)
x = layers.Dense(4096, activation = 'relu')(x)
x = layers.Dense(4096, activation = 'relu')(x)
material_output = layers.Dense(4, activation = 'softmax', name = 'material_output')(x)
model = keras.Model(inputs = [amp_input, phs_input],
outputs = [material_output],)
model.summary()
keras.utils.plot_model(model, "./multi_input_and_output_model.png", show_shapes=True)
model.compile(optimizer = 'adam', loss = tf.keras.losses.CategoricalCrossentropy(), metrics=['accuracy'])
checkpoint_filepath = cp_filepath
callback = tf.keras.callbacks.ModelCheckpoint(
filepath = './model/' + checkpoint_filepath,
monitor='val_accuracy',
mode='max',
save_best_only = True,
save_weigths_only = False,
)
model.fit(
{"amplitude": X[:,0,:], "phase": X[:,1,:]},
{"material_output": Y},
epochs=EPOCH,
validation_data = ({"amplitude" : test_X[:,0,:], "phase": test_X[:,1,:]}, {"material_output" : test_Y}),
callbacks = [callback]
)
'Tensorflow' 카테고리의 다른 글
Jeson Nano 에서의 Illegal instruction(core dumped) error (0) | 2022.02.23 |
---|---|
Tensorflow Gpu 확인 및 Gpu 지정하기 (0) | 2022.02.21 |
머신러닝 학습시 GPU 메모리 제한 (0) | 2022.02.19 |