๊ทธ๋ฆผ ์ฐธ๊ณ  1:

Early Warning Model of Wind Turbine Front Bearing Based on Conv1D and LSTM | IEEE Conference Publication | IEEE Xplore

๊ทธ๋ฆผ ์ฐธ๊ณ  2:

Understanding 1D and 3D Convolution Neural Network | Keras | by Shiva Verma | Towards Data Science

 


1. ๋ฐ์ดํ„ฐ์…‹ ๊ฐ€์ •

0. ๊ฐ€์†๋„๊ณ„ ๋ฐ์ดํ„ฐ์…‹ ๊ฐ€์ •

Batch size : 100000

Sequence : 10

Feature : 3 (x-axis, y-axis, z-axis)

 

Dataset shape : (100000, 10, 3) = (Batch size, Sequence, Feature) = (B, S, F)

 


2. ๋ชจ๋ธ ๊ตฌ์„ฑ

1. ๋ชจ๋ธ ๊ตฌ์„ฑ

1. Conv1D

  CNN์€ convolution layer, pooling layer, fully connected layer๋กœ ์ฃผ๋กœ ๊ตฌ์„ฑ๋œ๋‹ค. ๊ทธ ์ค‘ convolution layer์™€ pooling layer๋Š” ๋‘ ๊ฐœ์˜ ํŠน์ˆ˜ ์‹ ๊ฒฝ๋ง ๋ ˆ์ด์–ด๋กœ ์ฃผ๋กœ ์œ ํšจ ํŠน์ง• ์ถ”์ถœ์„ ๋‹ด๋‹นํ•œ๋‹ค. ์›๋ณธ ๋ฐ์ดํ„ฐ์—์„œ ๋ฒกํ„ฐ๋ฅผ ์ถ”์ถœํ•˜๊ณ  ์›๋ณธ ๊ธฐ๋Šฅ์˜ ๊ณต๊ฐ„์  ์ •๋ณด๋ฅผ ๋งˆ์ด๋‹ํ•  ์ˆ˜ ์žˆ๋‹ค. ๊ฐ€์†๋„๊ณ„์™€ ๊ฐ™์€ 1์ฐจ์› ๋ฐ์ดํ„ฐ๋ฅผ 1์ฐจ์› ์ปจ๋ณผ๋ฃจ์…˜ ์‹ ๊ฒฝ๋ง(Conv1D)์„ ์‚ฌ์šฉํ•˜์—ฌ ์„œ๋กœ ๋‹ค๋ฅธ ๋ณ€์ˆ˜๋ฅผ ๊ฒฐํ•ฉํ•˜๊ณ  ๋ณ€์ˆ˜ ๊ฐ„์˜ ๊ณต๊ฐ„์  ์ƒ๊ด€ ๊ด€๊ณ„๋ฅผ ์ถ”์ถœํ•œ๋‹ค.

 

2. Conv1D

  Conv1D๋Š” ๊ทธ๋ฆผ 2์™€ ๊ฐ™์ด ํ•œ ์ฐจ์›์— ๋Œ€ํ•ด ์ปค๋„ ์Šฌ๋ผ์ด๋”ฉ์„ ํ†ตํ•ด ๊ณต๊ฐ„์  ์ƒ๊ด€ ๊ด€๊ณ„๋ฅผ ์ถ”์ถœํ•œ๋‹ค.

 

 

2. LSTM

  LSTM์€ ์‹œ๊ณ„์—ด ๋ฐ์ดํ„ฐ๋ฅผ ์ฒ˜๋ฆฌํ•˜๊ธฐ ์œ„ํ•œ ๊ณ ์ „์ ์ธ ๋”ฅ ๋Ÿฌ๋‹ ๋„คํŠธ์›Œํฌ์ด๋‹ค. ์ˆœํ™˜ ์‹ ๊ฒฝ๋ง์ด ๊ธด ์‹œ๊ณ„์—ด์„ ์–ด๋Š ์ •๋„ ์ฒ˜๋ฆฌํ•  ๋•Œ ๊ธฐ์šธ๊ธฐ ์†Œ์‹ค(Vanishing gradient) ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๋Š” ์ˆœํ™˜ ์‹ ๊ฒฝ๋ง์˜ ๋ณ€ํ˜•์ž…๋‹ˆ๋‹ค. ์žฅ๊ธฐ ๋ฐ ๋‹จ๊ธฐ ๊ธฐ์–ต ๋„คํŠธ์›Œํฌ์˜ ์…€ ๊ตฌ์กฐ๋Š” ๊ทธ๋ฆผ 3๊ณผ ๊ฐ™์ด ๋ง๊ฐ ๊ฒŒ์ดํŠธ, ์ž…๋ ฅ ๊ฒŒ์ดํŠธ ๋ฐ ์ถœ๋ ฅ ๊ฒŒ์ดํŠธ๊ฐ€ ์žˆ๋‹ค.

3. LSTM

 

  3. Conv1D + LSTM 

  Conv1D + LSTM ๋ชจ๋ธ์€ ๊ทธ๋ฆผ 1๊ณผ ๊ฐ™์ด Conv1D ๊ธฐ๋ฐ˜์˜ ํŠน์ง• ์œตํ•ฉ ๋ ˆ์ด์–ด, LSTM ๊ธฐ๋ฐ˜ ์‹œ๊ณ„์—ด ์˜ˆ์ธก ๋ ˆ์ด์–ด, output layer๋กœ ๊ตฌ์„ฑ๋œ๋‹ค. Input layer์—๋Š” ๊ทธ๋ฆผ 0๊ณผ ๊ฐ™์€ ์‹œ๊ณต๊ฐ„์  ํŠน์„ฑํ–‰๋ ฌ์ด ์ž…๋ ฅ๋œ๋‹ค. ๊ฐ ๋ณ€์ˆ˜๋Š” CNN์— ์˜ํ•ด โ€‹โ€‹๊ฐ€์ค‘์น˜๊ฐ€ ๋ถ€์—ฌ๋˜๊ณ  ๋ณ€์ˆ˜ ๊ฐ„์˜ ์ •๋ณด๊ฐ€ ๊ฒฐํ•ฉ๋œ๋‹ค. ๊ณผ์ ํ•ฉ(Overfitting)์„ ํ”ผํ•˜๊ธฐ ์œ„ํ•ด dropout layer๊ฐ€ ๋„คํŠธ์›Œํฌ์— ์ถ”๊ฐ€๋ฉ๋‹ˆ๋‹ค. ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ๋Š” ๊ทธ๋ฆผ 4์™€ ๊ฐ™์ด ๊ตฌ์„ฑํ•œ๋‹ค.

 

4. ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ

 

๋ชจ๋ธ์„ ๊ตฌ์„ฑํ•˜๊ฒŒ ๋˜๋ฉด ์•„๋ž˜์˜ ์ฝ”๋“œ์™€ ๊ฐ™์ด ๊ตฌํ˜„ํ•  ์ˆ˜ ์žˆ๋‹ค. 

import torch.nn as nn


class Conv1d_LSTM(nn.Module):
    def __init__(self, in_channel=3, out_channel=1):
        super(Conv1d_LSTM, self).__init__()
        self.conv1d_1 = nn.Conv1d(in_channels=in_channel,
                                out_channels=16,
                                kernel_size=3,
                                stride=1,
                                padding=1)
        self.conv1d_2 = nn.Conv1d(in_channels=16,
                                out_channels=32,
                                kernel_size=3,
                                stride=1,
                                padding=1)
        
        self.lstm = nn.LSTM(input_size=32,
                            hidden_size=50,
                            num_layers=1,
                            bias=True,
                            bidirectional=False,
                            batch_first=True)
        
        self.dropout = nn.Dropout(0.5)

        self.dense1 = nn.Linear(50, 32)
        self.dense2 = nn.Linear(32, out_channel)

    def forward(self, x):
	# Raw x shape : (B, S, F) => (B, 10, 3)
        
        # Shape : (B, F, S) => (B, 3, 10)
        x = x.transpose(1, 2)
        # Shape : (B, F, S) == (B, C, S) // C = channel => (B, 16, 10)
        x = self.conv1d_1(x)
        # Shape : (B, C, S) => (B, 32, 10)
        x = self.conv1d_2(x)
        # Shape : (B, S, C) == (B, S, F) => (B, 10, 32)
        x = x.transpose(1, 2)
        
        self.lstm.flatten_parameters()
        # Shape : (B, S, H) // H = hidden_size => (B, 10, 50)
        _, (hidden, _) = self.lstm(x)
        # Shape : (B, H) // -1 means the last sequence => (B, 50)
        x = hidden[-1]
        
        # Shape : (B, H) => (B, 50)
        x = self.dropout(x)
        
        # Shape : (B, 32)
        x = self.fc_layer1(x)
        # Shape : (B, O) // O = output => (B, 1)
        x = self.fc_layer2(x)

        return x

 

728x90
๋ฐ˜์‘ํ˜•
18์ง„์ˆ˜