Inception ResnetV2及其网络结构是什么
Admin 2022-09-17 群英技术资讯 270 次浏览
今天我们来学习关于“Inception ResnetV2及其网络结构是什么”的内容,下文有详解方法和实例,内容详细,逻辑清晰,有需要的朋友可以参考,希望大家阅读完这篇文章后能有所收获,那么下面就一起来了解一下吧。
Inception ResnetV2是Inception ResnetV1的一个加强版,两者的结构差距不大,如果大家想了解Inception ResnetV1可以看一下我的另一个blog。facenet的神经网络结构就是Inception ResnetV1。
神经网络学习——facenet详解及其keras实现
源码下载
Inception-ResNetV2和Inception-ResNetV1采用同一个主干网络。
它的结构很有意思!
如图所示为整个网络的主干结构:
可以看到里面的结构分为几个重要的部分
1、stem
2、Inception-resnet-A
3、Inception-resnet-B
4、Inception-resnet-C
在Inception-ResNetV2里,它的Input为299x299x3大小,输入后进行:三次卷积 -> 最大池化 -> 两次卷积 -> 最大池化 -> 四个分支 -> 堆叠python实现代码如下:
input_shape = [299,299,3] img_input = Input(shape=input_shape) # Stem block: 299,299,3 -> 35 x 35 x 192 x = conv2d_bn(img_input, 32, 3, strides=2, padding='valid') x = conv2d_bn(x, 32, 3, padding='valid') x = conv2d_bn(x, 64, 3) x = MaxPooling2D(3, strides=2)(x) x = conv2d_bn(x, 80, 1, padding='valid') x = conv2d_bn(x, 192, 3, padding='valid') x = MaxPooling2D(3, strides=2)(x) # Mixed 5b (Inception-A block):35 x 35 x 192 -> 35 x 35 x 320 branch_0 = conv2d_bn(x, 96, 1) branch_1 = conv2d_bn(x, 48, 1) branch_1 = conv2d_bn(branch_1, 64, 5) branch_2 = conv2d_bn(x, 64, 1) branch_2 = conv2d_bn(branch_2, 96, 3) branch_2 = conv2d_bn(branch_2, 96, 3) branch_pool = AveragePooling2D(3, strides=1, padding='same')(x) branch_pool = conv2d_bn(branch_pool, 64, 1) branches = [branch_0, branch_1, branch_2, branch_pool] x = Concatenate(name='mixed_5b')(branches)
Inception-resnet-A的结构分为四个分支
1、未经处理直接输出
2、经过一次1x1的32通道的卷积处理
3、经过一次1x1的32通道的卷积处理和一次3x3的32通道的卷积处理
4、经过一次1x1的32通道的卷积处理、一次3x3的48通道和一次3x3的64通道卷积处理
234步的结果堆叠后进行一次卷积,并与第一步的结果相加,实质上这是一个残差网络结构。
实现代码如下:
branch_0 = conv2d_bn(x, 32, 1) branch_1 = conv2d_bn(x, 32, 1) branch_1 = conv2d_bn(branch_1, 32, 3) branch_2 = conv2d_bn(x, 32, 1) branch_2 = conv2d_bn(branch_2, 48, 3) branch_2 = conv2d_bn(branch_2, 64, 3) branches = [branch_0, branch_1, branch_2] block_name = block_type + '_' + str(block_idx) mixed = Concatenate(name=block_name + '_mixed')(branches) up = conv2d_bn(mixed,K.int_shape(x)[3],1,activation=None,se_bias=True,name=block_name + '_conv') x = Lambda(lambda inputs, scale: inputs[0] + inputs[1] * scale, output_shape=K.int_shape(x)[1:], arguments={'scale': scale}, name=block_name)([x, up]) if activation is not None: x = Activation(activation, name=block_name + '_ac')(x)
Inception-resnet-B的结构分为四个分支
1、未经处理直接输出
2、经过一次1x1的192通道的卷积处理
3、经过一次1x1的128通道的卷积处理、一次1x7的160通道的卷积处理和一次7x1的192通道的卷积处理
23步的结果堆叠后进行一次卷积,并与第一步的结果相加,实质上这是一个残差网络结构。
实现代码如下:
branch_0 = conv2d_bn(x, 192, 1) branch_1 = conv2d_bn(x, 128, 1) branch_1 = conv2d_bn(branch_1, 160, [1, 7]) branch_1 = conv2d_bn(branch_1, 192, [7, 1]) branches = [branch_0, branch_1] block_name = block_type + '_' + str(block_idx) mixed = Concatenate(name=block_name + '_mixed')(branches) up = conv2d_bn(mixed,K.int_shape(x)[3],1,activation=None,se_bias=True,name=block_name + '_conv') x = Lambda(lambda inputs, scale: inputs[0] + inputs[1] * scale, output_shape=K.int_shape(x)[1:], arguments={'scale': scale}, name=block_name)([x, up]) if activation is not None: x = Activation(activation, name=block_name + '_ac')(x)
Inception-resnet-B的结构分为四个分支
1、未经处理直接输出
2、经过一次1x1的192通道的卷积处理
3、经过一次1x1的192通道的卷积处理、一次1x3的224通道的卷积处理和一次3x1的256通道的卷积处理
23步的结果堆叠后进行一次卷积,并与第一步的结果相加,实质上这是一个残差网络结构。
实现代码如下:
branch_0 = conv2d_bn(x, 192, 1) branch_1 = conv2d_bn(x, 192, 1) branch_1 = conv2d_bn(branch_1, 224, [1, 3]) branch_1 = conv2d_bn(branch_1, 256, [3, 1]) branches = [branch_0, branch_1] block_name = block_type + '_' + str(block_idx) mixed = Concatenate(name=block_name + '_mixed')(branches) up = conv2d_bn(mixed,K.int_shape(x)[3],1,activation=None,se_bias=True,name=block_name + '_conv') x = Lambda(lambda inputs, scale: inputs[0] + inputs[1] * scale, output_shape=K.int_shape(x)[1:], arguments={'scale': scale}, name=block_name)([x, up]) if activation is not None: x = Activation(activation, name=block_name + '_ac')(x)
import warnings import numpy as np from keras.preprocessing import image from keras.models import Model from keras.layers import Activation,AveragePooling2D,BatchNormalization,Concatenate from keras.layers import Conv2D,Dense,GlobalAveragePooling2D,GlobalMaxPooling2D,Input,Lambda,MaxPooling2D from keras.applications.imagenet_utils import decode_predictions from keras.utils.data_utils import get_file from keras import backend as K BASE_WEIGHT_URL = 'https://github.com/fchollet/deep-learning-models/releases/download/v0.7/' def conv2d_bn(x,filters,kernel_size,strides=1,padding='same',activation='relu',use_bias=False,name=None): x = Conv2D(filters, kernel_size, strides=strides, padding=padding, use_bias=use_bias, name=name)(x) if not use_bias: bn_axis = 1 if K.image_data_format() == 'channels_first' else 3 bn_name = None if name is None else name + '_bn' x = BatchNormalization(axis=bn_axis, scale=False, name=bn_name)(x) if activation is not None: ac_name = None if name is None else name + '_ac' x = Activation(activation, name=ac_name)(x) return x def inception_resnet_block(x, scale, block_type, block_idx, activation='relu'): if block_type == 'block35': branch_0 = conv2d_bn(x, 32, 1) branch_1 = conv2d_bn(x, 32, 1) branch_1 = conv2d_bn(branch_1, 32, 3) branch_2 = conv2d_bn(x, 32, 1) branch_2 = conv2d_bn(branch_2, 48, 3) branch_2 = conv2d_bn(branch_2, 64, 3) branches = [branch_0, branch_1, branch_2] elif block_type == 'block17': branch_0 = conv2d_bn(x, 192, 1) branch_1 = conv2d_bn(x, 128, 1) branch_1 = conv2d_bn(branch_1, 160, [1, 7]) branch_1 = conv2d_bn(branch_1, 192, [7, 1]) branches = [branch_0, branch_1] elif block_type == 'block8': branch_0 = conv2d_bn(x, 192, 1) branch_1 = conv2d_bn(x, 192, 1) branch_1 = conv2d_bn(branch_1, 224, [1, 3]) branch_1 = conv2d_bn(branch_1, 256, [3, 1]) branches = [branch_0, branch_1] else: raise ValueError('Unknown Inception-ResNet block type. ' 'Expects "block35", "block17" or "block8", ' 'but got: ' + str(block_type)) block_name = block_type + '_' + str(block_idx) mixed = Concatenate(name=block_name + '_mixed')(branches) up = conv2d_bn(mixed,K.int_shape(x)[3],1,activation=None,use_bias=True,name=block_name + '_conv') x = Lambda(lambda inputs, scale: inputs[0] + inputs[1] * scale, output_shape=K.int_shape(x)[1:], arguments={'scale': scale}, name=block_name)([x, up]) if activation is not None: x = Activation(activation, name=block_name + '_ac')(x) return x def InceptionResNetV2(input_shape=[299,299,3], classes=1000): input_shape = [299,299,3] img_input = Input(shape=input_shape) # Stem block: 299,299,3 -> 35 x 35 x 192 x = conv2d_bn(img_input, 32, 3, strides=2, padding='valid') x = conv2d_bn(x, 32, 3, padding='valid') x = conv2d_bn(x, 64, 3) x = MaxPooling2D(3, strides=2)(x) x = conv2d_bn(x, 80, 1, padding='valid') x = conv2d_bn(x, 192, 3, padding='valid') x = MaxPooling2D(3, strides=2)(x) # Mixed 5b (Inception-A block):35 x 35 x 192 -> 35 x 35 x 320 branch_0 = conv2d_bn(x, 96, 1) branch_1 = conv2d_bn(x, 48, 1) branch_1 = conv2d_bn(branch_1, 64, 5) branch_2 = conv2d_bn(x, 64, 1) branch_2 = conv2d_bn(branch_2, 96, 3) branch_2 = conv2d_bn(branch_2, 96, 3) branch_pool = AveragePooling2D(3, strides=1, padding='same')(x) branch_pool = conv2d_bn(branch_pool, 64, 1) branches = [branch_0, branch_1, branch_2, branch_pool] x = Concatenate(name='mixed_5b')(branches) # 10次Inception-ResNet-A block:35 x 35 x 320 -> 35 x 35 x 320 for block_idx in range(1, 11): x = inception_resnet_block(x, scale=0.17, block_type='block35', block_idx=block_idx) # Reduction-A block:35 x 35 x 320 -> 17 x 17 x 1088 branch_0 = conv2d_bn(x, 384, 3, strides=2, padding='valid') branch_1 = conv2d_bn(x, 256, 1) branch_1 = conv2d_bn(branch_1, 256, 3) branch_1 = conv2d_bn(branch_1, 384, 3, strides=2, padding='valid') branch_pool = MaxPooling2D(3, strides=2, padding='valid')(x) branches = [branch_0, branch_1, branch_pool] x = Concatenate(name='mixed_6a')(branches) # 20次Inception-ResNet-B block: 17 x 17 x 1088 -> 17 x 17 x 1088 for block_idx in range(1, 21): x = inception_resnet_block(x, scale=0.1, block_type='block17', block_idx=block_idx) # Reduction-B block: 17 x 17 x 1088 -> 8 x 8 x 2080 branch_0 = conv2d_bn(x, 256, 1) branch_0 = conv2d_bn(branch_0, 384, 3, strides=2, padding='valid') branch_1 = conv2d_bn(x, 256, 1) branch_1 = conv2d_bn(branch_1, 288, 3, strides=2, padding='valid') branch_2 = conv2d_bn(x, 256, 1) branch_2 = conv2d_bn(branch_2, 288, 3) branch_2 = conv2d_bn(branch_2, 320, 3, strides=2, padding='valid') branch_pool = MaxPooling2D(3, strides=2, padding='valid')(x) branches = [branch_0, branch_1, branch_2, branch_pool] x = Concatenate(name='mixed_7a')(branches) # 10次Inception-ResNet-C block: 8 x 8 x 2080 -> 8 x 8 x 2080 for block_idx in range(1, 10): x = inception_resnet_block(x, scale=0.2, block_type='block8', block_idx=block_idx) x = inception_resnet_block(x, scale=1., activation=None, block_type='block8', block_idx=10) # 8 x 8 x 2080 -> 8 x 8 x 1536 x = conv2d_bn(x, 1536, 1, name='conv_7b') x = GlobalAveragePooling2D(name='avg_pool')(x) x = Dense(classes, activation='softmax', name='predictions')(x) inputs = img_input # 创建模型 model = Model(inputs, x, name='inception_resnet_v2') return model def preprocess_input(x): x /= 255. x -= 0.5 x *= 2. return x if __name__ == '__main__': model = InceptionResNetV2() fname = 'inception_resnet_v2_weights_tf_dim_ordering_tf_kernels.h5' weights_path = get_file(fname,BASE_WEIGHT_URL + fname,cache_subdir='models',file_hash='e693bd0210a403b3192acc6073ad2e96') model.load_weights(fname) img_path = 'elephant.jpg' img = image.load_img(img_path, target_size=(299, 299)) x = image.img_to_array(img) x = np.expand_dims(x, axis=0) x = preprocess_input(x) preds = model.predict(x) print('Predicted:', decode_predictions(preds))
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:mmqy2019@163.com进行举报,并提供相关证据,查实之后,将立刻删除涉嫌侵权内容。
猜你喜欢
在matplotlib.pyplot中除了可以绘制常规图表如折线、柱状、散点等,还可以绘制常用在地理上的平面展示地型的等高线图,本文主要为大家介绍了如何利用matplotlib绘制等高线图,需要的可以参考一下
所谓魔法函数(Magic Methods),是Python的⼀种⾼级语法,允许你在类中⾃定义函数(函数名格式⼀般为__xx__),并绑定到类的特殊⽅法中。⽐如在类A中⾃定义__str__()函数,则在调⽤str(A())时,会⾃动调⽤__str__()函数,并返回相应的结果
文章主要介绍了python实现黄金分割法的示例,对于大家了解python怎样用黄金分割法计算具有一定借鉴价值,感兴趣的朋友可以参考下,希望大家阅读完这篇文章能有所收获,接下来小编带着大家一起了解看看。
这篇文章主要介绍了Python 文本滚动播放器的实现代码,本文给大家介绍的非常详细,对大家的学习或工作具有一定的参考借鉴价值,需要的朋友可以参考下
这篇文章主要介绍了Python 图形绘制详细代码,文章主要从最简单图像的开始,在同一图上绘制两条或多条线一些简单操作,想了解的小伙伴可以学习一下,希望对你的学习有所帮助
成为群英会员,开启智能安全云计算之旅
立即注册Copyright © QY Network Company Ltd. All Rights Reserved. 2003-2020 群英 版权所有
增值电信经营许可证 : B1.B2-20140078 粤ICP备09006778号 域名注册商资质 粤 D3.1-20240008