光学学报, 2021, 41 (7): 0728003, 网络出版: 2021-04-11
基于空间分离表征GAN的遥感图像模式互转 下载: 831次
Remote Sensing Image Mode Translation by Spatial Disentangled Representation Based GAN
遥感 图像转换 合成孔径雷达 光学遥感图像 循环一致性生成对抗网络 remote sensing image translation synthetic aperture radar optical remote sensing image cycle-consistent adversarial networks
摘要
针对合成孔径雷达图像与光学遥感图像模式差异大、相互转换困难的问题,基于现有空间分离图像转换框架,提出了一种基于空间分离表征的循环一致性生成对抗网络(GAN)。以更深的网络层和跳跃连接完成图像风格和内容分离,通过学习内容映射关系,完成内容特征转换,而后组合目标风格特性实现图像转换。利用PatchGAN判别器,强化模型的图像细节信息生成能力,并新增目标误差损失和生成重建损失将转换任务限制为一对一映射,减少信息添加,约束生成网络。在SEN1-2、SARptical、WHU-SEN-City数据集上进行实验验证,相较于其他图像转换算法,所提方法能够有效实现两类遥感图像互转,生成图像清晰度高、细节特征完整、真实感强。
Abstract
Resting on the translation framework of spatially separated images, we proposed a cycle-consistent generative adversarial network (GAN) based on spatial disentangled representation to address the large mode difference and difficult translation between synthetic aperture radar images and optical remote sensing images. The proposed model separates images into style and content features by a deeper network layer and jump connection. Furthermore, the content features are translated by content mapping learning and combined with target style features for image translation. In addition, PatchGAN, as the discriminator, enhances the image detail generation, and target error loss and generation & reconstruction loss are introduced to limit the translation task to one-to-one mapping, thus reducing the information added and constraining the GAN. The experimental results in SEN1-2, SARptical, and WHU-SEN-City datasets show that compared with other image translation algorithms, the proposed method can translate two types of remote sensing images and generate images of high resolution, complete detail features, and strong authenticity.
韩子硕, 王春平, 付强, 赵斌. 基于空间分离表征GAN的遥感图像模式互转[J]. 光学学报, 2021, 41(7): 0728003. Zishuo Han, Chunping Wang, Qiang Fu, Bin Zhao. Remote Sensing Image Mode Translation by Spatial Disentangled Representation Based GAN[J]. Acta Optica Sinica, 2021, 41(7): 0728003.