论文标题

地下深度结构图与生成对抗网络重建

Subsurface Depths Structure Maps Reconstruction with Generative Adversarial Networks

论文作者

Ivlev, Dmitry

论文摘要

本文描述了一种使用来自2D地震深度图的数据重建通常在3D地震调查后获得的详细分辨率深度结构图的方法。该方法使用基于生成对流神经网络体系结构的两种算法。第一种算法Stylegan2 -ADA在神经网络的隐藏空间中积累了山地地形的语义图像,然后在理想情况下,在转移学习的帮助下 - 地层层层的结构几何形状。第二种算法Pixel2style2像素编码器使用第一算法的概括性的语义水平,学会了从其退化副本(超分辨率技术)中重建原始的高分辨率图像。有一种方法论方法可以将地层界限的结构形式转移到从井井有条的区域转移到未经充实的区域的结构形式。使用Pixel2style2像素编码器的多模式合成,提议创建一个概率的深度空间,其中项目区域的每个点都由概率深度分布的密度表示,同样可能的结构图像的地质地质形式相同。对重建质量的评估进行了两个区块。使用这种方法,可以从2D地震图获得可靠的详细深度重建与3D地震图的质量相当的。

This paper described a method for reconstruction of detailed-resolution depth structure maps, usually obtained after the 3D seismic surveys, using the data from 2D seismic depth maps. The method uses two algorithms based on the generative-adversarial neural network architecture. The first algorithm StyleGAN2-ADA accumulates in the hidden space of the neural network the semantic images of mountainous terrain forms first, and then with help of transfer learning, in the ideal case - the structure geometry of stratigraphic horizons. The second algorithm, the Pixel2Style2Pixel encoder, using the semantic level of generalization of the first algorithm, learns to reconstruct the original high-resolution images from their degraded copies (super-resolution technology). There was demonstrated a methodological approach to transferring knowledge on the structural forms of stratigraphic horizon boundaries from the well-studied areas to the underexplored ones. Using the multimodal synthesis of Pixel2Style2Pixel encoder, it is proposed to create a probabilistic depth space, where each point of the project area is represented by the density of probabilistic depth distribution of equally probable reconstructed geological forms of structural images. Assessment of the reconstruction quality was carried out for two blocks. Using this method, credible detailed depth reconstructions comparable with the quality of 3D seismic maps have been obtained from 2D seismic maps.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源