Colloquium




Abstract
 
Deep neural networks have been leveraged in surprising ways in the context of computational inverse problems and imaging over the past few years. In this talk, I will explain how deep nets can sometimes generate helpful virtual "deepfake" data that were not originally recorded, but which extend the reach of inversion in a variety of ways. I will discuss two examples from seismic imaging: 1) bandwidth extension, which helps to convexify the inverse problem, and 2) "physics swap", which helps to mitigate nuisance parameters. Joint work with Hongyu Sun, Pawan Bharadwaj, and Matt Li.


For future talks or to be added to the mailing list: www.math.uh.edu/colloquium