TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
IGARSS11_FFBP_CSAR_v3.ppt
1. Octavio Ponce, Pau Prats , Marc Rodriguez-Cassola, Rolf Scheiber, Andreas Reigber Microwave and Radar Institute (HR) German Aerospace Center Processing of Circular SAR Trajectories with Fast Factorized Back-Projection
22. FFBP CSAR - Simulation . 700m Terrain 4000m Height 4500m Radius 94MHz BW L Band 400Hz PRF
23. FFBP CSAR - Simulation *Interpolator Knab Pulse of 21 points BP vs FFBP Knab, IEEE TIT, 1979
24. FFBP CSAR - Performance Speed Up System, Geometry, Hardware, Interpolator BP CPU ~238days 25000 x 25000 pixels ~3 hrs FFBP GPU ~11 hrs FFBP CPU ~3 days BP GPU Speed up factor
Circular SAR mode is the acquisition of data of a spotlighted target region over 360° synthetic aperture. This geometry depends mainly on the platform height, flight radius, and aperture angle.
One of the biggest potential of CSAR is its resolution, because it can be smaller than the wavelength. Here we can see a simulated Impulse response of Stripmap SAR mode, which is basically a 2d sinc. Its resolution depends on the spectrum bandwidth in range and azimuth direction.
On the other hand, the resolution in CircularSAR depends mainly on the diameter of the ring, that means much more resolution than stripmap in both dimensions. The thickness of this ring is the bandwidth of the system. The ring has the same shape as the flight track. In time domain the 2d IR is the difference of 2 bessel functions.
The highest resolution in ground range is defined as lambda over four depending on the depression angle. In this case for L-Band the highest resolution would be 8cm, this allows a reduction in the speckle as we’ll see later. In a typical case of Stripmap we’ll get 1.5m x 1.m, with the same system but different geometry we can get higher resolution.
CSAR has also potential to get reconstruction in 3D. Computer aided Tomography. Conventional SAR cannot overcome altitude ambiguity caused by two such scattering centers that lie along a common wavefront but at different altitudes. The system geometry resolves height ambiguities by obtaining measurements made from different azimuthal angles along the flight track.
This is a 3D IRF of CSAR with one pass focused from -5 to 15m. We can see that at the true height we get a peak with the maximum energy. At the wrong height we’ll get a a ring as a result, which increases its size as we increase the distance from the true height.
The problem that we were leading with, was the development of a focusing algorithm, that was efficient in time (low computational burden), using the real track to avoid approximations, considering the topography changes, getting high accuracy in amplitude and phase.
FFBP, based on the Direct BP,and it was first developed for the spotlight mode, it solves the problem of the computational burden by splitting the full synthetic aperture of L pulses into smaller subapertures of Lm pulses, focusing several coarse images with the direct back projection but in polar coordinates. Afterwards each two contiguous images are merged by a polar interpolation, in this way increasing the resolution in the direction of the flight, until we reach the full synthetic aperture. At this point we can perform an interpolation from a polar to a Cartesian grid.
Why is it done in polar coordinates? Because the sampling conditions are much better than in Cartesian coordinates, as shown in this images. This fact helps also to decrease the computational burden.
If we focused two images, q1a and q2a, which we would like to interpolate into q1b. Our knowns are a_1b and r_1b, we now the distance of each subaperture center d_d. Then by the law of cosines we can get a_1a and a_r1a
This is a graphic example.
This is a graphic example.
How do we perform the P2P interpolation? If we have focused an image with respect to q1a, which we want to interpolate to the new reference system q1b, we can’t use the law of cosines as defined for stripmap because we don’T want the angle with respect to the distance vector between the two subaperture centers. Instead we would like to get the angle with respect to the vector q1a, as we define our system. To solve this we have to project the known coordinates a_1b and r_1b to the ground, using the DEM and followed by a rotation of b_deg and a translation with respect to q1b. In that way we’ll get the distance r_1a and a_1a by means of the dot product of two vectors. Now we start interpolating two by two contiguous images…..
Now the full resolution final grid of ~lambda over 4 we’ll be used until we reach Lmax pulses. In this case 45°, and we need to interpolate only 8 times using the full resolution grid. Simple and fast solution.
We did a simulation of a circular flight with……
To measure the performance we use as a reference the direct BP. The results with simulated data indicate…… high accuracy in amplitude and phase. For this the truncated-sinc + knab pulse of 21 points was used.
Direct BP and FFBP was also parallelized on a GPU, in order to accelerate even more the process. Here we see the speed up factor of DBP and FFBP with respect to the dBP tested on the CPU. The dBP with the GPU (black line) was 100times faster, while the FFBP CPU (red line) shows an improvement of 500, but this is not all, the FFBP GPU improves this factor up to 1800. As an example, we took a matrix size of 25kx25k pixels (1.5x15km, 6cm res), the BP CPU lasts 238days, while BP GPU took 3 days, FFBP CPU was improved to 11 hours and with the GPU to 3 hours. This results depend directly on the PRF, RADIUS, HARDWARE & INTERPOLATOR
We analyzed the real data acquired in a campaign with the ESAR system in Kaufbeuren Germany…..
The results comparing the BP with the FFBP indicate…… high accuracy in amplitude and phase.
The results comparing the BP with the FFBP indicate…… high accuracy in amplitude and phase.
This animation shows images that were focused in subapertures of 10° over the 360° full synthetic aperture. Polarimetric signature changes as a function of the aspect angle….. Smearing objects are moving targets. Resonance (flash fields) Saturation because of the high energy backscattered
Lights on the Runway No speckle More resolution
It is possible to see through the cannopy Tree Trunks as double bounce Shape of the buildings Fence as double bounce
Defocused soil Luneberg lens as double bounce Importance of the DEM Inaccuracies of the DEM can be corrected with CSAR Ring effect as shown at the beginning
3D IRF as shown in the first slides!
Tree trunk (Double bounce) focused at Height = 0m
In this way the measured signal in fast-time and slow-time (t,phi), can be described by the convolution of the reflectivity function f(x,y,z) and the transmitted radar signal P(t).
In the left image it’s possible to see an overview of the circular trajectory and next to it the beam pattern of the L-Band antenna during the flight. Common illuminated Spot
This is a sequence of P2P interpolations correspondent from 1 to 45 degrees. We can see that the spectrum starts to curve as we increase the size of the subaperture. This causes an increment in number of samples in range direction, to avoid aliasing. Therefore it is not worth to keep interpolating in polar grids, and an interpolation to final cartesian grid should be done before we reach the FullSA. This fact doesn’t overcome the great improvement in terms of computational burden, since large blocks of data can be processed and only few times the full resolution in cartesian grid is used. Interpolation Kernel.
For every Polar to polar interpolation the grids should be calculated.