# Analysis Of The RARMA Model

881 Words
4 Pages

In order to develop the RARMA model, the CRTM has been applied on the ECMWF analyzed field to compute the GPM/GMI satellite brightness temperatures at each channel. Nevertheless, ECMWF analysis data is a 0.125° x 0.125° resolution product (6 h temporal resolution). Therefore, it has been necessary to interpolate the ECMWF analysis fields to the GMI footprint, in both time and space. This is done by employing a simple interpolation scheme that interpolates the ECMWF atmospheric and surface analysis fields in time and space to exact location of the GMI radiometric measurements. One should also remember that emissivity is very difficult to model over non-ocean surface. Particularly, the radiometric simulation for surface sensitive channels can

The results are shown in the form of biases (GMI-CRTM TBs) before (unadjusted) and after (adjusted) applying the RARMA algorithm. The bias histograms seem to appear more similar to a Gaussian distribution in the figure. Eventually, bias exists for all of the GMI channels. Comparatively, a small bias is seen on channel 12, which is in the water vapor absorption line (183.31±3 GHz). One can see that the peak of the distribution for all channels has come closer to zero after applying the RARMA algorithm. This figure should be accompanied with Table 2, which tabulates the resulting mean radiance departures before and after applying the radiometric adjustment procedure. Computed biases seem to vary from one channel to another, which is to be expected. Up to ~5 K bias is observed in some channels, more specifically in channels 3 and 4 (18 GHz). Nevertheless, after applying the RARMA algorithm, a significant bias reduction has been possible. This is evident for all the channels. Given example, the radiance departures for channels 3 and 4 have been reduced to 0.04 and -0.06 K from 3.26 and 5.02 K,

CONCLUSIONS

Radiometric biases are well-known setback in satellite retrieval and data assimilation system thus requires a bias correction step. In this effort, a radiometric bias correction algorithm, named as RARMA, is presented. RARMA uses a model II regression method, more particularly, reduced major axis (RMA) regression approach to correct the radiometric biases. The CRTM has been used as the radiative transfer model for simulating the brightness temperatures on GMI frequencies. The RARMA algorithm is seem to be performing well for correcting the radiometric biases.

It must be stressed that the proposed bias correction is a static scheme. That means the coefficients are fixed at the moment. Nevertheless, the satellite may decay over the time, thus could impact the calibration of the sensor. It will be interesting to look in near future, how the revealed coefficients perform for correcting the radiometric biases. Most likely, the RARMA coefficients need to be updated. Another way could be an adaptive scheme, where the coefficients will be updated regularly prior to the retrieval and assimilation run.

*…show more content…*The results are shown in the form of biases (GMI-CRTM TBs) before (unadjusted) and after (adjusted) applying the RARMA algorithm. The bias histograms seem to appear more similar to a Gaussian distribution in the figure. Eventually, bias exists for all of the GMI channels. Comparatively, a small bias is seen on channel 12, which is in the water vapor absorption line (183.31±3 GHz). One can see that the peak of the distribution for all channels has come closer to zero after applying the RARMA algorithm. This figure should be accompanied with Table 2, which tabulates the resulting mean radiance departures before and after applying the radiometric adjustment procedure. Computed biases seem to vary from one channel to another, which is to be expected. Up to ~5 K bias is observed in some channels, more specifically in channels 3 and 4 (18 GHz). Nevertheless, after applying the RARMA algorithm, a significant bias reduction has been possible. This is evident for all the channels. Given example, the radiance departures for channels 3 and 4 have been reduced to 0.04 and -0.06 K from 3.26 and 5.02 K,

*…show more content…*CONCLUSIONS

Radiometric biases are well-known setback in satellite retrieval and data assimilation system thus requires a bias correction step. In this effort, a radiometric bias correction algorithm, named as RARMA, is presented. RARMA uses a model II regression method, more particularly, reduced major axis (RMA) regression approach to correct the radiometric biases. The CRTM has been used as the radiative transfer model for simulating the brightness temperatures on GMI frequencies. The RARMA algorithm is seem to be performing well for correcting the radiometric biases.

It must be stressed that the proposed bias correction is a static scheme. That means the coefficients are fixed at the moment. Nevertheless, the satellite may decay over the time, thus could impact the calibration of the sensor. It will be interesting to look in near future, how the revealed coefficients perform for correcting the radiometric biases. Most likely, the RARMA coefficients need to be updated. Another way could be an adaptive scheme, where the coefficients will be updated regularly prior to the retrieval and assimilation run.