QA about images and catalogs


How can I check PSF of coadd data?

For example, you can see and save the PSF as fits using the following script;

# (x, y) is patch local coordinate
import lsst.afw.image as afwImage
import leet.afw.geom as afwGeom

exp = afwImage.ExposureF("calexp-[filter]-[tract]-[patch].fits")
x0,y0 = exp.getXy0()
psf = exp.getPsf()
psfImage = psf.computeImage(afwGeom.Point2D(x+x0,y+y0))
psfImage.writeFits("out.fits")

There is an artificial pattern on CORR or Coadd image caused by a specific CCD problem

A temporary hardware problem causes abnormal counts in a specific CCD or its channel. This will produce an artificial pattern on CORR or Coadd image.

In order to aviod the effect, it is necessary to mask the problematic pixels in a FLAT image under the CALIB directry. The following shows the example to mask the data with pyfits;

# Note that the following options are needed to save the file as int16 when you open the fits file.
hdul = pyfits.open('FLAT-xxx.fits', uint=True, do_not_scale_image_data=False, scale_back=True)
data=hdul[2].data
for i in range( <problematic pixels>):
        data[<channel>] = 256
...
# The mask value comes from NO_DATA or BAD flag.
# If "HIERARCH MP_NO_DATA = 8" in header of masked image, the value should be 256. If "MP_BAD = 0", it should be 1.

Then the new FLAT data is saved with the same name as the original data and run reduceFrames.py again. When you execute reduceFrames.py, you have to add “–config processCcd.isr.doBias=False processCcd.isr.doDark=False” to the commad because the only FLAT image is masked.


How can I get object size from PSF in catalog?

If the object is assumed to be a point source and have gaussian profile, you can get its size from meas catalog. The PSF information is stored in “shape_sdss_psf” column.

# Using pyfits
catalog = pyfits.open("meas-*.fits")
catalog[1].data['shape_sdss_psf']

# array[Ixx, Iyy, Ixy]
size = (Ixx*Iyy-Ixy**2)**(1./4)

Then the size should be converted from pixel to arcsec. If you need its FWHM, just multiply the factor 2*math.sqrt(2*math.log(2)).


How to make multiband catalogs with HSC SSP data?

The HSC Subaru Strategic Program (SSP) is led by the astronomical communities in Japan, Taiwan, and Princeton University. The first public data have been released (HSC SSP Data Release1).

1. Preparation

Becuase Pan-STARRS 1 catalog is used as a reference in SSP, you should use it for your processing. You need to define tracts same as SSP data. Please refer to Define tract same as SSP

Then the following dataset is downloaded and place the appropriate directory.

  • rerun/[rerun]/deepCoadd/[filter]/[tract]/[patch]/calexp-[filter]-[tract]-[patch].fits
  • rerun/[rerun]/deepCoadd-results/[filter]/[tract]/[patch]/det-[filter]-[tract]-[patch].fits
  • rerun/[rerun]/deepCoadd/skyMap.pickle
  • rerun/[rerun]/schema/deepCoadd_det.fits

If there is no rerun/[rerun]/schema/deepCoadd_det.fits in the region where you interested, please use it in other regions. When you get calexp-HSC-Z-17272-5,0.fits from SSP data release site, you have to place in rerun/[rerun]/deepCoadd/HSC-Z/17272/5,0/.

In case that you already have multiband results in rerun/[rerun]/deepCoadd-results/[filter]/[tract]/[patch], the process is skipped. Please delete or move these results except det file.

If you create new rerun directory for multiband with SSP data, you need to make a link named _parent which refers to the _mapper directory.

ln -s <_mapper directory> $home/hsc/rerun/<rerun>/_parent

2. Execution

When you execute multiBand command, please add the following option:

multiBand.py  $home/hsc --calib=$home/hsc/CALIB --rerun=<rerun name> --id tract=<tract> filter=HSC-I^HSC-G^HSC-Z --config measureCoaddSources.doPropagateFlags=False

# --config measureCoaddSources.doPropagateFlags: default setting is True. It determines whether to match sources to CCD catalogs to propagate flags. In this case you don't have each CCD information.

If you find the error “Unequal schemas between input record and mapper”, the data except deepCoadd_det.fits in rerun/[rerun]/schema/ should be deleted.