SAOhdf5¶
-
class
Data_Reduction.DSN.SAO.
SAOhdf5
(filename, clobber_0=True, clobber_center=True)¶ Bases:
object
Container for data in SAOspec HDF5 data files
Notes
SAO HDF5 File Structure
SAO spectra are acquired from each ROACH once every five seconds. Each of these is a
record
. Records are grouped intoscans
. So a 2-min integration will have 24 records for each of the four ROACH boards.A typical dataset looks like this:
In [8]: data.values() Out[8]: [<HDF5 dataset "NSCANS": shape (1, 1), type "<f4">, <HDF5 dataset "SITEELEV": shape (1, 1), type "|S13">, <HDF5 dataset "SITELAT": shape (1, 1), type "|S13">, <HDF5 dataset "SITELONG": shape (1, 1), type "|S13">, <HDF5 dataset "obs_freq": shape (1, 1), type "<f4">, <HDF5 dataset "rest_freq": shape (1, 1), type "<f4">, <HDF5 dataset "v_ref": shape (1, 1), type "|S10">, <HDF5 dataset "vsys": shape (1, 1), type "<f4">, <HDF5 dataset "integ_time": shape (1, 4), type "<f4">, <HDF5 dataset "scan_number": shape (326, 1), type "<f4">, <HDF5 dataset "date_obs": shape (326, 1), type "|S10">, <HDF5 dataset "time_obs": shape (326, 1), type "|S10">, <HDF5 dataset "timestamp": shape (326, 1), type "<i8">, <HDF5 dataset "LST": shape (326, 1), type "|S13">, <HDF5 dataset "scan_duration": shape (326, 1), type "<f4">, <HDF5 dataset "observer": shape (326, 1), type "|S10">, <HDF5 dataset "source_name": shape (326, 1), type "|S10">, <HDF5 dataset "onsource": shape (326, 1), type "|b1">, <HDF5 dataset "current_azel": shape (326, 2), type "<f4">, <HDF5 dataset "offsets": shape (326, 2), type "<f4">, <HDF5 dataset "source_azel": shape (326, 2), type "|S12">, <HDF5 dataset "source_long_lat": shape (326, 2), type "|S13">, <HDF5 dataset "source_radec": shape (326, 2), type "|S13">, <HDF5 dataset "weather": shape (326, 6), type "<f4">] <HDF5 dataset "Tsys": shape (326, 4), type "<f4">, <HDF5 dataset "bandwidth": shape (326, 4), type "<f4">, <HDF5 dataset "mode": shape (326, 4), type "|S10">, <HDF5 dataset "pol": shape (326, 4), type "|S3">, <HDF5 dataset "spectraCh1": shape (326, 32768), type "<f4">, <HDF5 dataset "spectraCh2": shape (326, 32768), type "<f4">, <HDF5 dataset "spectraCh3": shape (326, 32768), type "<f4">, <HDF5 dataset "spectraCh4": shape (326, 32768), type "<f4">,
In order, the first set are parameters which do not change for the entire dataset.
integ_time
is specified for each backend. The remaining parameters are different for each integration. Most of the remaining arrays are arrays with multiple values in an obvious way such as five sec integrations, or parameters for each DSProc (ROACH). The data associated with each nominally 5 sec are associated with scans like this:In [49]: hdf.data['weather'].value Out[49]: array([[ 28.05555725, 946.18383789, 21.9917202 , 9.88373375, 187., 0.], [ 28.05555725, 946.18383789, 21.24612045, 9.26599979, 157., 0.], [ 28.05555725, 946.18383789, 21.24612045, 9.26599979, 157., 0.], ..., [ 28.38888931, 945.9130249 , 19.21655655, 5.55959988, 42., 0.], [ 28.38888931, 945.9130249 , 18.9209938 , 5.55959988, 44., 0.], [ 0. , 0. , 0. , 0. , 0., 0.]], dtype=float32)
The records are associated with scans like this:
In [48]: hdf.data['scan_number'].value Out[48]: array([[ -1.],[ -1.],[ -1.],[ -1.],[ -1.],[ -1.],[ -1.],[ -1.],[-1.],[-1.], [ 1.],[ 1.],[ 1.],[ 1.],[ 1.],[ 1.],[ 1.],[ 1.],[ 1.],[ 1.], [ 1.],[ 1.], [ -1.],[ -1.], [ 2.],[ 2.],[ 2.],[ 2.],[ 2.],[ 2.],[ 2.],[ 2.],[ 2.],[ 2.], [ 2.],[ 2.], [ -1.],[ -1.],[ -1.], ... [ -1.],[ -1.], [ 19.],[ 19.],[ 19.],[ 19.],[ 19.],[ 19.],[ 19.],[ 19.],[19.],[19.], [ 19.],[ 19.],[ 19.], [ -1.],[ -1.], [ 20.],[ 20.],[ 20.],[ 20.],[ 20.],[ 20.],[ 20.],[ 20.],[20.],[20.], [ 20.],[ 20.], [ -1.], [ 0.]], dtype=float32)
This class has an attribute ‘container’ similar to an SDFITS table. The ‘spec’ axes are:
frequency - 32768 equally spaced frequencies in the topocentric frame R.A. - 1 J2000 coordinate declination - 1 J2000 coordinate pol - 2 for E,H or L,R time - N seconds since midnight for each record in a scan beam - 2 for 1,2
-
clobber_0 - set 0th value to 1st value if True
-
clobber_center - set 16384th value to mean of 16383th and 16385th if True
-
container - SAOdataset object in which the reformatted data are stored
-
data - HDF5 file contents (temporary)
-
filename - name of HDF5 file
-
logger - logging.Logger instance
-
meta - metadata from HDF5 file
-
num_DSP - number of signal processors
-
num_uniq - number of unique records for each scan and beam and pol
-
PROJID - identifier string for project
-
SITEELEV - elevation of telescope (m)
-
SITELAT - latitude of telescope (deg)
-
SITELONG - east longitude of telescope (deg)
-
spec - nparray of spectra
-
uniq_recs - dict with number of unique records for each pol and beam
-
extract_valid_records - gets non-duplicate, non-zero records
-
to_dataset - returns an SAOdataset instance
Methods Summary
Remove duplicate and empty records from each scan.
Fills container with data in format suitable for pickling (saving to disk)
Methods Documentation
-
extract_valid_records
()¶ Remove duplicate and empty records from each scan.
Returns nparray of valid records, number of valid records for each pol,beam
-
to_dataset
()¶ Fills container with data in format suitable for pickling (saving to disk)
This method is used by the calling program which initializes this class to put the data in a FITS-like format that can be put in a pickle file.
-