Strategy: 20 questions ====================== The idea here is: - to feed into the choice of the criteria that will be used by the Query Tool to select the next observation - to design likely MSBs, which will influence how the data are reduced Some of these issues were dealt with in the 2-year plan, and some are covered in the GCS, and LAS draft strategy documents, but in most cases we now need more detail. In general if you have an idea of what you need done, think also of how it might be implemented, in terms of a parameter specified in the MSB, and a test made by the Query Tool. Hopefully the list below will ensure uniform coverage by each survey. Some points don't apply to particular surveys. My feeling is that at this stage UKIDSS should concentrate on the 2-year plan, and forget anything beyond that, since it isn't yet approved. However we should overfill the database by, say, 25% i.e. plan as if we were looking at the first 2.5 years. This ensures that if the conditions are unusual (such as 3 months cloudy followed by 3 months clear), there will no shortage of MSBs to observe at any time, over the first two years of the survey. It might be useful to know that, as a rough guide, it is anticipated WFCAM will be on the telescope (observing in earnest) for about 7 months in 2004, something like Feb, Mar, and Aug through Dec. Also for reference: - the layout of the arrays is now with 93% spacing, which means that the max. overlap between arrays is about 30", before losing overlap due to microstep/dither strategies - revised sensitivity figures are on the web (technical section) [i) these are 0.4 mag less deep than previously assumed because they are 2" aperture mags rather than psf-fit mags, ii) revised depths in ZYJH relative to K are now included] - in thinking of MSBs, think in terms of 40 mins At the WG meetings we worked under a few assumptions about the query tool: - query tool will give preference to fields that are rising - query tool will use a weighting scheme, where the parameters used include: a) seeing, corrected from zenith value for airmass dependence, b) sky brightness, again corrected for airmass dependence, and avoiding times when sky is bright in particular filters (ZYJ at start of night) - timing of beginning and end of night are likely to be set by visibility of guide stars 1. Survey layout ---------------- WFAU would like 'standard field centres'. This would just mean that all the pointings for a survey need to be set (which of course they have to be, in writing the MSBs), and then numbered. These numbers are a bit like the field numbers of the UK Schmidt surveys, and are useful for giving a numbering scheme for detected objects. Unlikely to be a problem. 2. Prioritisation ----------------- What are your priorities for your survey in terms of filters and areas? This needs to be written down in some detail if you want something complicated. How important is it that partially completed areas are contiguous (does one specify the order in which MSBs are executed)? The idea is that prioritisation between the different surveys is a problem for the Survey Manager, who will use a simulator to ensure each survey gets approximately what it is owed. 3. Seeing --------- For your survey, is there a max. seeing requirement? The seeing statistics indicate 3 regimes (take the boundaries between these as indicational, within about 0.1") What do you do if the seeing is in regime i, ii, or iii? i) Good seeing: 0"-0.7" (about 70% of the time). Because of the undersampling, the survey speed is not greatly different across this range. Specifying max 0.7" for certain cases e.g. in Galactic Centre would appear to be reasonable. ii) Mediocre seeing: 0.7"-1.2" (about 20% of the time). Survey speed is reduced but the data are still useful. [For example in mediocre seeing the LAS plan is to observe a particular declination band with longer integration time (twice as much?).] iii) Bad seeing: >1.2". Probably wasting our time. Does anybody have a serious suggestion of how to use this time usefully, or should we just stop observing? For more details see: http://www.jach.hawaii.edu/JACpublic/UKIRT/news/Newsletter/issue10/UKnews.html#1 Seeing is wavelength dependent - how does this affect strategy? Should exposure time depend on seeing/sky brightness combination? 4. Airmass ---------- Any airmass limit? Presumably not: zenith seeing will be airmass-scaled by the Query Tool, in deciding whether to move to a field. 5. Filters ---------- What are the restrictions on sequencing with filters? Some surveys have suggested cycling JHK, JHK etc. But this may be undesirable because flat fielding/sky subtraction is best done when you have a long sequence of frames taken in the same filter. In any case note that the sky brightness declines very rapidly in the Z, Y and J bands over the first 3 hours of the night, so should we specify 'LOW SKY' for some fields, or just blank out this time for these filters? See the nice new plots of variation of sky brightness over the night at: http://www.jach.hawaii.edu/JACpublic/UKIRT/astronomy/sky/skies.html How do you tie field together e.g. one night K is done, and we then want J. But it's then cloudy for a month. On returning another field is at better airmass, but we would now prefer to go back and do J in the other field even though the query tool says do K in the nearby field. Query priorities have to change, depending on what has already been done. 6. Microstepping ---------------- Is 2x2 microstepping satisfactory, or are there any circumstances under which 3x3 microstepping would be required? Currently microstepping is seen as something which has a step size of a very few (<5) arcsec. The idea for 2x2 microstepping is N+0.5-pixel microsteps, repeated (after a small N-pixel offset), yielding 8 x 5 sec frames per pointing. Each image pixel in the final 4096x4096 frame for any array is the average of 2 x 5 sec observations on different array pixels (allowing elimination of most bad pixels). The microstep size should be small so that variation of the pixel scale across the frame is not important, so the data can be interlaced. [I'm getting hold of the details of the N+delta-pixel steps allowed by the instrument (it depends on the beat between the IR and optical (autoguider) pixel size, but I have an idea the step is larger for 3x3, which then eats into the array overlap size substantially).] 7. Dithering ------------ This is the big question for the DXS and UDS. What is the dithering sequence, which is needed to minimise systematics, to get you to the required depth? Dithering should be treated separately from microstepping. Think of a single pointing producing a 4096x4096 frame, and then dithering considers what small shifts to make between these frames. Since the frames may get taken at significantly different airmasses, because of atmospheric refraction they can't be on the same pixel grid. So it doesn't make sense to try and combine the microstepping and dithering strategy with a complicated set of N+0.5-pixel shifts over a couple of hours. Do you e.g. do a 3x3 dither on any pointing, and then macrostep, or do you tile a large area, and then repeat, shifting slightly (i.e. dithering) on each repeat. An issue here is the size of the largest object you want to measure an accurate luminosity profile for. The same strategy in all bands? 8. Depth or area ---------------- This is really an issue only for the DXS. Do you build up depth first or area? If depth, then over what area (a single tile at a time would probably produce inferior data, because you don't benefit from the other areas in the sky subtraction process)? 9. Offset skies --------------- A concrete proposal is needed for the GPS and GCS. 10. Repeats/Stacking -------------------- What sort of requirement are the different surveys looking for e.g. after one year stack all possible frames, or wait until the full depth is reached, or restack and generate catalogues everytime a new frame comes in? For surveys requiring variability/p.m. info what limits are to be put on time intervals? 11. Calibration --------------- Calibration might consist of observing a calibration field in all of YJHK every two hours. Would this meet the requirements of your survey? 12. Non-photometric ------------------- Weather conditions will probably fall into three categories. The query tool will know the current conditions, whether 'i) clear (~50%)', 'ii) thin cirrus (~25%)', or 'iii) thick cloud (~25%)'. The assumption is that for the UDS and DXS the MSBs for any field will include only a few MSBs that must have i), and that the remainder can work with ii) [most UDS and DXS data, of course, will be taken in i) however]. What requirements should be specified. For example if you need eventually 50 exposures in one area, over 2 years, do you specify a photometric frame every month? The LAS, GCS, and GPS, on the other hand, have to have a fraction of MSBs with good RA distribution, that can be executed when conditions are ii) (for between 1/4 and 1/3 of their time?). Better stats will be forthcoming. 13. Macrostepping ----------------- Any restrictions on how the areas are tiled. One way to specify this is to give every MSB a ranking, for example to ensure contiguous rather than patchy coverage. 14. Bright stars ---------------- What limits should be placed? Issues are remanence and ghosting - I'll try and get some figures from Mark. Are there any other issues? 15. Large objects ----------------- What is the maximum angular size of object for which you expect standard procedures to work? (e.g. will galaxies in Virgo be a problem, requiring a separate observing strategy?) 16. Moon -------- Any moon limitations? e.g. avoid short wavelengths in thin cirrus during bright moon. 17. Commissioning ----------------- This is now largely irrelevant. The idea is that once the camera works, some two moths of survey data will be taken, without a break. Nevertheless some small especially urgent pogrammes could be accelerated. 18. Unresolved issues --------------------- List technical issues you need to know before final decisions can be taken, e.g.: - magnitude and timescale of remanence, seems very small - ghosting level - minimum/best time coverage in a single band in order to ensure good sky subtraction - array bad-pixel numbers/scale - sky brightness variations with zenith distance 19. Data release ---------------- If time, come up with some concrete proposals on data release. 20. Integration time -------------------- Is the 5s integration appropriate for your observations?