Is there a way to get an optimal binsize a priori? This could be calculated analytically for gaussian resolution & gaussian background, but not for Milagrito. So he runs a monte carlo calculation for the gaussian case, and finds the binsize that gives the smallest probability for a random fluctuation. The result depends on the average background level, for high backgrounds, the optimal binsize is smaller, running from 2.5 degrees up. In terms of the detector angular resolution, you want a square bin size that is ~2.8 times the resolution, or a round bin size that has radius 1.6 times the angular resolution.The curves are fairly flat near the optimum, but if you don't know the resolution well, you're better off going a little bigger. If you don't know where the source is, you also need to use a larger bin.

Then for Milagrito, did same procedure using true (non-Gaussian) resolution function, using deleo/2 for the resolution, for 1/4 million events, using the dataset containing the 6188 trigger. The deleo resolution is very strongly non-Gaussian, with a long tail to large angle differences, but this improves for higher nfit. The optimal binsize therefore depends also on the nfit cut, but for higher nfit, you lose more data. He takes the nfit cut for each background expectation level that maximizes the signal significance, e.g. for nexp = 1, nfit = 30 and binsize = 2.9 degrees, for nexp = 150, nfit = 40 and binsize = 1.9 degrees.

Bottom line, for burst searches (low nexp) we should be using binsizes of 3 degrees on a side. For Mrk 501, where the nexp can be quite large, the optimal binsize was 0.9 degrees.

Comparing Milagrito to Gaussian, we get an optimum binsize that is smaller than that for the equivalent Gaussian (1.25 degree resolution), but the actual angular resolution is worse.

WaveSearch - an independent search technique for checking, improving sensitivity, and minimizing computer resources. This uses wavelets, analyzes in both direct and Fourier space to get position and shape features. The Mexican hat wavelet (second derivative of a Gaussian) is optimal for finding peaks, and is not affected by linear gradients in the background.

How to use it? Need to parameterize the noise as a function of scale size to define the detection thresholds, and to determine the angular resolution. The noise distribution changes with the scale size and with the variance of the background. Then you need to do a flat-fielding on the skymap, since the Mexican hat can ignore a linear ramp on the background, but is strongly affected by a non-linear background.

Application to Markarian 501. Standard parameters give a source sigma of 3.2 with nfit 40 and round binsize 0.9 degrees. Tuning the parameters can improve sigma to 4.5. The wavelet analysis gets a similar significance, but the source is within 0.1 degree of the true position of Markarian 501, while the standard methods finds the source much further from the true position. There is also another weak source to the N of Mrk 501. Crab is a mess.

Can get upper limits to sources at known positions as well, fairly easily, and this techniques does better than the standard one for diffuse or extended sources.

No significant sources were detected in an all-sky search, but three marginal sources were found, including Mrk 501.

Recommend producing daily binned maps online for all analysis projects!

Version 30 is Milagro, last Milagrito version was 23, so this improvement is called 24. It makes the calibrations more complete, with an improved edge finder, filling in the gaps in the adc and slewing calibrations, recentering the time distributions, use max likelihood filtering, and improve clock error flagging.

Look at the calibration failures in version 23. There are sidebands in the trailing edge difference plot that is traditionally used for edge finding, due to individual tubes. Special cases were put in for these. Account for missing ADC data by using the calibration for surrogate tubes with similar characteristics. For back-terminated tubes, fill in the missing data by fitting to a quadratic curve. These fixes reduce the calibration failures dramatically, by a factor of 3 or so overall, with individual tube failure "spikes" eliminated altogether.

Does it help? Each of these improvements helps deleo, and they are all cumulative. From Monte Carlo, expect a 5% gain in significance from the larger number of calibrated tubes. The result on Mrk 501 is an offset that is larger than before (tipped the experiment?), with similar significance.

Use time sloshing for background subtraction (Cygnus pedigree) since the background is varying with time. A different application of the same method is to collect events in an hour-angle map as well as in a right-ascension map (direct integration).

Look at 6197 bins (1.5 degree square) of sky for DC sources between declination of 10 and 60 degrees, with a time sloshing of 2 hours. Nothing there. One 3.8 sigma point, but probability with 6197 trials is 36%. Look at day-scale structures with the same bins, and there is a 4.8 sigma event, probability with 6197 x 350 (number of days) trials is 82%. Look for diffuse emission from the galactic plane, with a 5-hour time sloshing. Eliminate events with declination > 70 degrees, because background is very hard to estimate (unfortunately this takes a big bite out of low galactic latitude). The galaxy does not show up. Can look at this as a function of galactic longitude, and can set 3 sigma upper limits as a function of galactic longitude. Our points are already below whipple's one point, and ours will get better.

Made 3-d movies to study the interaction between the shower and the pond to conceptually understand what is going on. Our detector is thick, we look at group effects, and timing is critical.

Refraction of shower front propagating into the water is strikingly evident in the movie, caused by particle generation. How does this affect the reconstruction? Result is a net offset of the reconstructed shower front, which differs in the reconstructions done separately in the two layers. Double layer angle fitters have to take this into account. Refraction for gamma showers occurs deeper in the pond. Also if you go higher energies, the refraction is delayed somewhat, because the secondaries produced in the pond are gammas.

Looking at vertical gamma shower, see a thin-shelled bowl shape with a bright ring propagating throught the pond. Amount of light in the ring as compared to the bowl should be energy dependent. The bowl structure is due to multiple scattering, as can be verified by turning it off. A circular shell of Cherenkov light is produced by the multiply scattered particles from each particle entering the pond. The ensemble produced by a shower front produces a first front of Cherenkov light that propagates at the speed of light in water (the bottom of the bowl), while the ring is the second front, which propagates at the speed of light in water times the cosine of the critical angle. This second front suffers angle-dependent smearing.

The tail in the observed Milagro timing distribution is given by the bowl spreading out.

...

*I missed some stuff here due to a dentist's appointment - all
the talks about repairing Milagro, and Isabel Leonor's talk "What are
the Odds".*

*...*

I came in on this late, - heard something about 5 sigma.

How well do we know the position of this burst? Get ~0.4 degrees. Likelihood with number of signal events maximizes between 12 and 23 events. We had 18 events.

Binned Likelihood - easier to compute, but not as sensitive. Not done yet.

Several million simulations at zenith angles 21-23 degrees, from 10 GeV to 50 TeV, discrete energies. Assume burst has a particular spectrum and cutoff energy and see what you get, in both triggers and scaler rate. The absence of a signal in the scaler results can exclude a burst with cutoff below 200 GeV, or softer than an index of 3.5.

Fluence can be calculated as a function of cutoff, gets high for
low cutoffs, in the range of 10^{-4}, BATSE fluence was
10^{-7}. This is clearly TeV dominated burst, and must
therefore be quite close.Total energy in the burst is small. There
are a couple galaxies near the error circle, one with a known
redshift at a few percent.

Low threshold scaler channels (60 of them) are read out once per second. High threshold is formed from an "or" of 16 signals. Look at 1000 seconds around the trigger time, with 4 channels taken out because of light leaks, leaving 56, which are then combined. Histogram of low-threshold rates is 3.5 times as wide as the root mean of the distribution. No scaler signal is seen at the time of the GRB.

Suggest to open the binsize from 1.6 degrees to 2.0 degrees to
account for position uncertainty. There are 3 possible flares during
the burst, T_{0}, T_{0} + 32s, T_{0} + 782s.
The most significant is the first, at a log probability close to -8,
and the second and third ones close to -4. The later two flares
become more significant if the binsize is opened up.

Calculation of final probability: Randomly draw 2195 events from
background distribution, and find out how often you get a log
probability less than -8. But does this allow the background level to
fluctuate, or the event to change over short timescales? Out of these
465,000 thrown events, get 13 that are this probable or lower, so
final probability using binned analysis is 10^{-3}. Using
maximum likelihood instead, get a much smaller trials factor, and
therefore a final probability more like 10^{-5}.

Determine the proton spectrum, compare grito results to the worlwide neutron monitor network, search for highest energy particles - does a cutoff exist? What about large angle muons? What energies are we looking at?

Best fit spectral index is ~ 6.9 with indeterminate errors; worldwide neutron monitor network sees 5.5 at lower energies. Suggests a cutoff or rollover in the spectrum between 10 and 880 GeV. This uses the 100-pmt trigger, but only as an upper limit.

Do we really see protons > 10 GeV? We certainly see > 4 Gev because of geomagnetic rigidity cutoff. Large angle muons also imply high energies. If angle is 83 degrees, muon loses 11 GeV in going through atmosphere, so protons have to be at least that energetic (they also lose 10-20 GeV in triggering 100 pmts), probably 30-40 GeV. For the event rise, 77% of events are fit, 23% not fit, which could come from high angle muons. For the event decline, the not fit contribution rises to 46%. The not-fit events kick in about when the patch 7 bump occurs.

There are a lot of measurements of this event. Several papers in
the ACE special edition of GRL have already dealt with the Nov.
6^{th} 1997 event, and we should have our results out there
as well.

There were two big events that week, on the 4^{th} and the
6^{th}. A shock traverses the earth on the 6^{th},
not at the same time as our event, which complicates things a little.
ACE sees heavy ions (e.g. O, Fe) increasing around the same time as
we see our SEP event. They see these heavy ions increase also on the
Nov 4^{th} event. For the Nov 6^{th }event, there is
a very large enhancement in the ^{3}He/^{4}He ratio,
indicative of an acceleration process that occurs very deep in the
solar corona (~1000 km above the photosphere?). Such a large
enhancement is unusual for large events like this one was. The
ionization state is also higher for high energy particles. Fe has
charge states up to 18-20, indicating it comes from a temperature of
10^{7} K or more at the highest energies. So there is a large
high-Z contribution from this event at high energies, that may be
affecting the Milagro results - this needs to be examined.

Two branches in the calibration plot present problems for the fitting routine. What's going on? All phototubes see this. There seems to be a discontinuity around filter wheel position 21, which is repeatable, and has a magnitude of around 4 nanoseconds in the sense that the higher light intensity is recorded late. Is this something happening in the phototubes? Or in the triggering logic in the laser shack?

The N_{2} gas does not meet purity requirements. The laser
is dead, it is not lasing. Possibly an electrical problem? Need to
repair it, need a laser expert!! This is the second time we have said
this, but nothing has been done. It's not even clear, however, that a
laser expert could do anything about it because of the existing
safety plans, which do not allow anyone to open the box.

Title is "Observation of TeV Emission from Mrk501 with a Water Cherenkov Detector". Author list? Editorial Board recommends putting the technicians on. Time sloshing done 30 times instead of 10 times.

Cull some figures, get another draft in a couple of weeks, should submit before the ICRC.

Submit to Nature or ApJ Lett describing just the 970417 event. A very rough draft exists, with the sky map, the spectrum plot and the light curve, but is not yet a first draft.

Dave is skeptical, though many of his concerns have been alleviated. This partly reminds him of the crystal ball project on upsilon-minus partilce in 1984, which got a 5 sigma result, later found to be bogus. The point is, for the burst event, moving to version 24 diminishes the significance of the signal. If you think you improve the analysis, and a result goes away, that's a strong sign that the result may be bogus.

The discussion wound around to the position that we should prepare a paper for ApJ Letters, with a circumspect tone, "Evidence For ..." giving the 6188 result, how it was arrived at, mentioning the 53 bursts for which no emission was detected.