Nice work. Most of the logic behind it makes sense. My first reaction was that a reported excess of snowfall on the weekends is a smoking gun, since there is no meterological reason that could happen. But it is important to check vs. outside data because over a couple of seasons it can happen. We all probably remember seasons when the storms were nicely timed for our weekend trips, and other years where all the big dumps seemed to be on Monday/Tuesday.
There are places like Mammoth where it is NOT good for business for the storms to hit on the weekends. 1982-83, where it snowed every weekend from Jan. 15 to May 15 saw much reduced skier visits from 1981-82, which had 3 massive storm cycles in November, early January and late March/early April but was sunny most of the weekends. Both seasons were in full operation by Thanksgiving and open into July.
Somewhat interesting since Mammoth was the sample area at the end of that paper, used to illustrate the new iPhone reports on SkiReport.com. About a year ago an suspicious ongoing reporting uptrend in Mammoth's snow reporting was finally explained. The Mammoth historical data was corrected to use patrol plot data vs. the sum of marketing ski reports. The marketing reports over the past 10-15 years had run higher than previous years, while the corrected patrol reports were quite in line with the prior data. So of course the excesses were inconsistent; bad years were consistent but the big years 2004-05 and 2005-06 broke the records from 1983 even though the base depths by observation weren't quite as much. With the patrol corrections 1982-83 has been resorted to its rightful record status.
The marketing excesses were traced by some of the locals to the biggest storms. I had caught a couple of these myself but did not know that it happened enough to effect season totals. A good example was the New Year's 2006 storm where I was stranded in Bishop between ski days. The storm lasted 31 hours and was reported at the time as 95 inches, but patrol data shows 70. The reason for the above discrepancies, touched upon in the Dartmouth paper, is that marketing departments can update more than once a day even though they usually do not. If a past-24-hour total is reported more than once a day it can sometimes result in double counting that can be picked up by website aggregators like OnThe Snow.com and SkiReport.com. The Dartmouth people did not think this effect was significant in their analysis, though in the specific case of Mammoth we know now that it can be. I believe the Mammoth errors were occasional and innocent, and they now use the patrol data for monthly historical totals. It turns out I had downloaded patrol daily records from 1984-2006, and they checked out exactly with the new revisions.