this is completely observational and not meant as a serious critique, but I think it raises an interesting point. one normalizing stat we have all become comfortable with, wrongly or rightly, is OPS+. it seems to make alot of sense to many to adjust stats to account for the competition and the effect a home park has on creation and prevention of runs. one thing that Cub fans are generally aware of is that the park effect of Wrigley is usually dependent on the wind. sure, a little bit of foul territory is now seats as compared with the past and there might be a slightly different height to the grass, but conventional wisdom is that the real effect of Wrigley is whether the wind turns HRs into cans o' corn, or Russell Branyan pop ups into basket shots. these seem to me to be the three primary factors that would go into the effect most parks would have, with some parks having different #1 factors than wind (ie. humidity in AZ and Col, marine layer in Cali). we have just seen one of the flukiest years in Wrigley history. as usual, the wind almost always blew in in April and May, but on those hot and humid days where we are all accustomed to gails blowing out to left or center, the wind has blown in, for whatever reason. not sure if there are any 'wind in/wind out' sources to back this up, just seems that way. perhaps Truff has a comment on whether my observations match up with actual weather patterns. so if the wind has been blowing in at a higher rate this year than in years past, how on earth is Wrigley's park factor favoring offense? shouldn't the park effect stat jibe up with the phenomena that is the primary factor going into determining park effect? and if doesn't, shouldn't we question any stat that normalizes based on park factors determined purely by statistical analysis as there may be sample size issues or some variable missing from the equation? or is it just a case of observation and conventional wisdom being incorrect?