There are a number of sentences that you could start about Jeff Locke that begin with the phrase, "It's no secret that." It's no secret that Jeff Locke was going to have to come back to earth from the ridiculous numbers that he had at the All-Star break, when he was 8-2 with a 2.15 ERA. It's no secret that Locke has been pretty terrible since the break, lasting just 30 2/3 innings over his six starts, putting up an obscene 5.52 ERA. What's interesting about Locke is what lies in between those statements: just how good is Locke? Why has his big league success looked so different from his minor league success? Can you explain what's happened to him since the break as regression, or is something else wrong with him?
The word "regression" gets bandied about a lot in baseball discussions, but I feel like it's worth really discussing before we get into what's happening with Locke. Everyone's favorite regression scenario is that of a coin flip. Each time you flip a coin, there's a 50% chance it will land heads up and a 50% chance it will land tails up. This doesn't mean that your coin tosses will alternate between heads and tails, it means that the more times you flip your coin, the better the odds are that you'll have a 50/50 split of your two outcomes. Let's imagine that you flip a coin ten times and it lands heads up all ten times. The concept of regression doesn't mean that the next ten tosses will be tails, and it doesn't mean that the next 90 tosses will go 50/40 in favor of tails to even the first ten heads tosses out. It means that every coin toss still carries the same 50/50 odds as the first ten, and that over time those strange first ten results will blend into the background. 10 heads and 0 tails looks remarkable, but 1,005 heads and 995 tails does not.
The coin toss is a bit oversimplified, though. A standard linear regression shows the relationship between a dependent variable and an independent variable.
Add The Sports Daily to your Google News Feed!