Inspiration Information: Analytics, What’s in it for the Players?

kop
[youtube http://www.youtube.com/watch?v=0xLXmdnL1P8&w=420&h=315]

An interesting sub-plot of the recent public analytics buy-in by NHL teams has revolved around the following series of questions: what about the players? what do they gain from analytics? isn’t it more for GMs constructing their roster and trying to take advantage of market inefficiencies and coaches trying to develop and put to practice on-ice systems? won’t players get confused and/or take the wrong lesson from an education in analytics?

This line of thought, especially in Oilers’ circles, stems from a pair of player interviews.

Back in March, Ryan Rishaug of TSN sat down with Taylor Hall and had a chat about analytics [because tsn.ca is a hateful mess of an online resource, this video, which should be found here, is no longer available. Fortunately, this tumbler post transcribed the crucial bits]. Here’s what Hall had to say,

The thing for a hockey player, if you’re an advanced stat guy and you’re describing them to a hockey player, you have to have some kind of end point, what does he have to do better to get this stat better. That’s the thing that I’m lost on, with Corsi and Fenwick and all this stuff, how do you improve a player, what do you tell him?… I know we have an advanced stat guy on our team… I’ve asked him, ‘So why is my Corsi not as good [this season]?’, and he didn’t really have an answer for me.

We often hear analytics referred to as a “tool.” And, here I think you can see how players approach the matter. If they are listening and invested in the conversation (like Hall is here with his “advanced stat guy”), they want to get a clear idea of how to the tool works and how to use it to get better. It’s clear in Hall’s case here, he didn’t get much more than noise from his analytics guys.

As it happens, Hall’s buddy, Jordan Eberle, recently offered his thoughts on the same matter:

[Question] The Oilers hired Tyler Dellow, a very well-known blogger and analytics guy. Dallas Eakins seem very open to using this as a tool. I’m wondering how much of that filters down to the players in terms of things you are told and things that might come from that universe that might affect some of the details of your game.

[Eberle] Well, even last year we were on top of it. We were getting chance sheets after games to see, you know, when you were on the ice how much was produced and how much was produced against you. So, obviously it’s a big part of the game now. If you look at the top teams that win every year, I believe their Corsi is over 52 or whatever it is. So, it’s starting to creep into the game. There’s obviously a lot of positivity and a lot of use for it. But, I think if you start thinking on the ice, you know, you’re shooting the puck and you’re thinking your Corsi is going up, I think that is where you have issues. But, there is definitely a lot of tools after the game you can use it for.

Here we get another wrinkle in the player’s perspective. Not only could analytics simply confuse a player with contextless noise, it might also do harm to a player’s game. If a player is told his Corsi is poor and that this is a problem, he may well interpret this message in the way Eberle suggests above:

I think if you start thinking on the ice, you know, you’re shooting the puck and you’re thinking your Corsi is going up, I think that is where you have issues.

There are a couple of issues here.

1) Players (and their agents) are savvy enough to recognize there is a direct correlation between their stats (traditional boxcars) and both their playing opportunities (linemates, assignments, power-play, etc) and their take home pay. If “advanced stats” continue to become mainstream (say, in contract negotiations and arbitration hearings), it does seem possible some players will mis-interpret the information and seek to run-up the score in shot differential.

2) Just as with traditional boxcars, in an ideal situation you don’t want a player thinking about abstractions on the ice, i.e., you don’t want them “thinking” at all, let alone about trying to mentally calculate shot metrics on the fly. Not only would such mental effort pull a player out of the “team game” (through a concern for one’s own stats sheets), but it would simply pull a player out of the immediacy of the game itself.

Ideally, a player will internalize the situation-dependent systems put in place by his coaching staff. And, ideally those systems will be informed by mutually reinforcing data points (shot metrics, zone entries/exits, match-ups, zone starts, etc.). However, the intricacies of how and why such-and-such a strategy (and the complex thought that subtends it) is put in place is largely inessential information for the practitioners in the immediacy of the game.

Let me clarify with an example ripped from Heidegger’s Being and Time (§§15-16). Heidegger makes a distinction between the “handiness” of practical behavior related to tools, like a hammer, and a “theoretical” understanding of those tools and associated behaviors. The up-shot is that a tool, like a hammer, loses its handiness when looked at theoretically. In the midst of hammering, one isn’t thinking about the hammer, the nail, the house you are building, the family you are trying to shelter, etc. In fact, thinking about all those things is likely to distract you from the work of hammering at hand. It is only when the tool becomes conspicuous, like when it is broken, that the broader world of the tool (its associations and ultimately its purpose in caring for the self) appear as something to think about.

In hockey, the idea is roughly the same. The moment a player steps out of the actual, immediate process of playing the game and views it theoretically, he is bound to perform poorly.

Clarifying the Arguments

1. The Learning Process

Recently, Jonathan Willis picked up on Eberle’s quote and added some insight he received on the matter from a hockey coach,

Talking informally to a high-level coach this summer, he mentioned that while he thought stats had an important place in the game, he didn’t think they were all that valuable to players and that it was up to the coaches to draw the right conclusions and distill them to their charges. Eberle’s comment about Corsi (“I think if you start thinking on the ice, you’re shooting the puck and you think your Corsi’s going up that’s where you have issues”) reinforces for me that raw numbers aren’t necessarily going to help a player with his game.

I think this is basically right. A post-game, contextless spreadsheet of numbers isn’t going to help a player much. At best, such a sheet would be a matter of passing interest on its way to padding the recycling bin. At worst, such a sheet would encourage bad habits (abandoning systems-play in order to rack up numbers) or over-thinking in on-ice situations.

However, off the ice, with the time available to contextualize this information, I can easily imagine a reasonable case being made wherein analytics does help players.
Primarily, I am thinking of a learning reinforcement mechanism. If I am learning a complicated series of actions, like a hockey system, it is going to be vital that I participate (or, actively learn) in those actions, i.e., actually go through the motions of the system in practice settings. However, humans aren’t simply machines (apologies to La Mettrie). As Taylor Hall suggests above, players have an interest in knowing why they are being asked to perform certain tasks. Athletes are hardly alone in this impulse. As a species we tend to balk at the idea that we simply “do what we are told.” We crave (and not out of mere insolence) rational grounds for our actions (which is different from saying humans always act rationally).
A simple introduction to the principles and uses of various analytics could easily be paired with an in-depth systems education. This kind of information (say in reference to the importance of entering the offensive zone with possession instead of dumping-in the puck and chasing after it), i.e., concepts derived from empirical findings, will give players a compelling reason to both learn and apply systems.
This is not to say, players should form a study group or be given homework or anything remotely taxing. This is simply a matter of introducing players to the concepts and tools newly available. Moreover, it’s about giving players a sense of how these concepts and tools inform the systems they are being asked to put into practice.
2. The Psychology of Control
I’ve written about “luck” before. The crux of my argument being: we tend to believe too strongly in the fallacy of subjective control over ourselves, others and the environment. This leads us to often dismiss the role luck plays in events.
Recently, however, a twitter conversation between myself and David Staples (in reference to Jason Strudwick and this post of his declaring the non-existence of “puck luck”) got me thinking about how analytics, especially insofar as it accounts for luck, can help players.
Screen Shot 2014-09-03 at 6.13.20 PM
The argument Jason Strudwick is making is straightforward: there is no “puck luck,” players are responsible for their performance good or bad.
David Staples is making a more nuanced argument. Whether puck luck exists or not, it is important for players, in games, to believe it does not exist. The idea being, a player will slacken his performance if he feels he is not in direct control over his actions and their results. This is a variation on the argument I made above concerning a player “thinking” during the game.
However, in this case, I think analytics regarding luck can have a psychological benefit both on and off the ice.
A player repeatedly faced with poor luck, but operating under the assumptions that he is entirely in control of his situation, is very likely to have one of two things occur: 1) his competitive fire depressed; 2) his play overcompensates for bad performance and he breaks out of systems play. These eventualities seem just as, if not more, likely as the scenario whereby introducing luck to a player depresses his competitive drive.
Let’s look at Tyler Seguin’s much discussed 2012-2013 post-season. From the Strudwick perspective, Seguin simply lacked the completive edge to get the job done. This is a narrative.
What really happened was that Seguin was second on his team in shots (70 in 21 games, one behind Bergeron) and, though a career 11.2% shooter through 4 NHL seasons, shot a lousy 1.4%. His team posted a 53.9% Fenwick For Close Percentage with him on the ice.
If I’m Seguin’s coach, the last thing I want to do is tell him in this situation that he’s failing. He’s not. He’s run up against a short sample size of bad luck that will be ironed out in the long run.
By introducing luck to players, I would argue, one ensures that they won’t get too high or low about results out of their control. It will keep them focused and determined to perform better in areas they can control and allow them to roll with the punches of luck-induced swings.
This, by the way, is exactly how Dallas Eakins’ believes players ought to be treated regarding luck. It’s exactly how he handled Yakupov last year.
[adsanity_group num_ads=1 num_columns=1 group_ids=1426 /]
Arrow to top