I like the looks of that graph.
But seriously: it doesn't surprise me that much. The 'free to play' model contradicts with TANSTAAFL (There Ain't No Such Things As A Free Lunch). Developers (understandably) want to get paid for the work they do, so they want your money. Customers, on the other hand, either don't want or aren't allowed to spend real money on things*. If anything, it's more a matter of when you call the people playing your game gamers.
Think of it this way: AAA games have shareware versions, demo's, gameplay video's, reviews, and so on, and so on. Are you considered a gamer if you play, watch or read any of those? No. With social games, it's much easier to just install and play it. Assuming it even requires an installation.
If it was up to me, I'd define a "gamer" by a simple definition: did he payed anything for it?
I know it's not the best definition ever (it would make it pretty hard to be a "gamer" if you're just playing freeware and cracked games), but in this case, it would give a totally different view of this news.
While I agree with the OP that social game developers need to hold on to their player base...my disagreement is in what the playerbase IS. You really don't want to cater to people who aren't going to pay you no matter what. So just leave them out of the equation. Focussing on your customers means exactly that: those who only check you out for a free ride will be gone whenever they find something else that's free or easily hacked.
*you need to be 18 to be eligible for a credit card. And google wallet has no way to put money on it without it.