Monday, February 3, 2014

Moral Imperative of Big Data and Analytics


(image credit)

Big Data and Analytics are about discrimination or, in business parlance, segmentation.  The idea with any marketing effort is to slice a universe of customers into discrete groupings, thereby allowing sales people to differentiate and optimize their efforts accordingly.  

But isn't discrimination the scourge of civilized communities?  

In Big Data's Dangerous New Era of Discrimination, Michael Schrage aptly raises a moral question about what a purely analytic approach may do and he offers several examples of fine-tooth comb, data-driven segmentation.   

It's an irony for social activists, I think.  The stuff of discrimination goes like this: You may have had some experience, positive or negative, with a handful of people, and you generalize your experiences into a conclusion about the particular group to which this handful belong.  Going forward, any other person you meet, who belongs to this group, immediately triggers your conclusions about what they are like and what they are not.  

The essence of prejudice is making a prejudgment about people, well before you've met them.  Discrimination is differential treatment, response or inclusion of certain people, based on that prejudgment.   Regardless, it is a flaw in inductive reasoning, the hallmark of which is unsystematic thinking and little supporting evidence.

The irony for social activists is that companies have data to back up what may frankly end up being discriminatory business practices.  For better or for worse, gay men may get preferentially better customer service, while African American women the opposite, based on how profitable their respective segments actually are.  

I see a missing piece in Schrage's otherwise very compelling argument.  The technology behind Big Data and Analytics can often pinpoint details about particular individuals, that is, current or prospective customers.  So, depending on such details, then, a company may, and should, serve an African American woman and sidestep a gay man.  In other words, technology can do a segmentation of one, even in real time, as, say, you navigate an e-commerce site.  

Nevertheless, Schrage duly prompts companies to reflect on its practices.  How they navigate among business priorities, on the one hand, and a moral imperative, on the other hand, goes beyond Big Data and Analytics.  Sure, they can formulate algorithms based on a more individualized segmentation, but how much will they account for the fact that people have needs and people deserve respect?  

Even if a particular individual is of a particular segment, which is deemed unprofitable by Big Data and Analytics, should he or she be served anyway?

Thank you for reading, and let me know what you think!

Ron Villejo, PhD

No comments:

Post a Comment