creators.com opinion web
Conservative Opinion General Opinion
David Sirota
David Sirota
17 Oct 2014
Fracking for the Cure?

Helping find a cure for cancer or "pinkwashing" carcinogenic pollution? That is the question being raised … Read More.

10 Oct 2014
How Big Brother Can Watch You With Metadata

Why did Bradley Cooper and Jessica Alba fail to record a tip when they paid their cabbies during New York … Read More.

3 Oct 2014
Why Economic Inequality Is Not a Bigger Political Issue

If critics of income inequality are wondering why the growing gap between rich and poor hasn't been a more … Read More.

The Dangers of a Data-Driven World

Comment

The Dangers of a Data-Driven World

 

If the recent political era has taught us anything, it has reiterated the enduring truth of George Santayana's aphorism about memory and duplication. Whether once again watching tax cuts fail to deliver a promised economic boost or witnessing more wars fail to deliver stability, we are reminded that "those who cannot remember the past are condemned to repeat it."

But then, as much as those haunting words are meant as a warning, technology today is coding Santayana's principle into society's operating system, as if mimicking history is an admirable objective. Indeed, whether it's movie studios, record companies, government intelligence agencies or corporate human resources departments, algorithms that use the past to predict — and create — the future are making more and more decisions.

For those employed in creative endeavors, it's comforting to believe that technology's use in the information economy begins and ends with the kind of straightforward processes (data entry, dictation, etc.) that require little cognitive analysis and even less artistic thinking. Yet, as Christopher Steiner shows in his mind-blowing new book "Automate This," algorithms taking into account past commercial successes are being deployed by the film and music industries to choose which movie and album proposals will be produced. What's more, an increasing number of the algorithms' selections have proven profitable.

Steiner also documents the Central Intelligence Agency's seeming preparation for a real-life version of the WOPR from the 1980s flick "War Games." Through grants to New York University and the Hoover Institution, the agency is trying to algorithmically quantify the history of past political and military decisions for the purpose of predicting — and perhaps eventually shaping — future events.

Then there is the realm of employment decisions. In the past, the job interview reined supreme precisely because it was the arena where an employer could personally assess the key skills that cannot be documented by a CV.

Now, though, the Wall Street Journal reports, "For more and more companies, the hiring boss is an algorithm." Using performance data from past employees, these algorithms preference future employees via quantifiable data, wholly ignoring the concept of intangibles.

The upsides of this brave new world are obvious — namely, institutions can replace human decision makers with machines, saving money and, theoretically, getting the same results or even better when the algorithm is properly tweaked.

That said, there are big downsides embodied in the difference between theoretical and actual.

Sure, pop culture industries may be able to use algorithms to produce more reliably lucrative films and music. But with those algorithms based on past successes, aren't they effectively making it less likely the industry will invest in a new wave of genre-busting and paradigm-shifting talents?

Certainly, CIA algorithms may be able to make predictions based on past events. But with the accelerating pace of global change, isn't there a big risk that such predictions will miss never-before-seen factors that therefore change the whole geopolitical game?

No doubt, employers' algorithms about past workforces may help them replicate the same workforce they've had for years. But won't that result in passing over valuable out-of-the-box thinkers whose talents don't fit within an equation?

The answer to all these questions is a resounding "yes." That's because, as much as technology triumphalists and data utopians want us to believe otherwise, we live in a world of both the quantifiable and the incalculable. And no matter what you call the latter, that which cannot be measured remains a factor in all human endeavors.

Pretending it doesn't runs the risk of fulfilling Santayana's omen and, in the process, manufacturing an ever more homogenous, destructive and unfair world.

David Sirota is a best-selling author of the new book "Back to Our Future: How the 1980s Explain the World We Live In Now." He co-hosts "The Rundown" on AM630 KHOW in Colorado. E-mail him at ds@davidsirota.com, follow him on Twitter @davidsirota or visit his website at www.davidsirota.com.

COPYRIGHT 2012 CREATORS.COM



Comments

1 Comments | Post Comment
A very interesting topic. There are many other dangers that technology brings about. It would be nice if the people in charge would learn from past mistakes.
Comment: #1
Posted by: Chris McCoy
Fri Nov 30, 2012 9:12 AM
Already have an account? Log in.
New Account  
Your Name:
Your E-mail:
Your Password:
Confirm Your Password:

Please allow a few minutes for your comment to be posted.

Enter the numbers to the right:  
Creators.com comments policy
More
David Sirota
Oct. `14
Su Mo Tu We Th Fr Sa
28 29 30 1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31 1
About the author About the author
Write the author Write the author
Printer friendly format Printer friendly format
Email to friend Email to friend
View by Month
Froma Harrop
Froma HarropUpdated 21 Oct 2014
Marc Dion
Marc DionUpdated 20 Oct 2014
Mark Shields
Mark ShieldsUpdated 18 Oct 2014

14 May 2010 The Predictable and Inevitable Blowback

28 Nov 2008 Our Dear Leader

15 Nov 2013 New Republican Icon, Same Old Policies