If Silicon Valley gets their way, privacy is the last thing you’ll be worried about

Ned Ryun
Fox News

 

With unending revelations of privacy-related data abuses by Silicon Valley’s largest and farthest reaching companies – even Apple CEO Tim Cook is calling for reform – we must acknowledge there is far more at stake than simply individual privacy and data. Big Tech’s immoral and unethical behavior regarding your privacy will be a mere annoyance when compared to their ungoverned and reckless forays into artificial intelligence.

The troubling reality is that there is a deeply disconcerting lack of transparency and accountability as extremely wealthy, powerful, self-interested people – looking no further than adding many more billions of dollars to their personal balance sheets – are making decisions that will affect a great many of us. More disturbing still is the fact that in many cases they are envisioning a future that most Americans do not desire and for which we remain woefully unprepared.

Here is only one example: A 2016 Obama White House report estimated that 83 percent of all jobs that pay under $20 an hour can be, and likely will be, automated. A 2013 study by the University of Oxford showed that nearly 50 percent of all U.S. employment is automatable.

Why, then, are so many on the left hell-bent on pushing for an increase in minimum wages to $15? Because make no mistake: this is a push to accelerate the replacement of low-skilled workers with automation. Period. By making low-skilled tasks more expensive for a human to complete, they are directly accelerating the movement of those jobs to automation – permanently.

To push for opening our immigration system to millions more low-skilled or unskilled laborers at the same time is insanity and raises serious questions about the impact this will have on our welfare systems.

But this is only one example of the disconnect between the reality of the impending mass automation our Big Tech masters envision for us and the impact it will have on real people and their everyday lives.

Automation is accelerated by data. This is an uncontroversial fact. Data is the rocket fuel, the nitrous oxide, for all of this. It allows computers and algorithms to rapidly gain more knowledge and begin to understand the environment around them. A quick review of Google Brain’s achievements in language translation or Deep Mind’s dominant victory over the world’s best Go player, a game far more complex than chess, should serve as a clarion call to awaken to this new reality.

What has prevented these rapid strides in machine learning and artificial intelligence (AI) in the past has been the lack of data; there simply hasn’t been enough to feed the machines. Part of the issue with lack of data was lack of devices to collect that data.

All that changed with the advent of cheap smart home devices and the Internet of Things (IoT). We now live in a world filled with sensors, from Fit Bits to Nest to Alexa, to even smart light bulb sockets. There are over 8 billion of these IoT devices in use today, with that number likely to exceed 20 billion in three years, creating an absolute firehose of data.

Most people don’t even consider where the data from their IoT devices is going, where it’s stored or how it’s being used. Perhaps more thought should be given to that on an individual level. Perhaps people should understand that for every good use regarding the optimization of their lives there can be a nefarious one.

But more importantly, perhaps our elected officials should require the Googles, Facebooks, and Amazons of the world to answer hard questions. Where is our data going? What will it be used for? What protections will an individual have to make sure that his or her data will not be used to support, or augment, or design, things that are anathema to that individual’s values and ideals?

Our experiment in self-government, which has led to the longest lasting constitutional republic in the history of the world, is a far more fragile idea than perhaps we would like to admit. Our republic was designed with great precision by our founding fathers to deliver personal liberty to the individual, to free the individual from tyranny and to allow the individual to make society better by pursuing his or her own path in his or her own way within the bounds of ordered liberty.

To sound the alarm on AI and automation is not to be a Luddite. It is to be concerned about and to seriously consider what lies ahead. And juxtaposed against this concern is a Silicon Valley, as it is currently constructed, with an unrepentant and specific vision of the future that involves mass automation and “singularity” – the belief that “amoral” computers will optimize our lives and create a utopia on earth.

Don’t believe me? Consider Masayoshi Son, the Japanese investor making billion-dollar investments into achieving that very goal. Most people won’t recognize the name, but he has every intention of succeeding. Son has an almost religious belief that “computers will run the planet more intelligently than humans.” Some have estimated that the point where computers take over – aka “singularity” – will take place by 2040. Now some are estimating that because of people like Son, that might potentially move up to 2030.

Consider the implications of that on the jobs front: by 2030, only 11 years from now, there are estimates that 400 million people globally will have to find new jobs and 800 million could be displaced by automation.

If left unexamined and unchecked, the future envisioned by people like Son runs the risk of becoming an oligarchy of a few elites in which privacy, freedom, personal liberty, and even free will, will be dead. We must have this conversation now, before the opportunity to give to our children the freedom and liberty we inherited from our parents dies with barely a whimper on a digital battleground without anyone lifting a finger.