Most people find the vision of the future rather disturbing. The rise of Artificial Intelligence and virtual reality seems threatening. People can’t help but fear the unstoppable technological developments that would risk their jobs or careers, and threaten their privacy and their sense of ownership and stewardship – think advanced robotic systems threatening to takeover humans in various industries, the self-driving cars, and emotional robots among others.

Humans like to be in control, but the advent of advanced technology has caused fears to develop among individuals who are afraid of not being in control. We often fear change and we will do what we can to stay in a comfort zone. But the thing is, technology always moves forwards, even though some humans remain stuck in their places.

In other words, technology advancement, much like change is inevitable.

The Quest

Over the years, have men not stopped innovating or evolving? From developing easier ways to hunt their preys in the wild to using computers to do repetitive and time-consuming tasks – people like to see things not only done right, they want things done conveniently and quickly. Therefore, humans learned to care for technology, since it will help them improve their lifestyle and enhance their day to day living experience.

Boon or Bane

Changes in technology happen from time to time. What used to be the fastest today will be the slowest one tomorrow and the most convenient ones now will be completely useless later on.

While some people will gladly embrace these changes and find other opportunities from them, there are also others who view these technology advancements as real and imminent threats. Some people see automation and Artificial intelligence as disconcerting and threatening to the human’s reign.

As machines are taking on more and more of the attributes and skills that were previously deemed as uniquely and exclusively for humans, people are also now beginning to question what consequences these radical technology transformation will bring.