Skip Navigation
BlackBerry Blog

The Economic Benefits of Emerging Technologies

NEWS / 04.26.17 / Malcolm Harkins

The following address was given by Cylance Chief Security & Trust Officer Malcolm Harkins to the United States Senate in March 2017. We believe it’s important enough to share with the public and start a dialogue so that we can band together to find the solutions we so clearly need in order to secure our vastly-changing future.

The march of technology can be viewed as a succession of major waves, each lasting roughly 100 years (Rifkin 2013). Each wave has brought transformative benefits to society, but also significant challenges.

The first wave, starting in the 1760s, included steam power, railways, and early factories, as well as mass education and printing.

The second wave, starting roughly in the 1860s and continuing well past the mid-1900s, included automobiles, electricity, mass production, and had an even bigger effect on society.

Version 1.0: 1760s                  Version 2.0: 1860s                  Version 3.0: 1990s

Steam and coal                          Electric lights                             The Internet

Railways                                     Communications                       Molecular biology

Factories                                     Oil & gas                                      Renewable energy

Printing press                            Mass production                       “Smart” everything

Mass education                        Automobiles  

The third wave began in the 1960s, with early computers, but only really gained momentum in the 1990s. It includes the Internet and smart “things”, molecular biology and genetic engineering, and renewable energy.

 Arguably, this technology wave may have the broadest impact on society of any to date. Each previous wave lasted about 100 years, so history suggests that we are far from reaching the crest.

To provide some perspective - if we thought of this wave as a movie, we’d still be watching the opening credits.

The March of Technology: Version 3.0 and Beyond

The Internet of Things (IoT) has come upon us at a fast and furious pace. It gets discussed and hyped constantly, but sometimes without a clear definition. And, as such, the phrase can mean different things to different people. But a simple way to think about it is that any powered device will compute, communicate, and have an IP address – meaning it is connected to a network.

The Internet of things allow devices to be sensed or controlled remotely across the Internet. This has created opportunities for more direct integration of the physical world into computer systems. When IoT is augmented with various sensors, we have what is often defined as smart grids, smart homes, and smart cities.

Each IoT device has an embedded computing system and is able to interoperate within the existing Internet infrastructure. Many estimate indicate that the IoT will consist of more than 50 billion devices by 2020; some estimates top 70 billion devices.

IoT devices or objects can refer to a wide variety applications, including everything from heart monitoring implants or pacemakers, to biochip transponders on farm animals, or children’s toys such as an Internet-connected Barbie doll.

Current market examples include home automation such as Google Nest, which can provide control and automation of lighting, HVAC systems, and appliances such as washer/dryers, robotic vacuums, air purifiers, ovens or refrigerators and freezers that use Wi-Fi for remote monitoring.

In November of 2016, Louis Columbus from Forbes wrote, “This years’ series of Internet of things and the industrial Internet of things (IIoT) forecasts reflect a growing focus on driving results using sensor-based data and creating analytically-rich data sets. What emerges is a glimpse into where IoT and IIoT can deliver the most value, and that’s in solving complex logistics, manufacturing, services, and supply chain problems.”

Figure 1: The Internet Of Things Heat Map 2016, Where IoT Will Have The Biggest Impact On Digital Business
- By Michele Pelino and Frank E. Gillett, Jan14, 2016 (Source: Forrester)

The Arrival of Quantum Computing

Quantum Computing is also emerging quickly. In 2011, Microsoft created a Quantum Architectures and Computation Group with a mission to advance the understanding of quantum computing, its applications and implementation models.

In February 2017, Brian Krzanich, CEO of Intel said that he was “investing heavily” in quantum computing during a question-and-answer session at the company’s Investor Day. Earlier this year in March 2017, IBM announced that it is planning to create the first commercially-minded universal quantum computer.

Today's computers work by manipulating bits that exist in one of two states: a 0 or a 1. Quantum computers aren't limited to just two states. By harnessing and exploiting the laws of quantum mechanics to process information, a quantum computer can encode bits which contain these multiple states simultaneously and are referred to as quantum bits or “qubits”.

Quantum computing has the potential to be millions of times more powerful than today's most powerful supercomputers. Last year, a team of Google and NASA scientists discovered a D-wave quantum computer was 100 million times faster than a conventional computer.

Figure 2: Qubits Explained (Source: Universe Review)

This means that many computing challenges and difficult computation tasks long to be thought impossible (or “intractable”) for classical computers may be achieved quickly and efficiently by quantum computing.

This type of leap forward in computing could allow for not only faster analysis and computation across significantly larger data sets, it would also reduce the time to discovery for many business, intelligence and scientific challenges which include improving energy grids, protecting and encrypting data, simulations of molecules, research into new materials, development of new drugs, or understanding economic catalysts.

Quantum computing can reduce time spent on physical experiments and scientific dead ends, resulting lower costs and faster solutions that can provide economic and societal benefit.

Forging Ahead With Blockchain

Blockchain as many people know it is the technology behind Bitcoin. A blockchain is a distributed database that maintains a continuously growing list of ordered records called blocks. Each block contains a timestamp and a link to a previous block. By design, blockchains are inherently resistant to modification of the data. Once recorded, the data in a block cannot be altered retroactively.

Blockchains are an open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way. The ledger itself can also be programmed to trigger transactions automatically.

The technology can work for almost every type of transaction involving value, including money, goods and property. Its potential uses are wide ranging: from collecting taxes to more effectively managing medical records, to anything else that requires proving data provenance.

Figure 3: How a Blockchain Works (Source: WEFORUM.ORG)

Why AI Is the Smart Choice

Artificial intelligence (AI) is progressing rapidly, with everything from SIRI to self-driving cars relying on it automate specific tasks.

While there is a wide variety of definitions of AI, artificial intelligence today is properly known as narrow AI (or weak AI), in that it is designed to perform a narrow task (e.g. only facial recognition or only internet searches or only driving a car). However, the long-term goal of many researchers is to create general AI (or strong AI).

While narrow AI may outperform humans at whatever its specific task is, like playing chess or solving equations, general AI would outperform humans at nearly every cognitive task.

Machine learning (ML) is a branch of artificial intelligence. Machine learning is also one of the most important technical approaches to AI. It is the basis of many recent advances and commercial applications of AI.

Machine learning is a statistical process that starts with a body of data and tries to derive a rule or procedure that explains the data or can predict future data.

A simple way to describe how ML works is as follows: In traditional programming, you give the computer an input - let’s say 1+1. The computer would run an algorithm created by a human to calculate the answer and return the output. In this case, the output would be 2.

Here’s the crucial difference. In machine learning, you would instead provide the computer with the input AND the output (1+1=2). You’d then let the computer create an algorithm by itself that would generate the output from the input.

In essence, you’re giving the computer all the information it needs to learn for itself how to extrapolate an output from the input. In classrooms, it’s often stated that the goal of education is not so much to give a growing child all the answers, but to teach them to think for themselves. This is precisely how machine learning works.

AI has applications in everything from agriculture for crop monitoring, automated irrigation/harvesting (GPS-Enabled) systems to the media and advertising industry with new developments such as facial recognition advertising.

Figure 4: The Global Robots and AI Market

Looking To The Future

With all this change comes significant challenges and the courage to build security features and tools into these innovative products in order to protect the people. Innovation cannot come at the cost of public safety.

In future, we’ll explore some of those challenges and discuss possible solutions that can grow with our emerging technology.


Malcolm Harkins 
Cylance Chief Security & Trust Officer
Address to the United States Senate, March 2017.

Malcolm Harkins

About Malcolm Harkins

VP Chief Security & Trust Officer at Cylance

As the global CISO at Cylance, Malcolm Harkins is responsible for all aspects of information risk and security, security and privacy policy, and for peer outreach activities to drive improvement across the world in the understanding of cyber risks and best practices to manage and mitigate those risks. Previously, he was Vice President and Chief Security and Privacy Officer at Intel Corp. In that role, Harkins was responsible for managing the risk, controls, privacy, security and other related compliance activities for all of Intel's information assets, products and services.