accenture technology vision 2014 every business is a digital … · 2015-05-23 · accenture...

28
Security Implications of the Accenture Technology Vision 2014 Accenture Technology Vision 2014 Every Business Is a Digital Business

Upload: others

Post on 10-Jun-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Security Implications of the Accenture Technology Vision 2014

Accenture Technology Vision 2014

Every Business Is a Digital Business

FOREWORD

Large companies are making a concerted push to

transform themselves from followers to leaders in

digital. But does that include a plan for leadership

in data security? Does it rally the organization’s

security resources around the parts of the digital

business that need it most? Is the push backed by a

real security mind-set among the executive team?

Those are the kinds of questions that inform this paper.

For business leaders everywhere, the next three

years will be about accelerating the organization’s

pace in the digital race—and its place in the new

digital world. Backed by their deep resources,

enormous scale and process discipline, global

enterprises such as GE, Tesco, Disney, and Procter &

Gamble are rewriting entire chapters of the digital

playbook. The first movers are poised to take

advantage of the many recent technology advances

in ways that will upend the expectations of industry

observers and consumers alike.

Those are the central messages in the Technology

Vision 2014 report from Accenture Technology

Labs. The report highlights six themes that reflect

the shifts emerging now among the digital power

brokers of tomorrow—themes that range from

the push for architected resilience to the blurring

between the digital and physical worlds. It provides

a richly detailed view from which business leaders

in every industry can draw insight, inspiration and

excitement about where digital technologies can

take their organizations.

Each of the themes comes with significant

challenges, and each requires deeper perspectives

to sharpen the focus on how security must

evolve to help the business take advantage of

digital disruption. In an environment where any

organization can fall prey to cyber threats and

where most find they are unprepared to deal with

rapidly blurring enterprise boundaries, security

organizations need to continue partnering with the

business and IT if they are to actively manage the

risks associated with the themes addressed in the

Technology Vision 2014 report. Drawing from the

extensive experience of the Accenture Technology

Labs Security Research and Development Group,

this paper examines the security implications

of each of the report’s six themes and describes

realistic ways in which companies can understand,

adapt and thrive in a rapidly evolving threat

landscape in the context of digital.

Our Security group’s ongoing advanced research

actively investigates stealthy cyber defense tactics,

algorithmic behavior modelling, and the resiliency

benefits of Internet-connected devices and

software-defined networks. If this document raises

questions about any area of your organization’s

security stance, we will be pleased to collaborate

with you to work through them.

Sincerely,

Ryan LaSalle

[email protected]

#techv i s i on2014

2

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

3

Digital-Physical Blur: Engineering TrustThe physical world is coming online as objects, devices,

and machines acquire more digital intelligence.

#techv i s i on2014

4

The Accenture Technology Vision 2014 report notes

that this wave of intelligent interfaces allows more

and more decisions to be made on the edge—at the

point where digital and physical worlds converge.

The decisions can be made exactly when and where

they’re needed in informed, social, easy-to-use

ways, allowing companies and governments to

re-imagine the possibilities for engaging with their

customers and citizens.

A few examples: smartphones have turned their

owners into digitally augmented versions of

themselves—able to catalog and quantify actions

throughout the day and access, create, and share

an astonishing array of pertinent information that

can enable faster, better decisions. “Wearable”

computers—such as fitness monitors such as Nike’s

FuelBand and Adidas’s miCoach—give users the

on-the-spot information needed to make decisions

about running another lap or pushing for a personal

best. Autonomous drones, once the sole province

of the military, are being used by police precincts

across the U.S. and devices such as Google Glass

will add many more layers of intelligence to

everyday experiences.

What’s emerging is more than just an Internet of

“things;” it’s a new layer of connected intelligence

that augments the actions of individuals, automates

processes, and incorporates digitally empowered

machines into our lives. These “cyber physical”

systems sense their environments and respond

appropriately in real time, enabling more informed

decisions that can create competitive advantage.

Autonomous devices are not the only

manifestation of the digital-physical blur. Given

that so many employees now carry intelligent

devices such as smartphones, they have the

capability to sense and respond in unprecedented

ways, and to make more significant business

decisions on their own.

But the more that intelligence and consequent

decision-making become decentralized, the greater

the risks to data security and individuals’ privacy.

Already, there are loud outcries about the threats

to personal privacy created by Google Glass and

by aerial drones, for instance. The rapid rise in the

numbers of sensors means that offline data can

be blended with online data, giving users larger

and more visible digital footprints. And IPv6, the

most recent version of the protocol used to assign

IP addresses, is likely to make it easier to identify

individual Internet-connected devices.

Furthermore, every new intelligent device on the

edge extends the enterprise’s “attack surface,”

adding another potential vulnerability to be

exploited by hackers. Accenture’s Technology

Vision 2012 report noted that not long ago,

hackers penetrated a car’s main communication

bus using the vehicle’s wireless tire-pressure

monitoring system as the cyber-gateway. The

manufacturing sector is perhaps the most

vulnerable to such attacks; in one case, an Italian

security researcher recently disclosed a security

vulnerability in a control system used for yogurt

production. At the time of the disclosure, another

researcher was able to identify 250 vulnerable

systems using only simple online search engines.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

5

Accenture urges a range of measures to

accommodate the digital-physical blur. To begin

with, companies need to decide whether to opt

for central administration of their devices and

sensors using a platform such as that offered by

Axeda.i Axeda’s M2M and IoT platform manages

communication between connected devices and

enables remote service and tracking. Companies

can also opt for zero-touch configuration, which

circumvents the need for a central administration

to configure a device during deployment, and

which sets up a trust relationship to the device.

Trusted computing techniques such as secure boot,

integrity measurement, and remote attestation help

establish remote trust and enable secure software

deployment and configuration.

Accenture also recommends securing the endpoint

by validating endpoint identity and health. Security

leaders need to monitor TrustMANET, a framework

for building trust between devices in mobile ad-hoc

networks (MANETs), where no central instances,

such as access points, are involved. TrustMANET

builds an architecture for the validation of trust

evidence records of network participants; it reports

malware, prevents spreading of malware, and allows

a reaction to an attack. It bolsters users’ confidence

in digital evidence by relying on trusted computing

technology, where the Trusted Platform Module

(TPM) serves as a root of trust in each system, and

where the chain of trust is built from boot time up

to the operating system.ii

In tandem, it is helpful to adopt the discipline of

data minimization so that companies store and

use only as much data as is necessary to ensure

the functionality of their products or services.

Minimization can be accomplished in several ways:

data can be collected periodically or randomly,

rather than constantly; or companies could take

data samples from a representative percentage of

products rather than collecting data from every

product. Alternatively, companies might opt to

collect only aggregated data in order to avoid

gathering granular information about particular

consumers.

For example, a smart utilities grid could collect

aggregate data from an entire apartment building

rather than obtaining individual data from each

apartment—or from individual devices in each

apartment. Individual device measurements may

be used locally to optimize energy usage, but do

not necessarily have to be collected by the smart

grid company to optimize grid utility. Aggregation

combined with deletion—that is, storing individual

data for only as long as it takes to develop an

aggregate computation—could allow for very

accurate aggregation while ensuring a degree

of anonymity for the consumers. Data retention

periods should be restricted as well.

Accenture also urges that organizations work to

empower consumers with regard to digital privacy.

This starts with obtaining consumers’ consent

before using their data for secondary purposes. For

example, in the case of the smart grid noted above,

consent would be needed for data collection and

use outside of that necessary for provision of the

service. Consent is a foundational privacy practice,

and it mirrors the U.S. Federal Trade Commission’s

recommendations on protecting consumer privacy.iii

Importantly, for consent to be effective, companies

must not make the use of a service conditional

upon collecting data from the prospective users.

Binary “take it or leave it” choices are unacceptable,

especially in markets where consumers have very

limited options.

#techv i s i on2014

6

Companies should also make their data collection

practices as transparent as possible, and give

consumers reasonable access to their own data.

Many of the consumer benefits of the Internet of

Things—reduced costs, time savings and increased

convenience—call for providing consumers with

access to their data and for understanding how the

company will use that data to make decisions about

its customers.

For instance, if the central database for a smart grid

determines that—based on their consumption, some

consumers will have their power switched off at

certain times of the day—those consumers must be

informed of that decision as well as the reasons for

the change in their data classification. Transparency

is a vital component of informed user choice. The

European Union Data Protection Directive, for

example, grants data subjects a “right of access” to

“logic involved in any automatic processing of data”

concerning the consumer “at least in the case of the

automated decisions.”iv

The Digital Physical Blur is connecting systems,

people and data at an unprecedented rate,

challenging security organizations to keep up.

Engaging with the enterprise’s first forays into

mobile-to-mobile, location based services and the

digital customer will give the security organization

opportunities to put these practices into place and

help those programs grow successfully.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

7

From Workforce to Crowdsource: Everyone’s an InsiderMore and more leading companies are starting to think

in terms of an expanded workforce—one that is not on

the payroll and that can deftly handle almost any task an

organization needs done.

#techv i s i on2014

8

The individuals involved may be around the corner

or on the other side of the world; what they have in

common is the experience and expertise to solve

the problem as well as the motivation to do so. This

expanded workforce also offers real scale; it can be

put to work on problems that may be too large or

too expensive to solve internally.

The tasks involved may be as simple as data entry or

as complex as industrial design. The individuals—the

problem solvers—may work on a project or just part

of one. They may be paid or they may compete for

prizes. But whatever their incentives and spheres of

interest, the unifying feature is that their

contributions are made possible with digital

technology—specifically cloud, social and

collaboration tools.

There are good examples in Procter & Gamble’s and

Eli Lilly’s use of Web-based innovation networks to

augment and accelerate market research and

product development. Similarly, GE and MasterCard

are harnessing crowdsourcing companies such as

Kaggle—a global network of computer scientists,

mathematicians, and data scientists who compete

to solve problems ranging from airline flight

optimization to optimizing retail-store locations.

New trust modelsHowever, this new world of crowdsourcing comes

with a host of security challenges. For example

every solver who participates is now an “insider”

who is likely to have access to a portion of the

enterprise’s intellectual property. So new trust

models are needed—models that are likely to be

different from the trust relationships developed

with suppliers, for example, because of the often

transient nature of the crowdsourcing projects. At

the very least, crowdsourcing can mean a huge

expansion of identities to manage, calling for new

approaches to identity assurance, such as the ones

offered by ThreatMetrix’ Persona ID, which uses

behavioral analysis techniques to compute a score

to identify high-risk users. Alternatively, enterprises

can lighten their burden by scaling identity

management to Identity-as-a-Service (IDaaS)

providers, such as Okta.

Businesses that adopt a crowdsourcing model

should implement more rigorous source code

control policies in order to protect their intellectual

property. They should also consider a zero-

footprint workspace, where the workforce gets

access to the information resources they need to

perform their jobs but without being able to copy

those resources. Web-based Integrated Development

Engines (IDEs), such as Cloud 9 and Codeanywhere,

are particularly helpful in this regard. Another

valuable tool is the Agent Desktop developed by

LiveOps.i This browser-based desktop agent app,

called LiveOps Engage, was built to provide call

center agents with streamlined access to a number

of applications to serve customers efficiently, while

user actions and access to sensitive data in the

cloud are continuously monitored to detect when

agents may be writing down or otherwise copying

customer-sensitive data.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

9

Crowdsourcing securityDespite all the risks discussed above, crowdsourcing

may also provide opportunities to make code more

secure. For example, the US Defense Advanced

Research Projects Agency (DARPA) has turned the

search for software vulnerabilities into computer

games. The initiative is reducing the workloads of

the agency’s vulnerability analysts because gamers

can identify potentially problematic chunks of code;

the agency notifies managers of the software when

errors are identified.iii The games are now reviewing

open source programs in use by the Defense

Department and other US government and

commercial organizations.

Large software vendors are implicitly inviting “the

crowd” to help test their products. For instance, in

response to the recent furor about the U.S. National

Security Agency’s use of U.S. citizens’ data,

Microsoft decided to enhance the transparency of

its software code, “making it easier for customers to

reassure themselves that our products do not

contain back doors.”iv With this smart move,

Microsoft is not only strengthening trust

relationships with its clients but is enhancing its

product security and quality, because more pairs of

eyes get to review its software code.

Security organizations really need to engage with

the business when it comes to crowdsourcing.

Security can be a real enabler to extending and

monitoring these trust relationships with these

transient contributors to the company’s most

strategic efforts.

Product qualitySome IT leaders have voiced concern that the

complexities of collaborative, crowdsourced work

would lead to substandard product quality, but the

opposite appears to be true. A study by Coverity

finds that the average defect density in open source

software (the number of defects per thousand lines

of code in the top 45 active open source projects

reviewed in 2011) was just 0.45,ii which is far better

than Coverity’s benchmark of 1.0 for high-quality

commercial software developed in-house.

Regardless of any product-quality advantages to

crowdsourcing of software, however, companies

should still guard their products against

contributors who may have malicious intent—who

could purposely introduce vulnerabilities into the

software code or Trojans into the hardware

components for kleptography, for instance.

Businesses are urged to recommit to their quality

control processes to ensure that there are no hidden

backdoors in their software or hardware products.

They need to reprioritize their resources from

development to a place where they are representing

user needs and governing the entire development

process from a security and quality perspective.

#techv i s i on2014

10

Data Supply Chain: Securing the Data EconomyEnterprise data is vastly underutilized. Data ecosystems are

complex and littered with data silos, limiting the value that

organizations can get out of their own data by making it

difficult to access.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

11

Few companies have mastered the concepts at the

foundation of modern data management—ideas

such as the mobility and portability of data, its

structure and velocity, data as a “saleable” product,

and its valuation in open data exchanges.

To truly unlock that value, companies must start

thinking about the entirety of the data supply chain,

putting data into circulation by enabling it to flow

easily and usefully through the entire organization—

and eventually throughout their ecosystems of

partners too. Yet the more that data is free to flow

easily within and outside of an enterprise, the more

at risk it becomes.

Ensuring the cyber hygiene of the data supply chainFrom a security standpoint, the challenge with the

data supply chain is that any link may be subject

to a failure or to a breach. Therefore, it is crucial

to have visibility all the way up and down the data

supply chain. Companies may have visibility into

how their data is being accessed on their own

networks, but they may not have comparable

visibility after the data leaves their premises.

Some large companies, such as financial

institutions, are working to assess the security of

their data supply chains by periodically requiring

third-party data providers and consumers to fill

out assessments of their compliance with various

regulations. However, verifying the veracity of

these self-reported assessments is difficult due to

the sheer number of third-party data providers.

Furthermore, such assessments may not accurately

reflect the level of information security risk that

these partners may be exposed to.

Therefore, businesses should seek other ways

to measure their data supply chain risk and

obtain better insights into the cyber hygiene

and information risks of their data providers and

consumers. Several security vendors have developed

solutions that may be useful.

However, carefully selecting data providers and data

users is not sufficient. Companies need to actively

monitor their data supply chains for signs of data

loss. Allure Security Technology is one vendor

that offers a data loss alert service which informs

companies when their sensitive data is accessed

out of policy. The service uses so-called “beacon

documents” that send a message in real-time every

time they are accessed. Such documents can be

embedded in advance in the organization’s sensitive

data to give companies a clear idea of when and

where their data is accessed.

Data in the CloudAs more businesses move their data centers to the

cloud, cloud services providers will be required to

offer greater visibility into how their clients’ data

is being accessed. Internally, too, companies are

quickly adopting cloud services, with the average

company using hundreds of such services. In

general, companies are hard-pressed to assess their

use of cloud services; for instance, they struggle to

pinpoint what is being used by their employees, who

exactly is accessing their data, and why.

#techv i s i on2014

12

It is not in the best interest of any company to

claim ignorance about the data they have shared

with a partner—no matter how temporary or solid

that relationship may be. While this stance may

help avoid breach notification, it is a risky defense.

Cloud service risk management firms such as

SkyHigh Networks enable customers to discover

unauthorized cloud services and then to manage

them.i They also rate these services in terms of

security and maturity, which enables their clients to

manage their own risks.

Guarding users’ privacy while monetizing dataAs data applications cut across many different

industries, IT leaders and their business colleagues

need to better understand all the ways in which

data can be used strategically and marketed. One

of the most compelling areas is data monetization,

where more and more enterprises see opportunities

to generate revenues by proactively selling data.

In other cases, companies are developing new

products and services built on the massive datasets

that they gather from customers, often complemented

with data from their business partners.

The practice of selling data has long raised privacy

concerns and will likely create more anxiety as

companies learn more information about their

consumers than they should.

Businesses that move into data monetization need

to be vigilant so that they neither compromise

their customers’ experiences nor squander their

trust. Until recently, companies that have sold

or transferred customer data have been able to

benefit from customers’ ignorance of how the data

is used. But already, the practice requires more and

more transparency about what data is collected,

where it will be shared, and how it will be used. The

U.S. Federal Trade Commission, in its latest report

on protecting consumer privacy, has already called

on companies to practice greater transparency and

to disclose “details about their collection and use

of consumers’ information.”ii, iii What is needed now

are truly proactive and transparent privacy policies

to protect customers and convince them that it is

in their best interests to share their data.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

13

Harnessing Hyperscale: Tuning Security for Performance After at least a decade out of the spotlight, the hardware

world is once again a hotbed of new development as

demand soars for bigger, faster, lower-cost data centers.

In fact, hardware is becoming a crucial consideration as

businesses strive to “go digital,” as discussed in Accenture’s

Technology Vision 2014 report.i

#techv i s i on2014

14

Innovations in hardware technologies such as low-

power CPUs, non-volatile memory chips, solid-state

data storage, and in-memory computing are set

to yield significant benefits. These innovations are

expected to augment the performance of every

enterprise’s servers and data centers, enabling the

next generation of infrastructure to support digital

transformation.

As more and more businesses come to rely on

supersized, super-scalable data centers— the

“hyperscale” systems run by the major cloud

services providers such as Amazon and Google as

well as the growing number that companies are

building themselves—the more their IT leadership

teams need to understand the security implications

of this strategic shift.

Concurrently, there is growing demand for

scalability in digital services and the hardware

that supports them. Resilience—the ability of IT

systems to maintain wholly acceptable levels of

operational performance during planned and

unplanned disturbances—has a strong security

remit. Enterprises now need IT architects who can

understand the resilience aspects of hyperscale

systems as well as their identity, data protection,

and interconnected trust capabilities.

Choosing between public, private and hybrid clouds What do the new architectures look like for

public, private and hybrid clouds? And what new

trust models are emerging? To begin to answer

those questions, IT leaders must ask themselves

how much their organizations should rely on

the public cloud. Although there is a strong

economic argument in favor of using public cloud

infrastructure, many enterprises are opting for

the flexibility and greater control provided by the

private cloud—that is, a full cloud implementation

that runs entirely within the enterprise’s own

infrastructure, using software such as OpenStack.

However, Accenture observes that leading

organizations are embracing the hybrid cloud.

Accenture recommends that IT leaders develop

integrated controls in collaboration with their

business colleagues to enable the enterprise to

take advantage of the components of the hybrid

cloud. IT leaders should also do more to grasp

which cloud services are being used, and for what

purposes, so they can make informed choices that

support the business. And they should work to

develop a security run-time architecture that lets

them control data, service usage, auditability, and

security event management in integrated ways

with their cloud providers.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

15

It’s also valuable to look at and evaluate the array

of security-as-a-service offerings now on the

market. Among the categories of interest are:

IAMaaS (identity and access management), SIEMaaS

(log monitoring and analysis), TMaaS (threat

management), DRaaS (disaster recovery), CaaS

(cryptography), ASTaaS (application security testing),

and various bundles of these. These offerings enable

enterprises to scale their security commensurate

with their cloud usage. Also important: the

application of analytics to enable rapid decisions

about the effectiveness of the organization’s

security controls (and security products) and to

scale back those that don’t continue to meet

necessary thresholds of effectiveness.

Securing data centers with hyperscale capabilitiesEnterprises that choose to invest in their own

hyperscale capabilities are finding it challenging

to secure the data center without hampering its

performance and even its core functionality: Can

security keep up with the hyperscale hardware

investments in speed and scale? Consider running

analytics on data collected from millions of sensors.

The question is whether security can prevent scaling

the data loss on increasingly larger data sets?

The good news is that many vendors are trying

to map better access controls to raw data and, in

turn, the analytics that are run on these data sets.

Others are working to enable better integration

with enterprise cryptography, logging and API

management so that hyperscale systems can

be audited.

Cisco is one of the vendors working hard to enable

this integration. It has announced a new Application

Centric Infrastructure (ACI) that is designed to

integrate layer 4 through layer 7—and security, in

particular—into data center environments.ii ACI

gives administrators the flexibility to create simple

and dynamic application and transaction chains,

without regard to data center topology; ACI security

solutions allow them to attach security directly to

those processes as a service.

Because there is a performance penalty to providing

such integration, IT leaders must size the capacity

of their hyperscale systems properly. They must also

keep data quality and reliability very much in mind,

since security is needed to support the integrity of

the data coming in and being stored.

Existing network security solutions, such as firewalls

and network intrusion detection systems (NIDS)

may not keep up with the velocity at which data

is transferred to and from hyperscale systems.

Scanning data packets against multiple signatures

is a computationally intensive task: Snort, for

instance, has a saturation analysis of 137 Mbps

using its standard configuration.iii A common

solution to close the performance gap is through

hardware implementation of security functions.

Reconfigurable hardware devices, such as Field

Programmable Gate Arrays (FPGAs) are becoming

popular for implementing these security functions.

They provide the flexibility of software and the

speed of hardware: Arista, for instance, offers a new

programmable switch that makes use of FPGAs for

load balancing and line-rate firewalling.iv

#techv i s i on2014

16

SDN’s upcoming role in data centersTo secure data centers effectively, our experience

indicates that enterprise IT departments must

begin treating the network like an application. As

such, there must be heightened monitoring of the

network control plane, redundancy/failover of the

control architecture, continued searches for man-

in-the-middle attack opportunities, and protection

of security certificates.

Software-defined networking (SDN) is getting closer

to this model of network-as-an-application. The few

early SDN deployments are in the large data centers

of cloud service providers and large enterprises

such as Google, NTT, AT&T, Verizon, Deutsche

Telekom, BT, and China Mobile. The next wave is

expected to show up in enterprise and data center

networks, according to Infonetics Research.v “We’re

already seeing significant use cases for SDN in the

enterprise LAN, providing security and unification

of wired and wireless networks, and enabling BYOD

(bring your own device),” said Cliff Grossner, the

research firm’s directing analyst for data center

and cloud. At the same time, Dell and Red Hat are

working together on OpenStack Neutron, a project

to bring SDN capabilities to OpenStack, allowing the

platform to offer networking as a service.vi

Non-volatile memoryBeyond hyperscale data centers, the hardware

innovation surge introduced persistent memory,

which promises to be fast, non-volatile, and power-

efficient. However, integrating persistent memory

into systems poses several key challenges—with

protection against offline memory scanning being

the most important. Early research looked into

adding a Memory Encryption Control Unit (MECU)

to the system architecture in order to provide

memory confidentiality when the system is

suspended or across system reboots.vii

Researchers at North Carolina State University also

proposed to incrementally encrypt memory using

hardware to avoid the performance degradation

caused by having the entire memory encrypted

throughout runtime.viii But until such hardware-

based encryption techniques are fully adopted by

memory chip manufacturers, persistent sensitive

data in the stack or heap would have to be

protected. Application developers are urged to

identify where they are relying on non-volatility to

avoid memory data leakage and to explicitly remove

sensitive data from their memory locations. This is

going to require deeper integration between security

architects, system architects and developers.

How can security benefit from hyperscale investments?While IT architects have to address many security

challenges caused by the demand for scale, they

should explore ways to take advantage of hyperscale

systems to solve some of their security problems.

For instance, hyperscale systems can be used in

the future for cognitive computing, i.e. bringing

reasoning to data. This can be applied to security

problems to find new inferences in data based on

known “good” and “bad” activity by applying neural

networks and machine learning techniques.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

17

IBM, for instance, is working on building a

neuromorphic processor. Micron is also building

hardware that can be used to create more advanced

neural networks for cognitive computing. That is the

future. But today, IBM is harnessing GPUs (graphics

processing units), in a neural network-like fashion

to improve intrusion detection systems. Today’s

intrusion detection systems use either signature-

based or anomaly-based detection techniques.

Neural networks could combine the two approaches

to strengthen the ability of the system to detect

unusual deviations from the norm.ix

The question is whether IT security professionals

can successfully advocate for access to these

hyperscale investments. Will enterprise leaders

share those resources between the business and

security practices? Will they invest in solutions that

bring robust and scalable security solutions to their

own hyperscale systems or will new hyperscale

security vendors step up to unify security and data

once and for all? Accenture contends that if security

groups can win the trust of the business by helping

to manage the risks of the digital business, they’ll

gain access to a broader set of resources for their

own use too.

#techv i s i on2014

18

Business of Applications: Protecting Against the Next Wave of Attacks The way that businesses build software is changing:

mimicking the shift in the consumer world, more and

more organizations are rapidly moving from enterprise

applications to apps, according to the Accenture

Technology Vision 2014 report.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

19

Eager for relief from some of their biggest pain

points—especially their systems’ lack of agility—

business leaders have been searching for software

that is far nimbler than the legacy systems they’ve

relied on for decades. Although there will always be

big, complex enterprise software systems to support

large organizations, there is a marked shift toward

simpler, more modular, and more customized apps.

The push is coming from the accelerating pace of

IT change: the more quickly businesses can create

and launch new applications in today’s turbulent

markets, the better they can innovate, collaborate,

improve customer experiences, and enrich personal

interactions. Users are raising the temperature

too. Customers and employees are looking for

consumer-grade experiences. In the workplace, they

are pressing IT to give them the kinds of low-cost,

accessible, and often intelligent apps that they use

every day on their own mobile devices.

In response, enterprise development teams are

recognizing the opportunity offered by APIs

to improve business agility, innovation, and

responsiveness, even with limited budgets for new

app development. To create and manage curated

sets of application programming interfaces (API),

companies need to leverage API management

and provisioning tools such as those from Apigee,

Layer 7 and Mashery.

Integrating Security into the Agile API Development ProcessHowever, with the new focus on agile development,

there is a greater risk that application security

requirements will not get the attention they need.

We believe it is imperative to build security into

the agile development and testing process from

the start, and provide the necessary developer

education and training.

To do so, developers can make use of SDL Agile,

a secure development lifecycle process adapted

from Microsoft’s secure software delivery lifecycle

(SDLC). To weave security design into agile

development, the security requirements in SDL

Agile are divided into three categories: (1) every

sprint requirements, which have to be completed

on every sprint or cycle; (2) one-time requirements,

which have to be completed once for every project;

and (3) bucket requirements, which consist of all

other requirements, and which are assigned to

one of three buckets: Security Verification, Design

Review, and Response Plans. To address bucket

requirements, product teams have to complete at

least one SDL requirement from each bucket of

related tasks during each sprint.i

API developers are uniquely challenged to test for

security. With solutions like Veracode APIs, they can

fully automate security verification for the entire

software portfolio. These APIs can be integrated

with internal build and bug tracking systems, and

used for dynamic and static testing to identify

potential vulnerabilities. Veracode, which was listed

#techv i s i on2014

20

last year on Forbes’ list of America’s most promising

companies, also offers testing-as–a-service

complete with role-based access control (RBAC),

risk ratings and embedded analytics. Its APIs enable

the testing and monitoring of vendor security

applications, thus helping business gain insights into

the security of their software supply chain.

Enterprise App Stores: Securing Data in the CloudWhen applications have been deployed, it is

important for product managers to understand

the specifics of how they are being used so that

the managers can get a holistic view of each API’s

effectiveness and then take appropriate corrective

action if the API is underperforming from a business

perspective. IT teams need much better traceability

between applications, users and data to understand

who is doing what and whether users’ actions are

legitimate—necessitating richer instrumentation of

apps and APIs.

IT leaders also need to consider the security

implications across the entire application

ecosystem. Given that the real power of apps lies in

how they are combined to handle larger business

tasks, it is critical for app developers to be able to

track dependencies and isolate the impact of any

change across a given application or application

portfolio, so that changes do not disrupt the entire

business task.

One software vendor, Netskope, has a product

that can monitor and control access to specific

actions in cloud applications. The solution is

designed to provide visibility and role-based

access control across all cloud applications for the

enterprise. It can discover and analyze thousands

of applications, allowing a deep dive into not just

cloud application use, but into identification of

users, devices, browsers, and even geographical

locations. As an in-line service, Netskope allows

enforcement via a policy-building tool. All the data

available for analytics can be used to build policies.

The granularity in identifying cloud application

actions is also available for use as a policy. For

instance, the security organization can create a

policy that disables file uploads in Facebook or

Gmail, while other features of those apps remain

functional, allowing very specific and granular

access controls.

As every digital business gets serious about

the business of software, developers become

the first line of defense for the enterprise. How

they incorporate good security practices in

their development, how data services and APIs

enforce good practice and how they instrument

their applications for greater security visibility

dramatically drives down the cost of security

remediation later and the cost of breaches.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

21

Architecting Resilience: Built to SurviveDowntime in data centers costs 41 percent more than it

did just four years ago. That statistic alone underscores the

argument for engineering resilience into IT systems from

Day One.

#techv i s i on2014

22

The fact is, failure is a normal operating condition.

It must be anticipated, accommodated and

deliberately designed into IT systems. Today, the

idea is no longer about designing for “five nines”

(99.999 percent) uptime; it’s about supporting the

nonstop business—24 hours a day, 365 days a year.

If systems are to be as nonstop as businesses need

them to be, they can no longer be designed just to

specification or engineered to handle only particular

incidents. They must be designed to work while

under failure and under attack.

Architecting resilience starts with presuming that

failure is a normal operating condition—that the

enterprise will operate in a hostile environment,

whether because of cyber-criminal attacks, system

failures, or maintenance events. That pertains all

the way down to the infrastructure “pipes,” and

it calls for a deep understanding of exactly what

the business requires to survive a disruptive event.

Engineering for resilience means recognizing that

the failure of a component under attack may

cascade to other components, thus it is crucially

important to build resilience into the connections

that allow different systems to interoperate.

Realizing the importance of resilient architectures,

the UK’s intelligence agencies GCHQ and MI5

have issued a letter in Fall 2013 to the UK’s banks

instructing them to put together their 10 point

plans for cyber resilience due to the vulnerability of

the banking sector.

When it comes to security, architecting resilience

implies striving for two system properties—

robustness and adaptability—that are not

necessarily synergistic with each other. Much of

the security architecture work to date focuses

on developing robustness against failures and

disruptions and perimeter defense against attacks.

But little work has gone into architecting for

adaptability—that is, with the assumption that an

intruder will succeed and that a system/service

must continue functioning properly and securely

while responding to the attack.

Architecting for resilience on three layersAccenture urges enterprises to architect for

resilience at three different layers: the network

layer, the compute layer, and the business layer.

At each layer, this calls for architecting for

monitoring, diagnosis and remediation. Each layer

merits a closer look:

At the network layer. Resilience is needed at

the network layer to enable intelligent application

delivery and fast security responses. This requires

self-diagnosis and self-repair/self-healing

capabilities, as well as proactive resilience to

challenges ranging from disruptive operating

conditions to cybercrime attacks. Distributed

Denial of Service attacks in particular are

becoming much more sophisticated and frequent;

businesses need to be ready to surge through

their partners to absorb the attacks whenever

campaigns against them materialize.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

23

Software-defined networking (SDN) has emerged

as a technology for making the underlying

communication network more secure and resilient.

It allows network engineers and administrators to

respond quickly to changing business conditions,

including when the infrastructure is under attack.

In such circumstances, SDN-based solutions such

as Radware’s DefenseFlow and CloudFlare can

help divert suspicious traffic to the closest

mitigation device.

The network administrator can “shape” the traffic

from the central controller without having to

touch the physical switches, using the software to

prioritize, de-prioritize or block traffic either globally

or in varying degrees down to the individual packet

level—all in programmatic, automated ways.

One technology area that IT leaders should

explore is the software-defined perimeter (SDP)—a

framework that provides secure connectivity to

cloud applications from any device. The SDP is

intended to protect application infrastructure from

network-based attacks. The framework, which

incorporates NIST and OASIS security standards, was

published in December 2013 by the Cloud Security

Alliance (CSA).i SDP brings together standard

security capabilities such as PKI, TLS, SAML, and

XML as well as concepts such as federation, device

attestation and geo-location. Connectivity in a SDP

is based on a “need to know” model in which device

posture and identity are verified before access to

application infrastructure is granted. Application

infrastructure is effectively “black,” with no visible

DNS information or IP addresses that are the typical

entry points for many common cyber-attacks.ii

Security organizations can use the SDP technology

to expand the enterprise’s perimeter and spread

out the attack surface to other groups that can

help. The combination of SDN technology and the

cloud can keep the “noisy” attack in the elastic

environment where it can be readily absorbed,

without impact on the company’s core business

functions. Deception networks can also be

deployed—using counterattack techniques or other

solutions—to further expand the perceivable attack

surface, thus diluting an attacker’s intelligence and

forcing the attacker to spend more resources on

trying to identify the real target.

At the compute layer. A resilient architecture

requires resilient hardware—servers and storage

systems that can recover quickly from a failure

or an attack. Different server workloads call for

different resilience techniques, but companies

that have “always on” requirements should use

architectures with more than one workload failover

point. Resilience at this layer can be achieved

through the cloud, where distributing data and

compute cycles across data centers gives businesses

portability of their compute services, geographically

and potentially across various providers.

Resilience at the compute layer is also a key element

of the Mission-oriented Resilient Clouds (MRC)

program, launched in 2011 by the US Defense

Advanced Research Projects Agency (DARPA) to

“effectively build a ‘community health system’ for

the cloud.”iii The MRC program is investigating

technologies that would enable compute jobs

to run under attack by developing distributed

cloud defenses and dynamic trust models and by

introducing diversity into the cloud.

#techv i s i on2014

24

Early results confirm the usefulness of Nash

bargaining techniques. These techniques, which aim

to maximize the product of utilities, can be used to

rapidly adjust resource allocations while preserving

critical mission expectations—tying business

priorities to technology prioritization. The MRC

program’s leaders hope it can lead to new libraries

for scalable, high-performance, disruption-tolerant

computation and new techniques to detect, isolate

and respond to intrusions in cloud computing

environments.

At the business layer. A key element for

architecting resiliency at the business layer depends

on being able to articulate the threats—not just at

the technology layer. Leading organizations are able

to translate threats outcomes to business outcomes

so they can better understand the impact and

prioritize their attention.

Security vendors such as Factonomy, Cytegic

and InfusionPoints offer services and tools to

analyze the potential impact of cyber-attacks.

And Symantec is offering a new product called the

Security Simulation Platform, which allows the

enterprise to create a simulation environment and

thus to conduct Game Day-like exercises in which

different kinds of attacks are launched. Simulation

scenarios like these can quickly expose the weak

link in the security chain—business processes

as well as IT components—and can help guide

security architecture planning and security-related

investments. These scenarios are also useful for

training information security staff.

Many of the techniques described are ready for

use today and can be employed as part of an

enterprise’s resiliency strategy. Other promising

solutions are still emerging. Security organizations

need to start addressing resiliency at all three

layers and look to these emerging models and

approaches for strategic input. Articulating threat

as a challenge to a business process or function,

exercising the enterprise and approaching the

enterprise attack surface as a programmable

asset will provide leap-ahead capabilities to

organizations willing to architect for resiliency.

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

25

NotesDigital-Physical Bluri “Adexa IoT Platform,” Axeda, 2014. http://www.axeda.com/products/IoT-platform

ii “Integrating Trust Establishment into Routing Protocols of Today’s MANETs,” in 2013 IEEE Wireless Communications and Networking Conference (WCNC): Networks, Trusted Computing, 2013. http://www.trustedcomputing.eu/cms/wp-content/uploads/2013/05/TrustMANET_WCNC_2013_Proceedings.pdf

iii “Protecting Consumer Privacy in an Era of Rapid Change,” Federal Trade Commission Report, March 2012. http://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-report-protecting-consumer-privacy-era-rapid-change-recommendations/120326privacyreport.pdf

iv “EU Directive 95/46/EC—The Data Protection Directive,” EU Data Protection Commissioner, 1995. https://dataprotection.ie/viewdoc.asp?DocID=93

From Workforce to Crowdsourcei “Rebooting Work in the Cloud,” LiveOps Blog, February 19, 2013. http://www.liveops.com/blog/tag/workplace/

ii “Coverity Scan: 2011 Open Source Integrity Report,” Coverity, 2011. http://www.coverity.com/library/pdf/coverity-scan-2011-open-source-integrity-report.pdf

iii “VeriGames website. http://www.verigames.com

iv “Protecting Customer Data from Government Snooping,” Microsoft Official Blog, December 4, 2013. http://blogs.technet.com/b/microsoft_blog/archive/2013/12/04/protecting-customer-data-from-government-snooping.aspx

Data Supply Chaini “Big News Day on the Cloud Application Security Front,” Forbes, January 28, 2014. http://www.forbes.com/sites/benkepes/2014/01/28/big-news-day-on-the-cloud-application-security-front/

ii “FTC Issues Final Commission Report on Protecting Consumer Privacy,” Federal Trade Commission, March 26, 2012. http://www.ftc.gov/news-events/press-releases/2012/03/ftc-issues-final-commission-report-protecting-consumer-privacy

iii “F.T.C. Warns Data Firms on Selling Information,” New York Times, May 7, 2013. http://www.nytimes.com/2013/05/08/business/ftc-warns-data-firms-on-selling-information.html?_r=0

Harnessing Hyperscalei “Accenture Technology Vision 2014,” Accenture, January 27, 2014. http://www.accenture.com/technologyvision

ii “Security Drives Major Transition in the Network and Data Center,” Cisco blog, November 23, 2013. http://blogs.cisco.com/security/security-drives-major-transition-in-the-network-and-data-center/

iii “Snort—Lightweight Intrusion Detection for Networks,” in Proceedings of LISA ‘99: 13th Systems Administration Conference, November 7-12, 1999. http://www.cse.wustl.edu/ANCS/2007/papers/p165.pdf

iv “Arista Outpaces Cisco Again with FPGA Switch,” Network Computing, March 27, 2012. http://www.networkcomputing.com/next-gen-network-tech-center/arista-outpaces-cisco-again-with-fpga-sw/232700283

v “New Infonetics Report Projects Data Center and Enterprise SDN Market Will Top $3 Billion by 2017,” Infonetics Research, December 9, 2013. http://www.infonetics.com/pr/2013/Data-Center-and-SDN-Market-Highlights.asp

#techv i s i on2014

26

vi “Neturon,” OpenStack webpage: https://wiki.openstack.org/wiki/Neutron; “Dell and Red Hat to Co-Engineer Enterprise-Grade, OpenStack Private Cloud Solution,” Dell press release, December 12, 2013. http://www.dell.com/learn/us/en/uscorp1/secure/2013-12-12-dell-cloud-red-hat-linux-openstack

vii “Defending Against Attacks on Main Memory Persistence,” in Proceedings of the Annual Computer Security Applications Conference (ACSAC) 2008, December 8-12, 2008. http://www.cse.psu.edu/~asmith/pubs/2008/nvmm.pdf

viii “Memory Encryption Breakthrough Claimed by NC State Researchers,” Network World, May 27, 2011. http://www.networkworld.com/news/2011/052711-nc-state-encryption.html

ix “Biologically Inspired: How Neural Networks Are Finally Maturing,” ComputerWorld, December 17, 2013. http://www.computerworld.com/s/article/9244869/Biologically_inspired_How_neural_nets_are_finally_maturing

Business of Applicationsi “SDL for Agile,” Microsoft website. http://www.microsoft.com/security/sdl/discover/sdlagile.aspx

Architecting Resiliencei “Cloud Security Alliance Releases Software Defined Perimeter (SDP) Framework Details,” Cloud Security Alliance Research News, December 5, 2013. https://cloudsecurityalliance.org/media/news/csa-software-defined-perimeter-details/

ii “Cloud Security Alliance Launches Secure Network Effort,” Network Computing, November 21, 2013. http://www.networkcomputing.com/next-generation-data-center/servers/cloud-security-alliance-launches-secure/240164194

iii “DARPA’s Mission-Oriented Resilient Clouds (MRC) Program,” DARPA website. http://www.darpa.mil/Our_Work/I2O/Programs/Mission-oriented_Resilient_Clouds_(MRC).aspx

SECURITY IMPL ICAT IONS OF THE ACCENTURE TECHNOLOGY V IS ION 2014

27

Copyright © 2014 Accenture All rights reserved.

Accenture, its logo, and High Performance Delivered are trademarks of Accenture.

ABOUT ACCENTURE

Accenture is a global management consulting,

technology services and outsourcing company,

with more than 293,000 people serving clients in

more than 120 countries. Combining unparalleled

experience, comprehensive capabilities across all

industries and business functions, and extensive

research on the world’s most successful companies,

Accenture collaborates with clients to help

them become high-performance businesses and

governments. The company generated net

revenues of US$28.6 billion for the fiscal year ended

Aug. 31, 2013. Its home page is www.accenture.com.

CONTACTS

For more informationRyan LaSalle Global Managing Director—Accenture

Security

[email protected]

Lisa O’Connor Accenture Technology Labs Security

R&D

[email protected]

Malek Ben Salem Research Manager

[email protected]

www.accenture.com/technologyvision