Cognitive Era - IBM Systems Magazine

JANUARY 2016
SPECIAL REPORT:
Big Data for Analytics Services
IT Infrastructure for the
Cognitive Era
Helping IT leaders become better service providers
INSIDE
MSP TechMedia
220 S. 6th St., Suite 500,
Minneapolis, MN 55402
(612) 339-7571
Direct editorial inquiries to [email protected]
EDITORIAL
EXECUTIVE PUBLISHER
Diane Rowell
4
PUBLISHER
Doug Rock
12
EXECUTIVE EDITOR
Evelyn Hoover
COPY EDITOR
Holly Eamon
PRODUCTION
ART DIRECTOR
Jill Adler
PRODUCTION MANAGER
Tim Dallum
17
CIRCULATION COORDINATOR
Carin Russell
FULFILLMENT COORDINATOR
Valerie Asante
ADVERTISING/SALES
ASSOCIATE PUBLISHER
Mari Adamson-Bray
ACCOUNT EXECUTIVE, NORTHEAST, NORTHWEST & CANADA
Kathy Ingulsrud
ACCOUNT EXECUTIVE, SOUTHEAST, SOUTHWEST & ASIA
PACIFIC
Nicole Johann
PROJECT MANAGER
Elizabeth Reddall
ACCOUNT EXECUTIVE,
MIDWEST & EUROPE
Darryl Rowell
CIRCULATION
SALES AND MARKETING
DEVELOPMENT MANAGER
Katie Vosbeek
CIRCULATION DIRECTOR
Bea Jaeger
8
3 IBM PERSPECTIVE
IT analytics services for the cognitive era: are you ready?
4 AN INSIGHT-DRIVEN WORLD
API integration brings data analytics into the cognitive era
8 DRIVING BETTER OUTCOMES
In-place analysis enables faster, smarter predictive analytics
12 G
AME-CHANGING ANALYTICS STARTS
WITH INFRASTRUCTURE
Power Systems is a key driver in clients’ digital transformation
17 T
RANSFORMING INFRASTRUCTURE
Organizations maximize storage to deliver faster insights
and cost savings
CIRCULATION MANAGER
Linda Holm
© Copyright 2016 by International Business Machines (IBM) Corporation. This magazine could contain technical inaccuracies or typographical errors. Also, illustrations contained herein may show prototype equipment. Your system configuration
may differ slightly. This magazine contains small programs that are furnished by IBM as simple examples to provide an
illustration. These examples have not been thoroughly tested under all conditions. IBM, therefore, cannot guarantee or
imply reliability, serviceability, or function of these programs. All programs contained herein are provided to you “AS IS.”
IMPLIED WARRANTIES OF MERCHANTABILITY, NON-INFRINGEMENT AND FITNESS FOR A PARTICULAR PURPOSE ARE
EXPRESSLY DISCLAIMED.
IBM, the IBM logo, and ibm.com are trademarks or registered trademarks of International Business Machines Corporation in the
United States, other countries, or both. If these and other IBM trademarked terms are marked on their first occurrence in this
information with an asterisk (*), these symbols indicate U.S. registered or common law trademarks owned by IBM at the time this
information was published. Such trademarks may also be registered or common law trademarks in other countries. A current list of
IBM trademarks is available on the Web at “Copyright and trademark information” (ibm.com/legal/copytrade.shtml).
AIX
DB2
Domino
i5/OS
IBM Watson
IBM z13
Power
POWER
POWER7
POWER7+
POWER8
PowerLinux
Power Systems
PureSystems
Rational
Smarter Planet
System i
System p
System Storage
System z
Tivoli
z/OS
z Systems
The following (marked with an *) are trademarks or registered trademarks of other companies: Intel, Itanium and Pentium
are trademarks or registered trademarks of Intel Corporation or its subsidiaries in the United States and other countries.
Java and all Java-based trademarks and logos are trademarks or registered trademarks of Oracle and/or its affiliates. Linear
Tape-Open, LTO and Ultrium are trademarks of HP, IBM Corp. and Quantum in the U.S. and other countries. Linux is a registered
trademark of Linus Torvalds in the United States, other countries, or both. Microsoft, Windows and Windows NT are trademarks
of Microsoft Corporation in the United States, other countries, or both. UNIX is a registered trademark of The Open Group in
the United States and other countries. Other product and service names might be trademarks of IBM or other companies.
All customer examples cited represent the results achieved by some customers who use IBM products. Actual environmental costs and performance characteristics will vary depending on individual customer configurations and conditions.
Information concerning non-IBM products was obtained from the products’ suppliers. Questions on their capabilities should
be addressed with the suppliers.
All statements regarding IBM’s future direction and intent are subject to change or withdrawal without notice and represent goals and objectives only. The articles in this magazine represent the views of the authors and are not necessarily
those of IBM.
Publications Agreement No. 40063731, Canadian Return Address, Pitney Bowes, Station A, PO Box 54, Windsor, Ontario
Canada N9A 6J5
[email protected]. Printed in the U.S.A.
2 // JANUARY 2016 Data for Analytics Services
IBM PERSPECTIVE
IT Analytics Services for the
Cognitive Era: Are You Ready?
B
usiness innovation is happening at ever-increasing speeds as
IT leaders look for the latest and best capabilities to facilitate
their digital transformation in the cognitive era.
For service providers in the
age of hybrid cloud, cognitive
computing and real-time analytics
capabilities are two key drivers of
innovation. Given the breakneck
pace of business, companies that
want to compete have to be in
the fast lane—and that means
delivering intelligent insights in
real time without wait. Through
cognitive computing, systems that
can sense, learn and adapt are
making faster analytics possible.
Cognitive systems go beyond
what traditional programming
can do by learning and adapting
as new information becomes
available. They enable computing
at an enormous scale and give
businesses the power to out-think
their competition.
By using the latest realtime analytics services and
locating data optimally for
analysis, companies can access
an unprecedented level of
business visibility and insights.
Stream processing addresses
data in motion—data coming
in at high speed that requires
high-throughput, low-latency
processing. In-transaction
analytics embed the analytic
capability in the transactions
themselves, exercising more
profitable business actions
during interactions with users.
Both stream processing and intransaction analytics are helping
organizations derive better
business outcomes when the
action window is milliseconds.
The innovations happening
today in cognitive computing
and real-time analytics
drive infrastructure design
requirements. How do businesses
address and monetize the natural
resource of big data? Do we have
the infrastructure capabilities to
make real-time insights a reality?
Today’s big data comes
in massive quantities and a
variety of forms (i.e., structured,
unstructured, at rest and in
motion). To capture the largest
volume and variety of data,
businesses need data acquisition
tools that can ingest data rapidly.
Real-time ingestion makes real-time
decision making possible.
The ability to efficiently locate
data in the right place, at the right
cost, with the right protection, is
also critical for implementation of
analytics services.
Finally, delivering analytics
services depends upon servers
and storage that are designed for
insights: high-speed processors,
3 // JANUARY 2016 Data for Analytics Services
hardware accelerators, lowlatency storage and a large
memory capacity.
Systems designed for data and
analytics can enable smarter,
faster business value from big
data and, ultimately, better
customer experiences that drive
business growth. IBM servers
and storage have the strength
to support today’s analytics
workloads; our infrastructure
capabilities are built to support a
new age of analytics computing.
Only one question remains:
is your business ready
for the cognitive era?
R
NIN LEI
Distinguished
Engineer, CTO,
Analytics and
Big Data,
IBM Systems
An Insight-Driven World
API integration brings data analytics into the cognitive era
A
ll organizations are looking for ways to create exceptional customer experiences that will drive
business growth and generate loyalty—and data insights will be key to rewarding outcomes.
Successful IT leaders are providing enterprise-wide analytics services that enable real-time insight
from a broad and diverse set of big data sources. The ability to acquire and integrate external data with
internal operational data is a foundational step to provide relevant, contextual and insight-driven content to
users. As a result, businesses must master the API economy and bring the power of cognitive to their data.
4 // JANUARY 2016 Data for Analytics Services
In an insight-driven world,
data acquisition is a critical
infrastructure capability. By rapidly
capturing the largest volume and
variety of data, and then enabling
cognitive technologies that sense,
learn and adapt, companies can
put more of their data to work to
achieve their business goals.
Such technology trends are
opening doors to new ways
of doing business. They are
strengthening our ability to
optimize business processes,
propel innovation and revenue
growth, and ultimately offer
customers meaningful,
personalized interactions that
generate loyalty.
Cognitive businesses take
maximum advantage of their
data by:
• Integrating it in hybrid clouds
through the standards-based
API economy
• Incorporating new cognitive
capabilities
• Driving better processes and
decision making through
analytics insights
So, how do APIs and cognitive
technology make it possible to
integrate data and bring analytics
into the cognitive era to enable
your business to deliver compelling
consumer experiences and open
new revenue channels?
Connecting Data Sources
Through API Integration
To get the most out of a huge
volume and variety of data, and
identify and mine the right insights,
companies must integrate their
existing business applications
with the oceans of unstructured
data coming from external sources.
Hybrid cloud makes this possible
by bringing together systems
of record (e.g., data in existing
business applications such as ERP
and CRM systems) and systems of
engagement (e.g., external social,
mobile and Internet of Things
data). A hybrid cloud architecture
spans traditional IT, private cloud,
dedicated or shared public cloud,
and third-party services and data.
By adopting a hybrid approach,
organizations can expose their
business logic and apps as APIs,
opening up new possibilities for
integration and innovation.
An API is a set of routines, protocols and tools for building software
applications that act as a technology glue to connect data and busi-
based API integration across
the business ecosystem can
help you monetize your data
and provide rich customer
experiences that are the basis of
competitive advantage.
An enterprise service bus can
enable systems and applications
to communicate with each
other and provide a platform
for the API economy and
analytics. Messaging middleware
can speed the integration
of diverse applications and
business data across multiple
Connecting data sources in a hybrid
cloud, through middleware capabilities
that support API integration, makes
it possible for organizations to really
capitalize on the oceans of data available
from internal and external sources.
ness systems through the cloud.
APIs help developers design, build,
deliver and manage composable
services and business processes.
In terms of business value, they
ensure secure integration across
enterprise IT and hybrid clouds in a
way that unlocks new value.
The API economy enables
much innovation today,
helping companies expose
their core business assets and
data through APIs to a system
of developers, customers or
partners. To take advantage of
this, you must adopt an API
strategy as part of your overall
digital transformation.
API integration makes
organizational data available
for analytics; this is where we
find ways to leverage data for
business growth and open up
new possibilities. Standards-
5 // JANUARY 2016 Data for Analytics Services
platforms. Finally, API
management platforms can
help enterprises create, deploy
and manage their APIs in a
secure, controlled fashion. IBM
Middleware provides all of these
capabilities—and more.
Connecting data sources in a
hybrid cloud, through middleware
capabilities that support API
integration, makes it possible
for organizations to capitalize on
the oceans of data available from
internal and external sources.
Bringing Cognitive
Power to Your Data
The connectivity and integration
made possible through the
API economy also create
a foundation for cognitive
business. As more APIs are
created, the ability to innovate
will depend largely on an
organization’s capacity to find
and work with the right ones—
and this is where cognitive can
help. Cognitive systems think,
learn and adapt to evolving
business contexts. They
enable companies to analyze
information and services to
decide which APIs they should
create and expose.
Applying cognitive APIs across
organizational data will help
drive decision making based on
to use, show API relationships
and identify what is missing.
IBM DataPower Gateway is a
security and integration platform
that provides the API gateway to
enforce API security and control,
allowing businesses to expand
the scope of their data and other
IT assets to new channels.
The ways we can access
and learn from big data in the
cognitive era are expanding
rapidly, and harnessing the
Cognitive operations result in real
business outcomes—fast, seamless,
insightful response to customer needs.
a richer understanding of all of
your data sets. But harnessing
cognitive technologies requires
building an architecture that
allows you to quickly and flexibly
plug in cognitive capabilities.
Data acquisition is a key
competency for creating IT
analytics services that deliver
insights, inform operational
decisions and enable superior
customer experience. You must
capture the data quickly and
enable cognitive APIs that sense
and learn from it. Time is of the
essence when it comes to data
analysis, and cognitive APIs can
speed a company’s ability to get
data-driven answers.
IBM is offering new services
and software that allow
companies to get greater use of
the API economy and cognitive
technology. IBM API Harmony,
announced in 2015, is a tool
for building applications in the
cognitive era. It uses technologies
like intelligent mapping and
graph technology to anticipate
what a developer will require
to build new apps, make
recommendations on which APIs
6 // JANUARY 2016 Data for Analytics Services
power of cognitive APIs can
inject greater intelligence into
your business decisions.
Reinventing Processes for
Better Decisions
Success in the digital age
requires businesses to do
work differently because how
work gets done can redefine
the customer experience.
It’s not enough to integrate
data through APIs and adopt
cognitive computing; you
also have to reinvent the
processes your technology
supports to convert your digital
transformation into a true
transformation of the way your
organization operates.
Bringing cognitive
capabilities into business
operations can help
organizations better sense what
is happening and act quickly
and consistently. Cognitive
decision support tools can
learn from every interaction
with people, processes,
systems, data and devices—
thus adapting in an evolving
digital environment.
Cognitive business operations
enhance the expertise of a
workforce and influence the
following needs:
• How we find context and
understand the relationships
between data
• The application of predictive
analytics and business rules
• Real-time, continuous
delivery of data intelligence
IBM operational decision
management solutions
enable businesses to codify
their policies, practices and
regulations to manage decision
logic; empower business users
to take control of business logic;
and automate decision making
with context.
Cognitive operations result in
real business outcomes—fast,
seamless, insightful responses to
customer needs.
An Infrastructure
Designed for Insights
Evolving your infrastructure for
digital transformation can help
boost revenues, inspire customer
satisfaction and deliver greater
value to your users.
An infrastructure built for
the cognitive era offers agile,
continuous delivery across the
mix of cloud and traditional
IT that provides the best
performance, cost and risk
profile for the needs of your
business at any given time. It is
built on open standards, allows
you to build on your existing
IT investment and is flexible
enough to change at the speed of
digital business.
These are the hallmarks of a
hybrid cloud architecture designed
for data. Are you ready?
David Crozier is a technology
marketing expert with 20 years of
experience across the IT industry.
The digital revolution
has shifted the
IT infrastructure
conversation to a
strategic discussion
IBM Institute for Business Value (IBV) asked IT
executives about the challenges they face competing
in the emerging digital world. Read the results in the
IBV report “New Technology, New Mindset.”
To register for the IBV report or learn more, visit
ibm.com/systems/infrastructure-report
DRIVING BETTER
OUTCOMES
In-place analysis enables faster,
smarter predictive analytics
By Paul DiMarzio and Mythili Venkatakrishnan
W
hat if you could always
anticipate customer
needs? In the cognitive era, delivering superior
customer service requires businesses to evolve their infrastructure for smarter analytics and
real-time response.
How can banks, for example,
prevent fraudulent transactions
without flagging legitimate ones
or slowing down the process,
thus increasing customer
confidence without sacrificing
satisfaction? Or how can retailers
provide exceptional point-ofsale experiences for consumers
by offering them personalized
suggestions instantly and in
context? Regardless of the
industry, customer experience
is the consumer battleground
today—and in the age of cognitive
business, real-time analytics is
at the heart of delivering great
customer service.
Organizations face numerous
challenges on this front: decades
of managing operations and
analysis as separate practices
have resulted in a massive
volume and variety of data
distributed over multiple,
disparate platforms. Analytics
solutions are often aligned to
specific architectures and tied to
inflexible programming models.
Ongoing manual intervention is
required to integrate data into
coherent analytics solutions.
All of these complications cost
businesses time—a key asset
they can’t afford to waste when
it comes to delivering real-time
customer service.
IT departments must become
the trusted service providers to
their organizations and ecosystems partners, and therefore
are innovating to address these
challenges with the latest technology solutions that deliver immedi-
8 // JANUARY 2016 Data for Analytics Services
ate insights, inform operational
decisions and enable superior
customer experience. In-place
data analysis through integrated
IT systems is facilitating faster,
smarter predictive capabilities and
helping organizations drive better
outcomes for their customers.
When every millisecond counts, an
infrastructure designed to handle
both operations and analysis can
make all the difference.
Speeding up Data Analysis
A single interaction can make or
break a customer’s experience
with your brand. In such an
environment, speed of response
can be a differentiator, and
integrating analytics services
into operational systems
is one way to design an
infrastructure for faster insights.
Through capabilities like
in-place predictive analytics,
organizations are identifying,
adapting and acting on their
data in real time. They are
predicting outcomes before they
happen, all while saving on the
cost and risk of moving data
from one place to another.
Predictive analytics helps
organizations better understand
customers, ensure good customer
experience and stand out against
the competition. So, how do you
deliver forward-looking analytics
services that are deeply embedded
into your operational systems?
Processing Data
at the Point of Origin
First, let’s think about effective
placement of data and
applications. Traditionally, data
residing in operational systems
is copied, moved to a centralized
A Hybrid System
for More Agile Analysis
The latest hybrid technology is
fusing transactional data and
analytics in systems that make
real time more real than ever
before. The majority of highvalue data enterprises possess
today resides on the z Systems*
platform, and several IBM z*
solutions have evolved to address
pressing analytics challenges:
IBM DB2* Analytics
Accelerator for z/OS* is one
such solution. It’s a single,
integrated system combining
transactional data and historical
data for highly accelerated
business analysis and reporting.
By co-locating analytics and
operations, businesses can
gain more instantaneous
more embedded analytics instead
of disconnected analytics. It
enables organizations to stand
out by knowing what is going to
happen before it happens.
Predictive analytics services
provide for in-place data
processing prediction. Through
advanced machine learning and
statistical methods, predictive
analytics helps businesses
discover patterns and create
models, which are then executed
within the scope of a transaction
for operational decision making.
In-database processing
improves the responsiveness and
accuracy of a predictive model
so organizations can anticipate
their customers’ wants, needs
and motivations. Instead of
wasting time on data transfer
ARE YOU READY TO TAKE THE NEXT STEP IN
YOUR DIGITAL TRANSFORMATION AND DIVE INTO
THE ERA OF COGNITIVE BUSINESS?
location and then analyzed.
The copying and transfer of
data, however, run counter
to the concept of real-time
analytics because each step in
the process takes time. Not only
that, but moving highly sensitive
data increases the risk of a
security breach.
A new approach has evolved
to address these challenges:
processing data at the point of
origin. “In-place analysis” simply
means processing operational
data where it resides instead of
moving it to another location.
This removes the time, cost
and security risk involved in
moving operational data from
the equation, which makes it
possible to speed up analytics
and, ultimately, address
customers more quickly.
insights to help them respond
to customer wants and needs
before they even happen. This
sort of capability enables a
retailer to offer a customer
that in-context upsell or crosssell, or a financial institution
to mitigate risk by preventing
fraudulent transactions. The
DB2 Analytics Accelerator for
z/OS is a hybrid platform built
on the security, safety and
reliability of z Systems. It enables
in-place data analysis, saving
organizations precious time
and helping them drive smarter
outcomes for their customers.
In-place analytics solutions
for z Systems can do more than
provide in-the-moment response
for customers—they make the
anticipation of customer needs a
reality. This capability to process
data at the point of origin means
9 // JANUARY 2016 Data for Analytics Services
and complex reporting processes,
companies can seek more
automated, intelligent decisions.
For predictive analytics
services to deliver optimal
performance, the data must
be close to the analytics tool—
which brings us back to the
idea of in-place analysis. Any
distance between the data and
the decisions creates delays;
therefore, an integrated system
that combines mainframe
hardware and analytics software
with business processes can
be a key differentiator. New
capabilities of the IBM DB2
Analytics Accelerator enable
advanced predictive models to be
built from z/OS data with speed
and accuracy.
Companies whose operations
are built on IBM z Systems have
access to tools that make it
possible to perform advanced
analytics in the scope of a
z/OS transaction. IBM SPSS*
Modeler and SPSS Statistics are
predictive analytics software
solutions that work with new
capabilities of the IBM DB2
Analytics Accelerator for z/OS
to provide real-time analytics
through predictive modeling.
Once a predictive model
is built, the most effective
execution of that model involves
deploying it directly into the
transactional system. IBM
Business Partner Zementis has
made its predictive analytics
engine available on z/OS for this
purpose—Zementis for z Systems
is a Predictive Model Markup
Language (PMML) engine that
installs in the transactional
application and can execute a
predictive score in just a few
milliseconds. These capabilities
enable businesses to place the
analytics directly in line with
their transactions and data
to drive successful customer
interactions through real-time
predictive analytics services.
Embedded Analytics for
Unparalleled Performance
and Insights
Businesses are looking for ways to
more easily gain insight from all of
their data; shifting toward a more
embedded analytics environment
can make a big difference in
delivering insights that change the
tide in customer experience. But
not all data is in IBM DB2 or on
the mainframe.
The data available to
organizations today comes
from both external systems of
engagement (e.g., social media
and mobile device data) and
internal systems of record
(e.g., transactional enterprise
data held in operational systems).
Both types of data are highly
valuable to organizations
seeking to understand customer
10 // JANUARY 2016 Data for Analytics Services
behavior and drive better
customer experiences, and the
greatest insight comes from
combining them.
In the past, applying analytics
services to multiple data sets
in different locations has been
complicated—especially given
the requirement for real-time
response. We need the ability
to analyze heterogeneous,
potentially distributed data
sources without moving the data.
For diverse data sources,
Apache Spark enables the
“democratization of data”—
helping companies use common
standards to gain faster insights.
Apache Spark is an analytics
framework and platform that
can help meet the demands
for faster analytics services by
offering a federated analytics
approach, meaning that data can
be analyzed in place for better
security and optimized speed.
Spark is an open-source cluster
computing framework with
in-memory processing and is
being enabled natively for z/OS
as well as Linux* on z Systems.
It is not reliant on a specific
file system or platform, and it
offers a unified programming
environment; support for diverse
programming languages; and
standard framework for common
analytic methodologies
IBM Business Partner Rocket
Software has developed a new
product, Rocket Mainframe
Data Service for Apache Spark
on z/OS, to enable Spark to be
used with a variety of z/OS data
types. When most of the data that
is going to feed Spark analytics
resides on z/OS, running
Spark on z/OS with integrated
support from Rocket delivers
performance, security and
colocation advantages.
Spark is a platform with
rapidly growing adoption and
is giving us a fresh look at
how analytics are constructed
and performed in today’s
enterprises.
Spark on z Systems is another
capability that can help
organizations use the strengths
of transactional environments
and their high-value data
(structured, unstructured, at
rest or in motion) to offer faster,
smarter insights.
Make Real Time a Reality
To deliver exceptional
experiences that engage and
delight customers, businesses
need fast analytics services made
for the cognitive era. Capabilities
like in-place analysis, predictive
analytics services, and a
federated approach to data
analysis are all making real-time
results a reality—and they are all
available in the z Systems family.
Companies that want to
compete on the consumer
battleground are adopting a
strategy that analyzes data
in place for truly real-time
responsiveness. They are
building predictive intelligence
into their core operations so
they can gain easier insight and
anticipate customer needs. IBM
z Systems analytics services
allow each organization to add
insight-generating capabilities
to core operations at their own
pace. Are you ready to take
the next step in your digital
transformation and dive into the
era of cognitive business?
Paul DiMarzio is a mainframe strategist with nearly 30 years’ experience
with IBM focused on bringing new and
emerging technologies to the mainframe. He’s currently responsible for
developing and executing IBM’s worldwide z Systems big data and analytics
portfolio marketing strategy.
Mythili Venkatakrishnan is an
IBM senior technical staff member
and is the z Systems Architecture and
Technology lead for analytics.
Outthink your limits at
InterConnect 2016
With more than 2,000 sessions spanning the entire lifecycle
of IT, IBM InterConnect 2016 is the only conference capable
of equipping your organization with flexible cloud and mobile
solutions that are built for security, powered by cognitive and
infused with advanced analytics. Register today to start building
the foundation of your cognitive, customer-driven enterprise.
ibm.com/interconnect
Game - Changing
Analytics Starts
With Infrastructure
Power Systems is a key driver in pushing clients’ digital transformation
By Keshav Ranganathan
12 // JANUARY 2016 Data for Analytics Services
D
ata is the new basis of
competitive advantage
for businesses, and it is
driving digital transformation
for many companies today.
Trailblazing organizations are
using data to accelerate insights
and decision making. They are:
• Applying analytics across
many different data
sources inside and outside
the enterprise
• Capturing the time value of
data to help deliver realtime insight to improve
business decisions
• Gearing up for cognitive
computing—a paradigm
where systems themselves
can hypothesize, learn and
improve over time
As organizations become datasavvy and insight-driven, analytics
become mission critical. This
places additional emphasis on
the infrastructure needed to
deliver analytics services across
the enterprise.
Integrating
Data Environments
Insight-driven companies are
infusing analytics in everything
they do. From attracting, growing
and retaining customers to
optimizing operations while
countering fraud and threats
to transforming financial and
management processes—every
aspect of the business is improved
by analytics. To deliver gamechanging analytics capabilities, you
must evolve your IT infrastructure
to address data and analytics
services. The infrastructure must
help with acquisition, placement
and management of all types of
data—structured, unstructured,
data at rest and in motion—and
provide compute capacity that
enables delivery of a range of
analytics services.
A first step is to seamlessly
integrate heterogeneous data
environments across your business.
Systems of record represent sources
of structured data in relational
databases and data warehouses
from ERP, CRM and financial
systems. Systems of engagement
represent unstructured data from
social media, surveys, public
records and sensors that helps you
understand customers, partners
and employees. These diverse data
types come together in systems
of insight, which enable delivery
of data and analytics services
to ensure that every decision,
interaction and process is fueled
by data and advanced analytics—
faster and in a more personalized
fashion (see Figure 1 on page 14).
Building on a
Solid Foundation
A high-performance infrastructure
that’s scalable for varying
workloads, highly available and
optimized for price performance
is critical. It provides seamless
integration of analytics services to
drive better business outcomes and
allows companies to build on what
they already have while adding new
capabilities as their needs grow.
IBM is leading innovation across
the infrastructure stack to deliver
analytics as a service for our clients.
We are helping IT leaders become
trusted service providers in the
cognitive era through hybrid cloud.
So where do you begin?
IBM offers a broad set of
analytics capabilities built on
the proven foundation of a single
platform, IBM Power Systems*.
The open, secure and flexible
Power Systems platform is
designed for big data; it has
13 // JANUARY 2016 Data for Analytics Services
massive (I/O) bandwidth to
deliver analytics in real time,
and it can provide the capabilities
needed to handle the varying
analytics initiatives each
business requires.
Differentiated Value
for Analytics
In 2014, IBM announced
POWER8*—the first microprocessor
designed for big data and analytics.
The POWER8 microprocessor offers
numerous advantages for big data
and analytics solutions: processing
capability with a large number
of processor threads; memory
capacity and bandwidth; cache
workspace; and the capability to
move information in and out of the
system at the rapid speeds required.
With these advantages, it delivers
levels of performance you need to
make decisions in real time, helping
you capitalize on the currency of
data by finding business insights
faster and more efficiently. The
Power Systems platform is designed
for big data—from operational to
computational to business and
cognitive solutions. The systems
are optimized for performance
and can scale to support
growing workloads.
The data and application layers
are key elements of a big data and
analytics architecture.
On the left side of Figure 2 (see
page 15) is the data layer—that is,
all of the data discussed earlier.
A variety of data-management
options are needed to effectively
handle the avalanche of data,
including NoSQL databases;
relational databases; analyticsoptimized; in-memory, columnar
databases; Hadoop; and
data warehouses.
On the right side is the
application layer—the key
analytics workloads, everything
from cognitive solutions around
IBM Watson* to Cognos* and
SPSS* to industry solutions,
that are integrated into business
processes. IBM offers all of
this on a single platform with
Power Systems.
Improving
Operational Simplicity
IT budgets are dedicated
mostly to management of the
IT environment rather than to
delivering new capabilities to the
business. With Power Systems
innovations, we address the needs
of structured and unstructured
data management with solutions
that simplify delivery of data
and analytics services and are
optimized for price performance.
Key solutions include:
• Structured data. For
structured data, IBM DB2*
with BLU Acceleration*
on Power Systems brings
faster analytic queries and
reports processing as well as
operational simplicity. IBM
DB2 with BLU Acceleration
is optimized to take
advantage of simultaneous
multithreading in POWER*
processors; it automatically
detects and exploits larger
cache sizes and memory
bandwidth. BLU Acceleration
technology takes advantage
of multiple cores, providing
consistent performance for
FIGURE 1
14 // JANUARY 2016 Data for Analytics Services
a large number of users.
With BLU Acceleration
and the latest LC line for
Power Systems, we saw 2.03x
more query results per hour
per core versus Intel* Haswell
servers, at 36 percent lower
hardware cost of acquisition.
• Unstructured data—NoSQL
database. The explosive
growth of new mobile, social
and cloud applications creates
a need for lightning-fast
response at high data volumes.
A growing number of NoSQL
databases are supported and
optimized on Power Systems.
IBM Data Engine for NoSQL
is an integrated platform that
uses the Coherent Accelerator
Processor Interface (CAPI)
in POWER8 to improve
infrastructure density. The
result is lower hardware,
maintenance and energy costs
with minimal performance
impact. Data Engine for NoSQL
can deliver up to 56 TB of
extended memory with one
POWER8 processor-based
server with CAPI-attached
Flash system—and without
sacrificing performance. That
means faster read-write speeds
to address the millions of
records of unstructured data
that don’t fit into traditional
SQL data structures. Redis
Labs’ key-value pair NoSQL
database is supported
today, and we are on track
to deliver a broader set of
NoSQL databases (e.g.,
graph, document, columnar)
in 2016.
• Unstructured data—Hadoop
and Spark (IBM BigInsights*).
IBM Data Engine for Analytics
is a customizable, preintegrated infrastructure
solution with integrated
software optimized for big data
and analytics workloads. It is
designed to help companies
speed insights on massive
amounts of data and has the
flexibility to grow as clients’
needs change. The IBM Data
Engine is ideal for workloads
such as Spark and Hadoop.
It provides flexible storage
and compute resources
that are easy to deploy and
align to specific business
requirements. This means you
can move or add more storage
and compute when and where
it’s needed. In addition,
the recently introduced
S812LC is a cost-optimized
big data server that delivers
superior performance and
throughput for Spark and
Hadoop workloads.
Together, these solutions
deliver data services designed
for capitalizing on data in the
cognitive era—to help businesses
derive real-time decisions and
increase speed of innovation.
Accelerate Insights With
Analytics Applications
Making the best decisions means
understanding what’s happening,
why it’s happening, what could
happen in the future and what you
need to do.
IBM Solution for Analytics–
Power Systems Edition delivers a
solution for business intelligence
and predictive analytics. It’s a
flexible and integrated service
that provides options to pre-load
and configure one or more IBM
We saw 2.03x MORE
query results per hour per core
versus Intel Haswell servers,
at 36 PERCENT LOWER
hardware cost of acquisition
analytics applications with data
warehouse acceleration on a
POWER8 processor-based server.
These analytics tools, alongside
many others in the IBM analytics
suite, help turn big data into
actionable insights to help
companies address customer
retention and growth, IT costs,
management of security risks,
counter-fraud techniques,
optimization of IT operations and
digital innovation opportunities.
Together, they are helping
IT leaders undergo digital
transformation and become
stronger service providers in the
age of hybrid cloud.
Infrastructure Matters
for Big Data and Analytics
At IBM, hardware and software
organizations work together
to optimize the entire solution
stack for data and analytics. The
suite of capabilities built on IBM
POWER8 can help businesses
develop a thorough plan for
addressing their big data and
analytics needs in the cognitive
era. The Power Systems family is
optimized for performance and
can scale to support demanding
and growing capabilities for
delivering data and analytics
services while controlling
cost. As a foundation of IBM’s
comprehensive big data
and analytics portfolio, the
Power Systems platform is a key
driver in pushing businesses
forward in the race toward
digital transformation.
IBM’s breadth and depth
of solutions for big data and
analytics is unmatched. The
company is committed to
delivering infrastructure that will
contribute to your success today
and evolve to meet your changing
needs in the future. Welcome to
the cognitive era!
Keshav Ranganathan is the
Power Systems analytics offering
manager at IBM.
FIGURE 2 The platform designed for big data & analytics
15 // JANUARY 2016 Data for Analytics Services
Big Data Analytics JANUARY 2016 // 15
On- demand webcast: New
technology, new mindset
Featuring Forrester Research and the
IBM Institute for Business Value
How does infrastructure impact APIs, data and analytics, and
scalable environments? View this on-demand webcast to
find out how market leaders are defining best practices for IT
infrastructure amid the shifting IT mindset.
ibm.co/NewTechNewMindset
Transforming
Infrastructure
Organizations maximize storage to
deliver faster insights and cost savings
By Tom Sullivan and Yael Shani
H
ow does Netflix predict which movies you want to watch or Plenty of Fish
foresee who you’re going to fall in love with? How can Coca-Cola be confident
that there’s enough product on the delivery truck to meet customer demand in
a given retail location? How does a hospital ensure it can quickly access a patient’s
full medical history and provide real-time analytics-based treatment?
17 // JANUARY 2016 Data for Analytics Services
It all begins with data. Data
is emerging as the world’s
newest resource for competitive
advantage. The massive volume
and variety of information being
collected from social media,
Web logs, retail transactions,
industrial equipment, smart
devices and more has the power
to transform the way we live,
work and make decisions.
Data presents enormous
opportunities for businesses
today—but only if you can make
sense of it.
IT analytics services can deliver
immediate insights to help
companies innovate, optimize
operations, save costs, improve
services, manage risks, counter
threats and fraud, make critical
decisions, and engage with their
customers where and when it
matters most. These insights are
leading digital transformation
in today’s cognitive era, but
companies must evolve their
infrastructure to benefit from them.
Once You Know, You
Can’t Unknow
In a recent study titled
“Becoming an Analytics-Driven
Organisation” EY found that 81
percent of organizations agree
that data should be at the heart
of decision making, but only 31
percent have restructured their
operations to do it.
If most organizations already
recognize the importance of data
analytics, why have so few made
the necessary changes to their
infrastructure to support it?
The 2014 IBM Institute
for Business Value study
“Analytics—The Speed
Advantage” found that:
• 63 percent of organizations
realize a positive return on
analytic investments within
a year
• 69 percent of speed-driven
analytics organizations
created a significant
positive impact on
business outcomes
• 74 percent of respondents
anticipate the speed at
which executives expect
new data-driven insights
will continue to accelerate
Given these statistics,
businesses can’t afford not to
strengthen their infrastructures
to deliver analytics services.
As the value of data continues
to grow, current systems won’t
keep pace. The right tools—with
the right infrastructure behind
them—can help companies
master hybrid cloud for
digital transformation.
Real-Time Insights You
Can Afford
How do you reconcile budget
constraints with the business
imperative of improving your
analytics capabilities?
Decision makers need faster
insights—the right numbers in
hand in real time so they can make
the most informed choices. This
means efficiently locating data
in the right place and running
analytics at the most optimal
location—next to the data. But
budget limitations are one of the
most often-cited challenges to
implementing analytics solutions.
With 70 percent of IT budgets
spent on running existing IT
systems and operations, there’s
not a lot left. Breaking this
18 // JANUARY 2016 Data for Analytics Services
maintenance loop can set a
business apart from competitors.
The starting point to getting
faster insights while saving on
costs is optimizing your current
infrastructure. Better utilization
of current systems and storage
directly affects budget because
the gains in efficiency mean
an immediate reduction of
maintenance costs. This in
turn frees up more funds for
digital transformation.
Companies can see a quick
ROI on analytics solutions when
they start the analytics journey
this way. It allows them to create
new opportunities for revenue
growth without increasing the
cost burden.
Coca-Cola
processed
20x
more data
delivering
insights
4x
FASTER
Addressing Performance
for Analytics
Choices around architecture
and infrastructure—systems,
software and storage
technologies—are critical
to delivering differentiated
customer experiences.
Collecting and analyzing
data isn’t enough; real-time
response is the key to meeting
client expectations. You need
an infrastructure that has been
intentionally designed to provide
unparalleled performance in
interacting with your most
valuable asset—your data. Faster
infrastructure leads to faster
insights and the ability to make
better-informed decisions, which
can actually contribute money to
the bottom line.
Take storage as an example.
IT storage infrastructure is a lot
like a physical warehouse. If
a warehouse is half filled with
empty boxes, that space isn’t
To derive cost savings, revenue generation and
business growth from big data, businesses today need to
start by evolving their existing IT infrastructure.
effectively being used. Many
organizations buy more storage
capacity (more buildings), but a
significant portion of what they
already own is wasted space.
For data, software-defined
storage (SDS) capabilities can
help companies genuinely use
their space. Not only that, but
SDS can help them compress
what is in each “box”—a lot like
packing boxes more effectively
in a warehouse to reduce
wasted space.
If you double utilization of
the warehouse by removing
empty boxes and store twice as
much in each box, your storage
infrastructure utilization can
improve fourfold. The same is
true with data placement. If
you want to efficiently locate
data and applications in the
right place at the right cost,
you’ll have to use policy engines
and analytics-driven data
management and move data
between storage systems without
disrupting users or applications.
Know What Your
Customer Wants
Right Now
IBM clients are already benefiting
from data analytics services.
They are optimizing data
economics by implementing
data management solutions and
other innovative technologies
such as virtualization and
compression. The significant
business outcomes they achieve
demonstrate the importance of
having scalable performance
to handle ever-growing
structured and unstructured data
volumes, and the importance
of accelerating business
applications to enhance time
to insight.
Have you ever thought about
what it takes to have an ice-cold
Coke waiting at your favorite shop
at any time on any day? CocaCola Bottling Co. Consolidated
had to uncover deeper insight
into customer demand in
order to make this happen. By
implementing IBM FlashSystem*
technology, Coca-Cola processed
20x more forecasting data
within the existing overnight
window and delivered deeper
demand insights 4x faster. This
allowed the company to match
manufacturing output with
product demand, reduce the
risk of over- or understocking,
enable earlier logistics planning,
increase profitability and make
sure you can get a Coke whenever
you want it!
In today’s world, users expect
from an application a response
time of less than a second and
24-7-365 availability. If their
experience doesn’t match their
expectations, they will move
on. The dating website Plenty
Of Fish, with over 60 million
19 // JANUARY 2016 Data for Analytics Services
Plenty of Fish
experienced a
500x
reduction
in latency
using IBM
FlashSystem
users, copes with hundreds of
thousands of new images every
day. IBM FlashSystem enabled
Plenty of Fish to experience a
500x reduction in latency for
viewing images on its site. Faster
response times help the company
retain millions of customers.
Netflix is another company
that relies on real-time data
processing and analysis to
provide its service. Netflix
allows its viewers to stream
movies and TV shows online or
directly to their television screen
using Xbox, Wii, PlayStation and
many other devices. By locating
its data with IBM XIV* Storage
System, the company can
support approximately 300,000
sub-millisecond database
transactions per minute with
no downtime. This enables
it to offer a seamless, highquality service and make movie
recommendations tailored to its
users’ preferences.
Free Data From Hardware
Constraints
In addition to IBM’s $1
billion investment in R&D
of FlashSystem solutions,
it is redefining storage
economics with new software
by committing more than
$1 billion over five years to
developing next-generation
technology and leading the way
in SDS. IBM Spectrum Storage*
technology transforms how
storage is deployed and
managed to deliver clear
value around simplified data
management, advanced data
protection and data retention,
unlimited data scalability, and
improved data economics.
In a perfect world, data would
be placed in the fastest storage
for analysis and then moved
to lower-cost storage when
not in use. IBM uses policy
engines and analytics-driven
data management to put data
in the right place automatically,
based on usage, and move
data between storage systems
without disrupting users or
applications. The result? Clients
can run big data and analytics
projects and environments with
faster performance at a lower
total cost.
IBM Spectrum Scale*,
for example, manages big
data—both structured and
unstructured—reducing the cost
of storage by up to 90 percent
with automatic policy-based
storage tiering that moves
data from flash through disk
to tape and cloud tiers to help
accelerate analytics of new
workloads (social and mobile
applications), allocating key
data to highest performing tiers
and lowering costs by moving
“cold” data to lower-cost tiers.
For traditional workloads,
IBM Spectrum Virtualize* can
optimize current data storage
environments to new levels of
economic efficiency through
virtualization, automation and
compression technologies,
Netflix
supports
300,000
sub-millisecond
transactions per
minute with IBM
XIV Storage
20 // JANUARY 2016 Data for Analytics Services
helping IBM Storwize* systems
achieve 3x greater performance
with as little as 5 percent of data
on flash. Insurance company
Prudential, for example,
improved its storage utilization
by 125 percent through
virtualization, allowing its
storage administrators to focus
on innovation.
The healthcare industry is
flooded with patient medical
data, and the data volume
is expected to grow as new
technologies make way for
new kinds of data collection.
Secure storage of this data—
gathered from patient histories,
genomic testing, digital
imaging, mobile devices and
more—poses a challenge for
healthcare systems. Not only is
the data enormous in quantity,
but it’s also sensitive personal
information. The University
of Pittsburgh Medical Center
(UPMC) recently partnered
with IBM to improve its data
storage following this paradigm.
IBM helped optimize UPMC’s
existing storage infrastructure
using SDS to address utilization
of UPMC’s existing storage
assets, dramatically reducing its
storage infrastructure costs by
72 percent. Data compression
and virtualization have allowed
UPMC to accelerate transmission
of patient medical data without
having to add storage capacity.
Instead of building another
massive data warehouse, UPMC
and IBM are working toward
the goal of “less storage, less
hardware, less space consumed,
all of which lead to lower costs.”
Most companies (63 percent)
realize a positive return on
investment from analytics within
one year—UPMC is no exception.
The organization has seen quick,
long-lasting results. The senior
vice president of the Information
Services Division at UPMC
estimates that by optimizing its
infrastructure UPMC has saved
$40 million on new storage in
the last decade.
Embrace New
Opportunities
In the cognitive era, companies
are flooded with data, but
data without the right analytic
capabilities is useless. To extract
the insights trapped in the
data, it needs to be efficiently
stored, managed, protected
and delivered with speed to the
right applications at the right
time. To derive cost savings,
revenue generation and business
growth from big data, businesses
must evolve their existing IT
infrastructure. Transforming the
economics of big data means
looking for data solutions that will
maximize efficiency and generate
real competitive advantage.
With the right approach to
storage, your business can easily
achieve faster insights and
cost savings.
Yael Shani has been an IBM marketing professional for 14 years and is
currently leading the marketing of IBM
Storage Big Data & Analytics portfolio.
Thomas Sullivan is a best practices
expert with over 30 years of experience with data of all types.
TCONow reveals the enduring
economics of IBM FlashSystem
The total cost is the true cost of storage
In the past, the cost of enterprise data storage was expressed only in terms
of dollars per capacity. Now, IT decision-makers are taking a more inclusive
approach. Total cost of ownership (TCO) incorporates a variety of relevant cost
factors, including $/GB and operational costs such as electricity, HVAC and data
center floor space, plus the value of storage performance, which drives staff
productivity and increased CPU utilization, among many other benefits.
TCONow simply provides TCO comparisons
The new Web-based IBM TCONow tool provides quick, easy estimates of the
TCO savings offered by IBM FlashSystem compared to conventional disk-based
storage. Answer a few simple questions about your IT and business needs, and
TCONow will instantly calculate how much your business can save when you
deploy an IBM FlashSystem storage solution.
Get a customized quote detailing how you can lower your IT costs while gaining
all of the benefits of software-defined storage at the speed of flash.
Go to TCONow: www.cioview.com/FlashAnalysis/
Learn more about IBM FlashSystem:
ibm.com/systems/storage/flash