HANA, Exalytics Adoption Slow, Market Focuses on Hybrid DB Solution
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
REPORT
May 8, 2012 Companies: ETR:SAP/SAP, FRA:SOW, IBM, INFA, MSFT, ORCL, QLIK, TDC
1
Guido Gualandi, [email protected]
Reverdy Johnson, [email protected], 415.364.3782
Summary of Findings
The in-memory database market continues to develop and has no
runaway winner. Sources reported numerous and varying client
needs and an equal number of database solutions.
Companies are avoiding single-system solutions that require
significant human and capital resources, such as SAP AG‟s
(ETR:SAP/SAP) HANA and Oracle Corp.‟s (ORCL) Exalytics.
Gaining the most traction are hybrid systems, which join in-memory
and traditional databases with improved hardware or software
accelerators.
IBM Corp.‟s (IBM) Netezza is a market leader. Netezza offers a
mature and cheaper hybrid solution combining in-memory
capabilities with IBM‟s business intelligence software.
SAP has responded to slow HANA adoption by pushing Sybase ASE
as a hybrid solution that provides speed and flexibility but does not
require technical support because it is based on MySQL. One
source chose Sybase ASE over Oracle‟s Exadata based on price.
Exalytics received little praise, even from Oracle partners, who are
unclear on Oracle‟s strategy for its in-memory systems.
Apache Hadoop-based cloud solutions are preferred for
unstructured data, data mining, social listening and ETL. Open-
source solutions such as Hadoop are a possible threat to
traditional companies that work mainly on structured data.
Qlik Technologies Inc.‟s (QLIK) QlikView and Tableau Software Inc.
offer fast data exploration at a fraction of the price charged by SAP,
Oracle and IBM. They will not overtake these larger players but will
continue to gain traction.
Hybrid Solution
Preferred Over
Single-System In-
Memory DB
Adoption of SAP
HANA and ORCL
Exalytics
In-Memory Database Users
Industry Specialists
Database Consultants
Research Question:
Which is the best in-memory database to handle big data?
Silo Summaries
1) IN-MEMORY DATABASE USERS These four sources panned HANA and Exalytics
because of their prohibitive prices in tough economic
times, but praised Netezza, Teradata, Oracle‟s 11G and
Sybase‟s ASE. One source espoused Netezza‟s speed,
technology, data mining, analytics, and its cloud
solution‟s lower costs. Another source prefers ASE for
its price/performance ratio and ease of use, and will
test the in-memory database later this year.
Maintenance of Netezza and ASE can be performed by
current employees, but most companies do not have
employees who are familiar with HANA or Exalytics, thus
requiring costly support from SAP or Oracle. HANA was
described as the most complete offering and is being
highly marketed by SAP, as is Netezza by IBM. Oracle‟s
marketing efforts have been lackluster, limiting
Exalytics‟ potential. Users are confronted with limited
budgets and resources and are unable to acquire an all
in-memory solution.
2) INDUSTRY SPECIALISTS These three sources said HANA will not be widely
adopted anytime soon because it is expensive and only
good for structured data. The best scenario is a hybrid
solution with in-memory and a traditional database
used only when needed, thus reducing the cost of the
project. Teradata can easily handle large amounts of
data. Oracle and IBM also offer hybrid solutions, and
SAP is said to be moving in this direction with its Sybase
ASE offering. Oracle is ahead in the hybrid approach
using both traditional database and in-memory
features. Real-time computing helps companies make
sense of unstructured data in a reasonable amount of
time, with fine granularity and freedom. Most of this
work is done with Hadoop-based software in the cloud,
where traditional vendors such as SAP, Oracle and
Informatica are not needed. IBM is the established
leader in in-memory technology, for which its solutions
are robust and flexible.
3) DATABASE CONSULTANTS These four sources said a clear leader has not emerged
because the market has numerous solutions for just as
many different needs. Products are chosen on a case-
by-case basis, giving the best odds to the most flexible
and versatile solutions. Hybrid solutions with multiple
databases that increase hardware performance are
preferred, as are accelerators. SAP is offering Sybase
ASE in-memory functions to clients who do not want to
purchase HANA because of its high cost. IBM is a very
strong option with well-known and respected products.
Oracle does not seem to be marketing its in-memory
solutions enough.
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
2
Background
Blueshift Research has written two reports on “big data.” Sources for our March 1 report on SAP said HANA was performing
well in pilots, threatening Oracle‟s Exadata in the short run, eliminating the need for Oracle databases, and lessening the long-
term importance of servers. Our April 3 report on big data management and analysis tools found a fragmented market full of
many options and varying data and storage needs at the enterprise level. Sources praised HANA‟s platform with its positive
proofs of concept and said it had the ability to take share from Oracle, especially among customers using both Oracle and SAP
products.
CURRENT RESEARCH In this next study, Blueshift will evaluate HANA, Exalytics, QlikView and other database solutions and determine which is best
positioned to handle big data and real-time computing. We employed our pattern mining approach to establish and interview
sources in four independent silos:
1) In-memory database users (4)
2) Industry specialists (3)
3) Database consultants (4)
4) Secondary sources (4)
We interviewed 11 primary sources, including five repeat sources, and included four of the most relevant secondary sources
focused on the growth of hybrid in-memory and pure in-memory solutions, the hype and feud between Oracle and SAP,
Hadoop‟s growing position in big data, and positive user feedback on Oracle‟s in-memory solutions.
Next Steps
Blueshift‟s next report on big data will focus on leaders in analytical platforms that help companies make sense of the data
collected in large databases, as well as on companies‟ in-memory databases and cloud solutions. We will assess the different
analytical platforms and learn if integrated vendors such as SAP and Oracle will hinder smaller companies like QlikView,
Informatica and MicroStrategy Inc. (MSTR). Finally, we will observe Hadoop‟s effects on traditional companies as unstructured
data drives the big data explosion.
Silos
1) IN-MEMORY DATABASE USERS These four sources panned SAP‟s HANA and Oracle‟s Exalytics because of their prohibitive prices in tough economic times,
but praised IBM‟s Netezza, Teradata Corp. (TDC), Oracle‟s 11G and Sybase‟s ASE. One source espoused Netezza‟s speed,
technology, data mining, analytics, and its cloud solution‟s lower costs. Another source prefers ASE for its price/performance
ratio and ease of use, and will test the in-memory database later this year. Maintenance of Netezza and ASE can be
performed by current employees, but most companies do not have employees who are familiar with HANA or Exalytics, thus
requiring support from SAP or Oracle and increasing the overall total cost of ownership. HANA was described as the most
complete offering and is being highly marketed by SAP, as is Netezza by IBM. Oracle‟s marketing efforts have been lackluster,
limiting Exalytics‟ potential. Users are confronted with limited budgets and resources and are unable to acquire an all in-
memory solution. Some are testing areas of need for in-memory while other sources said the technology is too new to
consider. In-memory computing‟s data mining and aggregating data in real time are useful to certain verticals such as retail.
Other verticals like banking can use cheaper solutions such as accelerators or software functions that can run on server RAM,
benefiting Teradata and IBM‟s Cognos.
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
3
CIO for an international retail company; repeat source
IBM‟s Netezza is a revolutionary product for its speed, technology, data mining, analysis, and cloud options. HANA is more
complete, but harder to implement and maintain and more expensive. Both IBM and SAP are putting a lot of marketing
muscle behind Netezza and HANA, respectively. Oracle is slower in marketing Exalytics and is losing ground. QlikView
offers a powerful and flexible business intelligence solution that is gaining ground, while Sybase may be a better option
than HANA from a total cost standpoint. The down economy is hurting trials and implementations.
“For our in-memory solution we decided to test HANA mainly because of our relationship with SAP. On top of
that, SAP is marketing HANA aggressively. They are really trying hard to sell it while Oracle does not do much at
the moment, at least from what I can see. They seem to have a worse reputation concerning in-memory, and it
appears they are behind. They never approached us to try to sell
Exalytics. So in the end, SAP wins.”
“We have two in-memory databases in our company. In one of our
branches, we have implemented Netezza, and we have been
impressed both by the speed and the technology itself. Data mining
and streaming capacity are really good, a revolution for a database I
would say.”
“Netezza is very good for all the possibilities of data mining and
marketing analysis. And you don‟t have to aggregate data; you can
navigate freely. It is the best solution when you need to leave your data
whole and you don‟t want to preselect your options.”
“We also have a HANA proof of concept in another branch. It is being
done now, so I have no real conclusion yet.”
“It is not easy to select one in-memory database. We have Oracle
11G/Exadata, and we have SAP ERP and IBM software as well. The
three of them offer an in-memory solution. We tried IBM and are happy
with it. However, Netezza is mainly data restitution and is not quite like HANA, which is more complete.”
“There are several options to accelerate database usage. If you don‟t need to mine data in the Web, you don‟t
need to aggregate large quantities of data. If you just need to do predefined analysis and you know what you are
looking for, maybe in-memory is not for you.”
“If you need granularity, if you are working with unstructured data and can‟t aggregate it, then in-memory is
necessary. In fact, to aggregate data you need to structure it, and when you do that you lose information. Normal
databases need structured data.”
“Airlines, retail, online retail, heavy CRM users can benefit from in-memory. They have too much info to
aggregate.”
“Right now, with a difficult economy, it is difficult for companies to embark in a long, difficult project such as
moving all databases to in-memory or switching databases. There are many issues: You have to have the money
for license and implementations. You have to have teams either internally or on the market. It is hard to find
anybody using HANA today. Some solutions will be better off than others just because you can or can‟t find
experts.”
“QlikView is an in-memory BI tool, and it is quite similar to in-memory
database functions. It is less expensive than SAP, Oracle or IBM.”
“QlikView has a future; they are flexible, powerful. I like their approach
„business discovery.‟ As with in-memory databases you don‟t need to
predefine information to find it. You can plug them in about anything
you have. They will continue to be successful in BI.”
“SAP HANA is expensive, difficult to implement, and you have to buy
SAP maintenance, so ROI is long. It is also difficult to find teams to
work on it as it is so new. They might not have many clients ready to
invest in the market. But SAP can wait; they have enough cash to
wait.”
“It is going to be easier to purchase Sybase than HANA.”
“IBM is pushing Netezza really hard. I talked to them last week. They
have the teams and the marketing. They also have the right approach
to offer applications in the cloud. So if a company does not have
Netezza is very good for all the
possibilities of data mining and
marketing analysis. And you
don‟t have to aggregate data;
you can navigate freely. It is the
best solution when you need to
leave your data whole and you
don‟t want to preselect your
options.
CIO, International Retail Company
SAP HANA is expensive, difficult
to implement, and you have to
buy SAP maintenance, so ROI is
long. It is also difficult to find
teams to work on it as it is so
new. They might not have many
clients ready to invest in the
market. But SAP can wait; they
have enough cash to wait.
CIO, International Retail Company
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
4
enough money to invest upfront, they can purchase a service in the cloud.”
“SAP is attacking the in-memory market much more than Oracle. Oracle has a worse image. … They could be
successful as they have a huge installed base. It will depend on their ability to market the products. … I don‟t
feel Oracle is aggressive enough on in-memory. Maybe their product is not really ready.”
IT project manager for a European bank
Sybase‟s ASE in-memory database is superior because it delivers the best price/performance ratio. This source‟s
company will begin testing it by year‟s end, driven by a desire to reduce input/output (I/O) time. Sybase ASE also is easy
to install and can be maintained the bank‟s current staff. By contrast, HANA and Exalytics are much more expensive and
require more specialty maintenance that the bank cannot provide.
“We have two main databases, Oracle and Sybase, and many different applications that run on them. We need
more speed in I/O operations, but our fragmentation actually helps our database work better. Simple in-memory
solutions are good for us, especially for budget reasons. Expensive solutions from SAP HANA and Oracle Exadata
and Exalytics are out of question in this difficult economy.”
“We have no big problem on our database performance; however, we
are willing to test Sybase ASE in-memory functions as they look to be
the best on paper especially the price/performance ratio.”
“We know SAP HANA and Oracle Exalytics and Exadata but we did not
even plan to test them as they are out of range for cost reasons. Those
products are really too expensive given the budget we have. On the
other hand we want to reduce our I/O operation time and ASE has the
right in-memory tool. Before the end of the year we will most likely test
ASE in-memory and see if we can implement it. Right now we can‟t
because all IT people are too busy.”
“ASE in-memory is very interesting concerning I/O operations, and that
will be enough for us for the moment. As far as I know we have no
other in-memory project besides this one. We did speak with Oracle, but we soon realized that it was too
expensive.”
“With ASE in-memory we hope to reduce latency time in I/O operations so that many queries will be faster. ASE
strengths are the fact that we have it already, we have technicians for it, it has good TCO and will be easy to
install.”
“SAP HANA and Oracle … we don‟t have enough people who know them. They are expensive; maybe they have
better performance but at what price? With new tools, it is also a question of resources. We do not have enough
people available to test certain products, we have no SAP or Oracle specialists in-house.”
Database manager for a large banking group
This source uses a database accelerator instead of an in-memory database and said a solution works satisfactorily in
most cases. He said the group has no need to accelerate all queries and that the cost of an in-memory database like
HANA is still too high because all support is done directly by SAP.
“We have no problem on our database performance; nobody is
complaining. We recently implemented Oracle 11G. Because not all our
system is on it, the database is idle most of the time. We don‟t have an
immediate need to use any in-memory solution.”
“In another department of our group, they had some speed needs and
they decided to purchase SAP‟s BW Accelerator. They are quite happy
with it.”
“If data volume increases, we might reassess, maybe at year-end, but
from what I can see we will be fine for two to three years. Human
Resources is still not in the database, but there is plenty of room. We
have no I/O problems. Our disks are fast, so we will be happy with what
we have.”
“There is a big difference between companies that have big and old ERP implementations and us, who has all
new. New systems are fast even if they don‟t have any in-memory databases.”
We want to reduce our I/O
operation time and ASE has the
right in-memory tool. … That will
be enough for us for the
moment. … We did speak with
Oracle, but we soon realized
that it was too expensive
IT Project Manager, European Bank
We don‟t have an immediate
need to use any in-memory
solution. … New systems are
fast even if they don‟t have any
in-memory databases.
Database Manager
Large Banking Group
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
5
“Since the important queries are no more than 20, an accelerator really does the job. Already we have
implementation problems with the BW accelerator; I can‟t imagine what we would go through with HANA. It still
costs too much, and since SAP is giving all the support, it would be even more costly.”
IT architect for an aerospace company; repeat source
This large company uses Teradata as a database and does not need an in-memory database. As needed, the company
can turn to ad hoc solutions that do not require an accelerator or specific software from large providers such as SAP or
Oracle.
“I do have numerous business intelligence dashboards within my scope. Hardly a week goes by without a
proposal for a new one, but OLAP [online analytical processing] and the cutting-edge technologies supporting it
such as in-memory databases are not used in these areas of my main focus.”
“We use Teradata data warehouses and [IBM‟s] Cognos for our business intelligence reporting solutions. Our
engineering and manufacturing transaction systems use Oracle and [Microsoft Corp.‟s/MSFT] SQL Server
databases as standards. We don‟t do a lot of OLAP type work in my end of the business. I was able to find
references to Netezza and [Cognos‟] TM1 within the company, but only references and I was not able to find
evidence of how or if they are being used.”
“I haven‟t seen any requests to utilize these platforms come across my desk for a design review. This tells me
that these products are not part of an active development or deployment in the engineering or production
operations functions for the company. That doesn‟t mean they aren‟t being used by finance or marketing, or
some other part of the company that is outside my scope.”
“We are pushing Teradata and Cognos to their limits and have more than a few of the technical reps from both
companies monitoring and tuning our daily operations. This is for operational business intelligence needs and
not OLAP work.”
“I am not aware of any activity afoot to make use of these in-memory databases. The one example of an in-
memory database I did see was an in-house development effort by one of our R&D groups based on massively
parallel but relatively generic server hardware. I did not keep track of what happened to this effort after the
production operations review board decided not to fund the development further.”
2) INDUSTRY SPECIALISTS These three sources said HANA will not be widely adopted anytime soon because it is expensive and only good for structured
data; one source said HANA is only 10 times faster than its competition and is not worth its price. The best scenario is a
hybrid solution with in-memory and a traditional database used only when needed, thus reducing the cost of the project.
Teradata can easily handle large amounts of data. Oracle and IBM also offer hybrid solutions, and SAP is said to be moving in
this direction with its Sybase ASE offering. Oracle is ahead in the hybrid approach using both traditional database and in-
memory features. Real-time computing helps companies make sense of unstructured data in a reasonable amount of time,
with fine granularity and freedom. Most of this work is done with Hadoop-based software in the cloud, where traditional
vendors such as SAP, Oracle and Informatica are not needed. Also, SAP, IBM, Informatica and Oracle tools will clean and
analyze data brought to the enterprise, but QlikView and Tableau offer cheaper alternatives if BI functions mainly are needed.
IBM is the established leader in in-memory technology, for which its solutions are robust and flexible.
BI analyst for a technology consulting and benchmarking company; repeat source
The solution to big data is several coexisting systems. SAP will have to market HANA better because companies are
addressing big data through several databases rather than a single system. HANA‟s strengths are in structured data and
data cleanup, but its price is a deterrent.
“It looks like appliances like HANA are not the best to work with big data. For long text and unstructured data,
server farms and Hadoop will be a better, cheaper tool. Cloud products, able to expand and shrink, seem to be
good solutions. Outsourcing the big data cleaning to the cloud might be the best idea.”
“Appliances such as HANA will be good for specific needs and regular structured data, partially originated in big
data cleanup. I can imagine the following scenario: Big data cleaned in Hadoop in a server farm in the cloud.
Data is sent to an in-memory database for fast analysis; data is stored a regular database. Companies will have
to decide rules to see which data to send to regular database for storage, to in-memory and to the cloud.”
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
6
“I was trying to figure out which solution is the fastest to manage big
data. I saw a test, and HANA was only 10 times faster than other
solutions. That is not that much considering the cost of the product.
You can obtain that with better hardware.”
“While Exalytics only does analytics, HANA use can be much broader,
but so far I have not seen HANA used in mission-critical environments.”
“HANA can scale well and is expensive at the beginning, but as you use
it more its price decreases. Some companies think you need to have
HANA in control of the whole data lifecycle. That would add a big value,
especially in retail where you can figure out in real time where the
product is from production to warehouse to cash register. But given the
economy, companies are not ready to pay any price. One company in
Germany asked SAP for a quote for an implementation over five years,
and the price was €217 million compared to €120 [million] for Oracle.
The high price was mainly due to data volume increase.”
“It does not look like SAP is marketing HANA very well. They have made
a series of mistakes. When Oracle markets Exalytics, it has a demo
with Oracle BI fast reporting. SAP is known to have used Tableau
instead of its own Business Objects. Also, it seems unlikely that clients will be buying HANA at such high prices.
There is a disconnect somewhere.”
“In the end I think solutions to big data will be a hybrid. Several databases will coexist, some for speed and
some for storage. SAP clients will go with SAP products, and companies who are not 100% SAP will consider all
the other solutions. Each case is different; some will need in-memory computing, some just an accelerator such
as QlikView that is limited but enough for a department or a specific solution at a departmental level.”
In-memory computing expert, consultant and blogger
A black-and-white solution for in-memory computing does not exist. A lot depends on the user‟s needs. HANA is a good
but expensive solution for speeding up BW and work with structured data coming from ERP. Oracle, QlikView, Tableau
and IBM offer more flexible solutions for data mining and using unstructured data. SAP is working toward a hybrid
solution that provides more flexibility, a lower price, and use of in-memory only as needed. Oracle already offers this type
of product. Adding to the expense of Oracle and SAP databases is the human capital required to serve and maintain the
machines. IBM has a leading in-memory competitor that does not require the level of maintenance as others and is
winning bids as a result.
“In-memory solutions generally provide real-time data exploration, fine detail on the data and offer new business
possibilities. However, there is a gap between marketing and reality.
For example, a lot needs to be done on data flux aggregation and
modeling when you have an in-memory solution.”
“We can take a case of a company needing an in-memory solution on
an ERP with data flux logic. With HANA, data are preconfigured and SAP
offers HR, sales and BW premodeled data fluxes. In those cases, HANA
can be used more easily. To have preconfigured data tables is good,
but if you go outside the given models, things get difficult. You have to
create your models which can be a huge amount of work. BW‟s strong
point is that it is already business-content oriented. This is its strength
compared to competition.”
“If we take a case of a company needing reporting on nonstructured
data in an ETL environment, with data tables from different
environments. If the need is to navigate the data, do tables on the fly,
data mining and in real time, HANA or BW could be cumbersome as a
tool and too expensive to do data exploration.”
“SAP still has progress to make in certain contexts. IBM and Oracle
have better possibilities as they are more flexible. … Clients will not go
massively to HANA but will look for easier and cheaper solutions
initially.”
It looks like appliances like
HANA are not the best to work
with big data. For long text and
unstructured data, server farms
and Hadoop will be a better,
cheaper tool. Cloud products,
able to expand and shrink,
seem to be good solutions.
Outsourcing the big data
cleaning to the cloud might be
the best idea.
BI Analyst
Tech Consulting & Benchmarking Co.
IBM is marketing their solutions
very hard. They have Netezza
and they have TM1 that works
in-memory as well as well as
other products. TM1 does not
need an appliance; it works in
the server RAM. They are very
fast and powerful, probably the
best technology today. They
recently won a bid at a Fortune
1000 company against SAP
and Oracle for in-memory
computing with TM1.
In-memory Computing Expert &
Consultant
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
7
“SAP is moving to offer a hybrid solution in-memory/traditional database with Sybase ASE that will give more
flexibility in terms of cost and deployment. The acceleration is noticeable, especially on the ERP with lots of data
to deal with, and in the case of both large reporting from the ECC and BW. Today BW has shown its limit, and
reporting is not always fluid. HANA brings that fluidity.
“Oracle has flexible tools and is ahead of SAP in the hybrid approach to in-memory. Oracle does in-memory when
needed; SAP HANA uses in-memory all the time, using the big weapon when it is not needed. Oracle is strong
when in the presence of a heterogeneous environment, less dependent on the ERP. Oracle is more subtle and
costs less. Their Hyperion solution works very fast, even without in-memory because they have a good cube
management. However, you need a very good DB specialist if you want to get the most out of the Oracle offer,
and those specialists are not found everywhere in the market.”
“IBM is marketing their solutions very hard. They have Netezza and they have TM1 that works in-memory as well
as well as other products. TM1 does not need an appliance; it works in the server RAM. They are very fast and
powerful, probably the best technology today. They recently won a bid at a Fortune 1000 company against SAP
and Oracle for in-memory computing with TM1.”
“For pure BI, there is also the alternative of using QlikView and Tableau. Even if they are not as complete and
powerful as SAP, IBM and Oracle, they can offer a very good solution to navigate real time in data at a fraction of
the cost of the other three.”
“Companies that are choosing in-memory solutions are faced with two additional problems. One is the cost in a
difficult economy, and the other is human resources. Oracle resources are difficult to find in Europe. Since HANA
is new, there is a lack of experts on the database. … IBM is probably the best; they are expensive, but they have
a large number of teams available.”
Vice president of product management for a software company specializing in performance applications
The source reported in-memory data management solutions that are faster and cheaper than in-memory databases.
Some allow applications to bring terabytes of high-value business data into the local memory on the servers on which the
application is running, and access it with microsecond latency. Optimal use of hardware is made, delivering the best
price/performance ratio.
“HANA is an appliance database and does not scale out very easily.”
“Most of our clients use traditional databases such as Oracle because they are the most used in general.
However, we are database-agnostic. A lot of applications we work with are not classical ERPs but more online
transactional software applications.”
“We are much cheaper than in-memory databases; one gigabyte starts at $500, and we only put high-value data
in the system so that you make the best use of the RAM you chose to use.”
“Growth areas we see include mobile, and high velocity data sources where we real-time stream processing.”
3) DATABASE CONSULTANTS These four sources said a clear leader has not emerged because the market has numerous solutions for just as many
different needs. The environment is rapidly changing and products are chosen on a case-by-case basis, giving the best odds
to the most flexible and versatile solutions. Hybrid solutions with multiple databases that increase hardware performance are
preferred, as are accelerators. SAP is offering Sybase ASE in-memory functions to clients who do not want to purchase HANA
because of its high cost. In-memory solutions such as HANA might not be the best to work on big data, but they remain very
good at achieving speed and granularity within an ERP system. IBM is a very strong option with well-known and respected
products. Oracle does not seem to be marketing its in-memory solutions enough. SAP and Oracle solutions are technologically
comparable except in a few verticals.
Database expert with a large consulting company; repeat source
The market lacks a leading solution to handle big data because data issues and solutions vary widely. Big data needs
require solutions such as Teradata and Hadoop. A traditional company wanting more speed will need HANA but only on
as-needed basis. SAP will have difficulty selling HANA as a companywide database. IBM offers mature solutions. Oracle
has a good database but lacks innovation. Meanwhile, Microsoft has emerged as a worthwhile option to SAP and Oracle.
“We can‟t say there is a leading solution right now. Each case is different, and all of us need to work case by
case. For example, with HANA it would be logical to use the normal system for 80% of the work and implement
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
8
HANA only for what is needed. In fact, SAP is pushing their partners to
come up with business processes studies so that they can propose
HANA when it is more valuable.”
“The in-memory solution might not be the best for big data. If you need
speed, in-memory is a good solution. If you need to work with variety,
social data, unstructured data, Hadoop is the solution. If your problem
is volume, you need a relational database such as Teradata or Exadata.
For real time the best solution is HANA, but you still need a database
like Sybase if you have big data.”
“HANA has an interesting interaction with BW, good speed, but we need
to isolate HANA in a specific scenario. It is not a good idea to migrate
all to HANA. … You can combine HANA to a traditional database. This
hybrid approach works better now.”
“You can‟t have just one database; all vendors are confronted with this.
In the future we will see multiple databases working together; there will
be three layers mainly. The first is Hadoop for unstructured data and
big data, the second will be an in-memory solution, the third will
continue to be the traditional relational database.”
“There will be choices to make according to companies‟ needs. For
large volumes a traditional choice is Teradata plus Hadoop. A total
break with the past, you can get HANA plus Sybase for speed in the
SAP realm. IBM DB2 plus Hadoop and different in-memory products will continue to be a good choice.”
“Oracle Exadata is good for volume but lacks innovation on other fronts. They can lose some clients. Microsoft‟s
all-integrated solution for the midmarket will always be there.”
“It will be interesting to see HANA fight Netezza. And all of this might help Microsoft, which is now a valid
alternative to SAP and Oracle. With all these new products, game rules are changing, but we can‟t really say yet
who is going to win.”
CEO of a database vendor and consultant
SAP is marketing HANA and Sybase ASE as a potential hybrid solution and a strong, cheaper competitor to Oracle‟s
Exadata and Exalytics products. Still, enterprises will need a few years to be convinced to change databases. New
database customers will be more likely to adopt HANA and ASE. ASE offers the ability to tap into in-memory functions
without the cost and commitment of a dedicated in-memory database.
“We started selling Sybase at the beginning when it was still Sybase SQL Server, when it was identical to
Microsoft SQL server. They have evolved differently.”
“ASE can do extreme transaction processing systems and supports tens of thousands of concurrent users with
ultra-fast, nonstop performance on cost-effective, standards-based platforms. It has nothing less than Oracle or
DB2 and can surely be an alternative for SAP users.”
“We have just started with SAP and are doing a training course next
week on how to sell the product. It is going to be a slow start. New
customers might adopt ASE quickly, but I don‟t think existing SAP
clients who are mainly on Oracle will change databases. It is a big step
to take.”
“We hope HANA can be adopted soon, but at the moment we do not
see any implementation in our area. We have to wait a couple of years
before we see massive adoption.”
“As per Oracle compared to ASE, there is little difference, which is why
SAP wants to market ASE: to replace Oracle.”
“ASE has two problems. One is marketing, as it has not been marketed
well and is not known in the market. The second problem is that most
implementations are in financials and telco. ASE has less experience in
verticals such as retail where Oracle has more customers.”
“ASE has some advantages compared to Oracle: One is that it is easier
to find resources and train resources for ASE because it is so similar to
The in-memory solution might
not be the best for big data. If
you need speed, in-memory is a
good solution. If you need to
work with variety, social data,
unstructured data, Hadoop is
the solution. If your problem is
volume, you need a relational
database such as Teradata or
Exadata. For real time the best
solution is HANA, but you still
need a database like Sybase if
you have big data.
Database Expert
Large Consulting Company
We have just started with SAP
and are doing a training course
next week on how to sell the
product. It is going to be a slow
start. New customers might
adopt ASE quickly, but I don‟t
think existing SAP clients who
are mainly on Oracle will
change databases. It is a big
step to take.
CEO, Database Vendor & Consultant
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
9
SQL Server. The second advantage is that ASE is compatible with all SAP applications and all SAP applications
are and will be compatible with ASE, including the in-memory functions.”
“ASE has in-memory functions where it brings in-memory all frequent operations and query. For those query
there is a lot of speed. However, when you need to write down and load other queries, it might slow down a bit.
HANA is faster as it is all in-memory. Oracle works in a similar way to ASE, so again, no difference there besides
cost. ASE is cheaper than Exadata/Exalytics and ASE clients don‟t need extra hardware or software.”
“Clients who don‟t need all of what an in-memory database offers will just be able to enable ASE in-memory
functions, achieving acceleration and spending little extra. It looks like it is a much easier step than purchasing
HANA or Exadata/Exalytics. SAP is pushing ASE a lot. It is in their plan to become database vendor No. 2 in the
world with ASE, not with HANA.”
“I do not know if ASE will cannibalize HANA; at the moment it is too early to say.”
IT director for a database integrator
HANA adoption is far off, and no market share changed have occurred or will
occur in the near future. HANA will stay marginal for a while because it is a costly
solution.
“In most cases database performance can be achieved by working on
the hardware, increasing disk access performance, only in a few cases
by using in-memory technology.”
“If there is a bottleneck at the database level for a few applications and
the problem is disk access, we first change disk technology, but if we
want all applications to go faster, then it can be justified to use HANA,
which is all in-memory. At that point there is no more difference
between applications.”
“Looking at customers‟ needs, I wonder if it is really necessary to have
such speed as HANA at this point. With an uncertain economy, I wonder
how many companies are willing to pay the price of an in-memory technology.”
“As far as transactional databases go, it is really difficult to compare DB2 to Oracle or others. Our job is not to
compare. We usually take what is there; companies don‟t want to change database usually. Same goes for the
adoption of an in-memory database or an accelerator: When we see a problem or a possible improvement, we
study the question. There is no global answer; there are only specific answers to the client‟s environment.”
“HANA is not mature right now; it is really marginal. However, in the future SAP will do as much business on
HANA as they do with the ECC [Enterprise Core Component].”
“HANA requires a very big investment that will happen only when the need is very strong.”
BI manager for a database integrator; repeat source
HANA is a leading in-memory database but is too expensive to be a
companywide database, at least for the next three years. It is more likely to
become part of an enterprise. HANA also has a complex implementation and a
lack of support and customers who are familiar with the solution. IBM‟s Netezza
is better known and works well, even if it has its own limitations.
“We have SAP HANA in a client site, and we are hoping to put it in
production this year. At the moment there are global negotiations
between the client and SAP.”
“HANA is excellent to work with big data and large amount of data, but
if we try to harmonize 50 TB of data on HANA, I think it will be too
expensive. In this specific case, we should not try to migrate the whole
data warehouse to HANA. The cost of licenses and hardware and the
complexity of the implementations are big limitations.”
“HANA is in-memory all the time. … Clients still have plenty of questions
about replication and data safety, and we are not sure what to answer
as we haven‟t seen HANA working in a mission-critical environment. We
are still working on data replication and backup as we have a very fast
tool on one hand but slow systems on the other.”
Looking at customers‟ needs, I
wonder if it is really necessary
to have such speed as HANA at
this point. With an uncertain
economy, I wonder how many
companies are willing to pay
the price of an in-memory
technology.
IT Director, Database Integrator
HANA is excellent to work with
big data and large amount of
data, but if we try to harmonize
50 TB of data on HANA, I think
it will be too expensive. In this
specific case, we should not try
to migrate the whole data
warehouse to HANA. The cost
of licenses and hardware and
the complexity of the
implementations are big
limitations.
BI Manager, Database Integrator
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
10
“We have more experience with IBM Netezza, and it works well even if it is mainly data restitution.”
“HANA is a good answer if we want to work the fine details. We can analyze data with a detail and speed
unmatched by others, but we might not need to do that with all our data. I can see a HANA implementation that
works on a set of data while the rest sits a normal database.”
“For several clients, a combination of a software accelerator such as BW, a fast disk access can be enough.
Clients who have Teradata will not change database as they are excellent to manage large volume. It would not
make sense for them to move from Teradata to HANA.”
“HANA adoption for BI needs in specific parts of the enterprise can work soon, probably this year. HANA as a
company-wide solution will have to wait probably three years. Nobody will want to be the first to do that at the
price HANA costs—at least $1 million per TB of data.”
Secondary Sources
These four secondary sources discussed the growth of hybrid in-memory and pure in-memory solutions, the hype and feud
between Oracle and SAP, Hadoop‟s emerging position in big data, and positive user feedback on Oracle‟s in-memory
solutions.
April 16 ZDNet blog entry
Hybrid in-memory solutions are becoming more commonplace, while Oracle and SAP fight for leadership in pure in-
memory databases.
“Hybrid in-memory and disk have also become commonplace, especially amongst data warehousing systems
(e.g., Teradata, Kognitio), and more recently among the emergent class of advanced SQL analytic platforms
(e.g., Greenplum, Teradata Aster, IBM Netezza, HP Vertica, ParAccel) that employ smart caching in conjunction
with a number of other bells and whistles to juice SQL performance and scaling (e.g., flatter indexes, extensive
use of various data compression schemes, columnar table structures, etc.). Many of these systems are in turn
packaged as appliances that come with specially tuned, high-performance backplanes and direct attached
disk.”
“Pure in-memory databases are now going mainstream: Oracle and SAP are choosing in-memory as one of the
next places where they are establishing competitive stakes: SAP HANA vs. Oracle Exalytics. Both Oracle and SAP
for now are targeting analytic processing, including OLAP (by raising the size limits on OLAP cubes) and more
complex, multi-stage analytic problems that traditionally would have required batch runs (such as multivariate
pricing) or would not have been run at all (too complex, too much delay). More to the point, SAP is counting on
HANA as a major pillar of its stretch goal to become the #2 database player by 2015, which means expanding
HANA‟s target to include next generation enterprise transactional applications with embedded analytics.”
“The cost of storing entire databases in RAM is rather low, but the advantages are enormous. By storing the
entire database in RAM you are able to process millions of database transactions per second.”
“Massively Parallel Processing has existed for decades now, and the new wisdom is about instantly available
data, which is resident and available to the „massively parallel processors‟ for the many different tasks that the
data can be used for, by as many different users as the system can handle.”
May 1 InformationWeek article
SAP and Oracle are turning up the heat and the hype in the competition for superior in-memory databases. Oracle was
especially critical of HANA for its cost and need to rewrite apps for its new database. SAP attempted to debunk Oracle‟s
claims, though questions remain about its affordability.
“SAP has been confidently claiming that its Hana in-memory database will quickly steal database market share
that it took Oracle decades to win. It will start with SAP Business Warehouse (BW) deployments, the company
says, and by the end of this year, once Hana gains the ability to run core enterprise applications, Hana will start
invading the transactional database market.”
“Oracle‟s Larry Ellison and Safra Catz have missed few opportunities to discredit Hana in recent months. But
executive VP Thomas Kurian took the slams a level deeper on Friday with a one-hour Webinar clearly intended to
sow seeds of fear, uncertainty and doubt in the minds of would-be Hana customers. The session was billed as
an Exalytics seminar, but each point set up a contrast with Hana. Kurian claimed, among other things, that
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
11
SAP‟s product costs five times to 50 times more than Exalytics and that it doesn‟t support SQL (relational) or
MDX (multidimensional) query languages, requiring apps to be rewritten to run on the new database.”
“So what‟s the truth behind all these claims and counter claims? SAP database executive Steve Lucas posted a
blog on Monday rebutting most of Oracle‟s claims, and also I spent some time with SAP‟s CTO, Vishal Sikka, and
Gartner analyst Don Feinberg on some of the technical points, as I‟ll detail below.
“SAP … is counting on Hana‟s in-memory performance to catapult it to a leading position in the database
market. The advantages of in-memory technology are well documented, but SAP has bigger ambitions for the
technology than any other vendor. With Hana, the goal is to support „transformational‟ business advantage, not
just faster queries.”
“SAP now has nearly a dozen of these new applications designed specifically for Hana, including sales and
operations planning, cash and liquidity management, and, for retail, trade-promotion management and
merchandising and assortment management. SAP has a couple dozen more apps in development. In some
cases apps fitting these descriptions already existed, but they‟ve been redesigned to take advantage of what
SAP calls „real, real-time.‟”
“SAP says the real payoff from Hana will be in transforming business
processes, not just accelerating queries. But we haven‟t seen enough
solid, real-world customer examples documenting transformed
business competitiveness.”
“The promise of Hana is compelling. … The reality check here is that
Hana won‟t be able to run core transactional SAP enterprise
applications until late this year at the earliest (it‟s already running the
BusinessOne service, in beta, but that‟s not core ERP). Even then, the
initial release will be in „ramp up‟ mode, meaning among a select group
of customers. General availability might take another six to eight
months.”
“Meanwhile, one of Oracle‟s key points about Exalytics is that it
improves the performance of existing applications—Oracle‟s entire
portfolio of apps as well as third-party apps—without any changes. Yes,
the downside of sticking with the status quo is that you‟ll have to keep paying for all those databases and
related infrastructure. But the upside is that you can keep running the old apps while giving them a performance
boost. That‟s another reason why SAP has to show that its new, Hana-based applications aren‟t just faster
(Exalytics can do that); they have to be better and deliver more business value.”
“At one point Kurian rattled off a litany of claims about Hana limitations that would make a would-be customer‟s
head spin. Let‟s lay a few of those to rest.
* Hana does support SQL and MDX.
* Hana does support parallel query execution. In fact, it supports massively parallel query execution.
* Hana does not support a bunch of stuff related to ROLAP and MOLAP—like indexes, aggregates, and
materialized views—because it does away with those artifacts entirely. It makes calculations instantly on the
fly—using the latest data and all available detail. SAP‟s analogy: Hana doesn‟t need a MOLAP hay loft or a
materialized view whip because it‟s not a horse and buggy; it‟s an automobile.
* Hana does support unstructured data analysis. In fact, the database‟s origins were in columnar text
processing, and SAP BusinessObjects has since added text-analysis capabilities that can be used in
conjunction with Hana.”
“Pricing Exalytics in a small data-mart scenario at around 500 gigabytes, Kurian said the hardware would be
$135,000 and the TimesTen database another $690,000 for a total of $825,000. (He did not mention Oracle
BI Foundation software, which is also required.)”
“By comparison, Kurian claimed SAP‟s cost would be $362,000 for hardware and $3.7 million for software.
SAP‟s Steve Lucas says Hana‟s cost in this scenario, including hardware and software, would be $500,000, but
by my calculation, using BW-on-Hana list prices (of $79,000 per 64-gigabyte unit, as reported here ) and a 50%
database overhead allowance (which SAP calls for), the software cost alone would be north of $1.2 million. SAP
must be counting discounts and incentives it‟s throwing in to spur sales.”
“The bottom line is that Oracle‟s claim that Hana costs 5-to-50-times more than Exalytics is exaggerated—in
large part because it‟s based on same-size deployments, when Hana will allow smaller deployments. But it‟s
SAP says the real payoff from
Hana will be in transforming
business processes, not just
accelerating queries. But we
haven‟t seen enough solid,
real-world customer examples
documenting transformed
business competitiveness.
InformationWeek Article
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
12
also hard to believe SAP‟s sweeping statements about the affordability of DRAM-based systems and Hana
overall.”
April 11 ZDNet blog entry
An in-memory tipping point is fast approaching and Hadoop is well positioned and is being embraced by Microsoft.
“Yesterday, Microsoft‟s Dave Campbell, a Technical Fellow on the SQL Server team, posted to the SQL Server
team blog on the subject of in-memory database technology. Mary Jo Foley, our „All About Microsoft‟ blogger
here at ZDNet, provided some analysis on Campbell‟s thoughts in a post of her own. I read both, and realized
there‟s an important Big Data side to this story.”
“In his post, Campbell says in-memory is about to hit a tipping point and, rather than leaving that assertion
unsubstantiated, he provided a really helpful explanation as to why.”
“Campbell points out that there‟s been an interesting confluence in the database and computing space:
o Huge advances in transistor density (and, thereby, in memory capacity and multi-core ubiquity)
o As-yet untranscended limits in disk seek times (and access latency in general)”
“This combination of factors is leading—and in some cases pushing—the database industry to in-memory
technology. Campbell says that keeping things closer to the CPU, and avoiding random fetches from
electromechanical hard disk drives, are the priorities now. That means bringing entire databases, or huge
chunks of them, into memory, where they can be addressed quickly by processors.”
“ Massively Parallel Processing (MPP) data warehouse appliances are Big Data products. A few of them use
columnar, in-memory technology. Campbell even said that columnstore indexes will be added to Microsoft‟s
MPP product soon. So MPP has already started to go in-memory.”
“Some tools that can connect to Hadoop and can provide analysis and data visualization services for its data,
may use in-memory technology as well. Tableau is one example of a product that does this.”
“Databases used with Hadoop, like HBase, Cassandra and HyperTable, fall into the „wide column store‟ category
of NoSQL databases. While NoSQL wide column stores and BI column store databases are not identical, their
technologies are related. That creates certain in-memory potential for HBase and other wide column stores, as
their data is subject to high rates of compression.”
“Hadoop‟s MapReduce approach to query processing, to some extent, combats disk latency though parallel
computation. This seems ripe for optimization though. Making better use of multi-core processing within a node
in the Hadoop cluster is one way to optimize. I‟ve examined that in a recent post as well.”
“Perhaps using in-memory technology in place of disk-based processing is another way to optimize Hadoop.
Perhaps we could even combine the approaches: Campbell points out in his post that the low latency of in-
memory technology allows for better utilization of multi-cores.”
“No matter what, MapReduce, powerful though it is, leaves some low hanging fruit for the picking. The
implementation of in-memory technology might be one such piece of fruit. And since Microsoft has embraced
Hadoop, maybe it will take a run at making it happen.”
May 3 InformationWeek article
Oracle‟s Exalytics received praise from users.
“Exalytics, Oracle Endeca, and analytic apps were centers of attention late last month when Oracle showcased
recent and upcoming products at its annual analyst conference in San Francisco.”
“Oracle claims an 18X performance boost to existing BI and planning apps. An in-memory summary advisor
recommends which data should be stored in-memory, based on usage statistics. I‟m waiting to test drive the
new release and appliance, but the in-memory summary advisor seems to be a differentiator, particularly if it is
self optimizing. The integration with Exadata, via a fast InfiniBand connection, for direct access to the larger data
warehouse, is another differentiator, but I‟d like to see how seamlessly the data moves from Exadata to
Exalytics.”
“Exalytics was initially touted as both an in-memory and visual-discovery solution. To be sure, there are some
new visual discovery capabilities in the latest release of OBIEE when deployed with Exalytics. In addition, Oracle
now has Endeca Information Discovery in its portfolio. Oracle acquired Endeca in October, primarily for its MDEX
engine. Best known for its e-commerce and e-retail search capabilities, Endeca brings the simplicity of search to
BI.”
“The Endeca software, which is not widely adopted, competes to some extent with capabilities available from
QlikTech‟s QlikView, SAP BusinessObjects Explorer, and Information Builders Magnify. The MDEX engine is most
Big Data: In-Memory Database Solutions
321 Pacific Ave., San Francisco, CA 94111 | www.blueshiftideas.com
13
different in its combination in-memory and columnar storage to support analysis of both structured content as
well as textual, semi-structured content stored in comments, documents, and social media.”
“Oracle‟s BI analytic applications have been a Trojan horse for selling OBIEE to Oracle E-business Suite, JD
Edwards, and PeopleSoft customers. The analytic applications are expansive, bringing ETL, data models,
dashboards and reports, and best practices to a number of functional areas (such as financial and human
resources) and industries (such as retail and government). They‟re built on OBIEE, so customers who buy the
apps must also purchase OBIEE, and thus may potentially leverage that BI platform for other BI initiatives.
Oracle has continued to improve its depth of coverage in the analytic applications, most recently adding Asset
Management and Manufacturing Process, as well as support for SAP data sources.”
“The highpoint of the event was the customer panel, which,
unfortunately, was under non-disclosure, meaning I can‟t name names
of customers. This panel was one of the best I have heard in my ten
years as an analyst, in part because the customers delivered genuine
criticism—which you don‟t hear in vendor-organized panels—but also
because they did it with humor. The constructive criticism made the
kudos for Oracle all the more believable. One OBIEE customer who had
deployed Endeca said, „Oracle bought an amazing technology. I‟m not
sure they yet realize how good it is.‟”
“A key theme of the event, and a rallying cry for Oracle‟s strategy, is to
simplify the IT experience to power extreme innovation. I get the
rallying cry: IT can‟t keep pace with business demand with difficult-to-
integrate and -deploy technology, but rarely has the goal of „helping IT‟
been inspirational to the business.”
“I suspect part of this shift in emphasis is related to Oracle‟s 2010
acquisition of Sun; that hardware now accounts for almost 20% of
Oracle‟s $35 billion in annual revenues. Not surprising then, a fair bit
of the keynotes were devoted to talking about the bits and bytes of
Oracle‟s engineered systems.”
“As a BI expert today, I felt somewhat like the car buyer who simply
wants to drive that sleek car, not dissect the engine. While hardware has not been a high-growth industry of late,
the difference is that Oracle‟s focus is on systems that power business analytics, for which most market
watchers cite double-digit growth.”
“To that point, Oracle showcased numerous customer success stories for Exadata. As one customer put it, „we
read all the glossy brochures, and [Exadata] has lived up to the hype.‟”
The Author(s) of this research report certify that all of the views expressed in the report accurately reflect their personal views about any and all of the subject securities
and that no part of the Author(s) compensation was, is or will be, directly or indirectly, related to the specific recommendations or views in this report. The Author does not
own securities in any of the aforementioned companies.
OTA Financial Group LP has a membership interest in Blueshift Research LLC. OTA LLC, an SEC registered broker dealer subsidiary of OTA Financial Group LP, has both
market making and proprietary trading operations on several exchanges and alternative trading systems. The affiliated companies of the OTA Financial Group LP, including
OTA LLC, its principals, employees or clients may have an interest in the securities discussed herein, in securities of other issuers in other industries, may provide bids and
offers of the subject companies and may act as principal in connection with such transactions. Craig Gordon, the founder of Blueshift, has an investment in OTA Financial
Group LP.
© 2012 Blueshift Research LLC. All rights reserved. This transmission was produced for the exclusive use of Blueshift Research LLC, and may not be reproduced or relied
upon, in whole or in part, without Blueshift‟s written consent. The information herein is not intended to be a complete analysis of every material fact in respect to any
company or industry discussed. Blueshift Research is a trademark owned by Blueshift Research LLC.
The highpoint of the event was
the customer panel. … One of
the best I have heard in my ten
years as an analyst, in part
because the customers
delivered genuine criticism—
which you don‟t hear in vendor-
organized panels—but also
because they did it with humor.
The constructive criticism made
the kudos for Oracle all the
more believable.
InformationWeek Article