Recent Blog Posts

Mashery named a Leader in Gartner Application Services Governance Magic Quadrant

Gartner published their first Application Services Governance Magic Quadrant (MQ) in 2013. Mashery–the company that invented API management–was named a Leader in that original MQ, and was reaffirmed again as a Leader in Gartner’s recently updated Application Services Governance Magic … Read more >

The post Mashery named a Leader in Gartner Application Services Governance Magic Quadrant appeared first on API Management.

Read more >

Reliable, Real-Time, Results: The Powerful Combination of SAP HANA* and Intel® Xeon® Processor E7v3 Family

Intel recently introduced the latest Intel® Xeon® processor E7 v3 family. Last year, as part of our partnership with SAP and our ongoing commitment to improve business technology, we worked together to dramatically improve the SAP HANA* platform and the Intel Xeon processor E7 v2 family performance and query analysis.


Now, we’re transitioning to the new generation of Xeon processors. The performance gains and new features just keep getting better, helping employees to work smarter, faster, and make critical decisions in a real-time environment.


The newcomer to the Intel Xeon processor E7 family is Intel® Transactional Synchronization Extensions (Intel® TSX), hardware improvements for easier multi-core programming. Working to maximize the high-core counts and multi-threading capabilities, Intel TSX allows the processor to determine dynamically whether threads need to serialize through lock-protected critical sections and to perform serialization only when required.


When enabled, and in conjunction with SAP HANA SPS9*, Intel TSX enabled on the Intel Xeon processor E7 v3 family delivered transactions over twice as fast as previous generation processors for a total of 6x more transactions per minute.





The Intel Xeon processor E7 v3 family also includes more than 40 RAS features to keep mission-critical systems, such as your SAP environment, always up and running. Features such as Intel® Run Sure Technology help servers bounce back from a larger range of errors without impacting the operating system. Your server firmware or operating system is given reign over memory management to reduce costs. And server downtime is minimized with multiple rank sparing, so you can include a second rank for dynamic failovers. DDR4 recovery cuts down on memory errors that could lead to a system crash.


All of these improvements to the Intel Xeon processor E7v3 family plus the latest SAP HANA platform deliver business insights faster to improve business-critical decision-making abilities.


To learn more about how SAP HANA running on servers built with the Intel Xeon E7v3 processor family can drastically improve the way you process and analyze data, read the full solution brief.


Follow Tim at @TimIntel for the latest Intel news.

Read more >

Getting the most out of your Big Data and Analytics tools: Data Cleansing, Integration, and Visualization


At this start of this series on Big Data and analytics, I mentioned that Intel derived US$351 million in value from its analytics programs during 2014.  While access to tools such as Cloudera Hadoop is very important, the best analytic tools in the world are not useful unless you can feed them correct, valid, and relevant data.   And even if you do get that data and are able to analyze it, analysis capabilities still are not useful unless the results can get them to the people able to take actions based on them.


Ashok Agarwal talks about some of these issues in his blog post on connected data.  15 years of data stored in some 4000 spreadsheets and other sources were integrated in a single database in an 18 month long program of data cleansing and consolidation.    That data, when fed into a decision support system, yielded US$264 million in increased revenue.


Gathering and analyzing data is not the only way to get value from this data.  Ashok also mentioned that how that data will be coupled with an advanced data visualization tool called the Info Wall. The Info Wall was designed so the executives could easily manipulate the historical data, through touch, to make actionable decisions that can improve business.



I had a chance to experiment with the Info Wall.  The paper mentions hosting multiple data sources, and these can be simultaneously visualized like the many graphs shown to the right.  Other data sources can be television. I experienced opening a feed from a television channel with my hands.  The title of the white paper on the Info Wall is “Collaborative Visual Data Analysis Enables Faster, Better Decisions,” and the Info Wall is definitely capable of being used collaboratively. Multiple people can manipulate data sources at the same time, and they have plenty of room as the Info Wall takes up one entire wall of a conference room.  The Info Wall will gaining more capability in the future as people from multiple sites will be able to collaborate.


You can find out more about the capabilities of the Info Wall and the data cleansing and integration efforts needed to make it work in this white paper.

Read more >

Intel & Cisco Make News at NFV World Congress 2015

By John Healy, General Manager SDN Division, Intel Corporation


I had a chance to chat onstage with Dave Ward, Cisco’s CTO of Engineering and Chief Architect, during my keynote at this year’s NFV World Congress event.


I wanted to have him there to help me announce that Cisco has joined the Intel Network Builders ecosystem – a new milestone in our companies’ long and fruitful relationship.


The goal of Network Builders is to foster alliances between companies helping to move the NFV/SDN and open standards technology forward.  Cisco joins more than 130 other vendors and service provider members of Network Builders, several of whom were demonstrating collaborative solutions at the show.


This announcement is an example of how Intel and Cisco are working together as we both embrace the new, more open networking environment that is driven by open standards, and the rapid adoption of SDN and NFV technology.


In our conversation, Dave said that the decision to join Network Builders was made because he believes in a new approach to SDN and that industry initiatives are critical in moving technology forward. The goals of the Network Builders program are aligned with the importance Cisco places on open and interoperable solutions that are standards based. Cisco is looking to expand its interaction and joint efforts with other software and hardware vendors in the new NFV, SDN, cloud and orchestration “value stack” to collaborate on opportunities and challenges.


One of the company’s first Network Builder activities was working with us in a technology demonstration of how its Network Service Header technology, combined with Intel 100GbE, can provide advanced, high-performance intra-data center service chaining. The demo was a reprise of a very successful presentation that was made at the Mobile World Congress 2015 event in Barcelona.


Dave also reminded me of some of the other ways that Cisco and Intel are working together on open source and standards-based initiatives including Open Platform NFV, OpenStack, Open Daylight, Open vSwitch, and other standards work with the IETF.  A great example of the impact of this collaboration is our joint work on enhancing Open vSwitch performance, which is really improving network traffic flows and policy-based capabilities, and helping customers realize more agile and instantaneous virtual network function deployment.


Another area of mutual cooperation is policy-driven networks. When I speak to service providers, they want NFV solutions that will drive down cost in the network, expand their service delivery agility and offer the service reliability in terms of service level agreements and QoS that that customers have come to trust.

Together, Cisco and Intel have jointly driven adoption of new policy technologies in OpenStack and Open Daylight. When completed, this will allow service providers to build out a lower-cost, datacenter-like infrastructure to better support SLAs and QoS.


There are lots of global communications opportunities and challenges today, especially across cloud, telecom and enterprise applications. Having leaders such as Cisco interact with start-ups, service providers and other industry leading firms through Network Builders is vital to unlocking the value in this industry transformation.


I appreciated Dave taking the time to join me at NFV World Congress and look forward to Cisco’s expanded contribution to Network Builders.

Read more >

Use Data To Support Arguments, Not Arguments To Support Data

BI.jpgThe concept of “better-informed” decisions is distinctly different than the concept of “better” decisions— the former is generally a choice, whereas the latter often results from an action. Better-informed leaders don’t always make better decisions, but better decisions almost always start with better-informed leaders. Business intelligence (BI) can be the framework that enables organizations of all sizes to make faster, better-informed business decisions.


BI Should Play a Role in Better-Informed Decisions


This same principle equally applies to individuals such as better-informed patients or better-informed consumers. Ultimately, when the final decision lies with us (humans), we either choose to ignore the data or choose to use it in our decision making—assuming, of course, that it exists and we can trust it. However, even the best implementations of BI solutions can neither change nor prevent uninformed or less-informed decisions if we choose to ignore the data.


Typically, “data-entangled decisions” involve potential use of data for analysis compared to other decisions that may be driven purely by our emotional states or desires. Most business decisions are data-entangled decisions. In these, existing or new data can play an important role compared to a personal decision, such as when to go to sleep. A data-entangled decision in general follows three main phases when a business question, challenge, or opportunity presents itself. BI, if designed and implemented effectively, should support all three phases.


Phase One: Reaction


In the reaction phase, the initial course is fashioned out of an immediate reaction to a threat or an opportunity. Typically, some preliminary figures are accompanied by known assumptions that form the initial direction. In this early stage, initial data is still “entangled” and only the requirement for additional information can be outlined. In some cases, however, the decision may be already made and if so, the effort to gather additional data for further analysis becomes a futile exercise.


Phase Two: Validation


Additional data produces opportunities for in-depth analysis, which should eventually lead to actionable insight. But these results need to be validated first using some type of critical thinking. Moreover, who validates the results is as critical as how it’s done.


Just as we don’t ask programmers to validate their own code, we don’t ask analysts or managers to validate their own conclusions of data. If available or feasible, objective methods that can remove assumptions or personal deductions from this phase provide the fastest and clearest path to actionable insight.


Phase Three: Execution


The execution phase is where the final decision will be made and the use of data will be completely up to the person in charge of the decision. There are three possibilities before the final decision is made and action is taken:

  1. The conclusion is supported by data, and we choose to take it into account for our decision.
  2. The conclusion is supported by data, and we choose to ignore it.
  3. The conclusion isn’t or can’t be supported by data, and we are left to our own judgment to make the decision.


In business, better-informed decisions often start with a strong appetite for data, followed by a healthy dose of skepticism for it. If available, our collective insight becomes the guiding light for our decisions enhanced by data. In the absence of it—when we are left to decide by ourselves—we seek wisdom in our own experiences to fill the void where we can’t find or rely on data.


Bottom Line


The bottom line is, we need to use data to support our arguments instead of using arguments to support our data. And BI, if designed and implemented effectively, should be the framework that supports all of this by enabling us to make faster, better-informed decision at all levels of our organization. This, in turn, helps us drive growth and profitability.


Where do you see the biggest challenge in making better-informed decisions?


Connect with me on Twitter (@KaanTurnali) and LinkedIn.


This story originally appeared on the SAP Analytics Blog.

Read more >

How Caesars Entertainment Cut Big Data Processing Time from 6 Hours to 45 Minutes

Caesars.jpgHappy customers are the lifeblood of the entertainment industry.  But before you can make customersand potential customershappy, you’ve got to understand what they want. For Caesars Entertainment, that meant putting together a big data analytics system that could handle a new, expanded set of customer data for its hotels, shows, and shopping venueswith faster processing and better results.


Expanded Data Environment

To improve customer segmentation and build more effective marketing campaigns, Caesars needed to expand its customer data analysis to include both unstructured and semi-structured data. It was also important to speed up processing for analytics and marketing campaign management.

Caesars built a new data analytics environment based on Cloudera’s Distribution Including Apache Hadoop (CDH) software running on servers equipped with the Intel® Xeon® processor E5 family. The new system reduced processing time for key jobs from 6 hours to just 45 minutes and expanded Caesars’ capacity to more than 3 million records processed per hour. It also enables fine-grained segmentation to improve marketing results and improves security for meeting Payment Card Industry (PCI) and other key security standards.

Caesars Infographic.jpgReaching New Customers

The new environment makes it easier for Caesars to reach out to younger customers, who are likely to prefer using smart phones or tablets to get their information. Caesars’ new mobile app lets customers use their mobile devices to check rates and make reservations. That data goes to the Hadoop environment for analysis in real time, where Caesars can use it to fine-tune its operations. Caesars can even use this data to tailor mobile responses and offers on the fly based on factors like the customer’s preferences and location.

Creating Personalized Marketing

With faster, better analysis of all data types, Caesars can now create and deliver personalized marketing, reaching a new generation of customers and winning their loyalty.

You can take a look at the Caesars Entertainment solution here or read more about it here. To explore more technology success stories, visit or follow us on Twitter.

Read more >

A New Order of All-Flash Hyper-Converged Storage: Atlantis HyperScale™ with Intel® Solid-State Drives

The software-defined storage (SDS) appliance concept of hyper-convergence is an attractive alternative to traditional Storage Area Network (SAN) and Network Attached Storage (NAS) for small to medium as well as large-sized businesses. Hyper converged infrastructure seems to be popular right now. So what is hyper-converged storage, and why should you care?


A hyper-converged storage system allows IT to manage compute, storage and virtualization resources as a single integrated system through a common tool set. The resulting system, often referred to as an appliance, consists of a server, storage, networking, and a hypervisor with management framework. Hyper-converged appliances can be expanded through the addition of nodes to the base unit to suit the compute and storage needs of a business in a manner know as scale-out. 


Hyper-converged scale-out storage differs from the older scale-up approach. In a scale-up system, the compute capacity is confined as storage is added, while in a scale-out system, new compute nodes can be added as the need for compute and storage arises. Scaling-up storage has often been cost prohibitive and often lack the necessary random IO performance (IOPS) needed by virtualized workloads. The scale-out approach is a more efficient use of hardware resources, as it moves the data closer to the processor. When scale-out is combined with solid-state drive (SSD) storage it offers far lower latency, better throughput, and increased flexibility to grow with your business. Scale-out is commonly used for virtualized workloads, private cloud, data bases, and many other business applications. 


Atlantis Computing introduced a new all-flash hyper-converged appliance that extends the concept of a software defined scale-out storage to the cloud. Atlantis HyperScale™, a turn-key hyper-converged appliance, delivers all-flash performance storage based on Intel® SSD Data Center Family  for enterprise wide applications. What is different is that HyperScale™ based on Atlantis USX pools existing enterprise SAN, NAS , and DAS storage and accelerates its performance by use of Intel SSDs. By abstracting the storage, USX delivers virtual storage volumes to enterprise applications. It further provides a context aware data service that performs deduplication and IO acceleration in real time for quality of service even when using public cloud services.


The Intel SSD Data Center Family holds the key for the HyperScale™ all-flash appliance. The Intel® SSD is designed for read- and write-intensive storage workloads with fast, consistent performance for smooth data center operation. The reliability, data integrity, and cost-effectiveness of the storage volumes in the HyperScale™ appliance helps protect your data with enterprise class features at a reasonable cost. The architecture of Intel’s SSDs ensure the entire read and write path and logical-block address (LBA) mapping has data protection and error correction. Many enterprise workloads depend not only on reliable data, but consistency in how quickly that data can accessed.  Consistent latency, quality of service, and bandwidth no matter what background activities are happening on the drive is the basis of the Intel Data Center Family.  Rigorous testing ensures a highly reliable SSD with consistent performance.


Today HyperScale™ all flash hyper-converged appliance introduces a new order of scale-out storage. The turn-key appliance can eliminate scalability and performance bottlenecks and allow computing and storage capacity to grow with the business need.

Read more >

Nurses Week 2015: Nurses are Superheroes


Today marks the start of Nurses Week so here at Intel we want to celebrate, with you, what we consider to be the real superheroes of healthcare by saying #ThankYouNurses. Throughout the next seven days we’ll be bringing you some fantastic blogs which demonstrate our appreciation for nurses while highlighting how technology is helping to deliver best-ever patient care.


I want you to help us to say thank you to nurses across the world by tweeting your appreciation using the hashtag #ThankYouNurses throughout the week. If you have received some fantastic care from a nurse or want to highlight the work of a great colleague I’d love to help you say thanks by sharing your stories on Twitter – so tag a nurse, @intelhealth and say #ThankYouNurses with us.


As a nurse practitioner I’ve seen up close how technology is truly revolutionising nursing through enhanced mobility, whether that be having access to the latest medical records when visiting a patient at home or helping a patient to understand their condition better by displaying an injury on a tablet at the bedside in a hospital. 


We are also working with Microsoft in Health to highlight all that nurses do at so please check out some of the great nursing stories there too. Nurses Week ends on a high with International Nurses Day on May 12th so let’s keep the conversation going throughout the week.


I very much look forward to sharing our collective appreciation for nurses with you.

Read more >

Open Storage Management: Critical to SDI Delivery

Anyone involved in the business of data storage knows we have many serious challenges on our hands: an explosion of unstructured data, ever-tougher compliance and regulatory issues, and increasing storage complexity.


In addition, storage is one of the roadblocks that impede the journey to cloud. Why? Today’s storage solutions lack interoperability, have different management consoles, and do not scale well. This historical reality for enterprise storage creates an enormous management challenge for data center operators implementing cloud environments. Without fundamental changes, these problems will be magnified in the data centers of tomorrow.


To move forward into the world of software-defined infrastructure and cloud data centers, IT teams need an open, intelligent, and flexible framework for dynamically managing storage resources. These resources will be from different vendors with different characteristics: They will include open and proprietary products; they will utilize different protocols; they will have different levels of performance and reliability. With enterprise demand focused on the choice of best-in- breed storage solutions across a wide array of usage scenarios, the requirement for interoperable solutions is only going to grow more acute.


At Intel, we are committed to working with our industry partners and the broader storage ecosystem to create this open framework—and clear the path to software-defined storage (SDS).


From Intel’s perspective, SDS is really about bringing cloud benefits to storage, including auto-provisioning, self-service models, and single-pane-of-glass management. Historically, these benefits have been elusive because of the lack of standards, interoperability, and common management across the wide and expanding range of storage systems.


While our full vision for an SDS framework will have to wait for another post, one of the key elements is a central control plane that unifies storage management and enables the orchestration of storage resources.


A key enabler of the new SDS architecture is a single-pane-of-glass management control plane.




There is, however, a big caveat here: In order for the SDS control plane to fully resolve management challenges, it must be able to interface with and control a large variety of storage systems—which brings us back to the need for open APIs that drive interoperable systems. No single storage provider can solve this problem; this is a challenge that can be addressed only by a broad community working together to deliver common storage standards.


This is where Intel—with its long history of driving standards, enabling open APIs, working with and developing communities, and cultivating open source capabilities where they make sense—can help. Intel is very supportive of open approaches that enable simplified management and interoperability to benefit end users. Our goal is to work with the entire storage ecosystem to support the development of open APIs that enable interoperability and simplify storage infrastructure management.


With these thoughts in mind, we support the just-announced Project CoprHD initiative to create an open source version of the EMC ViPR Controller. The project makes the code for the ViPR Controller, including all of the storage automation and control functionality, open for community-driven development. We think moves like this are a step in the right direction.


If you’d like to contribute to the industry’s efforts to drive open approaches to SDS, please connect with us.

Read more >

Could Your Old PCs be Putting Your Business at Risk?

Old-PCs-Put-Business-At-Risk.pngHere’s an example: An employee receives an email—apparently from a legitimate source—asking him to update an account password. As requested, he enters his old password and then types in a new one. Unfortunately, that’s all it takes for a hacker to steal $200,000 from his small business’s bank account, much of it unrecoverable. It’s a simple but extremely costly mistake, and it could happen to anyone. In fact, over the past few years, it’s been happening a lot. Did you know that some of the biggest security breaches in recent memory—including attacks on Sony, Target, and JPMorgan Chase—started with a phishing email sent to an employee?


If you own a business, you’re at risk. No matter how diligent you and your employees are about security, mistakes can happen. And the results can be disastrous. Virus protection and other software solutions—though useful and necessary—only get you so far, especially if your business is using PCs that are more than two years old. The problem is that software-only security solutions from even a few years ago can’t keep up with today’s cybercriminals and are not sufficient to protect your devices and vital business data.


So what can you do to stay safe? Don’t rely on software alone. You need your hardware to do the heavy lifting.


What You Can Do to Make Your Business More Secure


New desktops with at least 4th generation Intel Core processors have hardware-enhanced security features that allow hardware and software to work together, protecting your business from malware and securing the important, private data and content you create and share. Features such as Intel® Identity Protection Technology (Intel® IPT), Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI), and others are crucial to making your business more secure.


With hackers working around the clock to identify the next potential victim, it’s more important than ever for you to prioritize security. Read the new Intel white paper to learn more about what’s at risk, five new hardware-enhanced security features that help combat cybercrime, and why replacing your pre-2013 PC is a smart move.


In the meantime, join the conversation using #IntelDesktop, and get ready to rediscover the desktop.

This is the fifth installment of the Desktop World Tech Innovation Series.


To view more posts within the series click here: Desktop World Series

Read more >

Moore’s Law: exponential opportunity for education and empowerment

“I looked back to the beginning of the technology I considered fundamental—the planar transistor—and noticed that the [number of components] had about doubled every year. And I just did a wild extrapolation saying it’s going to continue to double every … Read more >

The post Moore’s Law: exponential opportunity for education and empowerment appeared first on CSR@Intel.

Read more >

Analytics: The New Frontier for Business Competitiveness

Not that long ago, data analysis focused mainly at looking backward to understand things that happened in the past. Today, thanks to Moore’s Law and the resultant advances in computing and memory technologies, analytics can now tell us what is happening in real time and help us predict what will happen in the days to come.


This landmark shift in our ability to extract value out of data is a key enabler for the new digital service economy. In this new era, an organization’s competitive edge increasingly hinges on its ability to turn an avalanche of data into actionable insights that improve operations and guide the creation of essential new products and services.


This isn’t an opportunity that is limited to Web 2.0 businesses or high-tech powerhouses. The opportunity for pervasive analytics and insights spans virtually all industries—from healthcare to transportation, from banking to manufacturing.


With powerful analytics solutions, physicians can diagnosis illnesses faster and create personalized treatment plans. Retailers can better understand buying behaviors to stock up on the products people are most likely to need. Car manufacturers can use predictive failure analysis to make repairs proactively—before customers find themselves stuck on the side of the road.


While these examples are diverse, they all share a common central focus: the combination of big data and high-powered computing solutions with sophisticated technologies like in-memory analytics that accelerate time to insight. And this is where the latest generation of Intel® Xeon® processors enters the picture.


The new Intel® Xeon® processor E7 v3 family is designed to accelerate real-time analytics on enormous datasets with sizes of multi-terabyte and even petabyte-scale. With up to 20 percent more cores, threads, cache, and system bandwidth than previous-generation processors, the Intel Xeon processor E7 v3 family makes fast work of complex, high-volume transactions and queries.


In addition, we’ve added an expanded memory footprint to support in-memory analytics—one of the keys to gaining immediate insights from big data. We’ve also added sophisticated technologies like Intel® Advanced Vector Extensions to boost simulation performance, Intel® Transactional Synchronization Extensions (Intel® TSX) to accelerate OLTP performance, and Intel® Run Sure Technology to support mission-critical uptime and advanced data integrity.


Let’s consider a couple of real-life examples of the potential to put powerful analytics and simulation tools to work in conjunction with Intel Xeon processors to help organizations extract value from big data in real time:


  • FarmLogs uses analytics tools and high-powered computing solutions to help farmers make their land more productive. It achieves this goal by putting sensors on farm machines, connecting the machines to Internet, and analyzing data streams in real time. With an instant view of how different areas of their fields are performing, farmers can adjust seed, fertilizer and other variables to best match the field conditions. This helps them avoid waste and improve the productivity of the field.


  • Pacific Northwest Seismic Network uses big data analytics to provide the public and others with early warnings about earthquakes and ground motions. The organization is now working to develop the ability to warn people about earthquakes before the shaking has reached them—to give them precious seconds, or maybe even a minute, to protect themselves and those around them.


There is a broader theme to consider here: Data-driven insights can improve the human condition—whether it’s increasing food production, saving lives when earthquakes strike, or meeting some other goal that can be achieved only with real-time insights gleaned from massive amounts of data.


At a business level, data analytics are increasingly tied to competitiveness. In fact, we believe that within three to five years data analytics will become the No. 1 data center application in terms of importance to the business. In virtually every industry, organizations will change their processes to take advantage of analytics to improve operations, products, and personalization.


To enable this transformation, Intel is working to democratize actionable insights by making analytics and simulation tools easy to deploy, easy to use, and efficient to run. And, as always, we are working with a broad ecosystem to provide open solutions and flexible tools to meet diverse customer needs.


For a deeper dive into the capabilities of the Intel® Xeon® processor E7 v3 family, visit

Read more >

A Foundation for Real-Time Insights via Analytics

For those of us that follow technology, it seems we constantly hear that big data will change life as we know it. While this is an interesting perspective, it misses an important point.  Data, by itself, is not a valuable asset for an organization. The industry transforming power of data comes from insights derived from analytics on that data – all driven by powerful computing systems.


To stay competitive in today’s world, business intelligence solutions are increasingly being woven into critical, real-time business processes across all industries. This is a game changer that can help businesses improve decision making, achieve better results, better connect with customers, and remap the ways they deliver value to their customers.  In fact, entire industries are being disrupted by the ability to harness the power of advanced analytics to business advantage.


Despite this promise, customers face many hurdles in fully harnessing the capabilities of advanced analytics.  Common struggles include trust, speed, and scale.  Can I trust the data and analysis?  Do I have the performance for real-time decisions?  Can I scale and unify massive datasets?  It’s worth noting that just 27 percent of executives say their big data projects are successful, and 65 percent cite determining value as their biggest barrier to adoption.


As we launch the new Intel® Xeon® processor E7 v3 family, we tackle these customer challenges head on – delivering the world’s fastest system for real-time analytics** on a trusted, mission-critical, highly scalable platform.  For the most complete information, on performance records see the full list.



Servers based on the new Intel Xeon processor E7 v3 family provide exceptional performance and scalability for real-time analytics operating core business data. These servers offer the industry’s largest memory per socket (12TB in 8S) to support in-memory data analytics, which improves analytics performance by moving data closer to the processors.


With 20 new world record performance awards, the Intel Xeon processor E7 v3 platform showcases  the benefits of the Haswell microarchitecture, 20% more cores, 20% more last-level cache, up to 32 sockets, and unique features like Intel® Transactional Synchronization Extensions (Intel® TSX). Intel TSX helps provide up to  6X online transaction processing (OLTP) performance for optimized database solutions, via a mechanism that accelerates multi-threaded workloads by dynamically exposing otherwise hidden parallelism***.  Xeon E7 v3 also delivers a trusted platform required for mission critical applications including the added cryptographic performance via new instructions like Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI) and Intel® Advanced Vector Extensions 2 (Intel® AVX2) and supports Intel® Run Sure Technology for maximum uptime.


So what kind of use cases can all of this computing horsepower provide to the enterprise? I’d like to share a couple of interesting examples of big data analytics powered by Intel Xeon processor based systems.


Nippon Paint is one of Asia’s leading Paint and Coating companies with 57 manufacturing facilities and operations spanning 15 countries and regions.  They have deployed SAP HANA to accelerate social media analytics. With sophisticated real-time analytics, the company can capture consumer behaviors and preferences more accurately and quickly. Among other benefits, the insights gained through analytics help Nippon Paint create targeted marketing campaigns, improve customer interactions, and develop products that meet emerging customer requirements.


In another example, a power company used SAS Analytics software to analyze data gathered every 15 minutes from 38,000 smart meters in seven cities to predict the amount of electricity needed at certain times. This forecasting is important because electricity cannot be stored—it needs to be produced at the correct levels when it is needed. With the ability to frequently analyze smart meter readings, the utility improved its forecast accuracy by 9 percent, for a savings of $9 million.


A common thread to all of these stories is the use of the Intel Xeon processor based platform as the foundation for better business intelligence. As your data and analytics workloads continue to grow, these powerful servers can help you keep pace with growth while turning big data into big opportunities for your business.


For a closer look at the new Intel® Xeon® processor E7 v3 product families, visit





Intel, the Intel logo, and Xeon are trademarks of Intel Corporation in the United States and other countries.

* Other names and brands may be claimed as the property of others.

**World’s fastest system for analytics claim based on servers using Intel® Xeon® processer E7-8890 v3 being the performance leader of SAP BW-EML* @ 1B scalable records and 2B records as of 5 May 2015.  For the most complete information, see

***Up to 6x business processing application performance improvement claim based on SAP* OLTP internal in-memory workload measuring transactions per minute (tpm) on SuSE* LINUX Enterprise Server 11 SP3. For configuration and result details, see

Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors.
Performance tests, such as SYSmark and MobileMark, are measured using specific computer systems, components, software, operations and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products.  For more complete information visit


Read more >

Building Right from the Start for Big Data with Cisco Unified Computing System

By Todd Brannon, Director of Product Marketing, Unified Computing



At the upper end of our UCS server portfolio we feature systems that deliver the large memory footprints and core-counts that performance-intensive workloads demand.  Today’s announcement of the Intel® Xeon® processor E7 v3 Family brings powerful new capabilities to this class of system.


Why is this important?


Our customers are striving to become intelligence-driven throughout their operations in order to create a perpetual and renewable competitive edge. Taking a long-term view in choosing the right infrastructure is essential; here are two reasons why:


  1. You never hear about a big data environment getting smaller.  Massive increases in data volume mean these environments will inevitably grow, and for many, this will mean continuously expanding clusters of hundreds or thousands of servers.
  2. Data is the lifeblood of the digital enterprise.  As the use of big data becomes pervasive and critical to day-to-day decision-making, the performance and predictability of these computing platforms will become increasingly paramount to the success of the business.  Choose partners you can trust.


IT departments need an infrastructure that is designed for deployment and operation at scale.  Down at the server level, particularly for scale up workloads, they will always need more horsepower, and it can’t come with additional power and cooling load.  Cisco UCS and the new Intel Xeon processor E7 v3 family lineup deliver on both of these vectors.


Traditional servers, essentially designed as stand-alone devices, aren’t built with needs of these new big-data environments in mind.  This is where Cisco UCS and our Integrated Infrastructure for big data come in.  We’re bringing customers a platform optimized for long-term success because of its unified design, inherent scalability, advanced reliability and robust security.  By abstracting the identity and configuration of individual servers and managing exclusively through policy constructs, UCS allows IT teams to manage up to 10,000 servers as a single pool of resources.  Many in the industry are still focused on the issues of density and power, which most customers consider basic computing table stakes today.  UCS is designed to optimize the most important resource in the data center: people’s time.


Intel and Cisco have a special partnership because UCS gives customers the most operationally effective platform to harness the performance and efficiency Xeon delivers.  The new class of systems we’re releasing today can scale to 72 processors and 6TB memory footprint, ideal for the latest generation of in-memory database workloads as well as traditional ERP applications and databases. Cisco was the first server vendor to publish results on the new TPCx-HS benchmark and it’s our intention to continue to lead the industry in terms of performance for these workloads.


Disruption and opportunity will continue to accelerate the realm of analytics.  Cisco and Intel are partnering to build the platforms customers need to build for the long haul.  To stay up to date, please follow us on twitter at #CiscoUCS and #CiscoDC.

Read more >


By Nick Winkworth, Product Marketing – Compute Platform, Hitachi Data Systems



If, like me, you have lived in the “technology bubble” for any length of time, you have probably become accustomed, perhaps even blasé, about the incredibly rapid – and ever accelerating – pace of change in our industry.


Today, everything we touch seems to be generating data, from the heart rate sensor on your smart watch to the thermostat on your wall to the jet engine on the airplane that takes you on your next vacation or business trip – to say nothing of all the videos, blog posts, tweets and emails we all generate every day.


The big question that companies like Hitachi are now starting to ask is “how can all this data improve our lives and make this planet a better place to live?” – But before we can even begin to answer that question we must be build the infrastructure to capture, store and process that data…


And that’s no easy task.


This is not something that can be done from a standing start. The companies that will succeed started out on this long road many years ago. They have learned from each new generation of technology; they have added, changed, tweaked and improved along the way to get to where are now. Which is just the starting place for the next step.


Sure, there’s the occasional “giant leap”, but if you look closer you will inevitably see a longer story of incremental change, prior inventions and ideas that provided a foundation.



Hitachi HIPAC MK 1 (1957)


Hitachi joined the information age in the late 1950’s with one of the earliest electronic computers, the HIPAC-MK1. Since then, the company has learned how to meet the needs of an incredibly diverse range of customers around the globe in many industries, how to consistently deliver high quality and how to scale to the highest capacities demanded by some of the world’s biggest organizations.


Today, our enterprise storage products, servers and converged infrastructure solutions are built on that strong foundation, and today a key part of that foundation is Intel’s processor technology.


With the introduction of the new Intel Xeon E7v3 Processor Family this week we take another step forward together, propelled not by a single breakthrough, but improvements built upon many years of hard work and customer feedback.





The new Intel Xeon E7v3 Processor Family allows Hitachi to carry forward the support of the unique innovations that have been developed and advanced over many years such as the support of native Logical Partitions (LPARs) and the ability to scale a blade’s processor, memory and IO capacity incrementally by combining blades:




Without this legacy of continuous innovation it would be impossible to keep up with the exponentially increasing demand for capacity and performance for which today’s market is just the tip of the iceberg.


New SAP Solutions deliver over 40% more performance at 30% lower price


This ever increasing demand does not come with an ever increasing budget, of course, and this week’s introduction of the Intel Xeon E7v3 Processor Family plays a critical role in Hitachi Data Systems ability to deliver big data analytics solutions for customers such as SPAR Austria Group at a price point that stays within budget, even as needs grow. Like many of our customers, SPAR’s business depends on rapid analysis of an increasing volume of unstructured data which has been running for the past year on Hitachi Compute Blade 2000 systems with the Intel Xeon E7 Processors and Hitachi VSP storage platforms.


Together with Intel’s announcement, we are introducing the next generation in that lineage: SAP HANA scale-up solutions based on our new CB520X B2 blades (powered by Intel’s new Intel Xeon E7v3 Processor Family), along with our new CB2500 blade chassis and the next generation of (Intel powered) Virtual Storage Platform (VSP) technology. Lab testing has shown that the new solutions deliver 46% more performance (measured data loading speed) at 30% lower price, compared to the previous generation of UCP solutions for SAP HANA.


Why is this so important? Social Innovation


One thing that sets Hitachi and Hitachi Data Systems apart is our vision. Hitachi’s goal is to deliver solutions and services to make societies safer, smarter and healthier. This is what we call Social Innovation.


You can read more about Social Innovation in Hu Yoshida’s blog, or at this website.


As announced last week at Connect 2015, Hitachi Data Systems is bringing to market a number of purpose-built solutions to address a variety of Social Innovation challenges across many industries. These solutions combine the power of connected devices and technologies – or the Internet of Things (IoT) – with operational technology (OT), machine-to-machine (M2M) and advanced data analytics, and best-of-breed IT infrastructure, all in a unified, fully integrated stack.


Underlying all these solutions are analytics platforms built on best of breed infrastructure including Hitachi storage, servers, networking, content platforms, and converged solutions – all built on a foundation of Intel technology.


Today’s Intel Xeon E7 v3 Processor Family announcement strengthens Hitachi’s infrastructure portfolio and maintains the momentum of innovation, from both Hitachi and Intel, that is required to ensure that the demands of big data and social Innovation Solutions can be met today and in the future.

Read more >