Recent Blog Posts

Intel & Cisco Make News at NFV World Congress 2015

By John Healy, General Manager SDN Division, Intel Corporation

 

I had a chance to chat onstage with Dave Ward, Cisco’s CTO of Engineering and Chief Architect, during my keynote at this year’s NFV World Congress event.

 

I wanted to have him there to help me announce that Cisco has joined the Intel Network Builders ecosystem – a new milestone in our companies’ long and fruitful relationship.

 

The goal of Network Builders is to foster alliances between companies helping to move the NFV/SDN and open standards technology forward.  Cisco joins more than 130 other vendors and service provider members of Network Builders, several of whom were demonstrating collaborative solutions at the show.

 

This announcement is an example of how Intel and Cisco are working together as we both embrace the new, more open networking environment that is driven by open standards, and the rapid adoption of SDN and NFV technology.

 

In our conversation, Dave said that the decision to join Network Builders was made because he believes in a new approach to SDN and that industry initiatives are critical in moving technology forward. The goals of the Network Builders program are aligned with the importance Cisco places on open and interoperable solutions that are standards based. Cisco is looking to expand its interaction and joint efforts with other software and hardware vendors in the new NFV, SDN, cloud and orchestration “value stack” to collaborate on opportunities and challenges.

 

One of the company’s first Network Builder activities was working with us in a technology demonstration of how its Network Service Header technology, combined with Intel 100GbE, can provide advanced, high-performance intra-data center service chaining. The demo was a reprise of a very successful presentation that was made at the Mobile World Congress 2015 event in Barcelona.

 

Dave also reminded me of some of the other ways that Cisco and Intel are working together on open source and standards-based initiatives including Open Platform NFV, OpenStack, Open Daylight, Open vSwitch, and other standards work with the IETF.  A great example of the impact of this collaboration is our joint work on enhancing Open vSwitch performance, which is really improving network traffic flows and policy-based capabilities, and helping customers realize more agile and instantaneous virtual network function deployment.

 

Another area of mutual cooperation is policy-driven networks. When I speak to service providers, they want NFV solutions that will drive down cost in the network, expand their service delivery agility and offer the service reliability in terms of service level agreements and QoS that that customers have come to trust.

Together, Cisco and Intel have jointly driven adoption of new policy technologies in OpenStack and Open Daylight. When completed, this will allow service providers to build out a lower-cost, datacenter-like infrastructure to better support SLAs and QoS.

 

There are lots of global communications opportunities and challenges today, especially across cloud, telecom and enterprise applications. Having leaders such as Cisco interact with start-ups, service providers and other industry leading firms through Network Builders is vital to unlocking the value in this industry transformation.

 

I appreciated Dave taking the time to join me at NFV World Congress and look forward to Cisco’s expanded contribution to Network Builders.

Read more >

Use Data To Support Arguments, Not Arguments To Support Data

BI.jpgThe concept of “better-informed” decisions is distinctly different than the concept of “better” decisions— the former is generally a choice, whereas the latter often results from an action. Better-informed leaders don’t always make better decisions, but better decisions almost always start with better-informed leaders. Business intelligence (BI) can be the framework that enables organizations of all sizes to make faster, better-informed business decisions.

 

BI Should Play a Role in Better-Informed Decisions

 

This same principle equally applies to individuals such as better-informed patients or better-informed consumers. Ultimately, when the final decision lies with us (humans), we either choose to ignore the data or choose to use it in our decision making—assuming, of course, that it exists and we can trust it. However, even the best implementations of BI solutions can neither change nor prevent uninformed or less-informed decisions if we choose to ignore the data.

 

Typically, “data-entangled decisions” involve potential use of data for analysis compared to other decisions that may be driven purely by our emotional states or desires. Most business decisions are data-entangled decisions. In these, existing or new data can play an important role compared to a personal decision, such as when to go to sleep. A data-entangled decision in general follows three main phases when a business question, challenge, or opportunity presents itself. BI, if designed and implemented effectively, should support all three phases.

 

Phase One: Reaction

 

In the reaction phase, the initial course is fashioned out of an immediate reaction to a threat or an opportunity. Typically, some preliminary figures are accompanied by known assumptions that form the initial direction. In this early stage, initial data is still “entangled” and only the requirement for additional information can be outlined. In some cases, however, the decision may be already made and if so, the effort to gather additional data for further analysis becomes a futile exercise.

 

Phase Two: Validation

 

Additional data produces opportunities for in-depth analysis, which should eventually lead to actionable insight. But these results need to be validated first using some type of critical thinking. Moreover, who validates the results is as critical as how it’s done.

 

Just as we don’t ask programmers to validate their own code, we don’t ask analysts or managers to validate their own conclusions of data. If available or feasible, objective methods that can remove assumptions or personal deductions from this phase provide the fastest and clearest path to actionable insight.

 

Phase Three: Execution

 

The execution phase is where the final decision will be made and the use of data will be completely up to the person in charge of the decision. There are three possibilities before the final decision is made and action is taken:

  1. The conclusion is supported by data, and we choose to take it into account for our decision.
  2. The conclusion is supported by data, and we choose to ignore it.
  3. The conclusion isn’t or can’t be supported by data, and we are left to our own judgment to make the decision.

 

In business, better-informed decisions often start with a strong appetite for data, followed by a healthy dose of skepticism for it. If available, our collective insight becomes the guiding light for our decisions enhanced by data. In the absence of it—when we are left to decide by ourselves—we seek wisdom in our own experiences to fill the void where we can’t find or rely on data.

 

Bottom Line

 

The bottom line is, we need to use data to support our arguments instead of using arguments to support our data. And BI, if designed and implemented effectively, should be the framework that supports all of this by enabling us to make faster, better-informed decision at all levels of our organization. This, in turn, helps us drive growth and profitability.

 

Where do you see the biggest challenge in making better-informed decisions?

 

Connect with me on Twitter (@KaanTurnali) and LinkedIn.

 

This story originally appeared on the SAP Analytics Blog.

Read more >

How Caesars Entertainment Cut Big Data Processing Time from 6 Hours to 45 Minutes

Caesars.jpgHappy customers are the lifeblood of the entertainment industry.  But before you can make customersand potential customershappy, you’ve got to understand what they want. For Caesars Entertainment, that meant putting together a big data analytics system that could handle a new, expanded set of customer data for its hotels, shows, and shopping venueswith faster processing and better results.

 

Expanded Data Environment

To improve customer segmentation and build more effective marketing campaigns, Caesars needed to expand its customer data analysis to include both unstructured and semi-structured data. It was also important to speed up processing for analytics and marketing campaign management.

Caesars built a new data analytics environment based on Cloudera’s Distribution Including Apache Hadoop (CDH) software running on servers equipped with the Intel® Xeon® processor E5 family. The new system reduced processing time for key jobs from 6 hours to just 45 minutes and expanded Caesars’ capacity to more than 3 million records processed per hour. It also enables fine-grained segmentation to improve marketing results and improves security for meeting Payment Card Industry (PCI) and other key security standards.

Caesars Infographic.jpgReaching New Customers

The new environment makes it easier for Caesars to reach out to younger customers, who are likely to prefer using smart phones or tablets to get their information. Caesars’ new mobile app lets customers use their mobile devices to check rates and make reservations. That data goes to the Hadoop environment for analysis in real time, where Caesars can use it to fine-tune its operations. Caesars can even use this data to tailor mobile responses and offers on the fly based on factors like the customer’s preferences and location.

Creating Personalized Marketing

With faster, better analysis of all data types, Caesars can now create and deliver personalized marketing, reaching a new generation of customers and winning their loyalty.

You can take a look at the Caesars Entertainment solution here or read more about it here. To explore more technology success stories, visit www.intel.com/itcasestudies or follow us on Twitter.

Read more >

A New Order of All-Flash Hyper-Converged Storage: Atlantis HyperScale™ with Intel® Solid-State Drives

The software-defined storage (SDS) appliance concept of hyper-convergence is an attractive alternative to traditional Storage Area Network (SAN) and Network Attached Storage (NAS) for small to medium as well as large-sized businesses. Hyper converged infrastructure seems to be popular right now. So what is hyper-converged storage, and why should you care?

 

A hyper-converged storage system allows IT to manage compute, storage and virtualization resources as a single integrated system through a common tool set. The resulting system, often referred to as an appliance, consists of a server, storage, networking, and a hypervisor with management framework. Hyper-converged appliances can be expanded through the addition of nodes to the base unit to suit the compute and storage needs of a business in a manner know as scale-out. 

 

Hyper-converged scale-out storage differs from the older scale-up approach. In a scale-up system, the compute capacity is confined as storage is added, while in a scale-out system, new compute nodes can be added as the need for compute and storage arises. Scaling-up storage has often been cost prohibitive and often lack the necessary random IO performance (IOPS) needed by virtualized workloads. The scale-out approach is a more efficient use of hardware resources, as it moves the data closer to the processor. When scale-out is combined with solid-state drive (SSD) storage it offers far lower latency, better throughput, and increased flexibility to grow with your business. Scale-out is commonly used for virtualized workloads, private cloud, data bases, and many other business applications. 

 

Atlantis Computing introduced a new all-flash hyper-converged appliance that extends the concept of a software defined scale-out storage to the cloud. Atlantis HyperScale™, a turn-key hyper-converged appliance, delivers all-flash performance storage based on Intel® SSD Data Center Family  for enterprise wide applications. What is different is that HyperScale™ based on Atlantis USX pools existing enterprise SAN, NAS , and DAS storage and accelerates its performance by use of Intel SSDs. By abstracting the storage, USX delivers virtual storage volumes to enterprise applications. It further provides a context aware data service that performs deduplication and IO acceleration in real time for quality of service even when using public cloud services.

 

The Intel SSD Data Center Family holds the key for the HyperScale™ all-flash appliance. The Intel® SSD is designed for read- and write-intensive storage workloads with fast, consistent performance for smooth data center operation. The reliability, data integrity, and cost-effectiveness of the storage volumes in the HyperScale™ appliance helps protect your data with enterprise class features at a reasonable cost. The architecture of Intel’s SSDs ensure the entire read and write path and logical-block address (LBA) mapping has data protection and error correction. Many enterprise workloads depend not only on reliable data, but consistency in how quickly that data can accessed.  Consistent latency, quality of service, and bandwidth no matter what background activities are happening on the drive is the basis of the Intel Data Center Family.  Rigorous testing ensures a highly reliable SSD with consistent performance.

 

Today HyperScale™ all flash hyper-converged appliance introduces a new order of scale-out storage. The turn-key appliance can eliminate scalability and performance bottlenecks and allow computing and storage capacity to grow with the business need.

Read more >

Nurses Week 2015: Nurses are Superheroes

 

Today marks the start of Nurses Week so here at Intel we want to celebrate, with you, what we consider to be the real superheroes of healthcare by saying #ThankYouNurses. Throughout the next seven days we’ll be bringing you some fantastic blogs which demonstrate our appreciation for nurses while highlighting how technology is helping to deliver best-ever patient care.

 

I want you to help us to say thank you to nurses across the world by tweeting your appreciation using the hashtag #ThankYouNurses throughout the week. If you have received some fantastic care from a nurse or want to highlight the work of a great colleague I’d love to help you say thanks by sharing your stories on Twitter – so tag a nurse, @intelhealth and say #ThankYouNurses with us.

 

As a nurse practitioner I’ve seen up close how technology is truly revolutionising nursing through enhanced mobility, whether that be having access to the latest medical records when visiting a patient at home or helping a patient to understand their condition better by displaying an injury on a tablet at the bedside in a hospital. 

 

We are also working with Microsoft in Health to highlight all that nurses do at www.microsoft.com/nurses so please check out some of the great nursing stories there too. Nurses Week ends on a high with International Nurses Day on May 12th so let’s keep the conversation going throughout the week.

 

I very much look forward to sharing our collective appreciation for nurses with you.

Read more >

Open Storage Management: Critical to SDI Delivery

Anyone involved in the business of data storage knows we have many serious challenges on our hands: an explosion of unstructured data, ever-tougher compliance and regulatory issues, and increasing storage complexity.

 

In addition, storage is one of the roadblocks that impede the journey to cloud. Why? Today’s storage solutions lack interoperability, have different management consoles, and do not scale well. This historical reality for enterprise storage creates an enormous management challenge for data center operators implementing cloud environments. Without fundamental changes, these problems will be magnified in the data centers of tomorrow.

 

To move forward into the world of software-defined infrastructure and cloud data centers, IT teams need an open, intelligent, and flexible framework for dynamically managing storage resources. These resources will be from different vendors with different characteristics: They will include open and proprietary products; they will utilize different protocols; they will have different levels of performance and reliability. With enterprise demand focused on the choice of best-in- breed storage solutions across a wide array of usage scenarios, the requirement for interoperable solutions is only going to grow more acute.

 

At Intel, we are committed to working with our industry partners and the broader storage ecosystem to create this open framework—and clear the path to software-defined storage (SDS).

 

From Intel’s perspective, SDS is really about bringing cloud benefits to storage, including auto-provisioning, self-service models, and single-pane-of-glass management. Historically, these benefits have been elusive because of the lack of standards, interoperability, and common management across the wide and expanding range of storage systems.

 

While our full vision for an SDS framework will have to wait for another post, one of the key elements is a central control plane that unifies storage management and enables the orchestration of storage resources.

 

A key enabler of the new SDS architecture is a single-pane-of-glass management control plane.

 

SDS_image.png

 

There is, however, a big caveat here: In order for the SDS control plane to fully resolve management challenges, it must be able to interface with and control a large variety of storage systems—which brings us back to the need for open APIs that drive interoperable systems. No single storage provider can solve this problem; this is a challenge that can be addressed only by a broad community working together to deliver common storage standards.

 

This is where Intel—with its long history of driving standards, enabling open APIs, working with and developing communities, and cultivating open source capabilities where they make sense—can help. Intel is very supportive of open approaches that enable simplified management and interoperability to benefit end users. Our goal is to work with the entire storage ecosystem to support the development of open APIs that enable interoperability and simplify storage infrastructure management.

 

With these thoughts in mind, we support the just-announced Project CoprHD initiative to create an open source version of the EMC ViPR Controller. The project makes the code for the ViPR Controller, including all of the storage automation and control functionality, open for community-driven development. We think moves like this are a step in the right direction.

 

If you’d like to contribute to the industry’s efforts to drive open approaches to SDS, please connect with us.

Read more >

Could Your Old PCs be Putting Your Business at Risk?

Old-PCs-Put-Business-At-Risk.pngHere’s an example: An employee receives an email—apparently from a legitimate source—asking him to update an account password. As requested, he enters his old password and then types in a new one. Unfortunately, that’s all it takes for a hacker to steal $200,000 from his small business’s bank account, much of it unrecoverable. It’s a simple but extremely costly mistake, and it could happen to anyone. In fact, over the past few years, it’s been happening a lot. Did you know that some of the biggest security breaches in recent memory—including attacks on Sony, Target, and JPMorgan Chase—started with a phishing email sent to an employee?

 

If you own a business, you’re at risk. No matter how diligent you and your employees are about security, mistakes can happen. And the results can be disastrous. Virus protection and other software solutions—though useful and necessary—only get you so far, especially if your business is using PCs that are more than two years old. The problem is that software-only security solutions from even a few years ago can’t keep up with today’s cybercriminals and are not sufficient to protect your devices and vital business data.

 

So what can you do to stay safe? Don’t rely on software alone. You need your hardware to do the heavy lifting.

 

What You Can Do to Make Your Business More Secure

 

New desktops with at least 4th generation Intel Core processors have hardware-enhanced security features that allow hardware and software to work together, protecting your business from malware and securing the important, private data and content you create and share. Features such as Intel® Identity Protection Technology (Intel® IPT), Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI), and others are crucial to making your business more secure.

 

With hackers working around the clock to identify the next potential victim, it’s more important than ever for you to prioritize security. Read the new Intel white paper to learn more about what’s at risk, five new hardware-enhanced security features that help combat cybercrime, and why replacing your pre-2013 PC is a smart move.

 

In the meantime, join the conversation using #IntelDesktop, and get ready to rediscover the desktop.


This is the fifth installment of the Desktop World Tech Innovation Series.

 

To view more posts within the series click here: Desktop World Series

Read more >

Moore’s Law: exponential opportunity for education and empowerment

“I looked back to the beginning of the technology I considered fundamental—the planar transistor—and noticed that the [number of components] had about doubled every year. And I just did a wild extrapolation saying it’s going to continue to double every … Read more >

The post Moore’s Law: exponential opportunity for education and empowerment appeared first on CSR@Intel.

Read more >

Analytics: The New Frontier for Business Competitiveness

Not that long ago, data analysis focused mainly at looking backward to understand things that happened in the past. Today, thanks to Moore’s Law and the resultant advances in computing and memory technologies, analytics can now tell us what is happening in real time and help us predict what will happen in the days to come.

 

This landmark shift in our ability to extract value out of data is a key enabler for the new digital service economy. In this new era, an organization’s competitive edge increasingly hinges on its ability to turn an avalanche of data into actionable insights that improve operations and guide the creation of essential new products and services.

 

This isn’t an opportunity that is limited to Web 2.0 businesses or high-tech powerhouses. The opportunity for pervasive analytics and insights spans virtually all industries—from healthcare to transportation, from banking to manufacturing.

 

With powerful analytics solutions, physicians can diagnosis illnesses faster and create personalized treatment plans. Retailers can better understand buying behaviors to stock up on the products people are most likely to need. Car manufacturers can use predictive failure analysis to make repairs proactively—before customers find themselves stuck on the side of the road.

 

While these examples are diverse, they all share a common central focus: the combination of big data and high-powered computing solutions with sophisticated technologies like in-memory analytics that accelerate time to insight. And this is where the latest generation of Intel® Xeon® processors enters the picture.

 

The new Intel® Xeon® processor E7 v3 family is designed to accelerate real-time analytics on enormous datasets with sizes of multi-terabyte and even petabyte-scale. With up to 20 percent more cores, threads, cache, and system bandwidth than previous-generation processors, the Intel Xeon processor E7 v3 family makes fast work of complex, high-volume transactions and queries.

 

In addition, we’ve added an expanded memory footprint to support in-memory analytics—one of the keys to gaining immediate insights from big data. We’ve also added sophisticated technologies like Intel® Advanced Vector Extensions to boost simulation performance, Intel® Transactional Synchronization Extensions (Intel® TSX) to accelerate OLTP performance, and Intel® Run Sure Technology to support mission-critical uptime and advanced data integrity.

 

Let’s consider a couple of real-life examples of the potential to put powerful analytics and simulation tools to work in conjunction with Intel Xeon processors to help organizations extract value from big data in real time:

 

  • FarmLogs uses analytics tools and high-powered computing solutions to help farmers make their land more productive. It achieves this goal by putting sensors on farm machines, connecting the machines to Internet, and analyzing data streams in real time. With an instant view of how different areas of their fields are performing, farmers can adjust seed, fertilizer and other variables to best match the field conditions. This helps them avoid waste and improve the productivity of the field.

 

  • Pacific Northwest Seismic Network uses big data analytics to provide the public and others with early warnings about earthquakes and ground motions. The organization is now working to develop the ability to warn people about earthquakes before the shaking has reached them—to give them precious seconds, or maybe even a minute, to protect themselves and those around them.

 

There is a broader theme to consider here: Data-driven insights can improve the human condition—whether it’s increasing food production, saving lives when earthquakes strike, or meeting some other goal that can be achieved only with real-time insights gleaned from massive amounts of data.

 

At a business level, data analytics are increasingly tied to competitiveness. In fact, we believe that within three to five years data analytics will become the No. 1 data center application in terms of importance to the business. In virtually every industry, organizations will change their processes to take advantage of analytics to improve operations, products, and personalization.

 

To enable this transformation, Intel is working to democratize actionable insights by making analytics and simulation tools easy to deploy, easy to use, and efficient to run. And, as always, we are working with a broad ecosystem to provide open solutions and flexible tools to meet diverse customer needs.

 

For a deeper dive into the capabilities of the Intel® Xeon® processor E7 v3 family, visit intel.com/ITcenter.

Read more >

A Foundation for Real-Time Insights via Analytics

For those of us that follow technology, it seems we constantly hear that big data will change life as we know it. While this is an interesting perspective, it misses an important point.  Data, by itself, is not a valuable asset for an organization. The industry transforming power of data comes from insights derived from analytics on that data – all driven by powerful computing systems.

 

To stay competitive in today’s world, business intelligence solutions are increasingly being woven into critical, real-time business processes across all industries. This is a game changer that can help businesses improve decision making, achieve better results, better connect with customers, and remap the ways they deliver value to their customers.  In fact, entire industries are being disrupted by the ability to harness the power of advanced analytics to business advantage.

 

Despite this promise, customers face many hurdles in fully harnessing the capabilities of advanced analytics.  Common struggles include trust, speed, and scale.  Can I trust the data and analysis?  Do I have the performance for real-time decisions?  Can I scale and unify massive datasets?  It’s worth noting that just 27 percent of executives say their big data projects are successful, and 65 percent cite determining value as their biggest barrier to adoption.

 

As we launch the new Intel® Xeon® processor E7 v3 family, we tackle these customer challenges head on – delivering the world’s fastest system for real-time analytics** on a trusted, mission-critical, highly scalable platform.  For the most complete information, on performance records see the full list.

 

 

Servers based on the new Intel Xeon processor E7 v3 family provide exceptional performance and scalability for real-time analytics operating core business data. These servers offer the industry’s largest memory per socket (12TB in 8S) to support in-memory data analytics, which improves analytics performance by moving data closer to the processors.

 

With 20 new world record performance awards, the Intel Xeon processor E7 v3 platform showcases  the benefits of the Haswell microarchitecture, 20% more cores, 20% more last-level cache, up to 32 sockets, and unique features like Intel® Transactional Synchronization Extensions (Intel® TSX). Intel TSX helps provide up to  6X online transaction processing (OLTP) performance for optimized database solutions, via a mechanism that accelerates multi-threaded workloads by dynamically exposing otherwise hidden parallelism***.  Xeon E7 v3 also delivers a trusted platform required for mission critical applications including the added cryptographic performance via new instructions like Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI) and Intel® Advanced Vector Extensions 2 (Intel® AVX2) and supports Intel® Run Sure Technology for maximum uptime.

 

So what kind of use cases can all of this computing horsepower provide to the enterprise? I’d like to share a couple of interesting examples of big data analytics powered by Intel Xeon processor based systems.

 

Nippon Paint is one of Asia’s leading Paint and Coating companies with 57 manufacturing facilities and operations spanning 15 countries and regions.  They have deployed SAP HANA to accelerate social media analytics. With sophisticated real-time analytics, the company can capture consumer behaviors and preferences more accurately and quickly. Among other benefits, the insights gained through analytics help Nippon Paint create targeted marketing campaigns, improve customer interactions, and develop products that meet emerging customer requirements.

 

In another example, a power company used SAS Analytics software to analyze data gathered every 15 minutes from 38,000 smart meters in seven cities to predict the amount of electricity needed at certain times. This forecasting is important because electricity cannot be stored—it needs to be produced at the correct levels when it is needed. With the ability to frequently analyze smart meter readings, the utility improved its forecast accuracy by 9 percent, for a savings of $9 million.

 

A common thread to all of these stories is the use of the Intel Xeon processor based platform as the foundation for better business intelligence. As your data and analytics workloads continue to grow, these powerful servers can help you keep pace with growth while turning big data into big opportunities for your business.

 

For a closer look at the new Intel® Xeon® processor E7 v3 product families, visit intel.com/xeonE7.

 

 

 

 

Intel, the Intel logo, and Xeon are trademarks of Intel Corporation in the United States and other countries.

* Other names and brands may be claimed as the property of others.

**World’s fastest system for analytics claim based on servers using Intel® Xeon® processer E7-8890 v3 being the performance leader of SAP BW-EML* @ 1B scalable records and 2B records as of 5 May 2015.  For the most complete information, see http://www.intel.com/E7v3records.

***Up to 6x business processing application performance improvement claim based on SAP* OLTP internal in-memory workload measuring transactions per minute (tpm) on SuSE* LINUX Enterprise Server 11 SP3. For configuration and result details, see http://www.intel.com/E7v3records.

Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors.
Performance tests, such as SYSmark and MobileMark, are measured using specific computer systems, components, software, operations and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products.  For more complete information visit
http://www.intel.com/performance

Configurations: http://www.intel.com/E7v3records.

Read more >

Building Right from the Start for Big Data with Cisco Unified Computing System

By Todd Brannon, Director of Product Marketing, Unified Computing

 

 

At the upper end of our UCS server portfolio we feature systems that deliver the large memory footprints and core-counts that performance-intensive workloads demand.  Today’s announcement of the Intel® Xeon® processor E7 v3 Family brings powerful new capabilities to this class of system.

 

Why is this important?

 

Our customers are striving to become intelligence-driven throughout their operations in order to create a perpetual and renewable competitive edge. Taking a long-term view in choosing the right infrastructure is essential; here are two reasons why:

 

  1. You never hear about a big data environment getting smaller.  Massive increases in data volume mean these environments will inevitably grow, and for many, this will mean continuously expanding clusters of hundreds or thousands of servers.
  2. Data is the lifeblood of the digital enterprise.  As the use of big data becomes pervasive and critical to day-to-day decision-making, the performance and predictability of these computing platforms will become increasingly paramount to the success of the business.  Choose partners you can trust.

 

IT departments need an infrastructure that is designed for deployment and operation at scale.  Down at the server level, particularly for scale up workloads, they will always need more horsepower, and it can’t come with additional power and cooling load.  Cisco UCS and the new Intel Xeon processor E7 v3 family lineup deliver on both of these vectors.

 

Traditional servers, essentially designed as stand-alone devices, aren’t built with needs of these new big-data environments in mind.  This is where Cisco UCS and our Integrated Infrastructure for big data come in.  We’re bringing customers a platform optimized for long-term success because of its unified design, inherent scalability, advanced reliability and robust security.  By abstracting the identity and configuration of individual servers and managing exclusively through policy constructs, UCS allows IT teams to manage up to 10,000 servers as a single pool of resources.  Many in the industry are still focused on the issues of density and power, which most customers consider basic computing table stakes today.  UCS is designed to optimize the most important resource in the data center: people’s time.

 

Intel and Cisco have a special partnership because UCS gives customers the most operationally effective platform to harness the performance and efficiency Xeon delivers.  The new class of systems we’re releasing today can scale to 72 processors and 6TB memory footprint, ideal for the latest generation of in-memory database workloads as well as traditional ERP applications and databases. Cisco was the first server vendor to publish results on the new TPCx-HS benchmark and it’s our intention to continue to lead the industry in terms of performance for these workloads.

 

Disruption and opportunity will continue to accelerate the realm of analytics.  Cisco and Intel are partnering to build the platforms customers need to build for the long haul.  To stay up to date, please follow us on twitter at #CiscoUCS and #CiscoDC.

Read more >

Momentum

By Nick Winkworth, Product Marketing – Compute Platform, Hitachi Data Systems

 

 

If, like me, you have lived in the “technology bubble” for any length of time, you have probably become accustomed, perhaps even blasé, about the incredibly rapid – and ever accelerating – pace of change in our industry.

 

Today, everything we touch seems to be generating data, from the heart rate sensor on your smart watch to the thermostat on your wall to the jet engine on the airplane that takes you on your next vacation or business trip – to say nothing of all the videos, blog posts, tweets and emails we all generate every day.

 

The big question that companies like Hitachi are now starting to ask is “how can all this data improve our lives and make this planet a better place to live?” – But before we can even begin to answer that question we must be build the infrastructure to capture, store and process that data…

 

And that’s no easy task.

 

This is not something that can be done from a standing start. The companies that will succeed started out on this long road many years ago. They have learned from each new generation of technology; they have added, changed, tweaked and improved along the way to get to where are now. Which is just the starting place for the next step.

 

Sure, there’s the occasional “giant leap”, but if you look closer you will inevitably see a longer story of incremental change, prior inventions and ideas that provided a foundation.

 

hds_image_4.jpg

Hitachi HIPAC MK 1 (1957)

 

Hitachi joined the information age in the late 1950’s with one of the earliest electronic computers, the HIPAC-MK1. Since then, the company has learned how to meet the needs of an incredibly diverse range of customers around the globe in many industries, how to consistently deliver high quality and how to scale to the highest capacities demanded by some of the world’s biggest organizations.

 

Today, our enterprise storage products, servers and converged infrastructure solutions are built on that strong foundation, and today a key part of that foundation is Intel’s processor technology.

 

With the introduction of the new Intel Xeon E7v3 Processor Family this week we take another step forward together, propelled not by a single breakthrough, but improvements built upon many years of hard work and customer feedback.

 

hds_image_5.png

EVOLUTION OF HITACHI INTEL BASED BLADE SERVERS

 

The new Intel Xeon E7v3 Processor Family allows Hitachi to carry forward the support of the unique innovations that have been developed and advanced over many years such as the support of native Logical Partitions (LPARs) and the ability to scale a blade’s processor, memory and IO capacity incrementally by combining blades:

 

hds_image_6.png

 

Without this legacy of continuous innovation it would be impossible to keep up with the exponentially increasing demand for capacity and performance for which today’s market is just the tip of the iceberg.

 

New SAP Solutions deliver over 40% more performance at 30% lower price

 

This ever increasing demand does not come with an ever increasing budget, of course, and this week’s introduction of the Intel Xeon E7v3 Processor Family plays a critical role in Hitachi Data Systems ability to deliver big data analytics solutions for customers such as SPAR Austria Group at a price point that stays within budget, even as needs grow. Like many of our customers, SPAR’s business depends on rapid analysis of an increasing volume of unstructured data which has been running for the past year on Hitachi Compute Blade 2000 systems with the Intel Xeon E7 Processors and Hitachi VSP storage platforms.

 

Together with Intel’s announcement, we are introducing the next generation in that lineage: SAP HANA scale-up solutions based on our new CB520X B2 blades (powered by Intel’s new Intel Xeon E7v3 Processor Family), along with our new CB2500 blade chassis and the next generation of (Intel powered) Virtual Storage Platform (VSP) technology. Lab testing has shown that the new solutions deliver 46% more performance (measured data loading speed) at 30% lower price, compared to the previous generation of UCP solutions for SAP HANA.

 

Why is this so important? Social Innovation

 

One thing that sets Hitachi and Hitachi Data Systems apart is our vision. Hitachi’s goal is to deliver solutions and services to make societies safer, smarter and healthier. This is what we call Social Innovation.

 

You can read more about Social Innovation in Hu Yoshida’s blog, or at this website.

 

As announced last week at Connect 2015, Hitachi Data Systems is bringing to market a number of purpose-built solutions to address a variety of Social Innovation challenges across many industries. These solutions combine the power of connected devices and technologies – or the Internet of Things (IoT) – with operational technology (OT), machine-to-machine (M2M) and advanced data analytics, and best-of-breed IT infrastructure, all in a unified, fully integrated stack.

 

Underlying all these solutions are analytics platforms built on best of breed infrastructure including Hitachi storage, servers, networking, content platforms, and converged solutions – all built on a foundation of Intel technology.

 

Today’s Intel Xeon E7 v3 Processor Family announcement strengthens Hitachi’s infrastructure portfolio and maintains the momentum of innovation, from both Hitachi and Intel, that is required to ensure that the demands of big data and social Innovation Solutions can be met today and in the future.

Read more >

The Power Behind Social Innovation, Software Defined Architecture and the Internet of Things with Hitachi and Intel

By Hu Yoshida, CTO, Hitachi Data Systems

 

 

Last week in Las Vegas, Hitachi and Hitachi Data Systems held our Connect 2015 event where we announced new offerings around Social Innovation, Software Defined Infrastructure and the Internet-of-Things. This week we are participating at Sapphire in Orlando where we are announcing new solutions to support real time analysis of big data with SAP HANA.

 

One common thread that ties together all these new Hitachi solutions is the power of Intel technology.

 

As Hitachi transforms from a provider of industry leading data storage technologies to a global provider of solutions that address some of the most challenging problems on the planet, we are delighted to be able to work closely with Intel to keep up with our customers requirements to process the exponentially accelerating volumes of data that will come from machine data, networks, videos and patient health records, to name just a few.

hds_image_1.png

 

New SAP Solutions deliver over 40% more performance at 30% lower price

 

This week’s introduction of the Xeon E7v3 family of processors is another step in a long legacy of innovation which has enabled Hitachi Data Systems to solve big data analytics problems for customers such as SPAR Austria Group whose business depends on rapid analysis of very large unstructured data running on Hitachi Compute Blade 2000 systems and VSP storage platforms.

 

Together with Intel’s announcement, we are introducing the next generation in that lineage: SAP HANA scale-up solutions based on our new CB520X B2 blades (powered by Intel’s new Xeon E7v3 processors), along with our new CB2500 blade chassis and the next generation of (Intel powered) Virtual Storage Platform (VSP) technology. Lab testing has shown that the new solutions deliver 46% more performance (measured data loading speed) at 30% lower price, compared to the previous generation of UCP solutions for SAP HANA.

hds_image_2.png

Hitachi Compute Blade 2500 with Xeon E7v3 powered CB520X B2 Blades

 

The new Intel Xeon processor – along with the rest of Intel’s product portfolio – is critical to Hitachi Data Systems’ ability to create the Social Innovation solutions to address the next generation of problems that face our companies, cities and global infrastructure.

 

The goal of Social Innovation is to deliver new solutions and services to make societies safer, smarter and healthier. Hitachi Data Systems is developing purpose-built solutions for a variety of markets that combine the power of connected devices and technologies – or the Internet of Things (IoT) – with operational technology (OT), machine-to-machine (M2M) and advanced data analytics, and best-of-breed IT infrastructure, all in a unified, fully integrated stack

hds_image_3.png

New Solution and Services for Social Innovation

 

The following are some of the solutions that Hitachi Data Systems announced last week that are ready for market:

 

 

Hitachi Visualization for Public Safety

 

This solution offers situational awareness solutions for law enforcement professionals by integrating multiple types of data from cameras, sensors, emergency dispatch and social media.

 

Hitachi Live Insight for Telecom

 

Hitachi Live Insight for Telecom offers enhanced network analytics that are specifically designed to support communication service providers and their customers’ ability to enhance network services using real-time insight.

 

Hitachi Live Insight for IT Operations

 

This cloud-based M2M analytics solution helps customers achieve optimal performance and availability from their IT infrastructure and gain operational intelligence at the lowest total cost of ownership (TCO).

 

Hitachi Clinical Repository for connected health

 

HCR for connected health empowers healthcare professionals with sophisticated data analytics tools and proven delivery methods to optimize patient care. It provides a multipurpose data repository where all clinical and nonclinical data can be stored, backed up, preserved and retrieved on a single, integrated platform.

 

Hitachi Live Insight Center of Excellence

 

Hitachi Live Insight Center of Excellence helps organizations confidently and swiftly test, customize and deploy advanced data analytics solutions, applications, platforms, and integrated solutions to support new business initiatives using a single point of coordination across various Hitachi and third-party resources.

 

You can find more information about Social Innovation here.

 

Software Defined Infrastructure

 

Underlying all these analytics platforms is best of breed software defined infrastructure. In addition to our expanded portfolio of software defined storage, content platforms, and converged solutions, we have expanded our midrange storage, expanded the range of our converged solutions and released a new hyper-converged solution for data lake applications and scale out applications.

 

Intel’s Xeon E7v3 announcement enhances our infrastructure story for servers and carries forward Hitachi’s unique innovations in server technology such as x86 LPARs, automated blade failover and incremental scaling from 2 to 8 socket systems. Together with our leading storage products, these features allow our solutions to deliver the very large memory and IO capabilities that these big data analytic and social Innovation Solutions demand.

 

Intel is a key contributor to all of our announcements in Social Innovation and Software Defined Infrastructure. Intel’s innovation in processor technology, combined with Hitachi Data Systems’ innovations in storage, servers and converged infrastructure design for scale-up and scale-out – together with real time analytics – will enable us to deliver on our goal for Social Innovation, to make societies that are smarter, safer, and healthier.

 

For more information on Hitachi Data Systems Connect 2015 announcements please see the following resources on social innovation and software defined infrastructure.

Read more >

New Insights Through Real-Time Analytics

By David Suh, Executive Director, Lenovo EBG Solutions & Ecosystem

 

 

New sensors, new devices, the Internet of Things. Social media creating new channels of communication. Zettabytes of digital content created every year. We’re talking about big data. Companies in all industries are taking advantage of big data to improve their business results. How do you turn big data into big-time results for your business? Through analytics. The pervasiveness of mainstream corporations running analytics workloads continues to grow. The level of investment in analytics confirms an increased awareness of its benefits in delivering business results. According to IDC, the worldwide business analytics market was $37.7 billion in 2013 and is expected to grow to $59.2 billion by 2018.

 

Real-Time Results

 

Traditional analytics often involves a batch process based on data from last week, last month or even last quarter. However, with increasing competitive demands, companies are looking for ways to make better decisions based on the latest data. The opportunity delivered by the ever-increasing performance of modern processors means data can be filtered, analyzed and turned to business insight in a matter of moments. For example, if your company is running a marketing campaign, wouldn’t it be useful to know in real-time how customers are responding and adjust the campaign to maximize results?

 

With real-time analytics, reports are updated continuously allowing companies to stop guessing. Experts develop an appropriate starting point and focus on what to measure.  Systems are then adapted, mixing expert intuition with the reality of the situation, as shown by real data from real customer behavior. The ability to adapt in minutes — sometimes seconds — can translate directly into business success.

 

Analytics-Driven Success

 

Success is driven by increased relevance of information to users. Mass personalization of web sites allows advertising to be targeted based on user behavior. Response rates improve. Fraud detection improves dramatically as new sources of information are leveraged in real time. Losses decrease. Healthcare facilities combine data from multiple sources and support the rapid decisions necessary in critical situations. Patient outcomes improve.

 

The ability to handle streaming data from multiple sources and the delivery of key insights in real-time, places unique pressures on the design of a computing system. The best insights are developed when larger data sets can be analyzed. Delivering sub-second response time requires intense compute power and low latency. Ingesting large volumes of data from multiple sources requires high scalability. Perhaps most important, is that systems critical in delivering key business insights must provide leadership availability.

 

Real-Time Analytics Tools

 

The tools to handle this onslaught of data rely on in-memory computing technology, holding the data entirely within main memory, rather than on solid-state or traditional hard drives. To deliver on the promise of real-time analytics, a computing system has to be designed to meet those requirements. System x X6 8-socket servers from Lenovo can support up to 12TB of memory and have self-healing RAS features that ensure extremely high availability. In a recent ITIC 2014-2015 Survey, System x was top-ranked in x86 server reliability.

 

The System x X6 family contains the Intel® Xeon® E7 4800/8800 v3 family of processors and delivers up to 56 percent more compute performance than previous generations. The latest X6 servers support both DDR3 and DDR4 memory and accommodate a variety of flash technologies.

 

Lenovo also has a large ISV ecosystem spanning thousands of applications. Lenovo combines its hardware infrastructure with ISV software to form analytics solutions that take the guesswork and lengthy time frames out of solution deployment. Clients recognize this benefit, as reinforced in the most recent customer satisfaction survey from Technology Business Research (TBR). TBR made note of the end-to-end solution capabilities of Lenovo System x servers and ranked System x first in customer satisfaction.

 

For more information on System x X6 servers and real-time analytics solutions, visit Lenovo.com.

Read more >

Shifting the Boundaries: Introducing the Dell PowerEdge R930

By Ashley Gorakhpurwalla, VP and GM, Dell Servers Solution

 

 

At the pace of today’s business transformation, workloads continue to evolve – datasets get bigger, stakeholders want more information and faster time to insights. But is the infrastructure evolving with the workloads? You’d be surprised at the amount of enterprises who would answer “no” to that question due to cost, system uptime, and other challenges.

dell-server.jpg

Enter the Dell PowerEdge R930, our most powerful server specifically designed for the most demanding enterprise applications such as in-memory databases, enterprise resource planning (ERP), and online transaction processing (OLTP). The R930, powered by Intel’s new Xeon E7 v3 family of processors, can flexibly scale to optimize transactions and operations while reducing latency, allowing customers to maximize application performance, accelerate workloads, protect mission-critical and data-intensive applications and reduce deployment time by 10x.

 

This ability to do what was once reserved for proprietary RISC servers and mainframes would never be possible without the continuing evolution of the microprocessor as well as Dell and Intel’s commitment to gather feedback from customers, who inspire us to design solutions that solve real-world business problems at the frontlines of IT.

 

At the heart of this shift in the boundary of what’s possible in computing is a fun fact: since the PowerEdge portfolio was introduced some 20 years ago, the x86 server market has grown more than 600 percent, while the UNIX market has been on a constant decline – shrinking 70 percent between 2000 and 2013(1). Even so, the RISC and mainframe markets present a $9.1 billion (USD) addressable market in 2015(2). With the PowerEdge R930 powered by Intel, these RISC customers can migrate from UNIX to Linux with ease and move to a more innovative, future-ready data center and reduce business risk (pun completely intended).

 

Along with Intel, Dell has been seeing more and more customers make that switch in order to take advantage of new technologies such as in-memory databases. Ameco Beijing, an aircraft maintenance, repair and operations leader, describes that by using these four-socket rack servers they’ve been able to reduce total cost of ownership by nearly 50 percent, achieve 99.99 percent system availability and improve SAP ERP performance by 3.5 times.

 

As Dell, Intel and their customers continue to push the boundaries on the next frontier of computing, we invite you to take a closer look at the PowerEdge R930 or share any feedback about Dell Servers by following us on Twitter.

 

 

 

 

(1) According to IDC Server Tracker

(2) According to IDC’s WW Server Workloads 2014 Model

Read more >

A New Order of All-Flash Hyper-Converged Storage: Atlantis HyperScale™ with Intel® Solid-State Drives

The software-defined storage (SDS) appliance concept of hyper-convergence is an attractive alternative to traditional Storage Area Network (SAN) and Network Attached Storage (NAS) for small to medium as well as large-sized businesses. Hyper converged infrastructure seems to be popular right now. So what is hyper-converged storage, and why should you care?

 

A hyper-converged storage system allows IT to manage compute, storage and virtualization resources as a single integrated system through a common tool set. The resulting system, often referred to as an appliance, consists of a server, storage, networking, and a hypervisor with management framework. Hyper-converged appliances can be expanded through the addition of nodes to the base unit to suit the compute and storage needs of a business in a manner know as scale-out. 

 

Hyper-converged scale-out storage differs from the older scale-up approach. In a scale-up system, the compute capacity is confined as storage is added, while in a scale-out system, new compute nodes can be added as the need for compute and storage arises. Scaling-up storage has often been cost prohibitive and often lack the necessary random IO performance (IOPS) needed by virtualized workloads. The scale-out approach is a more efficient use of hardware resources, as it moves the data closer to the processor. When scale-out is combined with solid-state drive (SSD) storage it offers far lower latency, better throughput, and increased flexibility to grow with your business. Scale-out is commonly used for virtualized workloads, private cloud, data bases, and many other business applications. 

 

Today, Atlantis Computing introduced a new all-flash hyper-converged appliance that extends the concept of a software defined scale-out storage to the cloud. Atlantis HyperScale™, a turn-key hyper-converged appliance, delivers all-flash performance storage based on Intel® SSD Data Center Family  for enterprise wide applications. What is different is that HyperScale™ based on Atlantis USX pools existing enterprise SAN, NAS , and DAS storage and accelerates its performance by use of Intel SSDs. By abstracting the storage, USX delivers virtual storage volumes to enterprise applications. It further provides a context aware data service that performs deduplication and IO acceleration in real time for quality of service even when using public cloud services.

 

The Intel SSD Data Center Family holds the key for the HyperScale™ all-flash appliance. The Intel® SSD is designed for read- and write-intensive storage workloads with fast, consistent performance for smooth data center operation. The reliability, data integrity, and cost-effectiveness of the storage volumes in the HyperScale™ appliance helps protect your data with enterprise class features at a reasonable cost. The architecture of Intel’s SSDs ensure the entire read and write path and logical-block address (LBA) mapping has data protection and error correction. Many enterprise workloads depend not only on reliable data, but consistency in how quickly that data can accessed.  Consistent latency, quality of service, and bandwidth no matter what background activities are happening on the drive is the basis of the Intel Data Center Family.  Rigorous testing ensures a highly reliable SSD with consistent performance.

 

Today HyperScale™ all flash hyper-converged appliance introduces a new order of scale-out storage. The turn-key appliance can eliminate scalability and performance bottlenecks and allow computing and storage capacity to grow with the business need.

 

Read more >