Recent Blog Posts
Anyone involved in the business of data storage knows we have many serious challenges on our hands: an explosion of unstructured data, ever-tougher compliance and regulatory issues, and increasing storage complexity.
In addition, storage is one of the roadblocks that impede the journey to cloud. Why? Today’s storage solutions lack interoperability, have different management consoles, and do not scale well. This historical reality for enterprise storage creates an enormous management challenge for data center operators implementing cloud environments. Without fundamental changes, these problems will be magnified in the data centers of tomorrow.
To move forward into the world of software-defined infrastructure and cloud data centers, IT teams need an open, intelligent, and flexible framework for dynamically managing storage resources. These resources will be from different vendors with different characteristics: They will include open and proprietary products; they will utilize different protocols; they will have different levels of performance and reliability. With enterprise demand focused on the choice of best-in- breed storage solutions across a wide array of usage scenarios, the requirement for interoperable solutions is only going to grow more acute.
At Intel, we are committed to working with our industry partners and the broader storage ecosystem to create this open framework—and clear the path to software-defined storage (SDS).
From Intel’s perspective, SDS is really about bringing cloud benefits to storage, including auto-provisioning, self-service models, and single-pane-of-glass management. Historically, these benefits have been elusive because of the lack of standards, interoperability, and common management across the wide and expanding range of storage systems.
While our full vision for an SDS framework will have to wait for another post, one of the key elements is a central control plane that unifies storage management and enables the orchestration of storage resources.
A key enabler of the new SDS architecture is a single-pane-of-glass management control plane.
There is, however, a big caveat here: In order for the SDS control plane to fully resolve management challenges, it must be able to interface with and control a large variety of storage systems—which brings us back to the need for open APIs that drive interoperable systems. No single storage provider can solve this problem; this is a challenge that can be addressed only by a broad community working together to deliver common storage standards.
This is where Intel—with its long history of driving standards, enabling open APIs, working with and developing communities, and cultivating open source capabilities where they make sense—can help. Intel is very supportive of open approaches that enable simplified management and interoperability to benefit end users. Our goal is to work with the entire storage ecosystem to support the development of open APIs that enable interoperability and simplify storage infrastructure management.
With these thoughts in mind, we support the just-announced Project CoprHD initiative to create an open source version of the EMC ViPR Controller. The project makes the code for the ViPR Controller, including all of the storage automation and control functionality, open for community-driven development. We think moves like this are a step in the right direction.
If you’d like to contribute to the industry’s efforts to drive open approaches to SDS, please connect with us.
Here’s an example: An employee receives an email—apparently from a legitimate source—asking him to update an account password. As requested, he enters his old password and then types in a new one. Unfortunately, that’s all it takes for a hacker to steal $200,000 from his small business’s bank account, much of it unrecoverable. It’s a simple but extremely costly mistake, and it could happen to anyone. In fact, over the past few years, it’s been happening a lot. Did you know that some of the biggest security breaches in recent memory—including attacks on Sony, Target, and JPMorgan Chase—started with a phishing email sent to an employee?
If you own a business, you’re at risk. No matter how diligent you and your employees are about security, mistakes can happen. And the results can be disastrous. Virus protection and other software solutions—though useful and necessary—only get you so far, especially if your business is using PCs that are more than two years old. The problem is that software-only security solutions from even a few years ago can’t keep up with today’s cybercriminals and are not sufficient to protect your devices and vital business data.
So what can you do to stay safe? Don’t rely on software alone. You need your hardware to do the heavy lifting.
What You Can Do to Make Your Business More Secure
New desktops with at least 4th generation Intel Core processors have hardware-enhanced security features that allow hardware and software to work together, protecting your business from malware and securing the important, private data and content you create and share. Features such as Intel® Identity Protection Technology (Intel® IPT), Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI), and others are crucial to making your business more secure.
With hackers working around the clock to identify the next potential victim, it’s more important than ever for you to prioritize security. Read the new Intel white paper to learn more about what’s at risk, five new hardware-enhanced security features that help combat cybercrime, and why replacing your pre-2013 PC is a smart move.
In the meantime, join the conversation using #IntelDesktop, and get ready to rediscover the desktop.
This is the fifth installment of the Desktop World Tech Innovation Series.
To view more posts within the series click here: Desktop World Series
“I looked back to the beginning of the technology I considered fundamental—the planar transistor—and noticed that the [number of components] had about doubled every year. And I just did a wild extrapolation saying it’s going to continue to double every … Read more >
The post Moore’s Law: exponential opportunity for education and empowerment appeared first on CSR@Intel.
Not that long ago, data analysis focused mainly at looking backward to understand things that happened in the past. Today, thanks to Moore’s Law and the resultant advances in computing and memory technologies, analytics can now tell us what is happening in real time and help us predict what will happen in the days to come.
This landmark shift in our ability to extract value out of data is a key enabler for the new digital service economy. In this new era, an organization’s competitive edge increasingly hinges on its ability to turn an avalanche of data into actionable insights that improve operations and guide the creation of essential new products and services.
This isn’t an opportunity that is limited to Web 2.0 businesses or high-tech powerhouses. The opportunity for pervasive analytics and insights spans virtually all industries—from healthcare to transportation, from banking to manufacturing.
With powerful analytics solutions, physicians can diagnosis illnesses faster and create personalized treatment plans. Retailers can better understand buying behaviors to stock up on the products people are most likely to need. Car manufacturers can use predictive failure analysis to make repairs proactively—before customers find themselves stuck on the side of the road.
While these examples are diverse, they all share a common central focus: the combination of big data and high-powered computing solutions with sophisticated technologies like in-memory analytics that accelerate time to insight. And this is where the latest generation of Intel® Xeon® processors enters the picture.
The new Intel® Xeon® processor E7 v3 family is designed to accelerate real-time analytics on enormous datasets with sizes of multi-terabyte and even petabyte-scale. With up to 20 percent more cores, threads, cache, and system bandwidth than previous-generation processors, the Intel Xeon processor E7 v3 family makes fast work of complex, high-volume transactions and queries.
In addition, we’ve added an expanded memory footprint to support in-memory analytics—one of the keys to gaining immediate insights from big data. We’ve also added sophisticated technologies like Intel® Advanced Vector Extensions to boost simulation performance, Intel® Transactional Synchronization Extensions (Intel® TSX) to accelerate OLTP performance, and Intel® Run Sure Technology to support mission-critical uptime and advanced data integrity.
Let’s consider a couple of real-life examples of the potential to put powerful analytics and simulation tools to work in conjunction with Intel Xeon processors to help organizations extract value from big data in real time:
- FarmLogs uses analytics tools and high-powered computing solutions to help farmers make their land more productive. It achieves this goal by putting sensors on farm machines, connecting the machines to Internet, and analyzing data streams in real time. With an instant view of how different areas of their fields are performing, farmers can adjust seed, fertilizer and other variables to best match the field conditions. This helps them avoid waste and improve the productivity of the field.
- Pacific Northwest Seismic Network uses big data analytics to provide the public and others with early warnings about earthquakes and ground motions. The organization is now working to develop the ability to warn people about earthquakes before the shaking has reached them—to give them precious seconds, or maybe even a minute, to protect themselves and those around them.
There is a broader theme to consider here: Data-driven insights can improve the human condition—whether it’s increasing food production, saving lives when earthquakes strike, or meeting some other goal that can be achieved only with real-time insights gleaned from massive amounts of data.
At a business level, data analytics are increasingly tied to competitiveness. In fact, we believe that within three to five years data analytics will become the No. 1 data center application in terms of importance to the business. In virtually every industry, organizations will change their processes to take advantage of analytics to improve operations, products, and personalization.
To enable this transformation, Intel is working to democratize actionable insights by making analytics and simulation tools easy to deploy, easy to use, and efficient to run. And, as always, we are working with a broad ecosystem to provide open solutions and flexible tools to meet diverse customer needs.
For a deeper dive into the capabilities of the Intel® Xeon® processor E7 v3 family, visit intel.com/ITcenter.
For those of us that follow technology, it seems we constantly hear that big data will change life as we know it. While this is an interesting perspective, it misses an important point. Data, by itself, is not a valuable asset for an organization. The industry transforming power of data comes from insights derived from analytics on that data – all driven by powerful computing systems.
To stay competitive in today’s world, business intelligence solutions are increasingly being woven into critical, real-time business processes across all industries. This is a game changer that can help businesses improve decision making, achieve better results, better connect with customers, and remap the ways they deliver value to their customers. In fact, entire industries are being disrupted by the ability to harness the power of advanced analytics to business advantage.
Despite this promise, customers face many hurdles in fully harnessing the capabilities of advanced analytics. Common struggles include trust, speed, and scale. Can I trust the data and analysis? Do I have the performance for real-time decisions? Can I scale and unify massive datasets? It’s worth noting that just 27 percent of executives say their big data projects are successful, and 65 percent cite determining value as their biggest barrier to adoption.
As we launch the new Intel® Xeon® processor E7 v3 family, we tackle these customer challenges head on – delivering the world’s fastest system for real-time analytics** on a trusted, mission-critical, highly scalable platform. For the most complete information, on performance records see the full list.
Servers based on the new Intel Xeon processor E7 v3 family provide exceptional performance and scalability for real-time analytics operating core business data. These servers offer the industry’s largest memory per socket (12TB in 8S) to support in-memory data analytics, which improves analytics performance by moving data closer to the processors.
With 20 new world record performance awards, the Intel Xeon processor E7 v3 platform showcases the benefits of the Haswell microarchitecture, 20% more cores, 20% more last-level cache, up to 32 sockets, and unique features like Intel® Transactional Synchronization Extensions (Intel® TSX). Intel TSX helps provide up to 6X online transaction processing (OLTP) performance for optimized database solutions, via a mechanism that accelerates multi-threaded workloads by dynamically exposing otherwise hidden parallelism***. Xeon E7 v3 also delivers a trusted platform required for mission critical applications including the added cryptographic performance via new instructions like Intel® Advanced Encryption Standard New Instructions (Intel® AES-NI) and Intel® Advanced Vector Extensions 2 (Intel® AVX2) and supports Intel® Run Sure Technology for maximum uptime.
So what kind of use cases can all of this computing horsepower provide to the enterprise? I’d like to share a couple of interesting examples of big data analytics powered by Intel Xeon processor based systems.
Nippon Paint is one of Asia’s leading Paint and Coating companies with 57 manufacturing facilities and operations spanning 15 countries and regions. They have deployed SAP HANA to accelerate social media analytics. With sophisticated real-time analytics, the company can capture consumer behaviors and preferences more accurately and quickly. Among other benefits, the insights gained through analytics help Nippon Paint create targeted marketing campaigns, improve customer interactions, and develop products that meet emerging customer requirements.
In another example, a power company used SAS Analytics software to analyze data gathered every 15 minutes from 38,000 smart meters in seven cities to predict the amount of electricity needed at certain times. This forecasting is important because electricity cannot be stored—it needs to be produced at the correct levels when it is needed. With the ability to frequently analyze smart meter readings, the utility improved its forecast accuracy by 9 percent, for a savings of $9 million.
A common thread to all of these stories is the use of the Intel Xeon processor based platform as the foundation for better business intelligence. As your data and analytics workloads continue to grow, these powerful servers can help you keep pace with growth while turning big data into big opportunities for your business.
For a closer look at the new Intel® Xeon® processor E7 v3 product families, visit intel.com/xeonE7.
Intel, the Intel logo, and Xeon are trademarks of Intel Corporation in the United States and other countries.
* Other names and brands may be claimed as the property of others.
**World’s fastest system for analytics claim based on servers using Intel® Xeon® processer E7-8890 v3 being the performance leader of SAP BW-EML* @ 1B scalable records and 2B records as of 5 May 2015. For the most complete information, see http://www.intel.com/E7v3records.
***Up to 6x business processing application performance improvement claim based on SAP* OLTP internal in-memory workload measuring transactions per minute (tpm) on SuSE* LINUX Enterprise Server 11 SP3. For configuration and result details, see http://www.intel.com/E7v3records.
Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors.
Performance tests, such as SYSmark and MobileMark, are measured using specific computer systems, components, software, operations and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products. For more complete information visit http://www.intel.com/performance.
By Todd Brannon, Director of Product Marketing, Unified Computing
At the upper end of our UCS server portfolio we feature systems that deliver the large memory footprints and core-counts that performance-intensive workloads demand. Today’s announcement of the Intel® Xeon® processor E7 v3 Family brings powerful new capabilities to this class of system.
Why is this important?
Our customers are striving to become intelligence-driven throughout their operations in order to create a perpetual and renewable competitive edge. Taking a long-term view in choosing the right infrastructure is essential; here are two reasons why:
- You never hear about a big data environment getting smaller. Massive increases in data volume mean these environments will inevitably grow, and for many, this will mean continuously expanding clusters of hundreds or thousands of servers.
- Data is the lifeblood of the digital enterprise. As the use of big data becomes pervasive and critical to day-to-day decision-making, the performance and predictability of these computing platforms will become increasingly paramount to the success of the business. Choose partners you can trust.
IT departments need an infrastructure that is designed for deployment and operation at scale. Down at the server level, particularly for scale up workloads, they will always need more horsepower, and it can’t come with additional power and cooling load. Cisco UCS and the new Intel Xeon processor E7 v3 family lineup deliver on both of these vectors.
Traditional servers, essentially designed as stand-alone devices, aren’t built with needs of these new big-data environments in mind. This is where Cisco UCS and our Integrated Infrastructure for big data come in. We’re bringing customers a platform optimized for long-term success because of its unified design, inherent scalability, advanced reliability and robust security. By abstracting the identity and configuration of individual servers and managing exclusively through policy constructs, UCS allows IT teams to manage up to 10,000 servers as a single pool of resources. Many in the industry are still focused on the issues of density and power, which most customers consider basic computing table stakes today. UCS is designed to optimize the most important resource in the data center: people’s time.
Intel and Cisco have a special partnership because UCS gives customers the most operationally effective platform to harness the performance and efficiency Xeon delivers. The new class of systems we’re releasing today can scale to 72 processors and 6TB memory footprint, ideal for the latest generation of in-memory database workloads as well as traditional ERP applications and databases. Cisco was the first server vendor to publish results on the new TPCx-HS benchmark and it’s our intention to continue to lead the industry in terms of performance for these workloads.
Disruption and opportunity will continue to accelerate the realm of analytics. Cisco and Intel are partnering to build the platforms customers need to build for the long haul. To stay up to date, please follow us on twitter at #CiscoUCS and #CiscoDC.
By Nick Winkworth, Product Marketing – Compute Platform, Hitachi Data Systems
If, like me, you have lived in the “technology bubble” for any length of time, you have probably become accustomed, perhaps even blasé, about the incredibly rapid – and ever accelerating – pace of change in our industry.
Today, everything we touch seems to be generating data, from the heart rate sensor on your smart watch to the thermostat on your wall to the jet engine on the airplane that takes you on your next vacation or business trip – to say nothing of all the videos, blog posts, tweets and emails we all generate every day.
The big question that companies like Hitachi are now starting to ask is “how can all this data improve our lives and make this planet a better place to live?” – But before we can even begin to answer that question we must be build the infrastructure to capture, store and process that data…
And that’s no easy task.
This is not something that can be done from a standing start. The companies that will succeed started out on this long road many years ago. They have learned from each new generation of technology; they have added, changed, tweaked and improved along the way to get to where are now. Which is just the starting place for the next step.
Sure, there’s the occasional “giant leap”, but if you look closer you will inevitably see a longer story of incremental change, prior inventions and ideas that provided a foundation.
Hitachi HIPAC MK 1 (1957)
Hitachi joined the information age in the late 1950’s with one of the earliest electronic computers, the HIPAC-MK1. Since then, the company has learned how to meet the needs of an incredibly diverse range of customers around the globe in many industries, how to consistently deliver high quality and how to scale to the highest capacities demanded by some of the world’s biggest organizations.
Today, our enterprise storage products, servers and converged infrastructure solutions are built on that strong foundation, and today a key part of that foundation is Intel’s processor technology.
With the introduction of the new Intel Xeon E7v3 Processor Family this week we take another step forward together, propelled not by a single breakthrough, but improvements built upon many years of hard work and customer feedback.
EVOLUTION OF HITACHI INTEL BASED BLADE SERVERS
The new Intel Xeon E7v3 Processor Family allows Hitachi to carry forward the support of the unique innovations that have been developed and advanced over many years such as the support of native Logical Partitions (LPARs) and the ability to scale a blade’s processor, memory and IO capacity incrementally by combining blades:
Without this legacy of continuous innovation it would be impossible to keep up with the exponentially increasing demand for capacity and performance for which today’s market is just the tip of the iceberg.
New SAP Solutions deliver over 40% more performance at 30% lower price
This ever increasing demand does not come with an ever increasing budget, of course, and this week’s introduction of the Intel Xeon E7v3 Processor Family plays a critical role in Hitachi Data Systems ability to deliver big data analytics solutions for customers such as SPAR Austria Group at a price point that stays within budget, even as needs grow. Like many of our customers, SPAR’s business depends on rapid analysis of an increasing volume of unstructured data which has been running for the past year on Hitachi Compute Blade 2000 systems with the Intel Xeon E7 Processors and Hitachi VSP storage platforms.
Together with Intel’s announcement, we are introducing the next generation in that lineage: SAP HANA scale-up solutions based on our new CB520X B2 blades (powered by Intel’s new Intel Xeon E7v3 Processor Family), along with our new CB2500 blade chassis and the next generation of (Intel powered) Virtual Storage Platform (VSP) technology. Lab testing has shown that the new solutions deliver 46% more performance (measured data loading speed) at 30% lower price, compared to the previous generation of UCP solutions for SAP HANA.
Why is this so important? Social Innovation
One thing that sets Hitachi and Hitachi Data Systems apart is our vision. Hitachi’s goal is to deliver solutions and services to make societies safer, smarter and healthier. This is what we call Social Innovation.
As announced last week at Connect 2015, Hitachi Data Systems is bringing to market a number of purpose-built solutions to address a variety of Social Innovation challenges across many industries. These solutions combine the power of connected devices and technologies – or the Internet of Things (IoT) – with operational technology (OT), machine-to-machine (M2M) and advanced data analytics, and best-of-breed IT infrastructure, all in a unified, fully integrated stack.
Underlying all these solutions are analytics platforms built on best of breed infrastructure including Hitachi storage, servers, networking, content platforms, and converged solutions – all built on a foundation of Intel technology.
Today’s Intel Xeon E7 v3 Processor Family announcement strengthens Hitachi’s infrastructure portfolio and maintains the momentum of innovation, from both Hitachi and Intel, that is required to ensure that the demands of big data and social Innovation Solutions can be met today and in the future.
The Power Behind Social Innovation, Software Defined Architecture and the Internet of Things with Hitachi and Intel
By Hu Yoshida, CTO, Hitachi Data Systems
Last week in Las Vegas, Hitachi and Hitachi Data Systems held our Connect 2015 event where we announced new offerings around Social Innovation, Software Defined Infrastructure and the Internet-of-Things. This week we are participating at Sapphire in Orlando where we are announcing new solutions to support real time analysis of big data with SAP HANA.
One common thread that ties together all these new Hitachi solutions is the power of Intel technology.
As Hitachi transforms from a provider of industry leading data storage technologies to a global provider of solutions that address some of the most challenging problems on the planet, we are delighted to be able to work closely with Intel to keep up with our customers requirements to process the exponentially accelerating volumes of data that will come from machine data, networks, videos and patient health records, to name just a few.
New SAP Solutions deliver over 40% more performance at 30% lower price
This week’s introduction of the Xeon E7v3 family of processors is another step in a long legacy of innovation which has enabled Hitachi Data Systems to solve big data analytics problems for customers such as SPAR Austria Group whose business depends on rapid analysis of very large unstructured data running on Hitachi Compute Blade 2000 systems and VSP storage platforms.
Together with Intel’s announcement, we are introducing the next generation in that lineage: SAP HANA scale-up solutions based on our new CB520X B2 blades (powered by Intel’s new Xeon E7v3 processors), along with our new CB2500 blade chassis and the next generation of (Intel powered) Virtual Storage Platform (VSP) technology. Lab testing has shown that the new solutions deliver 46% more performance (measured data loading speed) at 30% lower price, compared to the previous generation of UCP solutions for SAP HANA.
Hitachi Compute Blade 2500 with Xeon E7v3 powered CB520X B2 Blades
The new Intel Xeon processor – along with the rest of Intel’s product portfolio – is critical to Hitachi Data Systems’ ability to create the Social Innovation solutions to address the next generation of problems that face our companies, cities and global infrastructure.
The goal of Social Innovation is to deliver new solutions and services to make societies safer, smarter and healthier. Hitachi Data Systems is developing purpose-built solutions for a variety of markets that combine the power of connected devices and technologies – or the Internet of Things (IoT) – with operational technology (OT), machine-to-machine (M2M) and advanced data analytics, and best-of-breed IT infrastructure, all in a unified, fully integrated stack
New Solution and Services for Social Innovation
The following are some of the solutions that Hitachi Data Systems announced last week that are ready for market:
Hitachi Visualization for Public Safety
This solution offers situational awareness solutions for law enforcement professionals by integrating multiple types of data from cameras, sensors, emergency dispatch and social media.
Hitachi Live Insight for Telecom
Hitachi Live Insight for Telecom offers enhanced network analytics that are specifically designed to support communication service providers and their customers’ ability to enhance network services using real-time insight.
Hitachi Live Insight for IT Operations
This cloud-based M2M analytics solution helps customers achieve optimal performance and availability from their IT infrastructure and gain operational intelligence at the lowest total cost of ownership (TCO).
Hitachi Clinical Repository for connected health
HCR for connected health empowers healthcare professionals with sophisticated data analytics tools and proven delivery methods to optimize patient care. It provides a multipurpose data repository where all clinical and nonclinical data can be stored, backed up, preserved and retrieved on a single, integrated platform.
Hitachi Live Insight Center of Excellence
Hitachi Live Insight Center of Excellence helps organizations confidently and swiftly test, customize and deploy advanced data analytics solutions, applications, platforms, and integrated solutions to support new business initiatives using a single point of coordination across various Hitachi and third-party resources.
You can find more information about Social Innovation here.
Software Defined Infrastructure
Underlying all these analytics platforms is best of breed software defined infrastructure. In addition to our expanded portfolio of software defined storage, content platforms, and converged solutions, we have expanded our midrange storage, expanded the range of our converged solutions and released a new hyper-converged solution for data lake applications and scale out applications.
Intel’s Xeon E7v3 announcement enhances our infrastructure story for servers and carries forward Hitachi’s unique innovations in server technology such as x86 LPARs, automated blade failover and incremental scaling from 2 to 8 socket systems. Together with our leading storage products, these features allow our solutions to deliver the very large memory and IO capabilities that these big data analytic and social Innovation Solutions demand.
Intel is a key contributor to all of our announcements in Social Innovation and Software Defined Infrastructure. Intel’s innovation in processor technology, combined with Hitachi Data Systems’ innovations in storage, servers and converged infrastructure design for scale-up and scale-out – together with real time analytics – will enable us to deliver on our goal for Social Innovation, to make societies that are smarter, safer, and healthier.
By David Suh, Executive Director, Lenovo EBG Solutions & Ecosystem
New sensors, new devices, the Internet of Things. Social media creating new channels of communication. Zettabytes of digital content created every year. We’re talking about big data. Companies in all industries are taking advantage of big data to improve their business results. How do you turn big data into big-time results for your business? Through analytics. The pervasiveness of mainstream corporations running analytics workloads continues to grow. The level of investment in analytics confirms an increased awareness of its benefits in delivering business results. According to IDC, the worldwide business analytics market was $37.7 billion in 2013 and is expected to grow to $59.2 billion by 2018.
Traditional analytics often involves a batch process based on data from last week, last month or even last quarter. However, with increasing competitive demands, companies are looking for ways to make better decisions based on the latest data. The opportunity delivered by the ever-increasing performance of modern processors means data can be filtered, analyzed and turned to business insight in a matter of moments. For example, if your company is running a marketing campaign, wouldn’t it be useful to know in real-time how customers are responding and adjust the campaign to maximize results?
With real-time analytics, reports are updated continuously allowing companies to stop guessing. Experts develop an appropriate starting point and focus on what to measure. Systems are then adapted, mixing expert intuition with the reality of the situation, as shown by real data from real customer behavior. The ability to adapt in minutes — sometimes seconds — can translate directly into business success.
Success is driven by increased relevance of information to users. Mass personalization of web sites allows advertising to be targeted based on user behavior. Response rates improve. Fraud detection improves dramatically as new sources of information are leveraged in real time. Losses decrease. Healthcare facilities combine data from multiple sources and support the rapid decisions necessary in critical situations. Patient outcomes improve.
The ability to handle streaming data from multiple sources and the delivery of key insights in real-time, places unique pressures on the design of a computing system. The best insights are developed when larger data sets can be analyzed. Delivering sub-second response time requires intense compute power and low latency. Ingesting large volumes of data from multiple sources requires high scalability. Perhaps most important, is that systems critical in delivering key business insights must provide leadership availability.
Real-Time Analytics Tools
The tools to handle this onslaught of data rely on in-memory computing technology, holding the data entirely within main memory, rather than on solid-state or traditional hard drives. To deliver on the promise of real-time analytics, a computing system has to be designed to meet those requirements. System x X6 8-socket servers from Lenovo can support up to 12TB of memory and have self-healing RAS features that ensure extremely high availability. In a recent ITIC 2014-2015 Survey, System x was top-ranked in x86 server reliability.
The System x X6 family contains the Intel® Xeon® E7 4800/8800 v3 family of processors and delivers up to 56 percent more compute performance than previous generations. The latest X6 servers support both DDR3 and DDR4 memory and accommodate a variety of flash technologies.
Lenovo also has a large ISV ecosystem spanning thousands of applications. Lenovo combines its hardware infrastructure with ISV software to form analytics solutions that take the guesswork and lengthy time frames out of solution deployment. Clients recognize this benefit, as reinforced in the most recent customer satisfaction survey from Technology Business Research (TBR). TBR made note of the end-to-end solution capabilities of Lenovo System x servers and ranked System x first in customer satisfaction.
For more information on System x X6 servers and real-time analytics solutions, visit Lenovo.com.
By Ashley Gorakhpurwalla, VP and GM, Dell Servers Solution
At the pace of today’s business transformation, workloads continue to evolve – datasets get bigger, stakeholders want more information and faster time to insights. But is the infrastructure evolving with the workloads? You’d be surprised at the amount of enterprises who would answer “no” to that question due to cost, system uptime, and other challenges.
Enter the Dell PowerEdge R930, our most powerful server specifically designed for the most demanding enterprise applications such as in-memory databases, enterprise resource planning (ERP), and online transaction processing (OLTP). The R930, powered by Intel’s new Xeon E7 v3 family of processors, can flexibly scale to optimize transactions and operations while reducing latency, allowing customers to maximize application performance, accelerate workloads, protect mission-critical and data-intensive applications and reduce deployment time by 10x.
This ability to do what was once reserved for proprietary RISC servers and mainframes would never be possible without the continuing evolution of the microprocessor as well as Dell and Intel’s commitment to gather feedback from customers, who inspire us to design solutions that solve real-world business problems at the frontlines of IT.
At the heart of this shift in the boundary of what’s possible in computing is a fun fact: since the PowerEdge portfolio was introduced some 20 years ago, the x86 server market has grown more than 600 percent, while the UNIX market has been on a constant decline – shrinking 70 percent between 2000 and 2013(1). Even so, the RISC and mainframe markets present a $9.1 billion (USD) addressable market in 2015(2). With the PowerEdge R930 powered by Intel, these RISC customers can migrate from UNIX to Linux with ease and move to a more innovative, future-ready data center and reduce business risk (pun completely intended).
Along with Intel, Dell has been seeing more and more customers make that switch in order to take advantage of new technologies such as in-memory databases. Ameco Beijing, an aircraft maintenance, repair and operations leader, describes that by using these four-socket rack servers they’ve been able to reduce total cost of ownership by nearly 50 percent, achieve 99.99 percent system availability and improve SAP ERP performance by 3.5 times.
As Dell, Intel and their customers continue to push the boundaries on the next frontier of computing, we invite you to take a closer look at the PowerEdge R930 or share any feedback about Dell Servers by following us on Twitter.
(1) According to IDC Server Tracker
(2) According to IDC’s WW Server Workloads 2014 Model
There are many situations where we want to use multiple programming environments to to develop our IoT applications, for example,
We prefer to do image processing using OpenCV in C++ than doing it… Read more
There are multiple ways to write code for the Intel Edison (Arduino IDE, vi, emacs, SFTP, Intel XDK, etc.), here’s how you can do it from your browser!
Codebox is an open source web based IDE that… Read more
The First Rule of Programming Club is “NEVER use a variable before it has been initialized.” Case in point:
From time to time I’m asked, as part of my job, to optimize/parallelize some code. I had… Read more
Recently I participated in a panel of corporate “intrapreneurs” and was asked to comment on how we grow people to be good innovators at Intel. I tried to contribute a perspective that was different from the other panelists, who did … Read more >
Ever wish ordering a pizza was as simple as clicking a button? A team from Artefact here in Seattle brought the vision to life in the form of PizzaTime: PizzaTime is a fun(ctional), tongue-in-cheek prototype of a smart clock powered … Read more >
A powerful trend sweeping through today’s data centers can be summed up in a single word: Containers. Container-based virtualization allows you to package applications and the required software libraries to create units of computing that are scalable and portable. Today, Nick Weaver shared Intel’s vision for containers and the mainstream democratization of sophisticated cloud computing at CoreOS Fest in San Francisco. If the level of the engagement at the conference is any indication, I expect the industry to join us on the path toward broad scale deployments of advanced, easy to consume, hyperscale technologies.
The container concept is a central feature of hyperscale cloud technology. Hyperscale technologies promise to make software developers more productive, data center infrastructure more efficient, and IT resources easier to deploy and consume. It is not hyperbole to say that broad proliferation of hyperscale cloud technology could have an impact on data centers that is similar to the impact that virtualization delivered years ago.
While it was once mainly in the domain of highly advanced data centers, the orchestrated container concept is now emerging as a viable open-source solution for mainstream data centers that want to operate with greater efficiency, flexibility, and performance. It will enable organizations to deploy highly efficient cloud technology that is on par with that used by the most sophisticated cloud service providers.
At Intel, we fully support the addition of container technologies to the mainstream data center, and are actively participating in the ecosystem to bring containers to data centers of all levels of sophistication. With this goal in mind, Intel is collaborating with CoreOS and the Kubernetes community to advance the company’s Tectonic stack.
Tectonic is a commercial distribution of the combined CoreOS portfolio and Kubernetes. Kubernetes is a Google-led open source project for application scheduling. This combination makes Tectonic a unique offering that provides container software and scheduling in an integrated package. The Tectonic suite delivers a complete solution for businesses transitioning to a distributed, container-based software infrastructure for both private, public, and hybrid cloud.
As part of this newly announced collaboration, we’re working with CoreOS and associated communities to make Tectonic more scalable. To help Tectonic reach customers as quickly as possible, we are enabling the development of easy to order and consume appliances. We expect that this work will lead to ready-to-ship, hyperscale cloud systems that coincide with the future GA release of the Tectonic suite. We were excited to have two ecosystem partners, Supermicro and Redapt, signal their intent to work with CoreOS and Intel to bring rapid time to market for Tectonic. We were also pleased to collaborate with Supermicro to demonstrate this combination running on their hardware at CoreOS Fest today.
I caught up with CoreOS CEO Alex Polvi at the conference, and here’s what he had to say about this early demonstration of Tectonic in action: “Today Intel gave us a glimpse of the future. This is the beginning of a deep partnership to enable businesses to take advantage of containers, distributed systems, and the next generation of infrastructure.”
This collaborative effort builds on Google’s substantial investments in CoreOS and Kubernetes. In April, CoreOS announced that Google Ventures had invested $12 million in CoreOS and its efforts to bring Kubernetes to the enterprise.
Using features of the Intel platform, the Intel SDI-X team has innovated with CoreOS to overcome cluster scalability limits, while raising the bar on performance.
At Intel, we are excited to be a part of the broad ecosystem that is working to bring containers to the masses. This is a key component of our desire to see private and hybrid cloud computing grow significantly over next few years.
For more information on the innovation Nick and his team is delivering, take a look at http://nickapedia.com/.
This week I had the honor of being an invited guest at the Hitachi innovation forum. A panel of executives opened the conference with a theme of innovation, inclusivity, and sustainability. The Chairman and CEO, Hiroaki Nakanishi, showcased the sizeable breadth of global products and industries of this massive conglomerate and emphasized how their solutions contribute meaningfully to the lives of people every day. After highlighting their strategy to push further and do more with to improve mobility, energy, water, and healthcare of people around the world, he dared the attendees, including customers, industry experts and business partners, to ask questions and give their thoughts.
It was a quiet and reserved audience. Not every day does the leader of a $93B company ask for unscripted questions in a public venue in front of hundreds of attendees and media.
For those who know me or read my blogs, you are likely aware I am neither shy nor am I reserved to play-nice at marketing events. I am very passionate about cybersecurity and live for moments when we as an industry can constructively discuss security, the impact it holds on our world, and the role technology companies have in protecting our tech-rich future.
So I broke the silence, stood up, and asked the Hitachi executive panel a straightforward no-nonsense question:
Hitachi solutions are connecting and enriching the lives of people around the world. Your vision is truly inspiring. But with such power and innovation comes new risks which put in jeopardy the security, privacy, and safety of your devices, solutions, and customers. What is Hitachi’s strategy and commitment to secure and protect products and people in the future?
Before I get to their response, let me say I truly enjoy sitting down with business leaders from around the world and discussing cybersecurity issues. In almost all cases, executives have the best of intentions, but are well groomed by their marketing and legal teams to recite vetted and scripted statements including how they are operating or making everything securely, put customers privacy first, and work hard to protect from the nebulous ‘hackers’ of the world. This is the blasé norm and the expected underwhelming safe response. I don’t expect anything different and I am rarely impressed.
But standing in front of this panel, I was surprised.
Mr. Nakanishi paused, took a deep breath, and gave one of the best answers I have heard from any industry leading executive. He first commented on the fact it is an important question and a facet his company understands is becoming more significant. With a genuine tone of concern and deliberate focus, he then answered they are striving to achieve harmony. His viewpoint was Hitachi incorporates security, privacy, and safety in balance with customers’ expectations of value, usability, and functionality. He went on to discuss how these expectations are a variable situation which changes over time. He committed his company will strive for continuous incremental progress to remain ‘harmonized’ with the market needs. The Chief Executive for Americas, Jack Domme, followed up with a supporting position of seeking the right balance of security to enable solutions in ways to both protect but not undermine their value.
Wow. Harmony and balance. This is the most pragmatic position for cybersecurity. Whether we as customers consciously know it or not, this is what the market truly desires. We must look past the peaks and valleys of security theatre, to see we all want technology to make our lives better and there is an optimal level of security which must align to ‘protect to enable’ in order to make it a reality.
I am always wary of the slick sales and marketing answers, which paint unrealistic positions and overinflated commitments, which are so common nowadays, to artificially smooth over public concerns. None of that nonsense was present.
Well done Hitachi. Thank you for recognizing the challenges, describing a realistic strategy, and talking in a direct and honest way. Nobody has security figured out or solved all of the problems, but you have successfully recognized the right factors to tune for the very best results now and in the future. You have earned this cybersecurity strategist’s respect.
IT Peer Network: My Previous Posts
May 5, 2015 9 AM – 10 AM PACIFIC TIME
Senior VP & GM
Intel Data Center Group
Join us for this live, interactive… Read more
President Obama recently unveiled the Precision Medicine Initiative — a bold new enterprise to revolutionize medicine and generate the scientific evidence needed to move the concept of precision medicine into every day clinical practice. The million-dollar question, or multi-million-dollar question, is how do we make this mainstream?
The emerging platform will be this amalgamation of data from payers, clinics, EHRs, images, laboratories, contract research organizations, pharma, and an analytics tool to make sense of all this data. Then to accelerate innovation and foster collaboration, we need tools to make all this valuable data we have amassed public for clinicians, researchers and bioinformatics specialists to practice their art.
Partnering with the Multiple Myeloma Research Foundation (MMRF), GenoSpace is leveraging Intel® AES-NI technology to deliver high-performance encryption to ensure the security and privacy of patient data and needed analytics MMRF requires to further its mission of accelerating the pace of treating and curing multiple myeloma and changing the paradigm of how all cancer research is conducted.
The GenoSpace architecture is hosted on Amazon Web Services (AWS) which provides flexibility and scalability for it developers and customers. To ensure the utmost security for this public cloud implementation, GenoSpace takes a ground-up approach to encryption. Its solutions gather all of the data that will be subject to analysis and layer encryption on top of that to safeguard the confidentiality of sensitive healthcare data stored on AWS or data that travels over the Internet. This adds an important extra measure of protection to AWS built-in security features.
Recently, GenoSpace evaluated the benefits of Intel® Advanced Encryption Standard New Instructions (Intel® AESNI), a silicon-based instruction set that accelerates encryption on Intel® Xeon® processors, which GenoSpace uses to process data. Meeting its customers’ performance and usability demands was a key objective for GenoSpace, given the amount of encryption and decryption that occurs when its software is used for analytics. To determine how the query response time of its population analytics application would be affected by encryption and by the hardware encryption acceleration that Intel® AES-NI provides, GenoSpace ran a series of tests focused on measuring the performance aspects of encrypting and decrypting stored data.
The key findings of this test revealed that Intel® AES-NI-enhanced encryption had a markedly positive influence on the performance of the GenoSpace Population Analytics application.
- Provider library choice significantly impacts results. The choice of encryption provider library and AES mode had the largest impact on performance. While Bouncy Castle showed no appreciable improvement with respect to Intel® AES-NI, the NSS library with Intel® AES-NI enabled performed more than 78% faster than Bouncy Castle and is the obvious choice for encryption. For decryption, NSS was approximately 96 percent faster than Bouncy Castle and 90 percent faster than SunJCE. With respect to AES modes, ECB, which is the simplest algorithm, outperformed other modes. However, because ECB is less secure than the other modes, and given the sensitivity of healthcare data, it is generally not appropriate for healthcare applications. For best performance and security, test results implied that the combination of CBC and the NSS provider library should be used, as it has the shortest routine time.
- Intel® AES-NI significantly decreases the impact of increasing key length. Typically, increasing the length of the AES encryption key (which functions much like a password) to strengthen security also increases encryption/decryption time. As key length increases, one expects a near linear increase in encrypt/decrypt times. But the study showed that by using NSS with Intel® AES-NI, the impact of doubling key length was reduced twenty-fold.
- The benefits of Intel® AES-NI increase with the size of data sets. In Phase 2 of the study, where sample genomic data was used, GenoSpace found that enabling Intel® AES-NI improves request times by nearly 9 percent. In fact as the size of the data sets scales up, there are even greater performance gains—an almost 14 percent improvement.
- Intel® AES-NI had less impact on the application’s overall performance. GenoSpace concluded that with Intel® AES-NI, encryption can scale more efficiently than other operations, such as data serialization, sorting, and filtering.
Intel® AES-NI-enhanced encryption significantly enhances the performance and usability of the GenoSpace Population Analytics offering, which, in turn, results in increased user productivity and satisfaction with the overall solution. Enabling high-performance and secure solutions paves the way for healthcare organizations to embrace the use of genetic population analytics to significantly increase the effectiveness of research, healthcare, and disease treatment options.
While healthcare workers and researchers put these tools to work, they can be confident that Intel® AES-NI accelerated and hardened encryption can help mitigate serious security breaches.
What questions about encryption do you have?