ADVISOR DETAILS

RECENT BLOG POSTS

Technology-enabled business transformation

Man-Tablet-Cloud_Computing_Diagram.jpg

 

 

By Mike Simons | Computerworld UK

 

Intel’s CEBIT presence highlighted everything from workplace transformation to wearable’s via the reengineered data centre. Here is a brief overview of how the processor giant is innovating in these vital areas.

 

Workplace transformation

 

Wireless technologies caused a buzz at CeBit this year, with WiGig (multi-gigabit speed wireless communications) and Intel’s ProWiDi technologies laying the foundation for Workplace Transformation. (Workplace Transformation refers to changes in the workplace that will lead to higher employee productivity through things like wire-free working, mobility and collaboration.)

 

WiGig will enable workers to share large volumes of data quickly, videoconference in high-definition, and stream TV- quality media, among other things. Ten times faster than Wi-Fi, WiGig could become fast enough to let users transfer the contents of a 25GB Blu-ray disc in less than a minute.

 

Intel Pro Wireless Display (ProWiDi ), on the other hand, lets you securely share content from tablets, PCs, and Ultrabook devices on your conference room displays without wires. It’s another piece in the jigsaw of wireless working, with Intel promoting both its ease-of-use and administration, and security features.

 

Also on display on Intel’s CeBIT stand, and promising to transform the workplace, were a number of new device form factor innovations from vendors including HP, Fujitsu and Dell.

 

These included 2-in 1 devices: notebooks that can be turned into tablets with a twist – or removal – of a lid, or a detachable keyboard; super-thin-and-light tablets, phablets (smart phone tablets), more powerful laptops and all-in-ones: high-definition touch-screen computers with everything included.

 

As well as these new devices designed for workplace transformation, Intel demonstrated technologies aimed at helping businesses to create value from IoT – connected smart objects; analytics that make use of complex big data; and cloud technologies that will transform the enterprise.

 

Also on its stand, Intel showcased a development project with Germany’s E.on SE, part of the energy giant E.on, using ‘smart grid’ processors and technologies. These help power generators and users to monitor usage, so utilities firms can adjust their supply to meet consumption; also cutting costs by saving energy.

 

Talking about IoT, Intel’s VP and GM EMEA, Christian Morales, said: “[Intel] were in embedded applications about 35 to 40 years ago. We were the first company to introduce embedded application microcontrollers. The big change is that those applications are becoming connected to the cloud and the Internet.”

 

Data centre transformation

 

As well as workplace transformation, data transformation was also a key theme at CeBIT. Intel showcased its Xeon server processors and its ability to build manageable cloud services on top of this core product.

 

For example, Intel has designed its Xeon E5-2600 v3 processors to be used with data centres built on a software defined infrastructure (SDI) model: where data centre resources, such as compute, storage and networking, can be managed and manipulated through software to get the most efficient usage.

 

So, compared to a typical 4-year-old server, server platforms based on the new Xeon E5-2600 v3 processors offer up to 8.7 times the performance; up to three times the virtual machine density, and three times the energy efficiency of the older server systems, according to Intel.

 

Intel’s strategy with the latest Xeon processors is to encourage enterprises to use them to power their hybrid cloud infrastructures, and utilise the accompanying management tools and technologies. Among these are intelligent workload placement through automated provisioning; thin provisioning of storage; and tiered storage orchestration.

 

Alongside its Xeon processor developments, Intel is also making advances in rack-scale architecture using Silicon Photonics. This is a new approach to make optical devices out of silicon, using photons of light to move huge amounts of data at very high speeds – up to 100Gbps. This happens over a thin optical fibre at extremely low power, rather than using electrical signals over a copper cable.

 

Wearable’s

 

Other notable innovations that Intel showcased centred on wearable computers. For example, Intel showed the ProGlove on its stand. This sensor-based ‘smart glove’ can boost productivity for manufacturing jobs by enabling manual workers to work faster, and through scanning and sensing, collecting data that can be analysed for production management purposes. The ProGlove team won third place for the Intel Make It Wearable Challenge in November 2014.

 

Intel recently ran the contest to encourage entrepreneurs, universities and schoolchildren to design wearable computers that could be used for practical purposes, based on Intel’s Edison technology. Among these were a wearable camera drone, and sensor-equipped items for pregnant mothers and parents of new-born babies.

 

Another exciting development on display was Intel Real Sense 3D, based on 3D camera technology. This features the first integrated camera that sees more like humans do, with the system able to understand and respond to natural movement in three dimensions. Consequently, users can interact with the device with natural movements. In addition, 3D scans can be manipulated and altered, shared, or printed with a 3D printer.

 

The system works by using a conventional camera, an infrared camera, and an infrared laser projector. Together, the three lenses allow the device to infer depth by detecting infrared light that has bounced back from objects in front of it. This visual data, taken in combination with Intel RealSense motion-tracking software, create a touch-free interface that responds to hand, arm, and head motions as well as facial expressions. Consequently, 3D technology also has the potential to be used for security purposes, for additional biometric input such as face recognition.

 

On show were HP’s Sprout computer which uses the 3D technology. Although it’s targeted at consumers, employees are likely to find uses for it, with the vendor talking about parts manufacturing when Sprout is linked to a 3D printer. Dell also demoed its 3D enabled, Venue 8 7000 series tablet, based on the Intel Atom Z3500 processor. This super-thin device fits in a jacket pocket, and will allow enterprises to seek new uses for 3D technology, by making it mobile and comparatively cheap.

 

With its involvement in the workplace, the data centre and end-user computing, Intel used CEBIT to showcase the breadth of its innovation and its deep and broad reach inside enterprise computing.

 

This article was originally published on: http://www.computerworlduk.com/sponsored-article/it-business/3606336/technology-enabled-business-transformation/

Read more >

March 2015 Intel® Chip Chat Podcast Round-up

In March, we started off covering the future of next generation Non-Volatile Memory technologies and the Open Compute Project Summit, as well as the recent launch of the Intel® Xeon® Processor D-1500 Product Family. Throughout the second half of March we archived Mobile World Congress podcasts recorded live in Barcelona. If you have a topic you’d like to see covered in an upcoming podcast, feel free to leave a comment on this post!

 

Intel® Chip Chat:

  • The Future of High Performance Storage with NVM Express – Intel® Chip Chat episode 370: Intel Senior Principal Engineer Amber Huffman stops by to talk about the performance benefits enabled when NVM Express is combined with the Intel® Solid-State Drive Data Center Family for PCIe. She also describes the future of NVMe over fabrics and the coming availability of NVMe on the client side within desktops, laptops, 2-in-1s, and tablets. To learn more visit: http://www.nvmexpress.org/
  • The Intel® Xeon® Processor D-1500 Product Family – Intel® Chip Chat episode 371: John Nguyen, a Senior Product Manager at Supermicro discusses the Intel® Xeon® Processor D-1500 Product Family launch and how Supermicro is integrating this new solution into their products today. He illustrates how the ability to utilize the small footprint and low power capabilities of the Intel Xeon Processor D-1500 Product Family is facilitating the production of small department servers for enterprise, as well as enabling small businesses to take advantage of the Intel Xeon Processor Family performance. To learn more visit: www.supermicro.com/products/embedded/
  • Innovating the Cloud w/ Intel® Xeon® Processor D-1500 Product Family – Intel® Chip Chat episode 372: Nidhi Chappell, Entry Server and SoC Product Marketing Manager at Intel, stops by to announce the launch of the Intel® Xeon® Processor D-1500 Product Family. She illustrates how this is the first Xeon processor in a SoC form factor and outlines how the low power consumption, small form factor, and incredible performance of this solution will greatly benefit the network edge and further enable innovation in the telecommunications industry and the data center in general. To learn more visit: www.intel.com/xeond
  • Making the Open Compute Vision a Reality – Intel® Chip Chat episode 373: Raejeanne Skillern, General Manager of the Cloud Service Provider Organization within the Data Center Group at Intel explains Intel’s involvement in the Open Compute Project and the technologies Intel will be highlighting at the 2015 Open Compute Summit in San Jose California. She discusses the launch of the new Intel® Xeon® Processor D-1500 Product Family, as well as how Intel will be demoing Rack Scale Architecture and other solutions at the Summit that are aligned with OCP specifications.
  • The Current State of Mobile and IoT Security – Intel® Chip Chat episode 374: In this archive of a livecast from Mobile World Congress in Barcelona, Gary Davis (twitter.com/garyjdavis), Chief Consumer Security Evangelist at Intel Security stops by to talk about the current state of security within the mobile and internet of things industry. He emphasizes how vulnerable many wearable devices and smart phones can be to cybercriminal attacks and discusses easy ways to help ensure that your personal information can be protected on your devices. To learn more visit: www.intelsecurity.com or home.mcafee.com
  • Enabling Next Gen Data Center Infrastructure – Intel® Chip Chat episode 375: In this archive of a livecast from Mobile World Congress Howard Wu, Head of Product Line for Cloud Hardware and Infrastructure at Ericsson chats about the newly announced collaboration between Intel and Ericsson to launch a next generation data center infrastructure. He discusses how this collaboration, which is in part enabled by Intel® Rack Scale Architecture, is driving the optimization and scaling of cloud resources across private, public, and enterprise cloud domains for improved operational agility and efficiency. To learn more visit: www.ericsson.com/cloud
  • Rapidly Growing NFV Deployment – Intel® Chip Chat episode 376: In this archive of a livecast from Mobile World Congress John Healy, Intel’s GM of the Software Defined Networking Division, stops by to talk about the current state of Network Functions Virtualization adoption within the telecommunications industry. He outlines how Intel is driving the momentum of NFV deployment through initiatives like Intel Network Builders and how embracing the open source community with projects such as OPNFV is accelerating the ability for vendors to now offer many solutions that are targeted towards function virtualization.

 

Intel, the Intel logo, and Xeon are trademarks of Intel Corporation in the U.S. and/or other countries.

*Other names and brands may be claimed as the property of others.

Read more >

Accepting the CORAL Challenge—Where Collaboration and Innovation Meet

By Dave Patterson, President, Intel Federal LLC and Vice President, Data Center Group, Intel

 

 

The U.S. Department of Energy’s (DOE) CORAL program (Collaboration of Oak Ridge, Argonne and Lawrence Livermore National Laboratories) is impressive for a number of advanced technical reasons. But the recent award announcement to Intel has shown a spotlight on another topic I am very excited about: Intel Federal LLC.

 

Intel Federal is a subsidiary that enables Intel to contract directly and efficiently with the U.S. Government. Today we work with DOE across a range of programs that address some of the grand scientific and technology challenges that must be solved to achieve extreme scale computing. One such program is Intel’s role as a prime contractor in the Argonne Leadership Computing Facility (ALCF) CORAL program award.

 

Intel Federal is a collaboration center. We’re involved in strategic efforts that need to be orchestrated in direct relationship with the end users. This involves the engagement of diverse sets of expertise from Intel and our partners, ranging from providers of hardware to system software, fabric, memory, storage and tools. The new supercomputer being built for ALCF, Aurora, is a wonderful example of how we bring together talent from all parts of Intel in collaboration with our partners to realize unprecedented technical breakthroughs.

 

Intel’s approach to working with the government is unique – I’ve spent time in the traditional government contracting space, and this is anything but. Our work today is focused on understanding how Intel can best bring value through leadership and technology innovation to programs like CORAL.

 

But what I’m most proud of about helping bring Aurora to life is what this architectural direction with Intel’s HPC scalable system framework represents in terms of close collaboration in innovation and technology. Involving many different groups across Intel, we’ve built excellent relationships with the team at Argonne to gather the competencies we need to support this monumental effort.

 

Breakthroughs in leading technology are built into Intel’s DNA. We’re delighted to be part of CORAL, a great program with far-reaching impact for science and discovery. It stretches us, redefines collaboration, and pushes us to take our game to the next level.  In the process, it will transform the HPC landscape in ways that we can’t even imagine – yet.

 

Stay tuned to CORAL, www.intel.com/hpc

 

 

 

 

 

© 2015, Intel Corporation. All rights reserved. Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries. *Other names and brands may be claimed as the property of others.

Read more >

Challenge for Health IT: Move Beyond Meaningful Use

As we all know, healthcare is a well regulated and process-driven industry. The current timeline for new research and techniques to be adopted by half of all physicians in the United States of America is around 17 years. While many of these regulations and policies are created with the best of intentions, they are often designed by criteria that doesn’t have the patient in mind, but play more to the needs of our billing needs, reimbursements, and being efficient as organizations. Rarely do we see these being designed with the experience and interactions with a patient.

 

The challenge for technology at the moment, especially for the physician, is how to move beyond the meaningful use criteria that the federal government has adopted. Doug Wood.png

 

Outdated record rules

 

We are currently working with medical record rules and criteria that are 20 years old, and trying to adapt and apply them to our electronic records. The medical records have become a repository of large amounts of waste of words and phrases that have little meaning to the physician/patient interaction. For me to wade through a medical record (because of the meaningful use criteria and structure of medical records) it is very difficult to find relevant information.

 

As a person involved in quality review, what I find more and more in electronic records is that it’s very easy to potentiate mistakes and errors. One part of the whole system that I find uncontainable is to have the physician, who is one of the most costly members of the team, take time to ostensibly be a clerk, or scribe, and take time to fill out the required records.

 

Disrupts visits

 

The problem that we can identify with all of this, at least in the office visit portion, is that it disrupts the visit with the patient. It focuses the conversation to adhere to getting the clerical tasks necessary for meaningful use criteria completed. And to me, there’s nothing more oppressive in this interaction than to doing this clerical work, than when it’s done electronically, and getting worse.

 

So if we look at this situation from the perspective of people (both the patient and physician), and how we can use electronic tools, we could rapidly be liberated from the oppression of regulatory interactions. It would be so easy, right now, to capture patient’s activities and health to create a historical archive. This could be created in some template using video and audio technologies, and language dictation software that could give the physician much more content about what is going on.

 

I say this after visiting the Center for Innovation team at the Mayo Clinic Scottsdale location, where they are conducting a wearables experiment, on which the provider is wearing Google Glass when at an office visit with a patient.

 

The experiment had a scribe in another room observing and recording the interaction through the Glass feed, both video and audio, to capture the visit and create the medical record. As I looked through the note that was put together, it was a good note. It met the requirements for the bureaucrats, but it missed the richness of the visit that I observed, and it missed what the patient needed. It missed the steps and instructions that the physician covered with the patient. There is no place to record this in the current set up.

 

Easy review access

 

Just think if that interaction was available, through a HIPAA compliant portal, for the patient and provider to access. When the patient goes home, and a few days later asks, “What did my doctor cover during my visit,” they would be able to watch and hear the conversation right there. They might have brochures and literature that was given to them, but imagine if they had access to that video and audio to replay and watch again.

 

It seems to me that we have the technology at hand to make this a viable reality.

 

The biggest challenge here is to convince certain parties, like the Federal Government and Medicare, that there is a better way to do this, and that these are more meaningful ways. Recalling who the decision makers are that designed these processes and regulations, we must work to change the design criteria from that of a compliance perspective, to one where the needs of the patient come first.

 

That’s where I think we have the great opportunities and great challenges to turn this around. If we think for a minute, and decide to do away with all this useless meaningful criteria, and instead say, “Let’s go back and think how we can make the experience better for the patient,” and leverage technologies to do just that, we would be much better off.

 

What questions do you have?

 

Dr. Douglas Wood is a practicing cardiologist and the Medical Director for the Mayo Clinic’s Center for Innovation.

Read more >

An Intel partnership that’s a win for you too

By Charlie Wuischpard, VP & GM High Performance Computing at Intel

 

Every now and then in business it all really comes together— a valuable program, a great partner, and an outcome that promises to go far beyond just business success. That’s what I see in our newly announced partnership with the Supercomputing Center of the Chinese Academy of Sciences. We’re collaborating to create an Intel Parallel Computing Center (Intel® PCC) in the People’s Republic of China. We expect our partnership with the Chinese Academy of Sciences to pay off in many ways.

 

Through working together to modernize “LAMMPS”, the world’s most broadly adopted Molecular Dynamics application, Intel and the Chinese Academy of Sciences will help researchers and scientists understand everything from physics and semiconductor design to biology, pharmaceuticals, DNA analysis and ultimately aid in identifying cures for diseases, genetics, and more.

 

The establishment of the Intel® PCC with the Chinese Academy of Sciences is an important step. The relationship grows from our ongoing commitment to cultivate our presence in China and to find and engage Chinese businesses and institutions that will collaborate to bring their talents and capabilities to the rest of the world. Their Supercomputing Center has been focused on operating and maintaining supercomputers and exploiting and supporting massively parallel computing since 1996. Their work in high performance computing, scientific computing, computational mathematics, and scientific visualization has earned national and international acclaim. And it has resulted in important advances in the simulation of large-scale systems in fields like computational chemistry and computational material science.

 

We understand, solving the biggest challenges for society, industry, and science requires a dramatic increase in computing efficiency. Many leverage High Performance Computing to solve these challenges, but seldom realize they are only using a small fraction of the compute capability their systems provide. To take advantage of the full potential of current and future hardware (i.e. cores, threads, caches, and SIMD capability), requires what we call “modernization”. We know building Supercomputing Centers is an investment. By ensuring software fully exploits the modern hardware, this will aid in maximizing the impact of these investments. Customers will realize the greatest long-term benefit when they pursue modernization in an open, portable and scalable manner.

 

The goals of the Intel® PCC effort go beyond just creating software that takes advantage of hardware, all the way to delivering value to researchers and other users around the world. Much of our effort is training and equipping students, scientists, and researchers to write modern code that will ultimately accelerate discovery.

 

We look forward to our partnership with the Chinese Academy of Science and the great results to come from this new Intel® Parallel Computing Center. You can find additional information regarding this effort by visiting our Intel® PCC website.

Read more >

Data Center: The Future is Software Defined

It is a very exciting time for the industry of information and communication technology (ICT) as it continues the massive transformation to the digital service, or “on demand”, economy.  Earlier today I had the pleasure of sharing Intel’s perspective and vision of the Data Center market at IDF15 in Shenzhen and I can think of no place better than China to exemplify how the digital services economy is impacting people’s everyday lives.  In 2015 ICT spending in China will exceed $465 Billion, comprising 43% of global ICT spending growth.  ICT is increasingly the means to fulfil business, public sector and consumer needs and the rate at which new services are being launched and existing services are growing is tremendous.  The result is 3 significant areas of growth for data center infrastructure:  continued build out of Cloud computing, HPC and Big Data.

 

Cloud computing provides on-demand, self-serve attributes that enable application developers to deliver new services to the markets in record time.  Software Defined Infrastructure, or SDI, optimizes this rapid creation and delivery of business services, reliably, with a programmable infrastructure.  Intel has been making great strides with our partners towards the adoption of SDI.  Today I was pleased to be joined by Huawei, who shared their efforts to enable the network transformation, and Alibaba, who announced their recent success in powering on Intel’s Rack Scale Architecture (RSA) in their Hangzhou lab.

 

Just as we know the future of the data center is software defined, the future of High Performance Computing is software optimized. IDC predicts that the penalties for neglecting the HPC software stack will grow more severe, making modern, parallel, optimized code essential for continued growth. To this end, today we announced that the first Intel® Parallel Computing Center in China has been established in Beijing to drive the next generation of high performance computing in the country.  Our success is also dependent on strong partnerships, so I was happy to have Lenovo onstage to share details on their new Enterprise Innovation Center focused on enabling our joint success in China.

 

As the next technology disruptor, Big Data has the ability to transform all industries.  For healthcare, through the use of Big Data analytics, precision medicine becomes a possibility providing tremendous opportunities to advance the treatment of life threatening diseases like cancer.  By applying all the latest Cloud, HPC and Big Data analytics technology and products, and working collectively as an industry, we can enable the sequence of a whole genome, identify the fundamental genes that cause the cancer, and the means to block them through the creation of personalized treatment, all in one day by 2020.

 

Through our partnership with China technology leaders we will collective enable the Digital Service Economy and deliver the next decade of discovery, solving the biggest challenges in society, industry and the sciences.

Read more >

BYOD in EMEA – Part 4 – Five Best Practice Tips for Bring Your Own Device in Healthcare

I’ve looked at many aspects of Bring Your Own Device in healthcare throughout this series of blogs, from the costs of getting it wrong to the upsides and downsides, and the effects on network and server security when implementing BYOD.

 

I thought it would be useful to distil my thoughts around how healthcare organisations can maximize the benefits of BYOD into 5 best practice tips. This is by no means an exhaustive list but provides a starting point for the no doubt lengthy conversations that need to take place when assessing the suitability of BYOD for an organisation.

 

If you’ve already implemented BYOD in your own healthcare organisation then do register and leave a comment below with your own tips – I know this community will appreciate your expertise.

 

Develop a Bring Your Own Device policy


It sounds like an obvious first step doesn’t it? However, I’d like to stress the importance of getting the policy right from day one. Do your research with clinical staff, understand their technology and process needs, identify their workarounds and ask how you can make their job of patient care easier. Development of a detailed and robust BYOD policy may take much longer than anticipated, and don’t forget that acceptance and inclusion of frontline staff is key to its success. Alongside the nuts and bolts of security it’s useful to explain the benefits to healthcare workers to get their trust, confidence and buy-in from the start.


Mobile Device Management

 

It’s likely that you have the network/server security aspect covered off under existing corporate IT governance. A key safeguard in implementing BYOD is Mobile Device Management (MDM), which should help meet your organisation’s specific security requirements. Some of these requirements may include restrictions on storing/downloading data onto the device, password authentication protocols and anti-virus/encryption software. Healthcare workers must also be given advice on what happens in the event of loss or theft of the mobile device, or when they leave the organisation in respect of remote deletion of data and apps. I encourage you to read our Case Study on Madrid Community Health Department on Managing Mobile for a great insight into how one healthcare organisation is assessing BYOD.


Make it Inclusive


For a healthcare organisation to fully enjoy the benefits of a more mobile and flexible workforce through BYOD they need to ensure that as many workers as possible (actually, I’d say all) can use their personal devices. It can be complex but some simple stipulations in the BYOD policy, such as requiring the user to ensure that they have the latest operating system and app updates installed at all times, can help to mitigate some of the risk. Also I would be conscious of the level of support an IT department can give from both a resource (people) and knowledge of mobile operating systems point of view. Ultimately, the most effective BYOD policies are device agnostic.


Plan for a Security Breach

 

The best BYOD policies plan for the worst, so that if the worst does happen it can be managed efficiently, effectively and have as little impact as possible on the organisation and patients. This requires creation of a Security Incident Response Plan. Planning for a security breach may prioritise fixing the weak link in the security chain, identifying the type and volume of data stolen and reporting the breach to a governmental department. For example, the Information Commissioner’s Office (ICO) in the UK advises that ‘although there is no legal obligation on data controllers to report breaches of security, we believe that serious breaches should be reported to the ICO.’


Continuing Assessment


From a personal perspective we all know how quickly technology is changing and improving our lives. Healthcare is no different and it’s likely that the tablet carried by a nurse today has more computing power than the desktop of just a couple of years ago. With this rapid change comes the need to continually assess a BYOD policy to ensure it meets the advances in hardware and software on a regular basis. The risk landscape is also constantly evolving as new apps are installed, new social media services become available, and healthcare workers innovate new ways of collaborating. Importantly though, I stress that the BYOD policy must also take into account the advances in the working needs and practices of healthcare workers. We’re seeing some fantastic results from improved mobility, security and ability to store and analyse large amounts of data across the healthcare spectrum. We cannot afford for this progress to be hindered by out-of-date policies. The policy is the foundation of the security and privacy practice. A good privacy and security practice enables faster adoption, use, and realisation of the benefits of new technologies.

 

I hope these best practice tips have given you food for thought. We want to keep this conversation about the benefits of a more mobile healthcare workforce going so do follow us on Twitter and share our blogs amongst your personal networks.

 

BYOD in EMEA series: Read Part Three

Join the conversation: Intel Health and Life Sciences Community

Get in touch: Follow us via @intelhealth

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

Read more >

Virtual Care Technology Transforms Home Healthcare, Empowers Patients

Today, many healthcare organizations are experimenting with and implementing the art of virtual care. Innovation in technology is finally able to address the need to go beyond brick and mortar and drive “care anywhere” when it is needed. While technology is enabling providers to drive virtual care initiatives to increase quality of care, provide patients with more access, and improve patient empowerment, therein lies the question: How secure is the ecosystem in which more and more personal health information is being exposed to?

 

Current Technology

 

First, let’s look at where we are currently. Healthcare is one of the most exciting industries today, thanks to digital technology and the industry and governments coming together to address some major pain points that existed for many decades. We are finally at a point where many of the “what if we could” ideas that clinicians and patients worldwide had can be realized. For example, many providers are driving initiatives around virtual care, including telehealth, and remote patient monitoring leveraging technology that can reside in patients’ homes.

 

In the future, payers may be able to use HIT and device information to drive big data and provide the optimal plans for patients in different demographics given the geographic region where they live, family history, and life habits. Last, but not least, patients are empowered with tools, devices, and information to proactively manage their own health the way that really makes sense, outside the hospital.

 

Wearables and Mobility

 

Simple forms of home monitoring have existed for years; however, today, there is a big disruption in the market due to new form factors of clinical wearables and connectivity solutions, which are easier to use and have a greater ability to transfer and provide access to patient data. Smartphones and tablets have become an integral part of people’s lives and can serve as a tool for telehealth, as well as a hub for clinical patient information. This makes the implementation of virtual care much easier, allowing patients to have options to cost-effective solutions and allowing them to manage their health more proactively. photo for kay 1.jpg

 

At the same time, this proliferation of devices and data also increases the risk of data attack. Any points the data is collected, used, or stored can be at risk and needs to be secured. If the wearable devices that are collecting the data are outside the U.S. and this data is being uploaded to the cloud inside the U.S., then the use of these wearables can represent trans-border data flow which can be a significant concern, especially for countries with strong data protection laws such as in EU. We need to be more responsible on how the data can be captured, transmitted, and protected. At Intel, we provide security solutions that integrate well into the user experience such as fast encryption and cost reduction. We are working with our customers to develop the most effective solution for data privacy and security.

 

Key Challenges

 

Overall, it is wonderful to see so many healthcare institutions driving virtual care. Care is definitely moving outside the traditional venues to new more natural settings closer to what patients need. However, this also exposes more patient health information to be outside the hospital walls and outside the walls of patients’ homes.

 

As such, at Intel, when we design a solution, we enable security in our core HW technology. And this provides differentiation in how the users experience security. To have a great experience, the end user should not be subjected to data breaches or other security incidents, and solutions need to be smarter about detecting user context and risks, and guiding the user to safer alternatives. Devices need to function reliably and be free of malware.

 

In addition, we are focused on driving consistent security performance across the compute continuum of care.

 

That brings us back to the original question: How secure is the ecosystem? Security will play a key role in ensuring a safe solution that providers, payers, and patients can all rely on. Security would also be key to enabling faster adoption of virtual care. Depending on the types of patient information collected, used, retained, disclosed, or shared, and how to store/dispose it, security can be designed to optimally protect privacy. It is a complex area to address, but given the value of health data, I am hopeful that organizations will start to design their virtual care solutions and ecosystem with security as one of the key pillars.

 

What questions do you have?

 

Kay Eron is General Manager Health IT & Medical Devices at Intel.

Read more >

The Promise of Genomic Medicine: Are We There Yet?

Popularly referred to as next-generation sequencing (NGS), or high-throughput sequencing, NGS is the catch-all term used to describe a number of different modern sequencing technologies including Illumina (Solexa), Roche 454, Ion Torrent (Proton/PGM), and SOLiD. This has allowed us to sequence DNA and RNA much faster and cheaper than the previously used Sanger sequencing, and has revolutionized the study of genomics and molecular biology.

 

The cost of genomic sequencing has also come a long way. From $3 billion to sequence the first human genome, it cost about $100 million per genome in 2001, and as of January 2014, the cost is about $1,000. Compared to Moore’s law that observes computing doubles every two years, the cost of sequencing a genome is falling five to 10 times annually.

 

The issue now is computing power to analyze this data. Newer sequencers are now producing four times the data in half the time. Intel® technologies like Xeon® and Xeon® Phi®, SSDs, 10/40 GbE networking solutions, Omni-Path fabric interconnect, Intel Enterprise Edition for Lustre (IEEL), along with partners like Cloudera and Amazon Web Services, are helping to cut down the time for secondary analysis from weeks to hours. Photo for ketan 1.jpg

 

Genomic information is now catalogued and used for advancing precision medicine. For example, genomic information from TCGA (The Cancer Genome Atlas) has led to developments and FDA approval for certain cancer treatments. Currently, there are about 34 FDA-approved targeted therapies like Gleevec that treat gastrointestinal stromal tumors by blocking tyrosine kinase enzymes. Though approved by the FDA in 2001, it was further granted efficacy to treat 10 more types of cancers in 2011.

 

Technical Challenges

 

Sequencers are now producing four times more data in 50 percent less time at about 0.5TB/device/day. This is a lot of data. Newer modalities like 4-D imaging are now producing 2 TB/device/day. The majority of the software used for informatics and analytics is open sourced and the market is very fragmented.

 

Once the data is generated, the burden of storing, managing, sharing, ingesting, and moving it has its own set of challenges.

 

Innovation in algorithms and techniques is outpacing what IT can support, thus requiring flexibility and agility in infrastructures.

 

Collaboration across international boundaries is an absolute necessity and that introduces challenges with security and access rights.

 

Finally, as genomics makes its way into clinics, clinical guidelines like HIPAA will kick in.

 

At the clinical level, you have barriers around the conservation and validity of the sample, validity and repeatability of laboratory results, novelty and interpretation of biomarkers, merging genomics data with clinical data, actionability and eventually changing the healthcare delivery paradigm.

 

There are too few clinical specialists and key healthcare professionals, like pharmacists, who are trained in clinical genomics. New clinical pathways and guidelines will have to be created. Systems will need to be put in place to increase transparency and accountability of different stakeholders of genomic data usage. Equality and justice need to be ensured and protection against discrimination needs to be put in place (GINA).

 

Reimbursement methods need to consider flexible pricing for tailored therapeutics responses along with standardization and harmonization (CPT codes).

 

Path Forward

 

Looking ahead, we need to develop a standardized genetic terminology (HL7, G4GH, eMERGE) and make sure EHRs support the ability to browse sequenced data. Current EHRs will need standards around communication, querying, storing, and compressing large volumes of data while interfacing with EHRs’ identifiable patient information.

 

Photo for ketan 2.jpg

 

Intel is partnering with Intermountain Health to create a new set of Clinical Decision Support (CDS) applications by combining clinical, genomic, and family health history data. The goal is to promote widespread use of CDS that will help clinicians/counselors in assessing risk and assist genetic counselors in ordering genetic tests.

 

The solution will be agnostic to data collection tools, scale to different clinical domains and other healthcare institutions, be standards based where they exist, work across all EHRs, leverage state-of-the-art technologies, and be flexible to incorporate other data sources (e.g., imaging data, personal device data).

 

What questions do you have?

 

Ketan Paranjape is the general manager Life Sciences at Intel Corporation.

Read more >

Intel and Spanish Society of Family Medicine & Community Collaborate to Create Tablet with App Store Exclusively for Doctors



Improving care for patients is a common goal for our healthcare team and partners, so I’m really excited to be able to share the outcome of a collaborative project we’ve been working on with the Spanish Society of Family Medicine and Community (semFYC).

 

Together we have created a tablet featuring an app store exclusively for doctors. Meeting the needs of healthcare professionals with an easy-to-use mobile device combined with medical applications that have the endorsement of a scientifically-recognised body in semFYC is incredibly exciting for all involved and a step-change for the way GPs and physicians access the latest clinical information.

 

Josep Basora, President of semFYC, spoke to me about the tablet and app store created in partnership with Intel: “When I started to drive this project I wanted to facilitate the right information, at the appropriate place and by the authorised time. Mobility is one of the keys that defines the work of the current healthcare professional.”

 

“For a physician, the possibility to use applications that have the endorsement of a scientific society such as semFYC has real significance, as it has the full assurance that the tool used is supported by rigorous governance. This has certainly had a positive effect on both resource optimisation and improvement of patient service.”

 

semFYC brings together more than 17 Societies of Family Medicine and Community in Spain covering a total of 19,500 GPs with a focus on improving the knowledge and skills of its members. The app store, which exclusively features medical applications, automatically updates installed apps with the latest information around procedures and drugs, thus reducing the time GPs require to update their knowledge and consequently increasing the quality of patient care.

 

Take a look at the video above to find about more about the tablet and health app store created by Intel with semFYC.

 

Read more >

ODCA Cloud Adoption Survey: Trends in Public and Private Cloud Usage

privateusage.jpgIn addition to the Big Data area, I have done and continue to do work with Cloud Computing.  Intel is a member of the Open Data Center Alliance (ODCA), an independent consortium of global IT leaders building and recommending a unified customer vision for data center requirements.   Late in 2014, the ODCA sent out a survey to its members on cloud adoption.  I answered that survey, and recently the ODCA has published the survey results.  One finding is there seems to be a strong preference for internal cloud solutions among ODCA members.  Why is that?  Is public cloud adoption really slowing?  What are the key issues with both public and cloud adoption?

 

Overall, cloud adoption for ODCA members is on the increase for both public and private Cloud, although private cloud is increasing at a much faster rate.  The ODCA survey highlighted the following top concerns:  data security, regulatory issues, service reliability, and vendor lock-in.    For the public cloud, the data security and regulatory issues are probably highest in priority.    Intel IT has created a cloud brokering function for deciding whether to land an application externally in the public cloud or in the internal cloud.  This function makes a decision based on factors like security requirements, control, and location.

publicvsprivate.JPG

 

A co-worker pointed out to me that the report seems to be IaaS-centric and that SaaS to the public Cloud is likely to grow.  I would agree, and the report also mentions this.  Opportunistically adopting SaaS Solutions is in Intel IT’s original Cloud Strategy, and today I see that public SaaS adoption continues to move ahead within Intel. The survey also points out key areas of interest to ODCA members, such as Software Defined Networking and hybrid cloud.  SDN is also an area of focus for Intel IT, while moving to hybrid cloud has been a strategic goal.

 

A few other highlights that I didn’t cover:

  • Since 2012, the number of respondents who have greater than 60% of their operations in an internal cloud has increased from 10% to 24%.
  • Organizations project both their internal and public cloud usage to double by 2016.
  • More than 80% of survey respondents are using or are planning to using hybrid cloud solutions at some point in the future.

You can see the details by downloading the report.

Read more >

5 Drivers Fueling Growth in Consumer Health

A perfect storm of market conditions is forming that will likely propel consumer health near the top of many enterprise priority lists and justify its estimated 40 percent CAGR in 2015.

Intel has been the driving force behind the global technology revolution for more than 40 years, and we’ve seen the dramatic impact of technology on healthcare. Looking ahead, here are the five drivers that we see fueling growth in consumer health:

Payment Reform

 

One of the most important conditions is payment reform. As the basis for reimbursement shifts away from fee-for-service and toward quality-based outcomes in the U.S., providers will extend the continuum of care far beyond their hospitals to more accurately quantify value after discharge.

Data

 

One of the best ways to optimize care and demonstrate effectiveness is to implement a holistic approach for understanding a person’s status by deriving actionable data about her individually and continuously from multiple sources — including consumer devices.

Photo for MJ blog HIMSS.jpg

Consumer Involvement

 

Consumer empowerment is also going to play a large role. It began with the shift from a business model that was traditionally B2B to one that was more B2C as commercial health insurers positioned themselves to personally engage millions of newly eligible customers. Now, consumer health solutions enable all payer organizations — private, public, employer — to promote healthy behaviors and timely preventative care that has been shown to reduce the occurrence of costly acute emergencies. Ultimately, consumers will have the ability to be more active in managing their own care, with the expectation of access to more of their health information anytime.


Baby Boomers

 

A demographic shift is also fueling this growth. Every day, 10,000 baby boomers celebrate their 65th birthday in the U.S., and that trend will continue until at least 2019. Unfortunately, 90 percent of them, with help from their family caregivers in some cases, are managing at least one chronic medical condition (860 million people worldwide). As telehealth becomes more widely adopted (and reimbursed), remote doctor consultations will increasingly rely on consumer health technologies to improve chronic disease management and ease the stress on a limited pool of primary care physicians.

 

Worldwide Approach

 

Many fast-growing emerging global markets, like China and India, are exhibiting strong appetites for consumer health solutions that can add value while supplementing recent government efforts to provide more efficient virtual care to their significant aging and rural populations. As more technology vendors from the region offer innovative products at very competitive price points, access and adoption will continue to climb at a healthy pace, contributing to notable growth of the consumer health market segment regionally and worldwide.

 

Of course, one of the biggest hurdles to overcome is alignment of priorities for all major stakeholders. You need a consumer-centered design, an evaluation of clinical workflow integration, and a way to measure the business impact of the goals.

 

What questions do you have? What other drivers do you see impacting consumer health?

 

Michael Jackson is General Manager, Consumer Health at Intel Corporation.

Read more >

Turning Raw Data into Smart Insights

We are creating data at an exponential rate. Yet, the data growth rate is not the biggest challenge for IT. The biggest challenge is that the need for useful information is growing faster than the data itself  — providing a perfect storm for IT professionals and a business imperative to turn raw data into smart insights. To understand this challenge, I’d like to explore the history and evolution of big data complexity.

 

Data-Funny.pngIn 2001, Gartner analyst Doug Laney explained the initial challenges of big data in his 3Vs model. As time progressed, others have embraced the 3Vs model and incorporated two more areas of emphasis for big data analytics: veracity and value.

 

Ben Rossi does a nice job of articulating the impact of these five terms into one: smart data. “The purpose of smart data (veracity and value) is to filter out the noise and retain only the valuable data, which can be effectively used by the enterprise decision makers to solve business problems.”


Actionable Insights

 

Today, big data technology unfortunately isn’t meeting the needs of most businesses. There are two reasons. We should not be focused on the types of data, but the use case — business insights. And we must look far enough ahead in our use cases — trying to solve yesterday’s challenges and not just tomorrow’s. Michael Wu, chief scientist at Lithium Technologies, states that we are on a “maturity journey” when it comes to analytics and data visualizations. Understanding this evolution will help us better architect IT solutions today to extract the information and develop actionable insights for business decision makers.

 

There are three levels of analytics maturity that describe this progression:

 

  • Descriptive analytics (what happened): A summary report of historical data, usually seen in a dashboard. Most enterprise analytics today fall into this category. An example includes a report of business data offering insights into an organization’s financials, sales, or inventory.
  • Predictive analytics (what should happen): Makes predictions based on information that’s already available. An example includes financial services more accurately predicting future stock performance (noting that historical performance is not an indicator of future results).
  • Prescriptive analytics (what you should do today): Analytics that not only predict the future but also deliver insights that allow you to decide today what path you should take to optimize your results. Google’s self-driving car is an example of prescriptive analytics, since the car needs to make decisions based on predictions of future outcomes. This is the use case that business leaders in a variety of industries are seeking and what’s driving the need for big data analytics.

 

Analytics-Evolution.png


Rossi’s concept of smart data enables intelligent insights when evaluated with a focus on prescriptive analytics. Wu summarizes nicely: “Big data technology won’t help you make bigger decisions … yet smart data can certainly help you make smarter decisions.


Intel and Big Data Innovation

 

Extracting insights fast enough to support real-time business processes and decisions is critical, and companies are gathering, storing, and analyzing data they were never able to before. Intel understands the challenges and complexities facing IT professionals regarding the need to deliver high performance, cost-efficient big data solutions on a scalable, secure architecture.

 

As a result, Intel has joined forces with many industry leaders to enable enterprise solutions. SAP HANA, SAP Data Services, and SAP Business Objects provide solutions for real-time big data analytics using the Intel Distribution for Apache Hadoop software. Through these platforms, businesses can combine the performance of analytics with the scalability of Apache Hadoop, enabling a real-time analytics platform made to store, integrate, and analyze all business data.

 

Late last year, our CEO discussed a new Intel collaboration and equity investment with Cloudera aimed at bringing an enterprise-ready platform to the mainstream for impactful big data solutions.


The Big Data Maturity Journey

 

In summary, raw data is only useful when it’s used to add context-specific relevance, insights, and value to business operations. Smart data empowers the decision-making process by using analytics to achieve results that make sense to humans, not just machines. By making information actionable, we can make profitable decisions and solve problems in the process — and those are smart insights.

 

Join this conversation by connecting with me on Twitter at @chris_p_intel or by using #ITCenter.

Read more >

Bring Your Own Device in EMEA – Part 3 – It’s Not Just About Devices

I like to think of security as a chain, and like any other chain it is only as strong as its weakest link. In the case of security in healthcare the chain consists of the network, the server and the device. Often the focus is overwhelmingly placed on the security of the device but I argue that data is as equally, if not more, at risk when it’s in transit as it is when at rest. So, with that in mind I wanted to take a look at some of the wider security considerations around Bring Your Own Device (BYOD).


Whenever I speak at events about security and healthcare my starting point is often that we must remember that the priority for healthcare professionals is patient care. Security cannot, and must not, compromise usability as we know this drives workarounds. Often these workarounds mean using personal devices in conjunction with what is more commonly known as ‘Bring Your Own Cloud’.


Bring Your Own Cloud

Bring Your Own Cloud (BYOC) primarily refers to the use of clouds that are not authorized by the healthcare organization to convey sensitive data. This often occurs through an individual using an app they downloaded onto a personal device. Many such apps have backend clouds as part of their overall solution. When sensitive data is entered into the app it gets sync’d to the cloud. Furthermore, this transfer can occur over networks that are not managed by the healthcare organization, making the transfer invisible to the healthcare organization. Of course, sensitive data in an unauthorized cloud can constitute a breach. In many cases these 3rd party clouds can be in different countries, making this transfer a trans-border data flow and can represent further non-compliance issues with data protection laws.


For example, imagine a nurse taking patient notes that need to be sent to a specialist such as a cardiologist. This should be done using a secure device with a secure wireless network and a secure solution approved by the organization for such a task. However, lack of usability, or cumbersome security around such solutions, or a slow or overly restrictive IT department can drive the use of BYOC approach instead. In a BYOC approach the nurse uses a personal app on a personal mobile device together with either unencrypted email, a file transfer app, or social media to send these for analysis by a specialist.


This introduces risks to both the confidentiality of the sensitive healthcare data, as well as the integrity of the patient record that is often not updated with information traveling in these “side clouds”, rendering it incomplete, inaccurate, or out of data. In a best case this can result in suboptimal healthcare, and in a worst case this could be a patient safety issue. The consequences to both patient and organisation of such risks can be severe. Here at Intel we have security solutions available to healthcare organisations, which ensure that data is always secure whether at rest or in transit on the device or organisation’s network. Our security solutions also use hardware-enhanced security to maximize performance and usability, mitigating risk of cumbersome security and the healthcare worker being driven to resort to workarounds and BYOC.


Apps for Healthcare

One area where I’m seeing a lot of rapid change is in the development of apps for healthcare. I recently spoke to the Apps Alliance on the security challenges for developers of healthcare apps, whether they are aimed at healthcare professionals or consumers. These apps often make the recording and analysing of health information very easy and in some cases they can enhance the relationship between patient and clinician.


Stealth IT

I’d also like to briefly take a look at what is often referred to as ‘Stealth IT’, also called ‘Shadow IT’. As with any form of workaround, the use of Stealth IT can be driven by an unresponsive or overly restrictive corporate IT department. One obvious example would be a small team of researchers requiring additional server space to store data but perceiving the organisational process slow and expensive in providing such resources. The consequence is the purchase of what is comparatively cheap and accessible server space with any number of easy-to-find companies on the web. I remind you of my earlier comments about knowing exactly how secure the server is and in which country or continent the server sits.


I like to think that a healthcare organisation looking to put a Bring Your Own Device policy in place appreciates the benefits and risks but starts with understanding why a healthcare professional uses their own device, logs on to an unsecure network or purchases unauthorised server space. Only then will the organisation, healthcare worker and patient truly reap the benefits of BYOD.

 

 

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

Read more >

Harnessing the Internet of Things for Proactive Home Energy Management

Excuse the pun, but here in the UK, energy prices are hot news.

 

According to the Department of Energy and Climate Chance, since 2007, the prices of combined domestic gas and electricity bills have increased by 33 per cent in real terms. And, despite wholesale gas prices falling rapidly over the past 12 months, energy providers have been criticised for being slow to pass these savings onto customers. Of course it’s not that simple. Energy providers have to buy their wholesale gas supplies months in advance and have their own costs to cover, meaning it’s not always possible to immediately pass on these savings in full.  Both parties are feeling the squeeze – not only in the UK, but all across Europe.

Energy-IoT.png

 

So what can be done to shake up the energy market and improve the situation for all?

 

Well, French company IJENKO thinks the answer lies in the Internet of Things (IoT).

 

“We have only reached the tip of the iceberg when it comes to realizing the positive impact of the Internet on our digital home life, in particular with regard to energy,” explains Serge Subiron, CEO and co-founder at IJENKO. “The Internet of Energy (IoE), as it is known, has the potential to connect the activities of utility providers and consumers in real time, enabling much more dynamic energy provision and consumption.”

 

IJENKO’s Home Management Solution allows energy suppliers to empower customers to become more efficient in their use of energy, enabling them to save on their energy bills and collectively influence the demand curve.

 

Remote Monitoring

 

Imagine being able to use your Smart Phone to check how much energy your heating system is consuming, and what this equates to in monetary terms. Imagine then that you’re able to remotely turn down your thermostat a degree or two from wherever you are. This greater visibility and control, made possible by the IoT, allows for a much more proactive, dynamic and efficient use of energy, not to mention lower bills if that is your end goal.

 

And these solutions are not the stuff of science fiction. They are possible today thanks to IJENKO’s Home Management Solution and the Intel® IoT Gateway, which extracts data from legacy systems around the home and securely connects them to next-generation intelligent infrastructure.

 

Smart Services

 

As well as improving the customer experience, the IJNEKO solution presents utility providers with the opportunity to develop innovative smart services based on the interaction of a number of technologies, rather than one overarching standard. By adding real value to the customer, these services offer long-term stickiness and help utility providers to stay in control of the customer experience.

 

All in all, it’s a win-win scenario.

 

To continue this conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter.

 

Jane Williams
Online Sales Development Manager
Intel Corporation

Read more >

Hardware and Software Solutions that Create a Loosely Coupled Partnership

Demand for efficiency, flexibility, and scalability continues to increase, and the data center must keep pace with movement to our digital business strategies.  Previously, Diane Bryant, Intel’s senior vice president and general manager of Intel’s Data Center Group, recently stated, “We are in the midst of a bold industry transformation as IT evolves from supporting the business to being the business. This transformation and the move to cloud computing calls into question many of the fundamental principles of data center architecture.”

 

Those “fundamental principles of data center architecture” are on a collision course with the direction that virtualization has lead us.  This virtualization in conjunction with automation and orchestration is leading us to the Software Defined Infrastructure (SDI). The demand of SDI is driving new hardware developments, which will open a whole new world of possibilities for running a state-of-the-art data center.  This eventually will leave our legacy infrastructure behind.  While we’re not quite there yet, as different stages need to mature, the process has the power to transform the data center.

 

generic_server_room.jpg

 

Logical Infrastructure

 

SDI rebuilds the data center into a landing zone for new business capabilities. Instead of comprising multiple highly specialized components, it’s a cohesive and comprehensive system that meets all the demands placed on it by highly scalable, completely diversified workloads, from the traditional workloads to cloud-aware applications.

 

This movement to cloud-aware applications will demand the need for SDI; by virtualizing and automating the hardware that powers software platforms, infrastructure will be more powerful, cost-effective, and efficient. This migration away from manual upkeep of individual resources will also allow systems, storage, and network administrators to shift their focus to more important tasks instead of acting as “middleware” to connect these platforms.

 

Organizations will be able to scale their infrastructure in support of the new business services and products, and bring them to market much more quickly with the power of SDI.

 

Hardware Still Matters

 

As the data center moves toward an SDI-driven future, CIOs should be cautious in thinking that hardware does not count anymore. Hardware that works in conjunction with the software to ensure that the security and reliability of the workloads are fully managed and provide telemetry and extensibility that allow specific capabilities to be optimized and controlled within the hardware will be critical.

 

The Future of the Data Center Lies with SDI

 

Data centers must be agile, flexible, and efficient in this era of transformative IT. SDI allows us to achieve greater efficiency and agility by allocating resources according to our organizational needs, applications requirements, and infrastructure capabilities.

 

As Bryant concluded, “Anyone in our industry trying to cling to the legacy world will be left behind. We see the move to cloud services and software defined infrastructure as a tremendous opportunity and we are seizing this opportunity.”

 

To continue the conversation, please follow me on Twitter at @EdLGoldman or use #ITCenter.

Read more >

Hitting the Tradeshow Circuit with Red Rock Canyon – Intel’s New 100G Ethernet Technology

I had the opportunity to attend Mobile World Congress and the Open Compute Summit this year where we demonstrated Red Rock Canyon (RRC) at both venues. At Fall IDF in San Francisco last year, we disclosed RRC for the first time. RRC is Intel’s new multi-host Ethernet controller silicon with integrated Ethernet switching resources.

The device contains multiple integrated PCIe interfaces along with Ethernet ports that can operate up to 100G. The target markets include network appliances and rack scale architecture which is why MWC and the OCP summit were ideal venues to demonstrate the performance of RRC in these applications.

 

Mobile World Congress

This was my first time at MWC and it was an eye opener. Eight large exhibit halls in the middle of Barcelona with moving walkways to shuffle you from one hall to the next. Booths the size of two story buildings packed with 93,000 attendees – a record number according to the MWC website.

At the Intel booth, we were one of several demonstrations of technology for network infrastructure. Our demo was entitled “40G/100GbE NSH Service Chaining in Intel ONP” and highlighted service function forwarding using network services headers (NSH) on both the Intel XL-710 40GbE controller and the Intel Ethernet 100Gbps DSI adapter that uses RRC switch silicon. In case you’re not familiar with NSH, it’s a new virtual network overlay industry initiative being driven by Cisco, which allows flows to be identified and forwarded to a set of network functions by creating a virtual network on top of the underlying physical network.

The demo was a collaboration with Cisco. It uses a RRC NIC as a 100GbE traffic generator to send traffic to an Intel Sunrise Trail server that receives the traffic at 100Gbps using another RRC 100GbE NIC card. Sunrise Trail then forwards 40Gbps worth of traffic to a Cisco switch, which, in turn, distributes the traffic to both another Sunrise Trail server and a Cisco UCS server, both of which contain Intel® Ethernet XL710 Converged Network Adapters.

The main point of the demonstration is that the RRC NIC, the XL710 NIC and the Cisco switch can create a wire-speed service chain by forwarding traffic using destination information in the NSH header. For NFV applications, the NIC cards can also forward traffic to the correct VM based on this NSH information.

Network Function Virtualization (NFV) was a hot topic at MWC this year, and we had many customers from leading network service providers and OEMs come by our booth to see the demo. In some cases they were more interested in our 100GbE link, which I was told was one of the only demos of this kind at the show.

Another 100G Intel Ethernet demo was at the Ericsson booth where they announced their project Athena, which demonstrated a 100GbE link using two RRC-based NIC cards. Athena is designed for hyperscale cloud data centers using Intel’s rack scale architecture framework.

 

Open Compute Project Summit

The very next week, I traveled to San Jose to attend the Open Compute Project Summit where RRC was part of a demonstration of Intel’s latest software development platform for its rack scale architecture. OCP was a much smaller show focused on the optimization of rack architectures for hyperscale data centers. At last year’s conference, we demonstrated an RSA switch module using our Intel Ethernet Switch FM6000 along with four Intel 10GbE controller chips.

This year, we showed our new multi-host RSA module that effectively integrates all of these components into a single device while at the same time provides 50Gbps of bandwidth to each server along with multiple 100GbE ports out of the server shelf. This RSA networking topology not only provides a 4:1 cable reduction, it also enables flexible network topologies. We also demonstrated our new open source ONP Linux Kernel driver, which will be up-streamed in 2015 consistent with our Open Network Platform strategy.

We had a consistent stream of visitors to our booth partially due to an excellent bandwidth performance demo.

After first disclosing RRC at IDF last year, it was great to be able to have three demonstrations of its high-performance capabilities at both MWC and the OCP Summit. It doesn’t hurt that these conferences are also targeted at two key market segments for RRC: network function virtualization and rack scale architecture.

We plan to officially launch RRC later this year, so stay tuned for much more information on how RRC can improve performance and/or reduce cost in these new market segments.

Read more >