Recent Blog Posts

Virtual Care Technology Transforms Home Healthcare, Empowers Patients

Today, many healthcare organizations are experimenting with and implementing the art of virtual care. Innovation in technology is finally able to address the need to go beyond brick and mortar and drive “care anywhere” when it is needed. While technology is enabling providers to drive virtual care initiatives to increase quality of care, provide patients with more access, and improve patient empowerment, therein lies the question: How secure is the ecosystem in which more and more personal health information is being exposed to?

 

Current Technology

 

First, let’s look at where we are currently. Healthcare is one of the most exciting industries today, thanks to digital technology and the industry and governments coming together to address some major pain points that existed for many decades. We are finally at a point where many of the “what if we could” ideas that clinicians and patients worldwide had can be realized. For example, many providers are driving initiatives around virtual care, including telehealth, and remote patient monitoring leveraging technology that can reside in patients’ homes.

 

In the future, payers may be able to use HIT and device information to drive big data and provide the optimal plans for patients in different demographics given the geographic region where they live, family history, and life habits. Last, but not least, patients are empowered with tools, devices, and information to proactively manage their own health the way that really makes sense, outside the hospital.

 

Wearables and Mobility

 

Simple forms of home monitoring have existed for years; however, today, there is a big disruption in the market due to new form factors of clinical wearables and connectivity solutions, which are easier to use and have a greater ability to transfer and provide access to patient data. Smartphones and tablets have become an integral part of people’s lives and can serve as a tool for telehealth, as well as a hub for clinical patient information. This makes the implementation of virtual care much easier, allowing patients to have options to cost-effective solutions and allowing them to manage their health more proactively. photo for kay 1.jpg

 

At the same time, this proliferation of devices and data also increases the risk of data attack. Any points the data is collected, used, or stored can be at risk and needs to be secured. If the wearable devices that are collecting the data are outside the U.S. and this data is being uploaded to the cloud inside the U.S., then the use of these wearables can represent trans-border data flow which can be a significant concern, especially for countries with strong data protection laws such as in EU. We need to be more responsible on how the data can be captured, transmitted, and protected. At Intel, we provide security solutions that integrate well into the user experience such as fast encryption and cost reduction. We are working with our customers to develop the most effective solution for data privacy and security.

 

Key Challenges

 

Overall, it is wonderful to see so many healthcare institutions driving virtual care. Care is definitely moving outside the traditional venues to new more natural settings closer to what patients need. However, this also exposes more patient health information to be outside the hospital walls and outside the walls of patients’ homes.

 

As such, at Intel, when we design a solution, we enable security in our core HW technology. And this provides differentiation in how the users experience security. To have a great experience, the end user should not be subjected to data breaches or other security incidents, and solutions need to be smarter about detecting user context and risks, and guiding the user to safer alternatives. Devices need to function reliably and be free of malware.

 

In addition, we are focused on driving consistent security performance across the compute continuum of care.

 

That brings us back to the original question: How secure is the ecosystem? Security will play a key role in ensuring a safe solution that providers, payers, and patients can all rely on. Security would also be key to enabling faster adoption of virtual care. Depending on the types of patient information collected, used, retained, disclosed, or shared, and how to store/dispose it, security can be designed to optimally protect privacy. It is a complex area to address, but given the value of health data, I am hopeful that organizations will start to design their virtual care solutions and ecosystem with security as one of the key pillars.

 

What questions do you have?

 

Kay Eron is General Manager Health IT & Medical Devices at Intel.

Read more >

The Promise of Genomic Medicine: Are We There Yet?

Popularly referred to as next-generation sequencing (NGS), or high-throughput sequencing, NGS is the catch-all term used to describe a number of different modern sequencing technologies including Illumina (Solexa), Roche 454, Ion Torrent (Proton/PGM), and SOLiD. This has allowed us to sequence DNA and RNA much faster and cheaper than the previously used Sanger sequencing, and has revolutionized the study of genomics and molecular biology.

 

The cost of genomic sequencing has also come a long way. From $3 billion to sequence the first human genome, it cost about $100 million per genome in 2001, and as of January 2014, the cost is about $1,000. Compared to Moore’s law that observes computing doubles every two years, the cost of sequencing a genome is falling five to 10 times annually.

 

The issue now is computing power to analyze this data. Newer sequencers are now producing four times the data in half the time. Intel® technologies like Xeon® and Xeon® Phi®, SSDs, 10/40 GbE networking solutions, Omni-Path fabric interconnect, Intel Enterprise Edition for Lustre (IEEL), along with partners like Cloudera and Amazon Web Services, are helping to cut down the time for secondary analysis from weeks to hours. Photo for ketan 1.jpg

 

Genomic information is now catalogued and used for advancing precision medicine. For example, genomic information from TCGA (The Cancer Genome Atlas) has led to developments and FDA approval for certain cancer treatments. Currently, there are about 34 FDA-approved targeted therapies like Gleevec that treat gastrointestinal stromal tumors by blocking tyrosine kinase enzymes. Though approved by the FDA in 2001, it was further granted efficacy to treat 10 more types of cancers in 2011.

 

Technical Challenges

 

Sequencers are now producing four times more data in 50 percent less time at about 0.5TB/device/day. This is a lot of data. Newer modalities like 4-D imaging are now producing 2 TB/device/day. The majority of the software used for informatics and analytics is open sourced and the market is very fragmented.

 

Once the data is generated, the burden of storing, managing, sharing, ingesting, and moving it has its own set of challenges.

 

Innovation in algorithms and techniques is outpacing what IT can support, thus requiring flexibility and agility in infrastructures.

 

Collaboration across international boundaries is an absolute necessity and that introduces challenges with security and access rights.

 

Finally, as genomics makes its way into clinics, clinical guidelines like HIPAA will kick in.

 

At the clinical level, you have barriers around the conservation and validity of the sample, validity and repeatability of laboratory results, novelty and interpretation of biomarkers, merging genomics data with clinical data, actionability and eventually changing the healthcare delivery paradigm.

 

There are too few clinical specialists and key healthcare professionals, like pharmacists, who are trained in clinical genomics. New clinical pathways and guidelines will have to be created. Systems will need to be put in place to increase transparency and accountability of different stakeholders of genomic data usage. Equality and justice need to be ensured and protection against discrimination needs to be put in place (GINA).

 

Reimbursement methods need to consider flexible pricing for tailored therapeutics responses along with standardization and harmonization (CPT codes).

 

Path Forward

 

Looking ahead, we need to develop a standardized genetic terminology (HL7, G4GH, eMERGE) and make sure EHRs support the ability to browse sequenced data. Current EHRs will need standards around communication, querying, storing, and compressing large volumes of data while interfacing with EHRs’ identifiable patient information.

 

Photo for ketan 2.jpg

 

Intel is partnering with Intermountain Health to create a new set of Clinical Decision Support (CDS) applications by combining clinical, genomic, and family health history data. The goal is to promote widespread use of CDS that will help clinicians/counselors in assessing risk and assist genetic counselors in ordering genetic tests.

 

The solution will be agnostic to data collection tools, scale to different clinical domains and other healthcare institutions, be standards based where they exist, work across all EHRs, leverage state-of-the-art technologies, and be flexible to incorporate other data sources (e.g., imaging data, personal device data).

 

What questions do you have?

 

Ketan Paranjape is the general manager Life Sciences at Intel Corporation.

Read more >

Intel and Spanish Society of Family Medicine & Community Collaborate to Create Tablet with App Store Exclusively for Doctors



Improving care for patients is a common goal for our healthcare team and partners, so I’m really excited to be able to share the outcome of a collaborative project we’ve been working on with the Spanish Society of Family Medicine and Community (semFYC).

 

Together we have created a tablet featuring an app store exclusively for doctors. Meeting the needs of healthcare professionals with an easy-to-use mobile device combined with medical applications that have the endorsement of a scientifically-recognised body in semFYC is incredibly exciting for all involved and a step-change for the way GPs and physicians access the latest clinical information.

 

Josep Basora, President of semFYC, spoke to me about the tablet and app store created in partnership with Intel: “When I started to drive this project I wanted to facilitate the right information, at the appropriate place and by the authorised time. Mobility is one of the keys that defines the work of the current healthcare professional.”

 

“For a physician, the possibility to use applications that have the endorsement of a scientific society such as semFYC has real significance, as it has the full assurance that the tool used is supported by rigorous governance. This has certainly had a positive effect on both resource optimisation and improvement of patient service.”

 

semFYC brings together more than 17 Societies of Family Medicine and Community in Spain covering a total of 19,500 GPs with a focus on improving the knowledge and skills of its members. The app store, which exclusively features medical applications, automatically updates installed apps with the latest information around procedures and drugs, thus reducing the time GPs require to update their knowledge and consequently increasing the quality of patient care.

 

Take a look at the video above to find about more about the tablet and health app store created by Intel with semFYC.

 

Read more >

ODCA Cloud Adoption Survey: Trends in Public and Private Cloud Usage

privateusage.jpgIn addition to the Big Data area, I have done and continue to do work with Cloud Computing.  Intel is a member of the Open Data Center Alliance (ODCA), an independent consortium of global IT leaders building and recommending a unified customer vision for data center requirements.   Late in 2014, the ODCA sent out a survey to its members on cloud adoption.  I answered that survey, and recently the ODCA has published the survey results.  One finding is there seems to be a strong preference for internal cloud solutions among ODCA members.  Why is that?  Is public cloud adoption really slowing?  What are the key issues with both public and cloud adoption?

 

Overall, cloud adoption for ODCA members is on the increase for both public and private Cloud, although private cloud is increasing at a much faster rate.  The ODCA survey highlighted the following top concerns:  data security, regulatory issues, service reliability, and vendor lock-in.    For the public cloud, the data security and regulatory issues are probably highest in priority.    Intel IT has created a cloud brokering function for deciding whether to land an application externally in the public cloud or in the internal cloud.  This function makes a decision based on factors like security requirements, control, and location.

publicvsprivate.JPG

 

A co-worker pointed out to me that the report seems to be IaaS-centric and that SaaS to the public Cloud is likely to grow.  I would agree, and the report also mentions this.  Opportunistically adopting SaaS Solutions is in Intel IT’s original Cloud Strategy, and today I see that public SaaS adoption continues to move ahead within Intel. The survey also points out key areas of interest to ODCA members, such as Software Defined Networking and hybrid cloud.  SDN is also an area of focus for Intel IT, while moving to hybrid cloud has been a strategic goal.

 

A few other highlights that I didn’t cover:

  • Since 2012, the number of respondents who have greater than 60% of their operations in an internal cloud has increased from 10% to 24%.
  • Organizations project both their internal and public cloud usage to double by 2016.
  • More than 80% of survey respondents are using or are planning to using hybrid cloud solutions at some point in the future.

You can see the details by downloading the report.

Read more >

5 Drivers Fueling Growth in Consumer Health

A perfect storm of market conditions is forming that will likely propel consumer health near the top of many enterprise priority lists and justify its estimated 40 percent CAGR in 2015.

Intel has been the driving force behind the global technology revolution for more than 40 years, and we’ve seen the dramatic impact of technology on healthcare. Looking ahead, here are the five drivers that we see fueling growth in consumer health:

Payment Reform

 

One of the most important conditions is payment reform. As the basis for reimbursement shifts away from fee-for-service and toward quality-based outcomes in the U.S., providers will extend the continuum of care far beyond their hospitals to more accurately quantify value after discharge.

Data

 

One of the best ways to optimize care and demonstrate effectiveness is to implement a holistic approach for understanding a person’s status by deriving actionable data about her individually and continuously from multiple sources — including consumer devices.

Photo for MJ blog HIMSS.jpg

Consumer Involvement

 

Consumer empowerment is also going to play a large role. It began with the shift from a business model that was traditionally B2B to one that was more B2C as commercial health insurers positioned themselves to personally engage millions of newly eligible customers. Now, consumer health solutions enable all payer organizations — private, public, employer — to promote healthy behaviors and timely preventative care that has been shown to reduce the occurrence of costly acute emergencies. Ultimately, consumers will have the ability to be more active in managing their own care, with the expectation of access to more of their health information anytime.


Baby Boomers

 

A demographic shift is also fueling this growth. Every day, 10,000 baby boomers celebrate their 65th birthday in the U.S., and that trend will continue until at least 2019. Unfortunately, 90 percent of them, with help from their family caregivers in some cases, are managing at least one chronic medical condition (860 million people worldwide). As telehealth becomes more widely adopted (and reimbursed), remote doctor consultations will increasingly rely on consumer health technologies to improve chronic disease management and ease the stress on a limited pool of primary care physicians.

 

Worldwide Approach

 

Many fast-growing emerging global markets, like China and India, are exhibiting strong appetites for consumer health solutions that can add value while supplementing recent government efforts to provide more efficient virtual care to their significant aging and rural populations. As more technology vendors from the region offer innovative products at very competitive price points, access and adoption will continue to climb at a healthy pace, contributing to notable growth of the consumer health market segment regionally and worldwide.

 

Of course, one of the biggest hurdles to overcome is alignment of priorities for all major stakeholders. You need a consumer-centered design, an evaluation of clinical workflow integration, and a way to measure the business impact of the goals.

 

What questions do you have? What other drivers do you see impacting consumer health?

 

Michael Jackson is General Manager, Consumer Health at Intel Corporation.

Read more >

Barriers removed – simplified and accelerated firmware development

The spectrum of smart, connected devices is rapidly expanding from phones, tablets and e-readers to phablets, smart watches, and even robotic drones. Tens of billions of devices are expected by the end of the decade, and manufactures must bring new … Read more >

The post Barriers removed – simplified and accelerated firmware development appeared first on Intel Software and Services.

Read more >

Turning Raw Data into Smart Insights

We are creating data at an exponential rate. Yet, the data growth rate is not the biggest challenge for IT. The biggest challenge is that the need for useful information is growing faster than the data itself  — providing a perfect storm for IT professionals and a business imperative to turn raw data into smart insights. To understand this challenge, I’d like to explore the history and evolution of big data complexity.

 

Data-Funny.pngIn 2001, Gartner analyst Doug Laney explained the initial challenges of big data in his 3Vs model. As time progressed, others have embraced the 3Vs model and incorporated two more areas of emphasis for big data analytics: veracity and value.

 

Ben Rossi does a nice job of articulating the impact of these five terms into one: smart data. “The purpose of smart data (veracity and value) is to filter out the noise and retain only the valuable data, which can be effectively used by the enterprise decision makers to solve business problems.”


Actionable Insights

 

Today, big data technology unfortunately isn’t meeting the needs of most businesses. There are two reasons. We should not be focused on the types of data, but the use case — business insights. And we must look far enough ahead in our use cases — trying to solve yesterday’s challenges and not just tomorrow’s. Michael Wu, chief scientist at Lithium Technologies, states that we are on a “maturity journey” when it comes to analytics and data visualizations. Understanding this evolution will help us better architect IT solutions today to extract the information and develop actionable insights for business decision makers.

 

There are three levels of analytics maturity that describe this progression:

 

  • Descriptive analytics (what happened): A summary report of historical data, usually seen in a dashboard. Most enterprise analytics today fall into this category. An example includes a report of business data offering insights into an organization’s financials, sales, or inventory.
  • Predictive analytics (what should happen): Makes predictions based on information that’s already available. An example includes financial services more accurately predicting future stock performance (noting that historical performance is not an indicator of future results).
  • Prescriptive analytics (what you should do today): Analytics that not only predict the future but also deliver insights that allow you to decide today what path you should take to optimize your results. Google’s self-driving car is an example of prescriptive analytics, since the car needs to make decisions based on predictions of future outcomes. This is the use case that business leaders in a variety of industries are seeking and what’s driving the need for big data analytics.

 

Analytics-Evolution.png


Rossi’s concept of smart data enables intelligent insights when evaluated with a focus on prescriptive analytics. Wu summarizes nicely: “Big data technology won’t help you make bigger decisions … yet smart data can certainly help you make smarter decisions.


Intel and Big Data Innovation

 

Extracting insights fast enough to support real-time business processes and decisions is critical, and companies are gathering, storing, and analyzing data they were never able to before. Intel understands the challenges and complexities facing IT professionals regarding the need to deliver high performance, cost-efficient big data solutions on a scalable, secure architecture.

 

As a result, Intel has joined forces with many industry leaders to enable enterprise solutions. SAP HANA, SAP Data Services, and SAP Business Objects provide solutions for real-time big data analytics using the Intel Distribution for Apache Hadoop software. Through these platforms, businesses can combine the performance of analytics with the scalability of Apache Hadoop, enabling a real-time analytics platform made to store, integrate, and analyze all business data.

 

Late last year, our CEO discussed a new Intel collaboration and equity investment with Cloudera aimed at bringing an enterprise-ready platform to the mainstream for impactful big data solutions.


The Big Data Maturity Journey

 

In summary, raw data is only useful when it’s used to add context-specific relevance, insights, and value to business operations. Smart data empowers the decision-making process by using analytics to achieve results that make sense to humans, not just machines. By making information actionable, we can make profitable decisions and solve problems in the process — and those are smart insights.

 

Join this conversation by connecting with me on Twitter at @chris_p_intel or by using #ITCenter.

Read more >

Bring Your Own Device in EMEA – Part 3 – It’s Not Just About Devices

I like to think of security as a chain, and like any other chain it is only as strong as its weakest link. In the case of security in healthcare the chain consists of the network, the server and the device. Often the focus is overwhelmingly placed on the security of the device but I argue that data is as equally, if not more, at risk when it’s in transit as it is when at rest. So, with that in mind I wanted to take a look at some of the wider security considerations around Bring Your Own Device (BYOD).


Whenever I speak at events about security and healthcare my starting point is often that we must remember that the priority for healthcare professionals is patient care. Security cannot, and must not, compromise usability as we know this drives workarounds. Often these workarounds mean using personal devices in conjunction with what is more commonly known as ‘Bring Your Own Cloud’.


Bring Your Own Cloud

Bring Your Own Cloud (BYOC) primarily refers to the use of clouds that are not authorized by the healthcare organization to convey sensitive data. This often occurs through an individual using an app they downloaded onto a personal device. Many such apps have backend clouds as part of their overall solution. When sensitive data is entered into the app it gets sync’d to the cloud. Furthermore, this transfer can occur over networks that are not managed by the healthcare organization, making the transfer invisible to the healthcare organization. Of course, sensitive data in an unauthorized cloud can constitute a breach. In many cases these 3rd party clouds can be in different countries, making this transfer a trans-border data flow and can represent further non-compliance issues with data protection laws.


For example, imagine a nurse taking patient notes that need to be sent to a specialist such as a cardiologist. This should be done using a secure device with a secure wireless network and a secure solution approved by the organization for such a task. However, lack of usability, or cumbersome security around such solutions, or a slow or overly restrictive IT department can drive the use of BYOC approach instead. In a BYOC approach the nurse uses a personal app on a personal mobile device together with either unencrypted email, a file transfer app, or social media to send these for analysis by a specialist.


This introduces risks to both the confidentiality of the sensitive healthcare data, as well as the integrity of the patient record that is often not updated with information traveling in these “side clouds”, rendering it incomplete, inaccurate, or out of data. In a best case this can result in suboptimal healthcare, and in a worst case this could be a patient safety issue. The consequences to both patient and organisation of such risks can be severe. Here at Intel we have security solutions available to healthcare organisations, which ensure that data is always secure whether at rest or in transit on the device or organisation’s network. Our security solutions also use hardware-enhanced security to maximize performance and usability, mitigating risk of cumbersome security and the healthcare worker being driven to resort to workarounds and BYOC.


Apps for Healthcare

One area where I’m seeing a lot of rapid change is in the development of apps for healthcare. I recently spoke to the Apps Alliance on the security challenges for developers of healthcare apps, whether they are aimed at healthcare professionals or consumers. These apps often make the recording and analysing of health information very easy and in some cases they can enhance the relationship between patient and clinician.


Stealth IT

I’d also like to briefly take a look at what is often referred to as ‘Stealth IT’, also called ‘Shadow IT’. As with any form of workaround, the use of Stealth IT can be driven by an unresponsive or overly restrictive corporate IT department. One obvious example would be a small team of researchers requiring additional server space to store data but perceiving the organisational process slow and expensive in providing such resources. The consequence is the purchase of what is comparatively cheap and accessible server space with any number of easy-to-find companies on the web. I remind you of my earlier comments about knowing exactly how secure the server is and in which country or continent the server sits.


I like to think that a healthcare organisation looking to put a Bring Your Own Device policy in place appreciates the benefits and risks but starts with understanding why a healthcare professional uses their own device, logs on to an unsecure network or purchases unauthorised server space. Only then will the organisation, healthcare worker and patient truly reap the benefits of BYOD.

 

 

David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@davidhoulding)

Check out his previous posts

Read more >

Harnessing the Internet of Things for Proactive Home Energy Management

Excuse the pun, but here in the UK, energy prices are hot news.

 

According to the Department of Energy and Climate Chance, since 2007, the prices of combined domestic gas and electricity bills have increased by 33 per cent in real terms. And, despite wholesale gas prices falling rapidly over the past 12 months, energy providers have been criticised for being slow to pass these savings onto customers. Of course it’s not that simple. Energy providers have to buy their wholesale gas supplies months in advance and have their own costs to cover, meaning it’s not always possible to immediately pass on these savings in full.  Both parties are feeling the squeeze – not only in the UK, but all across Europe.

Energy-IoT.png

 

So what can be done to shake up the energy market and improve the situation for all?

 

Well, French company IJENKO thinks the answer lies in the Internet of Things (IoT).

 

“We have only reached the tip of the iceberg when it comes to realizing the positive impact of the Internet on our digital home life, in particular with regard to energy,” explains Serge Subiron, CEO and co-founder at IJENKO. “The Internet of Energy (IoE), as it is known, has the potential to connect the activities of utility providers and consumers in real time, enabling much more dynamic energy provision and consumption.”

 

IJENKO’s Home Management Solution allows energy suppliers to empower customers to become more efficient in their use of energy, enabling them to save on their energy bills and collectively influence the demand curve.

 

Remote Monitoring

 

Imagine being able to use your Smart Phone to check how much energy your heating system is consuming, and what this equates to in monetary terms. Imagine then that you’re able to remotely turn down your thermostat a degree or two from wherever you are. This greater visibility and control, made possible by the IoT, allows for a much more proactive, dynamic and efficient use of energy, not to mention lower bills if that is your end goal.

 

And these solutions are not the stuff of science fiction. They are possible today thanks to IJENKO’s Home Management Solution and the Intel® IoT Gateway, which extracts data from legacy systems around the home and securely connects them to next-generation intelligent infrastructure.

 

Smart Services

 

As well as improving the customer experience, the IJNEKO solution presents utility providers with the opportunity to develop innovative smart services based on the interaction of a number of technologies, rather than one overarching standard. By adding real value to the customer, these services offer long-term stickiness and help utility providers to stay in control of the customer experience.

 

All in all, it’s a win-win scenario.

 

To continue this conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter.

 

Jane Williams
Online Sales Development Manager
Intel Corporation

Read more >

Hardware and Software Solutions that Create a Loosely Coupled Partnership

Demand for efficiency, flexibility, and scalability continues to increase, and the data center must keep pace with movement to our digital business strategies.  Previously, Diane Bryant, Intel’s senior vice president and general manager of Intel’s Data Center Group, recently stated, “We are in the midst of a bold industry transformation as IT evolves from supporting the business to being the business. This transformation and the move to cloud computing calls into question many of the fundamental principles of data center architecture.”

 

Those “fundamental principles of data center architecture” are on a collision course with the direction that virtualization has lead us.  This virtualization in conjunction with automation and orchestration is leading us to the Software Defined Infrastructure (SDI). The demand of SDI is driving new hardware developments, which will open a whole new world of possibilities for running a state-of-the-art data center.  This eventually will leave our legacy infrastructure behind.  While we’re not quite there yet, as different stages need to mature, the process has the power to transform the data center.

 

generic_server_room.jpg

 

Logical Infrastructure

 

SDI rebuilds the data center into a landing zone for new business capabilities. Instead of comprising multiple highly specialized components, it’s a cohesive and comprehensive system that meets all the demands placed on it by highly scalable, completely diversified workloads, from the traditional workloads to cloud-aware applications.

 

This movement to cloud-aware applications will demand the need for SDI; by virtualizing and automating the hardware that powers software platforms, infrastructure will be more powerful, cost-effective, and efficient. This migration away from manual upkeep of individual resources will also allow systems, storage, and network administrators to shift their focus to more important tasks instead of acting as “middleware” to connect these platforms.

 

Organizations will be able to scale their infrastructure in support of the new business services and products, and bring them to market much more quickly with the power of SDI.

 

Hardware Still Matters

 

As the data center moves toward an SDI-driven future, CIOs should be cautious in thinking that hardware does not count anymore. Hardware that works in conjunction with the software to ensure that the security and reliability of the workloads are fully managed and provide telemetry and extensibility that allow specific capabilities to be optimized and controlled within the hardware will be critical.

 

The Future of the Data Center Lies with SDI

 

Data centers must be agile, flexible, and efficient in this era of transformative IT. SDI allows us to achieve greater efficiency and agility by allocating resources according to our organizational needs, applications requirements, and infrastructure capabilities.

 

As Bryant concluded, “Anyone in our industry trying to cling to the legacy world will be left behind. We see the move to cloud services and software defined infrastructure as a tremendous opportunity and we are seizing this opportunity.”

 

To continue the conversation, please follow me on Twitter at @EdLGoldman or use #ITCenter.

Read more >

Hitting the Tradeshow Circuit with Red Rock Canyon – Intel’s New 100G Ethernet Technology

I had the opportunity to attend Mobile World Congress and the Open Compute Summit this year where we demonstrated Red Rock Canyon (RRC) at both venues. At Fall IDF in San Francisco last year, we disclosed RRC for the first time. RRC is Intel’s new multi-host Ethernet controller silicon with integrated Ethernet switching resources.

The device contains multiple integrated PCIe interfaces along with Ethernet ports that can operate up to 100G. The target markets include network appliances and rack scale architecture which is why MWC and the OCP summit were ideal venues to demonstrate the performance of RRC in these applications.

 

Mobile World Congress

This was my first time at MWC and it was an eye opener. Eight large exhibit halls in the middle of Barcelona with moving walkways to shuffle you from one hall to the next. Booths the size of two story buildings packed with 93,000 attendees – a record number according to the MWC website.

At the Intel booth, we were one of several demonstrations of technology for network infrastructure. Our demo was entitled “40G/100GbE NSH Service Chaining in Intel ONP” and highlighted service function forwarding using network services headers (NSH) on both the Intel XL-710 40GbE controller and the Intel Ethernet 100Gbps DSI adapter that uses RRC switch silicon. In case you’re not familiar with NSH, it’s a new virtual network overlay industry initiative being driven by Cisco, which allows flows to be identified and forwarded to a set of network functions by creating a virtual network on top of the underlying physical network.

The demo was a collaboration with Cisco. It uses a RRC NIC as a 100GbE traffic generator to send traffic to an Intel Sunrise Trail server that receives the traffic at 100Gbps using another RRC 100GbE NIC card. Sunrise Trail then forwards 40Gbps worth of traffic to a Cisco switch, which, in turn, distributes the traffic to both another Sunrise Trail server and a Cisco UCS server, both of which contain Intel® Ethernet XL710 Converged Network Adapters.

The main point of the demonstration is that the RRC NIC, the XL710 NIC and the Cisco switch can create a wire-speed service chain by forwarding traffic using destination information in the NSH header. For NFV applications, the NIC cards can also forward traffic to the correct VM based on this NSH information.

Network Function Virtualization (NFV) was a hot topic at MWC this year, and we had many customers from leading network service providers and OEMs come by our booth to see the demo. In some cases they were more interested in our 100GbE link, which I was told was one of the only demos of this kind at the show.

Another 100G Intel Ethernet demo was at the Ericsson booth where they announced their project Athena, which demonstrated a 100GbE link using two RRC-based NIC cards. Athena is designed for hyperscale cloud data centers using Intel’s rack scale architecture framework.

 

Open Compute Project Summit

The very next week, I traveled to San Jose to attend the Open Compute Project Summit where RRC was part of a demonstration of Intel’s latest software development platform for its rack scale architecture. OCP was a much smaller show focused on the optimization of rack architectures for hyperscale data centers. At last year’s conference, we demonstrated an RSA switch module using our Intel Ethernet Switch FM6000 along with four Intel 10GbE controller chips.

This year, we showed our new multi-host RSA module that effectively integrates all of these components into a single device while at the same time provides 50Gbps of bandwidth to each server along with multiple 100GbE ports out of the server shelf. This RSA networking topology not only provides a 4:1 cable reduction, it also enables flexible network topologies. We also demonstrated our new open source ONP Linux Kernel driver, which will be up-streamed in 2015 consistent with our Open Network Platform strategy.

We had a consistent stream of visitors to our booth partially due to an excellent bandwidth performance demo.

After first disclosing RRC at IDF last year, it was great to be able to have three demonstrations of its high-performance capabilities at both MWC and the OCP Summit. It doesn’t hurt that these conferences are also targeted at two key market segments for RRC: network function virtualization and rack scale architecture.

We plan to officially launch RRC later this year, so stay tuned for much more information on how RRC can improve performance and/or reduce cost in these new market segments.

Read more >