Recent Blog Posts

Intel IoT Ecosystem Drives Transaction Innovation at Transact 2016

As the Internet of Things becomes a reality, Intel IoT is leading the industry in transforming and securing transactions. This was especially clear at Transact 2016 in Las Vegas, produced by the Electronic Transactions Association, the world’s largest payments industry … Read more >

The post Intel IoT Ecosystem Drives Transaction Innovation at Transact 2016 appeared first on IoT@Intel.

Read more >

5 Questions for Mark Caulfield, Chief Scientist, Genomics England

Mark Caulfield, FMedSci, is a chief scientist and board member at Genomics England, an organization which provides investment and leadership to increase genomic testing research and awareness. Caulfield is also the director of the William Harvey Research Institute and was elected to the Academy of Medical Sciences in 2008. His particular areas of research are Cardiovascular Genomics and Translational Cardiovascular Research and Pharmacology. We recently sat down with him to discuss genomic sequencing as well as insight into a current research project. Mark-Caulfield-11-use.jpg


Intel: What is the most exciting project you’re working on right now?


Caulfield: The 100,000 Genomes Project is a healthcare transformation program that reads through the entire DNA code using whole genome sequencing. That’s 3.3 billion letters that make you the individual you are. It gives insight into what talents you have as well as what makes you susceptible to disease. My research is focused on infectious disease and rare inherited diseases such as cancer. Technology can bring answers that are usable in the health system now across our 13 centers.


When studying rare disease, the optimal unit is a mother, father and an affected offspring. The reason is that both parents allow the researcher to filter out rare variations that occur in the genetic code that are unrelated to the disease, focusing in on a precise group. This project will result in more specific diagnosis for patients, a better understanding of disease, biological insights which may pave the way for new therapies and a better understanding of the journey of patients with cancer, rare disease and infection.


Intel: How does this project benefit patients?


Caulfield: By building a picture of the entirety of the genome or as much as we can read today, which is about 97.6 percent of your genome, we have a more comprehensive picture and a far greater chance of deriving healthcare benefits for patients. Cancer is essentially a disease of disordered genome. With genomic sequencing, we can gain insights into what drove the tumor to occur in the first place, what drives its relapse, what drives its spread and other outcomes. Most importantly, we can understand what drives response to therapy. We already have good examples of where cancer genotyping is making a real difference to therapy for patients.


Intel: What is the biggest hurdle?


Caulfield: Informed consent is essential to the future application of the 100,000 genomes project. It’s very hard to guarantee, that you can absolutely secure data. I think it’s the responsibility of all medical professionals like myself in this age to be upfront about the risk to data access. Most patients understand these risks. We try and keep patient data as secure as is reasonably possible within the present technological bounds.


Intel: What is crucial to the success of genomic sequencing?


Caulfield: We need big data partners and people who know how to analyze a large amount of data. We also need commercial partners that will allow us to get new medicines to patients as quickly as possible. That partnership, if articulated properly, is well received by people. Once we have this established, we can make strides in gaining and keeping public and patient trust, which is crucial to the success of genomic sequencing.


If you want public trust, you must fully inform patients about the plan. Ensure their medical professionals understand that plan and that patients are bought into a conversation. This allows the patients and the public to shape your work. Sometimes in medicine, we become a little remote from what the patient wants when in actuality, this is their money. It should be their program, not mine.


Intel: What goal should researchers focus on?


Caulfield: With this large amount of data comes the need to process it as quickly as possible in order to provide helpful results for both the patient and care team. Intel’s All in One Day initiative is an important goal because it accelerates the time from when a person actually enrolls in such a program to receiving a diagnostic answer.


The goal is to get the turn-a-round as fast as possible. For example, if a patient has cancer, that person may have an operation where the cancer is removed. Then the patient would then need to heal. If chemotherapy were needed, it would be important to start that as quickly as possible. We have to use the best technology we have available so we can shrink the time from involvement to answer.

Read more >

All In One Day by 2020 – A Progress Check


All In One Day by 2020 – the phrase encompasses our real ambition here at Intel to empower researchers to give clinicians the information they need to deliver a targeted treatment plan for patients in just one 24-hour period. I wanted to provide you with some insight into where we are today and what’s driving forward the journey to All In One Day by 2020.


Genomics Code Optimization


We have been working with industry-leader experts, and commercial and open source authors of key genomic codes for several years on code optimization to ensure that genome processing runs as fast as possible on Intel®-based systems and clusters. The result is a significant improvement on the speed of key genomic programs which will help get sequencing and processing down to minutes, for example:


  • Intel has sped up a key piece of the Haplotype Caller in GATK, the pairHMM kernel to be 970x faster for an overall 1.8x increase in the pipeline performance;
  • The acceleration of file compression for genomics files, e.g. BAM and SAM files by over 4x
  • The acceleration of Python using Intel’s Math Kernel Library (MKL) producing a 15x speedup on a 16-core Haswell CPU;
  • Finally, using the enhanced MKL, in conjunction with its Data Analytics Acceleration Library (DAAL), has enabled DAAL to be 100x faster than R for k-means clusters and 35x faster than Weka on Apriori.


You can find out more about Intel’s work in code optimization at our dedicated Optimized Genomics Code webpage.


Scalability for Success


As we see an explosion in the volume of available data the importance of being able to scale a high performance computing system becomes ever more critical to accelerating success. We have put forth the Intel® Scalable System Framework to guide the market on the optimal construction of an HPC solution that is multi-purpose, expandable and scalable.


Combining the Scalable System Framework with optimized life sciences codes has resulted in a new, more flexible, scalable, and performant architecture. This reduces the need for purpose-built systems and instead offers an architecture that can span a variety of diverse workloads while offering increased performance.


Another key element of an architecture is the balance between three key factors: compute, storage, and fabric. And today we see the fruits of our work coming to life, for example, in a brilliant collaboration between TGen, Dell and Intel which optimized TGen’s  RNA-Seq pipeline from 7 days to under 4 hours. TGen are successfully operating FDA-approved clinical trials, balancing research and providing clinical treatment of pediatric oncology patients.


The intersection of our code optimization efforts and our SSF effort have yielded two new products for genomics too, one from Dell and another from Qiagen.


From a week to a day


It’s useful, I think, to see just how far we’ve come in the last four years as we look ahead to the next four years to 2020. In 2012 it took a week to perform the informatics on a whole human in a cloud environment going from the raw sequence data to an annotated result. Today, the time for the informatics had decreased to just 1 day for whole genomes.


With the Dell and Qiagen reference architectures that are based on optimized code and the Intel® Scalable System Framework, a throughput-based solution has been created. This means that when fully loaded these base systems will perform the informatics on ~50 whole genomes per day.


However, it is important to note the genomes processed on these systems still take ~24 hours to run, but they are being processed in a highly parallel manner. If you use a staggered start time of ~30 minutes between samples, this results in a completed genome being produced approximately every 30 minutes. For the sequencing instrumentation, Illumina can process a 30x whole human genome in 27 hours using its “rapid-run mode”.


So, in 2016, we can sequence a whole genome and do the informatics processing in just over 2 days (51 hours consisting of 27 hours of sequencing + 24 hours of informatics time), that’s just ~1 day longer than our ambition of All In One Day by 2020.


Three final points to keep in mind:


  1. There are steps in the All In One Day process that are our outside of the sequencing and the informatics, such as the doctor’s visit, the sample preparation for sequencing, the genome interpretation and the dissemination of results to the patient. These steps will add additional time to the above 51 hours.
  2. The reference architectures are highly scalable meaning a larger system can do more genomes per day. 4 times the nodes produce 4 times throughput.
  3. There are enhancements still to be made. For example, streaming the output from the sequencer to the informatics cluster such that the informatics can be started before the sequencing is finished will further compress the total time towards our all-in-one-day goal.


I’m confident our ambitions will be realized.


Read more >

Can Zealous Security Cause Harm?

Security Balance.jpg

Good security is about balancing Risks, Costs, and Usability.  Too much or too little of each can be unhealthy and lead to unintended consequences.  We are entering an era where the risks of connected technology can exceed the inconveniences of interrupted online services or the release of sensitive data.  Failures can create life-safety issues and major economic impacts.  The modernization of healthcare, critical infrastructure, transportation, and defense industries is beginning to push the boundaries and directly impact people’s safety and prosperity.  Lives will hang in the balance and it is up to the technology providers, users, and organizations to ensure the necessary balance of security is present.


We are all cognizant of the risks in situations where insufficient security opens the door to exposure and the compromise of systems.  Vulnerabilities allow threats to undermine the availability of systems, confidentiality of data, and integrity of transactions.  On the other end of the spectrum, too much security can also cause serious issues.


A recent incident described how a piece of medical equipment crashed during a heart procedure due to an overly aggressive anti-virus scan setting.  The device, a Merge Hemo, is used to supervise heart catheterization procedures, while doctors insert a catheter inside blood vesicles to diagnose various types of heart diseases.  The module is connected to a PC that runs software to record and display data.  During a recent procedure, the application crashed due to the security software which began scanning for potential threats.  The patient remained sedated while the system was rebooted, before the procedure could be completed.  Although the patient was not harmed, the mis-configuration of the PC security software caused an interruption during an invasive medical procedure. 


Security is not an absolute.  There is a direct correlation between the increasing integration of highly connected and empowered devices, and the risks of elevated attack frequency with a greater severity of impacts.  The outcome of this particular situation was fortunate, but we should recognize the emerging risks and prepare to adapt as technology rapidly advances.


Striking a balance is important.  It may not seem intuitive, but yes, too much security can be a problem as well.  Protection is not free.  Benefits come with a cost.  Security functions can create overhead to performance, reduce productivity, and ruin users’ experiences.  Additionally, security can increase the overall cost of products and services.  These and other factors can create ripples in complex systems and result in unintended consequences.  We all agree security must also be present, but the reality is, there must be an appropriate balance.  The key is to achieve an optimal level, by tuning the risk management, costs, and usability aspects for any given environment and usage.




Interested in more?  Follow me on Twitter (@Matt_Rosenquist) and LinkedIn to hear insights and what is going on in cybersecurity.

Read more >

Telemedicine Trends in Latin America

Telemedicine is gaining increased attention worldwide as a solution for improving access to care, improving quality of care, and lowering costs.


Much of Latin America faces a major challenge that could in part be addressed with telemedicine:  a shortage of providers, and large populations living in rural areas where access to physicians—particularly specialists—is lacking.


In my multiple visits to Latin America over the past two years, it is clear that while most countries in the region have used telemedicine to varying extents for many years, scalability remains a major goal.


Governments across Latin America are generally strong advocates of telemedicine, and are investing in the networks and infrastructure that will support this technology.


Below I highlight ways in which countries throughout the region are using or intend to use telemedicine, and what trends we might observe in the years ahead.



In Brazil, telemedicine today is used strictly for provider-to-provider consultation, as physicians are not legally allowed to consult with patients over videoconference.


Telemedicine has been largely driven by the need to provide care virtually between specialists in urban centers to patients in remote areas, due to a lack of specialists in the rural areas.


The Brazilian government has long supported the use of telemedicine to provide better access and treatment to remote areas. Since 2006, it has facilitated two public initiatives–the Brazilian National Telehealth Network Program (launched by the MOH) and the RUTE-Telemedicine University Network (launched by the Ministry of Science, Technology, and Innovation) both of which serve to deploy telemedicine across Brazil.


One of the first major initiatives started in 2006 in Parintins, a city of 100,000 located in the middle of the Amazon. With no roads to or from the city, the goal was to use telemedicine to enable communication between physicians in Parintins and specialists in Sao Paulo. Parintins partnered with private technology companies, including Intel, to build the necessary infrastructure (e.g., WiMAX network). This telemedicine program continues to operate today, and has informed other telemedicine efforts including Brazil’s national telehealth program, Telessaude (


Another major initiative in Brazil is to bring intensive care unit (ICU) care to rural areas. The Brazilian MOH initiated tele-ICU programs so that now many hospitals in different regions are connected to rural parts of the country. These tele-ICUs reduce the need to transport patients into a city for health conditions such as heart attacks, strokes, and sepsis. Physicians in urban areas are able to use PTZ cameras to visually inspect the patient, and collect and interpret vital signs in real-time. Cerner, in partnership with Brazilian companies Intensicare and IMFtec, has provided the technology and software for most of these virtual ICUs.


Mexico, Chile, Peru, and Argentina

In Mexico, the social security network provides healthcare to formal sector workers. The network is currently working with companies such as Lumed Health to expand telemedicine capabilities. In addition, telemedicine is being used between the U.S. and Mexico with health systems such as the Mayo Clinic and Massachusetts General conducting consultations with physicians in Mexico.


In Chile, the Ministry of Health has implemented a “Digital Health Strategy.” Its primary goal is also to address provider shortages and to improve access to care in rural areas. There are currently several telemedicine projects and POCs underway in Chile.  AccuHealth (, for example, is a Chilean company that provides tele-monitoring services specifically to bring home care to patients who suffer from chronic conditions. The company plans to expand to Mexico and Colombia in the near term.


In Peru, the government is spearheading efforts to build a fiber optics network across the entire country ( This infrastructure will be used to better support telemedicine services.

In Argentina, the government has worked with the MOH and the Ministry of Federal Planning, Public Investment and Services to promote telemedicine. This collaboration has culminated in the CyberHealth Project, which is focusing on the installation of fiber optics and upgrading hospitals to allow for videoconferencing. It aims to connect 325 healthcare institutions across the nation to enable remote consultations and sharing of expertise.


The Future of Telemedicine in Latin America

Telemedicine is being increasingly recognized as a solution to achieve more with less. In Latin America, it has great potential to address the fact that providers and health care resources are not distributed equally among the urban and rural populations.


The future of telemedicine in the region is promising. Governments are investing in and taking active roles in digitizing their health systems (e.g., implementation of electronic medical records, improving interoperability) along with building the infrastructure required to support telemedicine. The Pan American Health Organization (PAHO) has convened a meeting of the MOH leaders from several Latin American countries to discuss strategic plans for e-Health across the region. This collaboration, where protocols, guidelines, and best practices can be shared, will be increasingly important.


Intel Health & Life Sciences looks forward to continuing its partnerships with public and private entities across Latin America to continue these important efforts.

Read more >

Cloud Adoption Security2

Will Your Cloud be at Risk?

The Cloud is both compelling and alluring, offering benefits that entice many organizations into rapid adoption. The attractiveness of lower operational costs, powering new service offerings, and adaptability to cater to varying demands makes it almost irresistible to rush in. … Read more >

Intel Hosts Historic NSTAC Meeting

By: David Hoffman, Associate General Counsel and Global Privacy Officer This week Intel hosted the National Security Telecommunications Advisory Committee (NSTAC) at our headquarters in Santa Clara, California.  This was the first meeting for the NSTAC in Silicon Valley, which … Read more >

The post Intel Hosts Historic NSTAC Meeting appeared first on Policy@Intel.

Read more >

Nurses Week 2016: Technology To Make Your Job Easier

International Nurses Day is a time to say Thank You Nurses. Thank you for your hard work, thank you for your compassion and thank you for the endless care you give to patients. It’s this unwavering focus on patient care that we must keep in mind when developing and implementing technology for nurses both in the hospital and community. The most valuable technology we can give to nurses is that which is almost invisible to – yet improves – their workflow, simplifies complex tasks and enables them to deliver even better care – in essence, technology must make the job of a nurse easier. I want to take today, International Nurses Day, to highlight a couple of technologies which have the potential to deliver on all of the above.


Nursing goes Digital

I know from experience that the best decisions are made when a nurse has the most accurate and up-to-date information on a patient’s condition. And when that accurate information can be gathered and accessed in an intuitive and more natural interaction using technology it’s a win-win for nurses and patients.


I’m excited by the potential offered by Intel’s RealSense 3D camera which can be found in a range of devices such as 2-in-1s, the likes of which are already being used by nurses to record vital signs and access EMRs. For example, imagine being able to accurately track all 22 joints of a hand to assist with post-operative treatment following hand surgery.


For community nurses, mobility is key. Holding the most up-to-date information when visiting patients in the home ensures mistakes are kept to a minimum and all parties involved in the care of the patient, from community nurses to specialist clinician, can make evidence-based decisions. 2-in-1 devices help nurses to stay focused on the patient rather than reams of paperwork, while also helping patients better understand their condition and improving buy-in to treatment plans. The real benefits are in simplifying and speeding up those processes which ensures nurses deliver the best possible care.


Big Data for Nurses

When we think of Big Data it is all too easy to think just about genomics, but there are benefits which can clearly help nurses identify serious illness more quickly too. Take Cerner for example, who have developed an algorithm that monitors vital information fed in real-time from the EMR. The data is analysed on a real-time basis, which then identifies with a high degree of accuracy that a patient is either going to get, or already has, sepsis.


Clearly, given the speedy nature with which drugs must be administered, this Big Data solution is helping nurses to simply save lives by identifying at-risk patients and getting them the treatment they so desperately need. Watch this video to find out more about how Intel and Cloudera allow Cerner to provide a technology platform which has helped save more than 2,700 lives.


Intelligent Care

The rise of the Internet of Things in the healthcare sector is seeing an increasing use of sensors to help simplify tasks for nurses. For example, if sensors can monitor not only a patient’s vital signs but also track movement such as frequency of the use of a toilet, it not only frees up a nurse’s time for other tasks but also begins to build an archive of data which can be used at both patient and population effort.


In China the Intel Edison-based uSleepCare intelligent bed is able to record a patient’s vital signs such as rate and depth of breathing, heart-rate and HRV without the need for nurse intervention. There are positive implications for patient safety too, as sensors can track movements and identify when patients might fall out of bed, alerting nurses to the need for attention.

And when I think of moving towards a model of distributed care, this type of intelligent medical device can help the sick and elderly be cared for in the home too. WiFi and, in the future, 5G technologies, combined with sensors can help deliver the right patient information to the right nurse at the right time.


Investing in the Future

Having highlighted two examples of how technology can help nurses do an even better job for patients I think it’s important to recognise that we must also support nurses in using new technology. Solutions must be intuitive and seamlessly fit into existing workflows, but I recognise that training is needed. And training on new technologies should happen right from the start of nursing school and be a fundamental part of ongoing professional development.


While International Nurses Day is, of course, a time to reflect and say Thank You Nurses, I’m also excited about the future too.


Read more >

Achieving the Data Moon Shot: The Importance of Institutionalizing Open Data

By Cisco Minthorn, Director of Government Relations and Senior Counsel, Government and Policy Group The Commerce Data Advisory Council (CDAC) – the federal advisory committee created by U.S. Secretary of Commerce, Penny Pritzker in 2014, to convene experts from industry … Read more >

The post Achieving the Data Moon Shot: The Importance of Institutionalizing Open Data appeared first on Policy@Intel.

Read more >

Key Lessons from the 2016 Verizon Data Breach Incident Report

Verizon 2016 DBIR.jpg

The annual Data Breach Incident Report (DBIR) is out and reinforcing the value of well-established cybersecurity practices.  The good folks at Verizon Enterprise have once again published one of the most respected annual reports in the security industry, the DBIR. 


The report sets itself apart with the author intentionally avoiding unreliable ‘survey’ data and instead striving to truly communicate what is actually happening across the cybersecurity breach landscape.  The perception of security typically differs greatly from reality, so this analysis provides some of the most relevant lessons for the field.


Report data is aggregated from real incidents that the company’s professional security services have responded to for external customers.  Additionally, a large number of security partners now also contribute data for the highly respected report.  Although this is not comprehensive across the industry, it does provide a unique and highly-valuable viewpoint, anchored in real incident response data.


Much of the findings support long-standing opinions on the greatest cybersecurity weaknesses and best practices.  Which is to say, I found nothing too surprising and it does reinforce the current directions for good advice.



Key Report Findings

1. Human Weaknesses

30% of phishing messages were opened by their intended victim

12% of those targets took the next step to open the malicious attachment or web link

2. Ransomware Rises

39% of crime-ware incidents were ransomware

3. Money for Data

95% of data breaches were motivated by financial gain

4. Attackers Sprint, Defenders Crawl

93% of data breaches were compromised in minutes

83% of victims took more than a week to detect breaches

5. Most of the Risk is from a Few Vulnerabilities

85% of successful traffic was attributed to the top 10 CVE vulnerabilities.  Although difficult to quantify and validate, it’s clear that top vulnerabilities should be prioritized



Key Lessons to Apply

1. Train users.  Users with permissions and trust are still the weakest link.  Phishing continues to be highly effective for attackers to leverage poorly trained users to give them access. 

2. Protect financially-valuable data from confidentiality, integrity, and availability attacks.  Expect attacks and be prepared to respond and recover.

3. Speed up detection capabilities.  Defenders must keep pace with attackers.  When preventative controls fail, it is imperative to quickly detect the exploit and maneuver to minimize overall impact.

4. Patch top vulnerabilities in operating systems, applications, and firmware.  Patch quickly or suffer.  It is a race; treat it as such.  Prioritize the work based upon severity ranking Serious vulnerabilities should not languish for months or years!


This is just a quick review.  The report contains much more information and insights.

I recommend reading the Executive Summary or the full DBIR Report.




Interested in more?  Follow me on Twitter (@Matt_Rosenquist) and LinkedIn to hear insights and what is going on in cybersecurity.

Read more >

Intel IoT Champions Industrial Internet of Things at Hannover Messe

At Hannover Messe, the world’s largest industrial and manufacturing trade fair, the Intel IoT ecosystem showcased everything from augmented reality industrial wearables like the DAQRI smart helmet, to Industry 4.0 factory automation and industrial asset management solutions. The event in … Read more >

The post Intel IoT Champions Industrial Internet of Things at Hannover Messe appeared first on IoT@Intel.

Read more >

Tweet Chat Review: The Growth of Connected Care

Last week I had the honor of moderating the weekly #HITsm (Health IT social media) chat on Twitter. This regular discussion about health IT issues is a wonderful forum for addressing what steps need to be taken to move healthcare technology forward on a number of fronts.


The topic of my chat was The Growth of Connected Care, and focused on defining the terms, sharing trends and identifying successful characteristics of a connected care program. I enjoyed the banter and the great questions that came my way during the chat and learned quite a bit about what the climate is like for overcoming obstacles to adopting connected care.  You can see the transcript of the entire chat here.


To recap the conversation, below are the questions that were asked during the chat and my brief answers.



Connected care is a broad term – what does it mean?

Generally, connected care applies to leveraging technology to connect patients, providers, and caregivers. Increasingly, this is happening in real-time. Connected care extends care outside of the traditional hospital setting and moves healthcare from episodic events to more continuous care that is tailored specifically for the patient.


What market trends are driving connected care?

A few trends are driving connected care forward. First, new Internet of Things (IoT) technology (devices-datacenter) are making connected care possible for patients. Think about wearables and the massive amount of data that can be acquired that influences care; this is the cornerstone of connected care.


Second, payment reform and payment models are changing from fee-for-service to value-based. As payment models change, patient retention becomes increasingly important for clinicians. This is the consumerization of healthcare, where the patient takes charge of their own health and the care is on a regular, on-going basis.


Finally, healthcare technology investments in digital platforms have opened the opportunity to create and consume new data streams in real-time.


What technologies are enabling connected care?

For starters, big data technologies, both software and hardware, are enabling us to work with the high volume, variety, and velocity of connected care data. Wearables and sensors are also evolving, and newer devices are delivering more value in improved form factors.


What are characteristics of a successful connected care program?

Successful connected care programs have clear clinical and business goals, know the problems that need to be solved, have measurable outcomes and clear value propositions, and feature scalable architecture for data ingestion, storage, analysis, and visualization.


Programs must be patient-centric and look holistically at both patient and care team touch points throughout the continuum of care. They also need a strategy for transforming data into actionable/comprehensible insights delivered at the right time, to the right person. This is often overlooked – insights for providers or patient instructions get lost in poor visualization. This is why the UI/UX aspect of connected care is so critical.


Where is connected care headed, and what are some things to watch for?

Expect larger connected care programs with employers, payers, and care providers to reach consumers and tie engagement to financial outcomes. It will be interesting to see how employees respond and how the employer/employee relationship is re-written to include health-related activities.


Population health programs will go through a three step evolution of understanding, predicting, and then preventing (UPP). Step one is simply understanding what data is available and identifying/filling gaps. The second stage of program maturity involves using data to being predicting outcomes for specific populations. This stage involves iterating through models to improve specificity both for target outcomes and population boundaries.


The third stage is using the predictions to implement real programs that prevent target outcomes from occurring. This stage will partially rely on human-centered care delivery, but it will also push the boundaries of virtual medicine in response to access and delivery constraints that inevitably arise.


On the downside, large data breaches look inevitable in the future as more devices allow for more attack vectors. The big unknown is how this will impact the industry and consumers.


What are some of the short- and long-term obstacles to adoption of connected care programs?

The business models for connected care are still evolving. New payment and reimbursement pathways are needed to create growth. Sustainable, long-term patient engagement is a challenge. Hopefully, healthcare will continue to look to industries that have pioneered techniques for data-driven high-touch consumer engagement (consumer goods, SaaS internet companies, etc.) and apply those learnings to developing new strategies to engage patients. Finally, federal and state regulation must continue to evolve because connected care operates across traditional geographic boundaries and models of care delivery.

Read more >

Will Your Cloud be at Risk?

The Cloud is both compelling and alluring, offering benefits that entice many organizations into rapid adoption. The attractiveness of lower operational costs, powering new service offerings, and adaptability to cater to varying demands makes it almost irresistible to rush in.


But caution should be taken.


Leveraging cloud technologies can offer tremendous opportunities, with the caveat of potentially introducing new security problems and business risks.


These risks can include vulnerability to cyber-attacks, jeopardizing the confidentiality of data, and potentially undermining the integrity of transactions. Care must be taken to understand these challenges in order to properly design the environment and establish sustainable management processes to maintain a strong security posture. Information assurance is required.





How can you mitigate risks in the Cloud?

1. Be informed by understanding both the benefits and risks of cloud adoption.

2. Know the threats and types of attacks that put your cloud data and services at risk.

3. Establish practices to cover the Top 10 assurance categories for cloud.

4. Build a quality plan by leveraging expert resources.

5. Establish accountability across the lifecycle.

6. Don’t be afraid to ask. Nobody gets it right alone!


I recently presented strategic recommendations for cloud adoption to a community of application and infrastructure developers. The first step of the journey into the Cloud resides with teams pursuing the benefits and those accountable for maintaining the environment. It is important to follow a path of practical steps for cloud adoption in order to manage the risks while accessing the plethora of benefits. To be successful, teams must understand the security challenges, leverage available expertise and establish a comprehensive plan across the service lifecycle.



Interested in more?  Follow me on Twitter (@Matt_Rosenquist) and LinkedIn to hear insights and what is going on in cybersecurity.

Read more >

Forging an Open Path for SDI Stack Innovation

path-02.jpgIntel was founded on a deep commitment to innovation, especially open standards driven innovation, that results in  acceleration only seen when  whole ecosystems come together to deliver solutions.  Today’s investment in CoreOS is reflective of this commitment, as data centers face an inflection point with the delivery of software defined infrastructure (SDI).  As we have at many times in our industry’s history, we are all piecing together many technology alternatives to form an open, standard path for SDI stack delivery.  At Intel, we understand the value that OpenStack has brought to delivery of IaaS, but also see the additive value of containerized architectures found in many of the largest cloud providers today.  We view these two approaches as complimentary, and the integration and adoption of these are critical to broad proliferation of SDI.

This is why we announced a technology collaboration with CoreOS and Mirantis earlier this year to integrate OpenStack and Kubernetes, enabling OpenStack to run as a containerized pod within a Kubernetes environment. Inherent in this collaboration is a strong commitment across all parties to contribute the results of this collaboration directly upstream so that both communities may benefit. The collaboration brings the broad workload support, and vendor capabilities of OpenStack and the application lifecycle management and automation of Kubernetes into a single solution that provides an efficient path to solving many of the issues gating OpenStack proliferation today – stack complexity and convoluted upgrade paths.  Best of all, this work is being driven in a fully open source environment reducing any risk of vendor lock in.


Because software development and innovation like this is a critical part of Intel’s Cloud for All initiative, we tasked our best SDI engineers to work together with CoreOS to deliver the first ever live demonstration of OpenStack running as a service within Kubernetes at the OpenStack Summit.  To put this into perspective, our joint engineers were able to deliver a unified “Stackanetes” configuration in approximately three weeks’ time after our initial collaboration was announced. Three weeks is a short timeframe to deliver such a major demo, but highlights the power of using the right tools together. To say that this captured the attention of the OpenStack community would be an understatement, and we expect to integrate this workflow into the Foundation’s priorities moving forward.


The next natural step in our advancement of the Kubernetes ecosystem was our investment in CoreOS that we announced today.  CoreOS was founded on a principle of delivering GIFEE, or “Google Infrastructure for Everyone Else”, and their Tectonic solution integrates Kubernetes with the CoreOS Linux platform. CoreOS’s Tectonic is an easy to consume Hyperscale SDI Stack. We’ve been working with CoreOS for more than a year on various software optimization efforts focused at optimization of Tectonic for underlying Intel Architecture features. Our collaboration on Kubernetes reflects a common viewpoint on the evolution of SDI software to support a wide range of cloud workloads that are efficient, open and highly scalable.  We’re pleased with this latest chapter in our collaboration and look forward to delivering more of our vision in the months ahead.

Read more >

Nurses Week 2016: When Will Avatars Join Nurses Week Celebrations?

Nurses Week is a great opportunity to celebrate all of the fantastic work we do for patients. I often find myself pausing at this time of the year to appreciate just how different – and in most cases better – our working practices, processes and outcomes are compared to just 10 or so years ago. Technology has been a great enabler in improving the workflow of nurses today, but I wanted to share some thoughts on the future of nursing in this blog and how we might be welcoming avatars and the world of virtual reality to Nurses Week celebrations in the near future.


Better Training, Overcoming Global Shortage of Nurses

There are challenges ahead for the nursing community, driven by many of the same factors affecting the entire healthcare ecosystem, ranging from an increasingly ageing population to pressure on budgets. When I met with nurses from across Europe in Brussels earlier this year at Microsoft in Health’s Empowering Health event, two key themes really came to the fore:

  • First, there was a call for improved training for nurses to help them better understand and benefit from technologies such as 2 in 1 tablets and advanced Electronic Medical Record systems;
  • Second, there was a discussion around what technologies might help overcome the potential of a global shortage of nurses in the future. A 2015 World Health Organisation report stated that ‘a fundamental mismatch exists between supply and demand in both the global and national health labour markets, and this is likely to increase due to prevalent demographic, epidemiologic and macroeconomic trends.’

Looking ahead I see a real opportunity to integrate avatars and virtual reality into the nursing environment which will not only train students to be better nurses but also deliver better patient care with improved workflows at the bedside


Virtual Reality To Deliver Safe, Effective Teaching

Training is a fundamental part of a nurse’s development, and that rings true for both those in nursing school and more experienced nurses learning new technologies and procedures. Virtual reality technology can play a major role in helping nurses to better deal with a range of scenarios and technologies.


For example, if I want to teach a nurse how to perform a specific procedure using virtual reality, I’m able to present the trainee with an avatar on a screen that could be any combination of gender, height, weight and medical condition. And whilst the procedure is being undertaken I’m then able to trigger a wide range of responses from the avatar patient to help the nurse learn how to deal with different scenarios – all in a safe and controlled manner that can be monitored and assessed for post-session feedback.


Similarly, if a nurse is required to understand how to use a new piece of technology to improve their workflow, such as working with an upgrade to an EMR system on a 2 in 1 tablet, virtual reality can help too by simulating these new systems. In a virtual setting nurses are not only able to familiarise themselves with new processes but can provide feedback on issues around workflow before they are launched into a live patient environment.


If I think of how my training was delivered at nursing school there was plenty of ‘chalk and board’-style teaching and a lot of time spent in a classroom using limited resources such as manikins. Today, an infinite number of student nurses can learn remotely using virtual reality and avatar patients, reinforcing knowledge and improving workflows on a range of mobile devices. This is particularly useful too for countries where educators are in short supply but nursing demand may be high.

Avatars Ask ‘How Are You Feeling Today’?

A recurring question in my mind is how can we make better use of the fantastic expertise and knowledge of today’s nurses to continue to deliver great care to patients. In the face of a shortage of nurses we should explore how avatars on a bed-side screen or 2 in 1 device might be able to take away the burden of some of the more routine daily tasks such as asking patients if they have any unusual symptoms.


Patient answers could be fed back into the EMR which would trigger either further questions from the avatar or, in more serious cases, an alert for intervention by a human nurse.

When I talk to my peers within healthcare there are some obvious and real concerns about the lack of emotion delivered by avatars. The rise of chat bots has made for interesting news recently and I see this kind of artificial intelligence combined with a great avatar experience delivering something approaching human emotions such as sympathy for routine tasks. We should also recognise that an avatar can potentially speak an unlimited number of languages too, helping all patients get a better understanding of their condition.


As nurses I hope our community embraces discussion and ideas around the use of virtual reality and avatars – I’ve talked through just a couple of scenarios where I see improvements to training and care delivery but I’d be interested to hear how you think they could help you do your job better. And perhaps one day in the near future, avatars will be celebrating Nurses Week with us too.


Read more >