Recent Blog Posts

Improving User Experience through Big Data

Fig1.png

Enterprise IT users switch between a multitude of programs and devices on a daily basis. Inconsistencies between user interfaces can slow enterprise users’ productivity, as those users may enter the same information repeatedly or need to figure out what format to enter data (e.g. specifying an employee might be done with an employee number, a name, or an e-mail address).   On the application development side, code for user interfaces may be written over and over again.  One approach to solving these problems is to create a common User Experience (UX) framework that would facilitate discussion and the production of shareable interface templates and code.    Intel IT took the challenge to do just that, with the goals of increasing employee productivity by at least 25% and achieving 100% adoption.  To create that unified enterprise UX frame work, Big Data approaches were critical, as described in this white paper from IT@Intel.

 

To understand the requirements for the enterprise UX, two sources of data are available, but both have unique problems.  Traditional UX research methods like surveys, narratives, or observations, typically are unstructured and often do not have statistical significance. Usage data from logs have large volumes, and user privacy is at risk.  Unstructured data, varied data, and voluminous data are a perfect fit for Big Data techniques.   We used de-identification (aka anonymization) to hide the personal information of users.  De-identification techniques were combined with Big Data to create a Cloudera Hadoop based analysis platform shown to the right.  Fig2.png

 

Using that analysis platform, Intel IT’s UX team created a single framework standard for all enterprise solutions.  60% of Intel IT’s staff can take advantage of it.   Data from this platform was also used to select and implement a new internal social platform.  The analysis platform has also been used to analyze other aspects of user behavior, which we are planning to write about in a future IT@Intel white paper.

 

In addition to the white paper, more detail on the development of the UX framework can be found in the following papers:

 

Regarding our use of de-identification/anonymization, we talked about our early explorations in this white paper, and a more detailed analysis of the challenges of using de-identification in an enterprise setting our detailed in this conference paper:

Read more >

Malware Trend Continues its Relentless Climb

Malware development continues to remain healthy.  The Intel Security Group’s August 2015 McAfee Labs Threat Report shows malware quarterly growth at 12% for the second quarter of 2015.  In totality, the overall count of known unique malware samples has reached a mesmerizing 433 million. 

2015 Q3 Total Malware.jpg

Oddly, this has become a very stable trend.   For many years malware detection rates have remained relatively consistent at about ~50% increase annually. 

 

Which makes absolutely no sense! 

 

Cybersecurity is an industry of radical changes, volatile events, and chaotic metrics.  The growth of users, devices, data, new technologies, adaptive security controls, and dissimilar types of attacks differ each year.  Yet the numbers of malware being developed plods on with a consistent and predictable gain. 

 

What is going on?

 

Well colleagues, I believe we are witnessing a macro trend which incorporates the natural equilibrium occurring between symbiotic adversaries. 

 

Let me jump off topic for a moment.  Yes, cyber attackers and defenders have a symbiotic relationship.  There, I said it.  Without attacks, security would have no justification for existence.  Nobody would invest and most, if not all, security we have today would not exist.  Conversely, attackers do need security to keep their potential victims healthy, online, and valuable as targets.  Just as lions need a healthy herd to hunt, to avoid extinction, attackers need defenders to insure computing continues to grow and be more relevant.  If security was not present to hold everything together, attackers would decimate systems and in short order nobody would use them.  The herd would disappear.  So yes, a healthy electronic ecosystem has either a proper balance of both predator and prey, or a complete omission of both.

 

Back to this mind boggling trend.  I believe the steady growth of malware samples is a manifestation, at a high level, of the innumerable combined maneuvering of micro strategies and counter tactics.  As one group moves for an advantage, the other counters to ensure they are not defeated.  This continues on many fronts all the time.  No clear winner, but no complete loser either.  The players don’t consciously think this way, instead it is simply the nature of the symbiotic adversarial relationship.    

I have a Malware Theory and only time will tell if this turns into a law or dust.  My theory “malware rates will continue to steadily increase by 50% annually, regardless of the security or threat maneuvering” reflects the adversarial equilibrium which exists between attackers and defenders.  Only something staggering, which would profoundly upset the balance will change that rate.  If my theory is correct, we should break the half-billion mark in Q4 2015.

 

So I believe this trend is likely here to stay.  It also provides important insights to our crazy industry and why we are at this balance point.

 

Even in the face of new security technologies, innovative controls, and improved configurations, malware writers continue to invest in this method because it remains successful.  Malware continues to be the preferred method to control and manipulate systems, and access information.  It just works.  Attackers, if nothing else, are practical.  Why strive to develop elaborate methods when malware gets the job done?  (See my rants on path of least resistance for more on understanding the threats.) 

 

Defensive strategies are not slowing down malware growth.  This does not mean defensive tools and practices are worthless.  I suspect the innovation in security is keeping it in check somewhat, but not slowing it down enough to reduce the overall growth rates.  In fact, without continued investment we would likely be overrun.  We must remain vigilant in malware defense.

 

The rate increase is a reflection on the overall efficacy of security.  Malware must be generated at a rate of 150% per year, in order compensate for security intervention and achieve the desired success.  Flooding defenders is only one strategy as attackers are also demanding higher quality, feature rich, smarter, and more timely weapons.

 

Malware must land somewhere in order to operate and do its dirty deeds.  PC’s, tablets, phones, servers, cloud and VM hosting systems, and soon to be joined more prominently by droves of IoT devices, are all potential hosts.  Therefore, endpoints will continue to be heavily targeted and defense will continue to be hotly contested on this crucial battleground.  Ignore anyone who claims host based defenses are going away.  Just the opposite my friends.

 

At a rate of over three-hundred thousand new unique samples created per day, I speculate much of the malware is being generated automatically.  It is interesting on the defensive side, anti-malware companies are beginning to apply machine-learning, community reporting, and peer-validation to identify malicious code.  It is showing promise.  But just wait.  The malware writers can use the same type of machine-learning and community reporting to dynamically write code which either subverts detection or takes advantage of time delays in verification.  Malware code can quickly reinvent itself before it is verified and prosecuted.  This should be an interesting arms race.  Can the malware theory sustain?  Strangely, I suspect this battle, although potentially significant, may be exactly what the malware model anticipates.  The malware metronome ticks on.

 

 

Connect with me:

Twitter: @Matt_Rosenquist

Intel IT Peer Network: My Previous Posts

LinkedIn: http://linkedin.com/in/matthewrosenquist

Read more >

SmartGrid Security: Q&A with Bob Radvanosky, Co-Founder, Infracritical

Bob Radvanosky is one of the world’s leading cyber security experts, with more than two decades of experience designing IT security solutions for the utility, homeland security, healthcare, transportation, and nuclear power industries. The author of nine books on cyber … Read more >

The post SmartGrid Security: Q&A with Bob Radvanosky, Co-Founder, Infracritical appeared first on Grid Insights by Intel.

Read more >

How Wearables are Impacting Healthcare

Read Part I of this blog series on wearables in healthcare


As I mentioned in the first part of this blog series, wearables have become more than a passing trend and are truly changing the way people and organizations think about managing health. I hear from many companies and customers who want to understand how the wearables market is impacting patient care as well as some of the changes taking place with providers, insurers, and employers. In the next several blogs, I’ll share some of their questions and my responses. Today’s question is:

 

What are some of the ways that wearables are impacting providers, payers, and employers as well as patients?

 

For providers, one example is a pilot that the Mayo Clinic did with Fitbit to track patients recovering from cardiac surgery. They were able to predict which of those patients would be discharged sooner than others based on their activity in the hospital. You can easily see how this use case could be extended outside of the hospital, where you might be able to use wearables to more accurately predict which patients are at the highest risk for hospital readmission. This of course is a key quality metric that hospitals are incentivized to reduce.

 

On the payer side, organizations are using wearable devices to influence the behavior of their members, encourage a healthier lifestyle, and delay the onset of conditions like obesity and diabetes. Cigna has a program for their own employees where they identify individuals who may be at risk for diabetes. They created a wearables program that encouraged increased activity in those individuals’ daily lives, and it’s making a difference.

 

Gartner finds that over 2,000 corporate wellness programs have integrated wearables to track employees’ physical activity and incentivize them, sometimes financially, to have a healthier lifestyle. BP rolled out a program with 14,000 employees. Those who were able to achieve 1 million steps (equivalent to roughly 500 miles for an average-size person) over the course of a one year period received a health plan premium reduction the following year.

 

Now, has anybody been able to aggregate enough wearable data for some serious predictive analytics, or is that down the road? I think that’s down the road; certainly before it becomes mainstream. This will entail significant data integration and big data analytics. We’re looking to pull in multi-structured data from multiple distributed entities and repositories – data from electronic health records, health insurance claims, in some cases socioeconomic data, and all the new sensor data from wearables. If we can pull the continuous stream of patient-generated data into a repository, and overlay more traditional payer and provider data, I suspect the accuracy of predictive models will be significantly improved. We’ll be much better able to identify high-risk patients that will benefit most from additional outreach by a provider organization.

 

What questions do you have?

 

In my next blog, I’ll look at the primary challenges companies are facing in collecting, analyzing, and sharing data generated by wearables.

Read more >

Intel Donates HPC Infrastructure to Pan-Cancer Analysis of Whole Genomes Project

We’re experiencing ever-increasing volumes of data within health and life sciences. If we were to sequence just once the ~14M new cancer patients (T/N) worldwide[1], it would require more than 5.6 Exabytes (and the reality is we need to be able to sequence them multiple times during the course of treatment using a variety of omics and analytics approaches). The technical challenges of big data are many, from how do we manage and store such large volumes of data to being able to analyse hugely complex datasets. However, we must meet these challenges head-on as the rewards are very real.

 

I’m pleased to tell you about a significant project that Intel is supporting to help overcome these types of challenges which will assist in the drive to comprehensively analyse cancer genomes. Our HPC solutions are already facilitating organisations around the world to deliver better healthcare and individuals to overcome diseases such as cancer. And our relationship with the Pan-Cancer Analysis of Whole Genomes (PCAWG) project is helping scientists to access and share analysis of more than 2,600 whole human genomes (5200 matched Tumor/Normal pairs).

 

Scientific discovery can no longer operate in isolation – there is an imperative to collaborate internationally working across petabytes of data and statistically significant patient cohorts. The PCAWG project is turning to the cloud to enhance access for all which will bring significant advances in healthcare through collaborative research.

 

By working directly with industry experts to accelerate cancer research and treatment, Intel is at the forefront of the emerging field of precision medicine. Advanced biomarkers, predictive analytics and patient stratification, therapeutic treatments tailored to an individual’s molecular profile, these hallmarks of precision medicine are undergoing rapid translation from research into clinical practice. Intel HPC Big Data/Analytics technologies support high-throughput genomics research while delivering low-latency clinical results. Clinicians together with patients formulate individualized treatment plans, informed with the latest scientific understanding.

 

For example, Intel HPC technology will accelerate the work of bioinformaticists and biologists at the German Cancer Research Centre (DKFZ) and the European Molecular Biology Laboratory (EMBL), allowing these organisations to share complex datasets more efficiently. Intel, Fujitsu, and SAP are helping to build the infrastructure and provide expertise to turn this complex challenge into reality.

 

The PCAWG project is in its second phase which began with the uploading of genomic data to seven academic computer centres, creating what is in essence a super-cloud of genomic information. Currently, this ‘academic community cloud’ is analysing data to identify genetic variants, including cancer-specific mutations. And I’m really excited to see where the next phase takes us as our technology will help over 700 ICGC scientists worldwide to remotely access this huge dataset, performing secondary analysis to gain insight into their own specific cancer research projects.

 

This is truly ground-breaking work made possible by a combination of great scientists utilising the latest high-performance big data technologies to deliver life-changing work. At Intel it gives us great satisfaction to know that we are playing a part in furthering knowledge in both the wider genomics field, but also specifically in better understanding cancer which will lead to more effective treatments for everyone.

 

 


[1] http://www.cancerresearchuk.org/cancer-info/cancerstats/world/incidence/

Read more >

In Their Own Words: Intel Intern Juan Lopez Marcano Shares His Story

Juan Lopez Marcano was an Intel Scholar with the Platform Engineering group during the summer of 2015. He is currently earning a Master’s Degree in Electrical and Electronics Engineering at Virginia Polytechnic Institute and State University. If I could describe … Read more >

The post In Their Own Words: Intel Intern Juan Lopez Marcano Shares His Story appeared first on Jobs@Intel Blog.

Read more >

Next-Gen Shopping on Innovation Boulevard

ll.jpg

I don’t know about you, but while I love being able to browse my favourite store’s latest range from the comfort of my sofa, the hands-on experience that I get from a visit to the store itself is also still very appealing. What’s great about today’s retail landscape is that we have the opportunity to do both. The way we try and buy items from our favourite brands is no longer dictated by the opening hours or stock levels in our local high street store.

 

While this is good news for the consumer, the battle is on for high street retailers. To entice as many shoppers as possible through their doors, retailers need to offer a totally unique shopping experience – something that will convince you and me to put down our tablets and head to the high street.

 

Personalized, anytime shopping on the streets of Antwerp

 

Digitopia, a digital retail solution provider in Belgium, is working with Intel to build devices and apps that retailers can use to create more compelling shopping experiences. By trailing different solutions in various retail environments on Antwerp’s most popular shopping street, Digitopia is helping retailers to define which technologies work best in each different store scenario.

 

On Innovation Boulevard, as Digitopia has dubbed it, shoppers can turn their phone into a remote control to browse holidays on a large screen in the travel agent’s window. They can use an interactive fitting room in a fashion boutique to check for alternative colors and sizes of the outfits they are trying on. It’s even possible to order and pay for their cafe refreshments with a smartphone app rather than queuing up in the store. A large number of the solutions are powered by Intel technologies.

 

For shoppers, the retail experience is smoother and more personalized. Importantly, the technologies are also helping retailers to increase sales, offer new services and continue to interact with their customers when the shops are closed.

 

You can read more about the exciting retail experience that Digitopia has created in our new case study. My personal favorite is the possibility to book a holiday while walking between shops – what’s yours?


To continue this conversation, find me on LinkedIn or Twitter.

Read more >

5 Questions for Dr. Giselle Sholler, NMTRC

 

Giselle Sholler is the Chair of the Neuroblastoma and Medulloblastoma Translational Research Consortium (NMTRC) and the Director of the Hayworth Innovative Therapeutic Clinic at Helen DeVos Children’s Hospital. The NMTRC is a group of 15 pediatric hospitals across the U.S, plus the American University in Beirut, Lebanon, and Hospital La Timone in Marseilles, France. We sat down recently with Dr. Sholler to talk about to role of precision medicine in her work and how it impacts patients.


Intel: What are the challenges of pediatric oncology and how do you tackle those challenges?

 

Sholler: As a pediatric oncologist, one of the most challenging times is when we’re faced with a child who is not responding to standard therapy and we want to figure out how we can treat this patient. How can we bring hope to that family? A project that we are working on in collaboration with TGen, Dell and Intel has brought that hope to these families.

 

Intel: What is the program?

 

Sholler: When a child has an incurable pediatric cancer, we a take a needle biopsy and send it to TGen where the DNA and RNA sequencing occurs. When ready, that information comes back to the Consortium. Through a significant amount of analysis of the genomic information, we’re able to look at what drugs might target specific mutations or pathways. On a virtual tumor board, we have 15 hospitals across the U.S. and now two international hospitals in Lebanon and France that come together and discuss the patient’s case with the bioinformatics team from TGen. Everyone is trying to understand that patient and with the help of pharmacists create individualized treatment plans for that patient so that patient can have a therapy available to them that might result in a response for their tumor.

 

Intel: Why is precision medicine important?

 

Sholler: Precision medicine is about using the genomic information data form a patient’s tumor to identify which drugs not only will work, but which ones may not work on that patient’s specific cancer. With precision medicine, we can identify the right treatment for a patient. We’re not saying chemotherapy is bad, but for many of our patients chemotherapy is attacking every rapidly dividing cell and leaves our children with a lot of long term side effects. My hope for the future is that as we can target patients more specifically with the correct medications, we can alleviate some of the side effects that we’re seeing in our patients. Half our children with neuroblastoma have hearing loss and need hearing aids for the rest of their lives. They have heart conditions, kidney conditions, liver conditions that we’d like to see if we can avoid in the future.

 

Intel: How does the collaboration work to speed the process?

 

Sholler: The collaboration with Dell and Intel has been critical to making this entire project possible. The grant from Dell to fund this entire program over the last four years has been unparalleled in pediatric cancer. The computer power has also been vital to the success. Three years ago we were doing only RNA expression profile and it took two months; now, we’re doing RNA sequencing and DNA exomes completely and it takes less than two weeks to get the answers for our patients. The data transfer and networking used to entail shipping hard drives a few years ago. Now, we can send a tumor sample from Lebanon to TGen, complete the sequence in a few days and have a report for the tumor board a few days after that. It’s just been amazing to see the speed and accuracy improve for profiling.

 

Intel: Anything else?

 

Sholler: Another very critical piece that Dell has helped provide is the physician portal. Physicians are able to work together across the country, and across the world, and have access to patient records. The database now has grown and grown. When we do see patients, we can also pull up previous patients with similar sequencing or similar profiles, or treated with similar drugs, and see what was used in treatment. And how did they do? What was the outcome? We’re learning more and more with every patient and it doesn’t matter where we live anymore. Everything’s virtual online. It’s just been incredible.

Read more >