Recent Blog Posts

The Big Challenges We Face in Genomics Today: A European Perspective

Recently I’ve travelled to Oxford in UK, Athens in Greece and Antalya in Turkey for a series of roundtables on the subject of genomics. While there were different audiences across the three events, the themes discussed had a lot in common and I’d like to share some of these with you in this blog.

 

The event in Oxford, GenofutureUK15, was a roundtable hosted by the Life Sciences team here at Intel and bought academics from a range of European research institutions together to discuss the future of genomics. I’m happy to say that the future is looking very bright indeed as we heard of many examples of some fantastic research currently being undertaken.

 

Speeding up Sequencing

What really resonated through all of the events though was that the technical challenges we’re facing in genomics are not insurmountable. On the contrary, we’re making great progress when it comes to the decreasing time taken to sequence genomes. As just one example, I’d highly recommend looking at this example from our partners at Dell – using Intel® Xeon® processers it has been possible to improve the efficiency and speed of paediatric cancer treatments.

 

In contrast to the technical aspects of genomics, the real challenges seem to be coming from what we call ‘bench to bedside’, i.e. how does the research translate to the patient? Mainstreaming issues around information governance, jurisdiction, intellectual property, data federation and workflow were all identified as key areas which are currently challenging process and progress.

 

From Bench to Bedside

As somebody who spends a portion of my time each week working in a GP surgery, I want to be able to utilise some of the fantastic research outcomes to help deliver better healthcare to my patients. We need to move on from focusing on pockets of research and identify the low-hanging fruit to help us tackle chronic conditions, and we need to do this quickly.

 

Views were put forward around the implications of genomics transition from research to clinical use and much of this was around data storage and governance. There are clear privacy and security issues but ones for which technology already has many of the solutions.

 

Training of frontline staff to be able to understand and make use of the advances in genomics was a big talking point. It was pleasing to hear that clinicians in Germany would like more time to work with researchers and that this was something being actively addressed. The UK and France are also making strides to ensure that this training becomes embedded in the education of future hospital staff.

 

Microbiomics

Finally, the burgeoning area of microbiomics came to the fore at the three events. You may have spotted quite a lot of coverage in the news around faecal microbiota transplantation to help treat Clostridium difficile. Microbiomics throws up another considerable challenge as the collective genomes of the human microbiota contains some 8 million protein-coding genes, 360 times as many as in the human genome. That’s a ‘very’ Big Data challenge, but one we are looking forward to meeting head-on at Intel.

 

Leave your thoughts below on where you think the big challenges are around genomics. How is technology helping you to overcome the challenges you face in your research? And what do you need looking to the future to help you perform ground-breaking research?

 

Thanks to participants, contributors and organisers at Intel’s GenoFutureUK15 in Oxford, UK, Athens in Greece and HIMSS Turkey Educational Conference, in Antalya, Turkey.

 

Read more >

The Johnny-Five Framework Gets A New Website, Add SparkFun Support

The Johnny-Five robotics framework has made a big leap forward, migrating it’s primary point of presence away from creator Rick Waldron’s personal GitHub account to a brand new website: Johnny-Five.io. The new website features enhanced documentation, sample code and links … Read more >

The post The Johnny-Five Framework Gets A New Website, Add SparkFun Support appeared first on Intel Software and Services.

Read more >

The new scale of compute demands a new scale of analytics

With the proliferation of popular software-as-a-service (SaaS) offerings, the scale of compute has changed dramatically. The boundaries of enterprise IT now extend far beyond the walls of the corporate data center.

 

You might even say those boundaries are disappearing altogether. Where we once had strictly on-premises IT, we now have a highly customized and complex IT ecosystem that blurs the lines between the data center and the outside world.

 

When your business units are taking advantage of cloud-based applications, you probably don’t know where your data is, what systems are running the workloads, and what sort of security is in place. You might not even have a view of the delivered application performance, and whether it meets your service-level requirements.

 

This lack of visibility, transparency, and control is at once both unsustainable and unacceptable. And this is where IT analytics enters the picture—on a massive scale.

 

To make a successful transition to the cloud, in a manner that keeps up with the evolving threat landscape, enterprise IT organizations need to leverage sophisticated data analytics platforms that can scale to hundreds of billions of events per day. That’s not a typo—we are talking about moving from analyzing tens of millions of IT events each day to analyzing hundreds of billions of events in the new enterprise IT ecosystem.

 

This isn’t just a vision; this is an inevitable change for the IT organization. To maintain control of data, to meet compliance and performance requirements, and to work proactively to defend the enterprise against security threats, we will need to gain actionable insight from an unfathomable amount of data. We’re talking about data stemming from event logs, network devices, servers, security and performance monitoring tools, and countless other sources.

 

Take the case of security. To defend the enterprise, IT organizations will need to collect and sift through voluminous amounts of two types of contextual information:

 

  • “In the moment” information on devices, networks, operating systems, applications, and locations where information is being accessed. The key here is to provide near-real time actionable information to policy decision and enforcement points (think of the credit card company fraud services).
  • “After the fact” information from event logs, raw security related events, netflow and packet data, along with other indicators of compromise that can be correlated with other observable/collectable information.

 

As we enter this brave new world for IT, it’s clear that we will need an analytics platform that will allow us to store and process data at an unprecedented scale. We will also need new algorithms and new approaches that will allow us to glean near real-time and historical insights from a constant flood of data.

 

In an upcoming post, I will look at some of the requirements for this new-era analytics platform. For now, let’s just say we’re gonna need a bigger boat.

 

 

 

 

Intel and the Intel logo are trademarks of Intel Corporation in the United States and other countries. * Other names and brands may be claimed as the property of others.

Read more >

Data Center Disruption: Grounded in Moore’s Law

Yesterday we celebrated the 50th anniversary of Moore’s Law, the foundational model of computing innovation. While the past half-century of industry innovation based on the advancement of Moore’s Law is astounding, what’s exciting today is that we’re at the beginning of the next generation of information and communication technology architecture, enabling the move to the digital services economy. Nowhere are the opportunities more acute than in the data center.


This evening, at the Code/Enterprise Series in San Francisco, I had the pleasure of sharing Intel’s perspective on the disruptive force the data center transformation will have on businesses and societies alike. Like no time before, the data center stands at the heart of technology innovation connecting billions of people and devices across the globe and delivering services to completely transform businesses, industries, and people’s lives.


To accelerate this vision Intel is delivering a roadmap of products that enable the creation of a software-defined data center – a data center where the application defines the system. One area I’m particularly excited about is our work with the health care community to fundamentally change the experience of a cancer patient. Here, technology is used to speed up and scale the creation and application of precision medicine. 


Our goal? By 2020, a patient can have her cancerous cells analysed through genome sequencing, compared to countless other sequences through a federated, trusted cloud, and a precision treatment created… all in one day.


We are also expanding our business focus into new areas where our technology can accelerate business transformation, a clear example being the network. Our recent announcements with Ericsson and Huawei highlight deep technical collaborations that will help the telco industry deliver new services to their end users with greater network utilization through virtualization and new business models through the cloud. At the heart of this industry transformation is open, industry standard solutions running on Intel architecture.


Transforming health care and re-architecting the network are just two examples of Intel harnessing the power of Moore’s Law to transform businesses, industries, and the lives of us all. 

Read more >

Small Business, Big Threat, Security Connected Solutions: How Innovation and the Framework Can Help Protect Small Businesses

On April 22 – right in the middle of “Cyber Week” in the U.S. House of Representatives – Steve Grobman, Intel Fellow and Intel Security’s CTO, will testify before the House Committee on Small Business to discuss how, from Intel’s … Read more >

The post Small Business, Big Threat, Security Connected Solutions: How Innovation and the Framework Can Help Protect Small Businesses appeared first on Policy@Intel.

Read more >

How to Achieve 39 Percent Faster Performance for Whole Genome Analysis

The transition toward next-generation, high-throughput genome sequencers is creating new opportunities for researchers and clinicians. Population-wide genome studies and profile-based clinical diagnostics are becoming more common and more cost-effective. At the same time, such high-volume and time-sensitive usage models put more pressure on bioinformatics pipelines to deliver meaningful results faster and more efficiently.

 

Recently, Intel worked closely with Seven Bridges Genomics’ bioinformaticians to design the optimal genomics cluster building block for direct attachment to high-throughput, next-generation sequencers using the Intel Genomics Cluster solution. Though most use cases will involve variant calling against a known genome, more complex analyses can be performed with this system. A single 4-node building block is powerful enough to perform a full transcriptome. As demands grow, additional building blocks can easily be added to a rack to support multiple next-generation sequencers operating simultaneously.

 

Verifying Performance for Whole Genome Analysis

To help customers quantify the potential benefits of the PCSD Genomics Cluster solution, Intel and Seven Bridges Genomics ran a series of performance tests using the Seven Bridges Genomics software platform. Performance for a whole genome pipeline running on the test cluster was compared with the performance of the same software platform running on a 4-node public cloud cluster based on the previous generation Intel Xeon processor E5 v2 family.

 

The subset of the pipeline used for the performance tests includes four distinct computational phases:

 

  • Phase A: Alignment, deduplication, and sorting of the raw data reads
  • Phase B: Local realignment around Indels
  • Phase C: Base quality score recalibration
  • Phase D: Variant calling and variant quality score recalibration.

 

The results of the performance tests were impressive. The Intel Genomic Cluster solution based on the Intel® Xeon processor E5-2695 v3 family completed a whole genome pipeline in just 429 minutes versus 726 minutes for the cloud-based solution powered by the prior-generation Intel® Xeon processor E5 v2 family.

 

Based on these results, researchers and clinicians can potentially complete a whole genome analysis almost five hours sooner using the newer system. They can also use this 4-node system as a building block for constructing large, local clusters. With this strategy, they can easily scale performance to enable high utilization of multiple high-volume, next-generation sequencers.

 

For a more in-depth look at these performance tests, we will soon release a detailed abstract that will provide more detailed information about the workloads and system behavior in each phase of the analysis.

 

What questions do you have?

Read more >

Digitization of the Utility Industry Part I: The Impact of Moore’s Law

As Moore’s Law turned 50 this past Sunday, it’s amazing to see how much this ‘law’ is impacting the energy industry. Back in 1975, Gordon Moore extrapolated that computing would dramatically increase in power, and decrease in relative cost, at an … Read more >

The post Digitization of the Utility Industry Part I: The Impact of Moore’s Law appeared first on Grid Insights by Intel.

Read more >

Delivering Innovation & Technology for Retail

The Role of Technology in Retail: Is it enough?

My friend and colleague, Jon Stine, @joncstine1 recently penned a blog regarding technology in retail. Jon has extensive retail industry and technology expertise and offers a great perspective on the role of technology to address challenges retailers face. The challenge from my perspective for retailers with store fronts, is best summed in a question. Can you deliver a killer shopping experience – from sofa-to-store aisle?” Sadly many retailers are not able to answer yes this question. The vast majority of the retailers don’t invest in innovative shopping solutions. Many retailers are content to follow the same old formula of price discounting, coupons and Sunday circulars. A well proven formula that is just too hard to break from. However, it is a formula we know no longer fits the new connected consumer.

The evolution of the connected consumer has been highlighted in popular press at great length for at least the last three plus years. However, many retailers have missed the evolution of the consumer. You know, the Millenials generation. It will comprise 75% of the workforce by 2020 and command over $2.5T in purchasing power. This segment is always connected and values experiences as much as price. And by the way, since they are always connected, they never shop without their device. Yes, the evolution has been occurring for some time and yes the Millenials are reshaping the shopping experience. What are you going to do about it?

Retailers are facing a strategic inflection point, which could mean an opportunity to prosper or a slow ride toward demise. At least that is my point-of-view. Jon is arguing a few factors that are relevant to creating an innovative shopping experience.

Retailers:

  1. “Showrooming” is multidirectional (in-store and online) and it is here to stay.
  2. Leveraging big data can have a profound impact on your brand and the experience you deliver – it should be considered as the starting point as you create a new shoppers journey.
  3. Security must become a strategic imperative for the way you conduct business – trust is won in drips and lost in buckets. Cybercriminals are well funded and profitable and hacking will continue as the new normal.

As mentioned in Jon’s blog, retailers have long chosen to focus on maintaining their ongoing operations, rather than investing for growth and innovation. Growth and Innovation don’t come cheap. As a matter of fact, growth and innovation are more than a technology roadmap. It is a business strategy. Why do consumers flock to Amazon or any of the “new concept” stores? I argue it boils down to the experience. Amazon provides the ultimate in clienteling and sales assist. The new concept stores I had the privilege to tour in NYC offer innovative shopping experiences. My store tour was during NRF 2015.

The collection of stores we visited all offered a unique & engaging shopping experiences.

Rebecca Minkoff – connected fashion store with technology envisioned and planed by EBAY. Bringing together the online experience to the physical store. In store interactive shopping display. Once the shopper selects the clothes they want to try on they tap a button to have a sales associate bring the items to a dressing room.

Under Armour Brand House – to create a physical space that becomes a destination for shoppers. The strategy for the stores is more about telling a story and engaging the shopper through story telling. UA founder Kevin Plank is more interested in aligning its product communication and retail presentation than anything else. His claim is that UA focuses 80% on storytelling and 20% on product – just the opposite of so many other product retailers.

Converse – yup that old classic, Chuck Taylor canvas shoe. Converse has been offering online customization for some time. But what if you wanted an immediate and unique shoe to wear to an event. Now you can visit a Converse store, select your favorite Chuck’s and then set off in creating your own personalized style.

Similar to the way Amazon offers a unique shopping experience these stores invested in delivering an innovation. It wasn’t a technology solution alone – it was a desire from top to bottom to give the shopper something unique and innovative.

Do you want help in delivering growth and innovation in your retail environment? Intel isn’t going to solve all of this on its own. We work with very talented fellow travelers that offer solutions to achieve growth and innovation.

Graylish, Gordon, Edgecombe, Paul J, Steuart, Ann M, Walsh, Megan A, Emde, Charles, Yep, Ray S, Snyder, Don P, Martin, Lisa A,Malloy, Steve,, Julie, Cavallo, Jerry,Phillips, Todd,, Dastghaib, Hooman, Gledhill, Alexander N, Karolkowski, Gilles,Aillerie, Yves,Horsthemke, Uwe, Calandra, Joseph, Vandenplas, Patricia, , Gary,Shean, Robyn, Bakkeren, Matty,Butcher, Paul,Brown, Steve PowerYep, Ray S, Bhasin, Rahoul, Dastghaib, Hooman , Cangro-essary-coons, Lisa, Archer, Darin,Robason, Kelly,Pitarresi, Joe, Nickles, Annabel SDastghaib, Hooman, Peutin, Florence  @Ward, Matthew, @Williams, La Tiffaney, @Fox, Tania, @Lester, Ryan, @Weiskus, Sarah,, Mushahwar, Rachel K, Poulin, Shannon, @Tea, Peter, @Webb, Victor; Laura Barbaro -,   Pattie.Sims@intel.com,

Read more >