Recent Blog Posts

IT Accelerating Business Innovation Through Product Design

For the Product Development IT team within Intel IT that I am a part of, these have been our recent mandates. We’ve been tasked with accelerating the development of Intel’s key System on Chip (SoC) platforms. We’ve been asked to be a key enabler of Intel’s growing software and services business. And we’ve been recognized as a model for employee engagement and cross-functional collaboration.

 

Much of this is new.

 

We’ve always provided the technology resources that facilitate the creation of world-class products and services. But the measures of success have changed. Availability and uptime are no longer enough. Today, it’s all about acceleration and transformation.

 

Accelerating at the Speed of Business

 

In many ways, we have become a gas pedal for Intel product development. We are helping our engineers design and deliver products to market faster than ever before. We are bringing globally distributed teams closer together with better communication and collaboration capabilities. And we are introducing new techniques and tools that are transforming the very nature of product design.

 

Dan McKeon, Vice President of Intel IT and General Manager of Silicon, Software and Services Group at Intel, recently wrote about the ways we are accelerating and transforming product design in the Intel IT Business Review.

 

The IT Product Development team, under Dan’s leadership, has enthusiastically embraced this new role. It allows us to be both a high-value partner and a consultant for the design teams we support at Intel. We now have a much better understanding of their goals, their pain points, and their critical paths to success—down to each job and workload. And we’ve aligned our efforts and priorities accordingly.

 

The results have been clear. We’ve successfully shaved weeks and months off of high-priority design cycles. And we continue to align with development teams to further accelerate and transform their design and delivery processes. Our goal in 2014 is to accelerate the Intel SoC design group’s development schedule by 12 weeks or more. We are sharing our best practices as we go, so please keep in touch.

 

Get the latest from Dan’s team on IT product development for faster time to market, download the Intel IT Business Review mobile app. http://itbusinessreview.intel.com/

Follow the conversation on Twitter: hashtag #IntelIT

Read more >

High Performance Computing in Today’s Personalized Medicine Environment

 

The goal of personalized medicine is to shift from a population-based treatment approach (i.e. all people with the same type of cancer are treated in the same way) to an approach where the care pathway with the best possible prognosis is selected based on attributes specific to a patient, including their genomic profile.

 

After a patient’s genome is sequenced, it is reconstructed from the read information, compared against a reference genome, and the variants are mapped; this determines what’s different about the patient as an individual or how their tumor genome differs from their normal DNA.  This process is often called downstream analytics (because it is downstream from the sequencing process).

 

Although the cost of sequencing has come down dramatically over the years (faster than Moore’s law in fact), the cost of delivering personalized medicine in a clinical setting “to the masses” is still quite high. While not all barriers are technical in nature, Intel is working closely with the industry to remove some of the key technical barriers in an effort to accelerate this vision:

 

  • Software Optimization/Performance: While the industry is doing genomics analytics on x86 architecture, much of the software has not been optimized to take advantage of parallelization and instruction enhancements inherent with this platform
  • Storing Large Data Repositories: As you might imagine, genomic data is large, and with each new generation of sequencers, the amount of data captured increases significantly.  Intel is working with the industry to apply the Lustre (highly redundant/highly scalable) file system in this domain
  • Moving Vast Repositories of Data: Although (relatively) new technologies like Hadoop help the situation by “moving compute to the data”, sometimes you can’t get around the need to move a large amount of data from point A to point B. As it turns out, FTP isn’t the most optimal way to move data when you are talking Terabytes

 

I’ll leave you with this final thought: Genomics is not just for research organizations. It is accelerating quickly into the provider environment. Cancer research and treatment is leading the way in this area, and in a more generalized setting, there are more than 3,000 genomic tests already approved for clinical use. Today, this represents a great opportunity for healthcare providers to differentiate themselves from their competition… but in the not too distant future, providers who don’t have this capability will be left behind.

 

Have you started integrating genomics into your organization? Feel free to share your observations and experiences below.

 

Chris Gough is a lead solutions architect in the Intel Health & Life Sciences Group and a frequent blog contributor.

Find him on LinkedIn

Keep up with him on Twitter (@CGoughPDX)

Check out his previous posts

Read more >

Can the Coding Illini Return to the Parallel Universe Computing Challenge Finals Again?

Mike Bernhardt is the Community Evangelist for Intel’s Technical Computing Group.

 

When the Gaussian Elimination Squad from Germany indicated their interest in defending their 2013 Intel Parallel Universe Computing Challenge (PUCC) championship title, little did they know the first team to respond would be the one they faced in last year’s finals, the Coding Illini.

Last year’s Coding Illini team defeated Team Ohio in their first round and K2I18 (Rice University) in round two to advance into the finals. According to team captain Mike Showerman, this year’s team hopes to go all the way to the championship, “Coding Illini will be even fiercer this year and will take every opportunity to bring the title home.”

The Coding Illini will represent the National Center for Supercomputing Applications (NCSA) and the University of Illinois at Urbana–Champaign.

Similar to the inaugural PUCC held at SC13, the 2014 challenge will include an entertaining supercomputing trivia round followed by a parallel computing code challenge live and on stage in the Intel booth at SC14 in New Orleans. Teams from around the globe are expected to take part in the challenge again this year and may submit a PUCC interest form to express their desire to participate.

 

The 2013 Coding Illini included (left to right): Omar Padron (NCSA research programmer and a member of the Science and Engineering Applications Support team), Mike Showerman (team captain), Xiang Ni (computer science PhD student), Nikhil Jain (Computer Science PhD student), and Andriy Kot (A post-doctoral research associate at NCSA).

Read more >

In search of the best way to train future security experts: notes from the 2014 Intel Developers Forum (IDF ’14)

By Claire Vishik, Intel’s Trust & Security Technology & Policy Director, and Scott Buck, Intel’s University Program Manager IDF’14 was about future technologies – powerful computing devices, Internet of Things and mobiles, and how computing is going to affect everyday lives of … Read more >

The post In search of the best way to train future security experts: notes from the 2014 Intel Developers Forum (IDF ’14) appeared first on Policy@Intel.

Read more >

Taking Your Photography to the Next Level with Intel RealSense

Both amateur and professional photographers alike have something new to be ecstatic about! Last week at the Intel Developer Forum, CEO Brian Krzanich talked about Intel® RealSenseTM snapshot, an intelligent camera system for computing devices that captures depth information of … Read more >

The post Taking Your Photography to the Next Level with Intel RealSense appeared first on Technology@Intel.

Read more >

If 802.11ac Wi-Fi Is In Your iPhone, Shouldn’t It Also Be In Your Laptop?

Following in the steps of other major mobile device manufacturers, Apple has finally included 802.11ac wireless technology in the iPhone 6. Faster wireless connectivity is always a good thing, but if your phone’s Wi-Fi is faster than your laptop, you … Read more >

The post If 802.11ac Wi-Fi Is In Your iPhone, Shouldn’t It Also Be In Your Laptop? appeared first on Technology@Intel.

Read more >

The SDI Data center of the future is here… Now let’s distribute it more evenly

The science-fiction writer William Gibson once observed, “The future is already here — It’s just not very evenly distributed.” The same could be said of the today’s data centers.

 

On one hand, we have amazing new data centers being built by cloud service providers and the powerhouses of search, ecommerce and social media. These hyperscale data center operators are poised to deploy new services in minutes and quickly scale up to handle enormous compute demands. They are living in the future.

 

And then on the other hand, we have enterprises that are living with the data center architectures of an earlier era, a time when every application required its own dedicated stack of manually provisioned resources. These traditional enterprise data centers were built with a focus on stability rather than agility, scalability and efficiency—the things that drive cloud data centers.

 

Today, the weaknesses of legacy approaches are a growing source of pain for enterprises. While cloud providers enjoy the benefits that come with pooled and shared resources, traditional enterprises wrestle with siloed architectures that are resistant to change.

 

But there’s good news on the horizon. Today, advances in data center technologies and the rise of more standardized cloud services are allowing enterprise IT organizations to move toward a more agile future based on software-defined infrastructure (SDI) and hybrid clouds.

 

With SDI and the hybrid cloud approach, enterprise IT can now be managed independently of where the physical hardware resides. This fundamental transformation of the data center will enable enterprises to achieve the on-demand agility and operational efficiencies that have long belonged to large cloud service providers.

 

At Intel, we are working actively to deliver the technologies that will allow data centers to move seamlessly into the era of SDI and hybrid clouds. Here’s one example: The new Intel® Xeon® Processor E5 v3 family exposes a wide range of information on hardware attributes—such as security, power, thermals, trust and utilization—to the orchestration layer. With access to this information, the orchestration engine can make informed decisions on the best placement for workloads within a software-defined or cloud environment.

 

And here’s another of many potential examples: The new Intel Xeon processors incorporate a Cache QoS Monitoring feature. This innovation helps system administrators gain the utilization insights they need to ward off resource-contention issues in cloud environments. Specifically, Cache QoS Monitoring identifies “noisy neighbors,” or virtual machines that consume a large amount of the shared resources within a system and cause the performance of other VMs to suffer.

 

And that’s just the start. If space allowed, we could walk through a long list of examples of Intel technologies that are helping enterprise IT organizations move toward software-defined data centers and take advantage of hybrid cloud approaches.

 

This transformation, of course, takes more than new technologies. Bringing SDI and hybrid clouds to the enterprise requires extensive collaboration among technology vendors, cloud service providers and enterprises. With that thought on in mind, Intel is working to enable a broad set of ecosystem players, both commercial and open source, to make the SDI vision real.

 

One of the key mechanisms for bringing this vast ecosystem together is the Open Data Center Alliance (ODCA), which is working to shape the future of cloud computing around open, interoperable standards. With more than 300 member companies spanning multiple continents and industries, the ODCA is uniquely positioned to drive the shift to SDI and seamless, secure cloud computing. There is no equivalent organization on the planet that can offer the value and engagement opportunity of ODCA.

 

Intel has been a part of the ODCA from the beginning. As an ODCA technology advisor, we gathered valuable inputs from the ecosystem regarding challenges, usage models and value propositions. And now we are pleased to move from an advisory role to that of a board member. In this new role, we will continue to work actively to advance the ODCA vision.

 

Our work with the ecosystem doesn’t stop there. Among other efforts, we’re collaborating on the development of Redfish, a specification for data center and systems management that delivers comprehensive functionality, scalability and security. The Redfish effort is focused on driving interoperability across multiple server environments and simplifying management, to allow administrators to speak one language and be more productive.

 

Efforts like this push us ever closer to next-generation data centers — And a future that is more evenly distributed.

 

 

For more follow me @PoulinPDX on Twitter.

Read more >

Part 1: The Changing Life of Modern Pharmaceutical Sales Professionals

Below is the second in a series of guest blogs from Dr. Peter J. Shaw, chief medical officer at QPharma Inc. Watch for additional posts from Dr. Shaw in the coming months.

 

With all the recent advances in tablet technology, the way pharmaceutical sales professionals interact with health care providers (HCPs), and in particular doctors, has changed. Most pharmaceutical companies are now providing their sales teams with touch screen tablets as their main platform for information delivery. The day of paper sales aids, clinical reprints and marketing materials is rapidly fading. The fact is that doctors have less time to see sales professionals during their working day and there are increasing restrictions on access to doctors by many institutions. Therefore, the pharmaceutical industry is having to be more and more inventive and flexible in the way that it approaches doctors and conveys the information needed to keep up-to-date on pharmaceutical, biotech and medical device advances.

 

  • How has this impacted the life of the sales professional?
  • How have pharmaceutical companies adapted to the changes?
  • To what extent has the use of mobile devices been adopted?
  • What impact has this had on the quality of the interaction with HCPs?
  • What are alternatives to the face-to-face doctor visit?
  • How have doctors received the new way of detailing using mobile technology?
  • What do doctors like/dislike about being detailed with a mobile device?
  • What does the future look like?
  • Are there any disadvantages to relying solely on mobile technology?

 

To answer some of these questions, and hopefully to generate a lively discussion on the future of mobile technology in the pharmaceutical sales world, I would like to share some facts and figures from recent research we conducted on the proficiency of sales reps using mobile devices in their interactions with HCPs, and the impact this has had on clinical and prescribing behaviors.

 

  • In tracking the use of mobile devices for the last three years, it is clear that there is variable use of mobile devices by sales professionals.
  • Where sales reps only have the mobile device, they are using them in only 7 to 35 percent of interactions with HCPs.
  • The use of mobile devices increases with the duration of the interaction with HCPs, in that the device is used in almost all calls lasting over 15-20 minutes.
  • Many reps do not use mobile devices in calls under 5 minutes. Often this is due to the non-interactive nature of the content, or the awkwardness of navigating through required multiple screens before arriving at information relevant to that particular HCP.
  • We have data to show that where the mobile device is very interactive and the sales rep is able to use it to open every call, the call will be on average 5-7 minutes longer with the doctor than if it is not used.
  • In cases where doctors will take virtual sales calls, these calls are greatly enhanced if there is a two-way visual component. Any device used in virtual sales calls much have a two-way video capability as the HCP will expect to see something to back up the verbal content of the sales call.
  • Most doctors feel that the use of mobile technology in face-to-face calls enhances the interaction with sales reps provided it is used as a means to visually back up the verbal communication in an efficient and direct manner.
  • Screen size is the main complaint we hear from HCPs. Most say that where the rep is presenting to more than one HCP the screen needs to be bigger than the 10” that is on most of the devices currently used by reps.

 

The mobile device is clearly here to stay. HCPs use them in their day-to-day clinical practice and now accept that sales professionals will also use them. When the mobile device is expected to be used as the sole means for information delivery, more work needs to go into designing the content and making it possible for the sales professional to navigate to the information that is relevant to that particular HCP. All aspects of the sales call need to be on the one device; information delivery, signature capture and validation for sample requests, and ability to email clinical reprints immediately to the HCP are just the start.

 

In part 2, we will look at how sales reps are using mobile devices effectively and the lessons to be learned from three years of data tracking the use of these devices and the increasing acceptance of virtual sales calls.

 

What questions do you have?

 

Dr. Peter J. Shaw is chief medical officer at QPharma Inc. He has 25 years of experience in clinical medicine in a variety of specialties, 20 years’ experience in product launches and pharmaceutical sales training and assessment, and 10 years’ experience in post-graduate education.

Read more >

Delivering the Building Blocks for the Software-Defined Infrastructure at VMWorld 2014

VMworld San Francisco is now in the rear view mirror, but what an excellent show it was—an opportunity to learn the latest through thought-provoking sessions and keynotes, and a chance to meet old and new friends. The buzz was all about the unprecedented level of ecosystem support for the software-defined infrastructure, and how it can serve as the foundation for an evolutionary path to hybrid cloud. The show also generated a lot of news: VMware, Intel and industry partners announced a series of technological breakthroughs that will make it easier—and more secure—than ever for businesses to achieve the benefits of virtualization and the hybrid cloud.

 

One of the big announcements at the show was the unveiling of VMware* EVO:RAIL, a turnkey, hyper-converged appliance that offers key VMware virtualization, compute, networking and storage software preinstalled on four Intel Xeon processor E5-2620 v2 nodes with 13 terabytes of flash-accelerated storage on Intel SSD S3700 drives. Because it is delivered as a pre-configured, ready-to-mount unit, EVO:RAIL makes it simpler and easier for IT organizations to add infrastructure without the complex integration. However, the real promise of EVO:RAIL technology is as a building block for the most advanced forms of virtualization, leading towards a software-defined infrastructure. EVO:RAIL can help organizations quickly and affordably extend their data center into the cloud without the expense, complexity and time commitment of “build-your-own” alternatives.

 

Intel also announced a new security controller that makes it easier for businesses to protect data within software-defined environments. The Intel Security Controller, released by Intel® Security, integrates with existing security applications to virtualize security services and synchronize policy and enforcement across both physical and virtual systems within environments using VMware* NSX software. This allows security admins to use existing security management applications to span policies across their physical and virtual security infrastructures, boosting ROI and security for virtual workloads.

 

Though SAP had previously announced the general availability of SAP* HANA virtualization with VMware* vSphere 5.5 in production environments, the first chance most of us had to see it in action was at VMworld. Intel, SAP and VMware hosted a booth to demonstrate the technology and show how virtualization, in-memory computing and high-performance Intel® E7 processors combine to extend SAP HANA to the cloud. With SAP HANA ready for the data center, the potential for real-time business analytics and an in-memory software-defined infrastructure just got more real.

 

 

Follow Tim Allen on Twitter at @TimIntel.

Read more >