Utilities are facing an increasingly complex power delivery environment, with the need to incorporate renewables generation, electric vehicles, and complex demand response programs into their service area operations. In response, many are expanding their smart grid strategies to take advantage … Read more >
Recent Blog Posts
In today’s digital world, consumers face a barrage of online phishing attacks, new forms of nasty malware, and the risk of virus-infected desktops like never before. Unfortunately, cyber criminals do not discriminate, and it’s very easy to fall victim to their scams.
But what if you could rest easy at night knowing all of your pictures, videos, and personal files are securely stored on a high-capacity, always-available desktop PC that stays safely in your home? [i],[ii] Here are a few ways that Intel Security is making this possible.
Built-In Protection for Stronger Security
At its core, Intel-based desktops build security in from the silicon up to help safeguard your files, online transactions, data, and identity on a device that can reside securely in your home. Desktop PCs that are running 6th gen Intel Core processors feature hardware-based technologies that protect against a wide range of malware attacks and exploits—and help keep your system and data free from hacking, viruses, and prying eyes.
As an added layer of support, the hardware-based security capabilities of Intel Identity Protection Technology can be found on more than 500 million PCs[iii] to support trusted device authentication. Now you can enjoy amazing computing experiences and more control over your personal content and information without worrying about the next Trojan horse.
Creating one strong password that you can remember is hard enough, but doing it for every single online account is almost impossible—until now. Many people use the same password everywhere, so it doesn’t take a skilled hacker to break into an account, just a good guesser.
“More than 90 percent of passwords today are weak, predictable, and ultimately crackable,” says Dave Singh, product marketer, Intel Client Computing Group. “What we’re trying to do is help consumers develop good security habits when they’re browsing and shopping online, and password managers make this very convenient by decreasing frustration to provide a better user experience on their PCs.”
As one example, True Key comes preloaded on most Intel-based desktop PCs with McAfee LiveSafe software. Users can sync their data across Windows, Mac, Android, and iOS devices and import passwords from all browsers and competitors. Advanced multi-factor authentication (MFA) and biometric security make it easy to sign into any account. Choose at least two different factors (e.g., trusted device, face, email, master password, numeric pin, or fingerprint) and the app will verify your identity. For additional security, you can add more factors and make your profile even stronger. Basically, True Key can recognize you and sign you in—eliminating the need for passwords altogether.
“With so many different ways to log in and get to your personal content and information, password managers can really help increase productivity by saving time and headaches,” adds Singh.
Imagine being able to walk up to your PC and have one central app manage your mobile wallet, healthcare account, or hotel membership profile. You can now book travel, buy and ship gifts or upload photos to the cloud more conveniently and better protected against malware.
Some password managers can also store wallet items—credit cards, addresses, memberships—and make it easy to “tap and pay” at checkout for secure online payments and transactions. Intel technologies feature fast, end-to-end data encryption to keep your information safe without slowing you down, with built-in hardware authentication to provide seamless protection for online transactions.
“Your high-capacity, always-available desktop can stay safely at your home with all your locally stored files, but you can securely access the information from other devices, including your smartphone,” Singh says.
“Paired with new Windows 10 sign-in options like Windows Hello, desktop computing is truly becoming more personal and secure. It really shows how digital security is advancing to work better together for the best home computing experience.”
So the next time you log into your home desktop PC, you can do it with a smile. Download the Flash Card for more tips on how to safeguard your digital security.
[i] Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Performance varies depending on system configuration. No computer system can be absolutely secure. Check with your system manufacturer or retailer or learn more at intel.com.
[ii] Requires an Intel® Ready Mode Technology-enabled system or motherboard, a genuine Intel® processor, Windows* 7, Windows 8.1, or Windows 10 OS. Results dependent upon hardware, applications installed, Internet connectivity, setup and configuration.
[iii] True Key™ by Intel Security. Security White Paper 1.0. https://b.tkassets.com/shared/TrueKey-SecurityWhitePaper-v1.0-EN.pdf
By John Kincaide, Privacy and Security Policy Attorney at Intel On May 11, 2016 the US Senate Judiciary Subcommittee on Privacy, Technology and the Law held a hearing on “Examining the Proposed FCC Privacy Rules”. The objective of the hearing … Read more >
The post US Senate Judiciary Committee Holds Hearings on the FCC’s Proposed Privacy Rules appeared first on Policy@Intel.
Intel Internet of Things (IoT) technology drove robust smart and connected retail solutions and interactive brand experiences at JDA FOCUS in Nashville this month. JDA’s annual global conference brings together more than 2,000 retail and manufacturing professionals to network, share … Read more >
The post Intel IoT Tech Drives Enhanced Store Intelligence Solutions at JDA Focus appeared first on IoT@Intel.
By James Hsu, Director of Technical Marketing at Citrix
One of the great experiences in our industry is to see products from different vendors—hardware and software—come together to solve real customer problems. That’s what’s been happening with Citrix and Intel for the last two years as we worked together to apply Intel Graphics Virtualization Technology (Intel GVT) to the Citrix XenServer virtualization platform. The result of that effort is Citrix XenServer 7.0, which we are announcing at Citrix Synergy 2016 in Las Vegas. It’s the first commercial hypervisor product to leverage Intel GVT-g, Intel’s virtual graphics processing unit that can power multiple VMs with one physical GPU. As well as announcing XenServer 7.0, Citrix is also announcing XenDesktop 7.9 offering industry-leading remote graphics delivery supported by Intel. Let me tell you what that does for users running graphics-intensive virtualized desktop applications, and then I’ll tell you how we used Intel GVT-g to do it.
Citrix XenApp and XenDesktop lets you deliver virtualized desktop and applications hosted on a server to remote workstations. Many desktop applications—like computer-aided design and manufacturing apps and even accelerated Microsoft Office—require the high-performance graphics capabilities of a graphics processing unit (GPU). In XenDesktop 7.9 Citrix also added support for Intel Iris Pro graphics in the HDX 3D Pro remote display protocol.
Earlier versions of XenServer enabled Intel GPU capabilities on virtualized desktops in a pass-through mode that allocated the GPU to a single workstation. Now, XenServer 7.0 expands our customers’ options by using Intel GVT-g to virtualize access to the Intel Iris Pro Graphics GPU integrated onto select Intel Xeon processor E3 family products , allowing it to be shared by as many as seven virtual workstations.
With Intel GVT-g, each virtual desktop machine has its own copy of Intel’s native graphics driver, and the hypervisor directly assigns the full GPU resource to each virtual machine on a time-sliced basis. During its time slice, each virtual machine gets a dedicated GPU, but the overall effect is that a number of virtual machines share a single GPU. It’s an ideal solution in applications where high-end graphics are required but shared access is sufficient to meet needs. Using the Intel Xeon processor E3 family, small single-socket servers can pack a big graphics punch. It’s an efficient, compact design that enables a new scale-out approach to virtual application delivery. And it’s a cost-effective alternative to high-end workstations and servers with add-on GPU cards.
The advantages go beyond just cost efficiency. Providing shared access by remote users to server-based data and applications enhances worker productivity and improves collaboration. It also tightens security and enables compliance, because critical intellectual property, financial data, and customer information stays in the data center rather than drifting out to individual workstations and mobile devices. And security is further enhanced, because Intel Xeon processors contain Intel Trusted Execution Technology (Intel TXT) to let you create trusted computing pools. Intel TXT attests to the integrity and trust of the platform, assures nothing has been tampered with, and verifies that the platform is running the authorized versions of firmware and software when booting up.
At Citrix, our goal is to provide our customers with the computing experience they need to innovate and be productive—on a range of platforms and usage models and in a way that enhances the security of their business. And we want to give them the flexibility to access the computing resources they need anywhere, any time, and from any device. Our collaboration with Intel has let us deliver on that promise, and it lets us provide even more options for platform choice and deployment configurations. It’s been a great experience for us, and now it will enable a great experience for our mutual customers.
The shift from fee-for-service to fee-for-performance is changing the conversation around patient care. Reducing readmissions is one benchmark for analyzing the quality of care, and more discussion is happening around bringing telehealth into the mix to improve this metric.
Traditionally, when patients leave the clinical setting, interaction between the care team and the patient decreases. With telehealth and remote patient monitoring, technology allows the provider team to remain in contact with the patient to follow up on regiments and make sure instructions are followed. The result can be a shift in outcomes for the better.
To learn more about telehealth, we sat down with Fadesola Adetosoye from Dell Healthcare Services, who says telehealth allows patients to overcome challenges, like transportation issues, to obtain better primary care and stay in touch with clinicians following discharge.
Watch the video above and let us know what questions you have about telehealth? Is your organization using a telehealth strategy?
Developers – your HPC Ninja Platform is here! HPC developers worldwide have begun to participate in the Developer Access Program (DAP) – a bootstrap effort for early access to code development and optimization on the next generation Intel Xeon Phi processor. A key part of the program is the Ninja Developer Platform.
Several supercomputing-class systems are currently powered by the Intel Xeon Phi processor (code name Knights Landing (KNL))—a powerful many core, highly parallel processor. KNL delivers massive thread parallelism, data parallelism, and memory bandwidth with improved single-thread performance and Intel Xeon processor binary-compatibility in a standard CPU form factor.
In anticipation of KNL’s general availability, we, along with our partners, are bringing to market a developer access program, which provides an ideal, platform for code developers. Colfax, a valued Intel partner, is handling the program, which is already underway.
The Ninja Platform
Think of the Ninja Developer Platform as a stand-alone box that has a single bootable next-generation Intel Xeon Phi processor. Developers can start kicking the tires and getting a feel for the processor’s capabilities. They can begin developing the highly parallel codes needed to optimize existing and new applications.
As part of Intel’s Developer Access Program, the Ninja platform has everything you need in the way of hardware, software, tools, education and support. It comes fully configured with memory, local storage, CentOS 7.2 and also includes a one-year license for Intel Parallel Studio XE tools and libraries. You can get to work immediately whether you’re a developer experienced with previous generations of Intel Xeon Phi coprocessors or if you are new to the Intel Xeon Phi processor family.
Colfax has pulled out all the stops in designing the education and support resources including white papers, webinars, and how-to and optimization guides. Currently underway are a series of KNL webinars and hands-on workshops – see details at http://dap.xeonphi.com/#trg
Here is a quick look at the two platform options that are being offered by the Developer Access Program – both are customizable to meet your application needs.
Given the richness of the technology and the tools being offered along with the training and support resources, developers should find the process of transitioning to the latest Intel Xeon Phi processor greatly accelerated.
The Ninja Development Platform is particularly well suited to meet the needs of code developers in such disciplines as academia, engineering, physics, big data analytics, modeling and simulation, visualization and a wide variety of scientific applications.
The platform will cost ~$5,000 USD for the single node pedestal server with additional costs for customization. On the horizon is our effort to take this program global with Colfax and partners. Stay tuned for details in my next blog.
You can pre-order the Ninja Developer Platform now at http://www.xeonphideveloper.com.
By Kent Landfield, Director of Standards and Technology Policy, Intel The Information Sharing and Analysis Organization Standards Organization (ISAO SO) held its Third Public Forum on May 18-19 in Anaheim California. More than 100 participants from academia, government and industry … Read more >
Graphics virtualization and design collaboration took a step forward this week with the announcement of support for Intel Graphics Virtualization Technology-g (Intel® GVT-g) on the Citrix XenServer* platform.
Intel GVT-g running on the current generation graphics-enabled Intel Xeon processor E3 family, and future generations of Intel Xeon® processors with integrated graphics capabilities, will enable up to seven Citrix users to share a single GPU without significant performance penalties. This new support for Intel GVT-g in the Citrix virtualization environment was unveiled this week at the Citrix Synergy conference in Las Vegas.
A little bit of background on the technology: With Intel GVT-g, a virtual GPU instance is maintained for each virtual machine, with a share of performance-critical resources directly assigned to each VM. Running a native graphics driver inside a VM, without hypervisor intervention in performance-critical paths, optimizes the end-user experience in terms of features, performance and sharing capabilities.
All of this means that multiple users who need to work with and share design files can now collaborate more easily on the XenServer integrated virtualization platform, while gaining the economies that come with sharing a single system and benefiting from the security of working from a trusted compute pool enabled by Intel Trusted Execution Technology (Intel® TXT).
Intel GVT-g is an ideal solution for users who need access to GPU resources to work with graphically oriented applications but don’t require a dedicated GPU system. These users might be anyone from sales reps and product managers to engineers and component designers. With Intel GVT-g on the Citrix virtualization platform, each user has access to separate OSs and apps while sharing a single processor – a cost-effective solution that increases platform flexibility.
The back side of this story is one of close collaboration among Intel, Citrix, and the Xen open source community to develop and refine a software-based approach to virtualization in an Intel GPU and XenServer environment. It took a lot of people working together to get us to this point.
And now we’ve arrived at our destination. With the combination of Intel GVT-g, Intel Xeon processor-based servers with Intel Iris Pro Graphics, and Citrix XenServer, anywhere, anytime design collaboration just a got a lot easier.
One of the most rewarding aspects of my work at Intel is seeing the new capabilities built in to Intel silicon that are then brought to life on an ISV partner’s product. It is this synergy between Intel and partner technologies where I see the industry and customers really benefit.
Two of the newer examples of this kind of synergy are made possible with Citrix XenServer 7.0—Supervisor Mode Access Prevention (SMAP) and Page Modification Logging (PML). Both capabilities are built in to the Intel Xeon processor E5 v4 family, but can only benefit customers when a server-virtualization platform is engineered to use them. Citrix XenServer 7.0 is one of the first server-virtualization platforms to do that with SMAP and PML.
Enhancing Security with Supervisor Mode Access Prevention (SMAP)
SMAP is not new in and of itself. Intel introduced SMAP for Linux on 3rd generation Xeon processors, SMAP is new to virtualization though. Intel added SMAP code to the Citrix Xen hypervisor in Xen Project. Citrix then worked with the code in Xen, and XenServer 7.0 makes SMAP a reality for server virtualization.
Figure 1: SMAP prevents the hypervisor from accessing the guests’ memory space other than when needed for a specific function
SMAP helps prevent malware from diverting operating-system access to malware-controlled user data, which helps enhance security in virtualized server environments. SMAP aligns with the Intel and Citrix partnership where Intel and Citrix regularly collaborate to help make a seamless, secure mobile-workspace experience a reality.
Improving Performance with Page Modification Logging (PML)
PML improves performance during live migrations between virtual server hosts. As with SMAP, PML capabilities are built in to the Intel Xeon processor E5 v4 family, and XenServer 7.0 is one of the first server-virtualization platforms to actually enable PML in a virtualized server environment.
Figure 2: With PML, CPU cycles previously used to track guest memory-page writes during live migration are available for guest use instead
I haven’t gone into detail on SMAP or PML or how they work. Instead, I invite you to read about them and how they add to the already strong XenServer virtualization platform and Intel Xeon processor E5 family in the Intel and Citrix solution brief, “New Capabilities with Citrix XenServer and the Intel Xeon Processor E5 v4 Family.” I also invite you to follow me and my growing #TechTim community on Twitter: @TimIntel.
By Steve Sieron, Senior Alliance Marketing Manager at CItrix
Intel will be highly visible next week at Synergy as a Platinum Sponsor. They’ll be featuring a number of new solutions that showcase the broad technical, product and marketing partnership with Citrix across networking, cloud, security and graphics virtualization. And? There’ll be an array of innovative Intel-based endpoint devices running XenApp and XenDesktop across Win10, Linux and Chrome OS.
You won’t want to miss SYN121 on Wednesday May 25 from 4:30-5:15pm PDT in Murano 3204 for “Mobilize your Design Workforce: Delivering Graphical Applications on Both Private and Public Clouds.” This informative panel, hosted by Jim Blakley, Intel GM Visual Cloud Computing, will feature graphics industry experts, including Thomas Poppelgaard, Jason Dacanay from Gensler, Adam Jull from IMSCAD. and Citrix own “Mr. HDX,” Derek Thorslund.
Be sure to take advantage of Intel’s Ask the Experts Bar and daily tech talks, where you can network with a variety of industry experts. The tech talks will feature customers and industry experts along with Intel and Citrix product owners. Intel health care implementations will also be featured in customer presentations at the Citrix Booth Theatre from both LifeSpan and Allegro Pediatrics.
Visit these Interactive Demos and More in Intel Booth #870
Enhancing Netscaler Security and Performance with Intel Inside. Showcasing performance scaling and new security enhancements on Intel® Xeon® Processor based Netscaler MPX and SDX product families.
Intel® Solid State Drives (SSD) Enable a Secure Client. New endpoint security, storage technologies and capabilities with Citrix core product solutions.
Scaling XenDesktop with Atlantis USX and Intel SSD. Featuring Atlantis USX as a storage layer with Intel SSDs for XenDesktop. Offering a robust performance architecture and high density with lower implementation costs and ongoing maintenance OPEX compared to traditional VDI Solutions.
Intel® Graphics Virtualization on Citrix (Intel® GVT). Learn about the new Intel Xeon Processor E3 family with Intel® Iris™ Pro Graphics in the cloud and new graphics virtualization technologies and solutions powered by Citrix from leading OEM partners. Interact with ISV-certified rich and brilliant 3D apps on the Intel remote cloud and learn how integrated graphics offer a compelling alternative to add-in graphics cards. The technologies highlighted will include Intel GVT-d – direct deployment of Intel processor graphics running 3D apps and media as well as Intel GVT-g – shared deployment in a cloud-based environment, hosted remotely in a data center running Citrix on latest-gen Intel Xeon processor servers.
Intel Ecosystem Enables Citrix Across Synergy16
Of course, the broader Intel ecosystem will be on full display at Synergy, including the latest HP Moonshot m710 Series and Cisco M-Series offerings. These tools bring unmatched levels of price, performance and density in delivering graphics and rich apps to a wide range of professional users requiring access to apps with ever-increasing graphics capabilities. There will also be a broad array of Intel Xeon-based Netscalers running in the IBM Softlayer Cloud and across booths and learning labs throughout the event. Explore exciting Intel-based Storage solutions on Citrix with new offerings from partners such as Nutanix, Pure Storage and Atlantis. As always, Intel end points will be ubiquitous throughout Synergy and featured in many sponsor pavilions, including HPE, Google, Dell and Samsung.
Beyond being a technology leader and strategic partner, Intel will be supplying Intel Arduino boards for the Simply Serve program at Synergy. Promoting STEM programs for Title 1 middle school students. A big thanks to Intel on behalf of both Citrix and the Southern Nevada United Way!
Citrix is pleased to welcome Intel to Synergy 2016. We encourage all attendees to stop by Booth #870 to meet the Intel team, watch customer presentations at the Intel Theatre and interact with innovative technology demos. Don’t forget to pull up your Synergy Mobile App to mark your calendar for SYN121, the Industry Expert Graphics Panel on Wed May 25 at 4:30pm in Murano 3204.
The Limitations of Security Data
We are constantly being bombarded by cybersecurity data, reports, and marketing collateral—and not all of this information should be treated equally. Security data inherently has limitations and biases, which result in varying value and relevance in how it should be applied. It is important to understand which is significant and how best to allow it to influence your decisions.
There is a tsunami of security metrics, reports, analyses, blogs, papers, and articles vying for attention. Sources range from reporters, researchers, professional security teams, consultants, dedicated marketing groups, and even security-operations people who are adding data, figures, and opinions to the cauldron. We are flooded with data and all those who have opinions on it.
It was not always this way. Over a decade ago, it was an information desert, where even speculations were rare. Making decisions driven by data has always been a good practice. Years ago, many advocates were working hard to convince the industry to share information. Even a drop is better than none. Most groups that were capturing metrics were too frightened or embarrassed to share. Data was kept secret by everyone while decision makers were clamoring for security insights based upon industry numbers, which simply were not available.
What Was the Result?
In the past, fear, uncertainty, and doubt ruled. People began to dread the worst and unscrupulous security marketing advocates took advantage, fanning the flames to sell products and snake oil. They were dark times, promulgated with outlandish claims of easily eradicating cyber threats with their software or appliance products. The market was riddled with magic boxes, silver-bullet software, and turn-key solutions to easily fix all security woes. I can remember countless salespeople asserting “we solve security” (which at that point I stopped listening or kicked them out). The concept of flipping a switch and all the complex problems of compute security forever goes away, was what uninformed organizations wanted to hear, but was simply unrealistic. Why customers chose to believe such nonsense (when the problem and the effectiveness of potential solutions could not be quantified) is beyond me, but many did. Trust in the security solutions industry was lost for a period of time.
Slowly, a trickle of informative sources began to produce reports and publish data. Such initiatives gained momentum with others joining in to share in limited amounts. It was a turning point. Armed with data and critical thinking, clarity and common sense began to take root. It was not perfect or quick, but the introduction of data from credible sources empowered security organizations to better understand the challenge and effective ways to maneuver against threats.
As the size of the market and competition grew, additional viewpoints joined the fray. Today, we are bombarded by all manner of cybersecurity information. Some are credible while others are not. There are several types of data being presented, ranging from speculations to hard research. Being well-informed is extremely valuable to decision makers. Now, the problem is figuring out how to filter and organize the data so one is not mislead.
As part of my role as a cybersecurity strategist, I both publish information to the community and consume vast amounts of industry data. To manage the burden and avoid the risks of believing less-than-trustworthy information, I have a quick guide to help structure the process. It is burned into my mind as a set of filters and rules, but I am committing it to paper in order to share.
I categorize data into four buckets. These are: Speculation, Survey, Actuarial, and Research. Each has its pros and cons. The key to managing security data overload is to understand the limitations of each class, its respective value and its recommended usage.
For example, Survey data is the most unreliable, but does have value in understanding the fears and perceptions of the respondent community. Research data is normally very accurate but notoriously narrow in scope and may be late to the game. One of my favorites is Actuarial data. I am a pragmatic guy. I want to know what is actually happening so I can make my own conclusions. But there are limitations to Actuarial data as well. It tends to be very limited in size and scope, so you can’t look too far into it and it is a reflection of the past, which may not align to the future.
I hear lots of different complaints and criticisms when it comes to the validity, scope, intent, and usage of data. I personally have my favorites and those which I refuse to even read. Security data is notoriously difficult. There are so many limitations and biases, it is far easier to point out issues than to see the diamond in the rough. But data can be valuable if filtered, corrected for bias, and the limitations are known. Don’t go in blind. Common sense must be applied. Have a consistent method and structure to avoid pitfalls and maximize the data available to help you manage and maintain an optimal level of security.
Below are a few examples, in my opinion, of credible cybersecurity data across the spectrum of different categories. Again keep in mind the limitations of each group and don’t make the mistake of using the information improperly! Look to Speculation for the best opinions, Survey for the pulse of industry perceptions, Actuarial for real events, and Research for deep analysis:
- 2016 Cybersecurity Threat Predictions from McAfee Labs
- $243 billion – $1 trillion. Potential cost of a single attack against the US Power Grid, per Lloyds Insurance
- ~3 Trillion aggregate economic impact of cybersecurity on technology trends, through 2020. World Economic Forum 2014 report Risk and Responsibility in a Hyperconnected World
- $90 trillion dollars cyber impact for one scenario (worst case) affecting the global benefits of Information and Communications Technologies by 2030. The Atlantic Council’s report estimate.
- 55% CAGR. Growth of global IoT Security market 2016-2020, per researchandmarkets.com
- My 2016 Cybersecurity Predictions, in fact most of my blogs
- Threat Intelligence Sharing surveyMcAfee Labs Threats Report March 2016
- 20% jump in cybercrime in the UK since 2014 with nearly two-thirds of businesses expressing no confidence in the ability of law enforcement to deal with it, per PwC
- 25% Americans believe they have experienced a data breach or cyber attack. Travelers survey
- 43% organizations surveyed indicated increases in cybersecurity will drive the most technology spending. Source 2016 ESG IT spending intentions research report
- 61% of CEO’s believe cyber threats pose a danger to corporate growth per PwC survey
- 3 out of 5 Californians were victims of data breaches in 2015 according to the CA Attorney General in the 2016 California Data Breach Report
- ~35% of the US population. Top 10 Healthcare breaches of 2015, affected almost 35% of the US population. Source: Office of Civil Rights
- Data Breach Investigations Report (DBIR) annual report by Verizon
- 2016 Annual Security Report by Cisco
- 42 million new unique pieces of malware discovered in Q4 2015, bringing the total known samples to almost 500 million, per McAfee Labs Threat Report (March 2016, Malware section)
- Security Intelligence Report (SIR) bi-annual report by Microsoft
- $325M losses attributed to Cryptowall v3 ransomware, analysis from the Cyber Threat Alliance
- $13.1 billion. U.S. Government spends on cybersecurity in 2015. Source: FISMA report from OMB
- “Carbanak” advanced attack analysis by Kaspersky
By the way, yes, this very blog would be considered Speculation. Treat it as such.
By Lisa Malloy, director of Government Relations and Trade Policy for Intel Trade agreements have long been a major contributor to improving U.S. competitiveness and economic growth. That’s certainly true for the Trans-Pacific Partnership (TPP), which will help U.S. businesses … Read more >
The post ITC Report Highlights TPP Benefits to the Digital Economy appeared first on Policy@Intel.
Big data analytics is one of the decade’s biggest buzz phrases. One place it can really deliver value for retailers is in data-driven decision making. Today’s retailers have a wealth of technology at their disposal to gather, analyze, and use … Read more >
The post Retailers Tap Into Data Analytics for Amazing Customer Experiences appeared first on IoT@Intel.
As the Internet of Things becomes a reality, Intel IoT is leading the industry in transforming and securing transactions. This was especially clear at Transact 2016 in Las Vegas, produced by the Electronic Transactions Association, the world’s largest payments industry … Read more >
The post Intel IoT Ecosystem Drives Transaction Innovation at Transact 2016 appeared first on IoT@Intel.
Mark Caulfield, FMedSci, is a chief scientist and board member at Genomics England, an organization which provides investment and leadership to increase genomic testing research and awareness. Caulfield is also the director of the William Harvey Research Institute and was elected to the Academy of Medical Sciences in 2008. His particular areas of research are Cardiovascular Genomics and Translational Cardiovascular Research and Pharmacology. We recently sat down with him to discuss genomic sequencing as well as insight into a current research project.
Intel: What is the most exciting project you’re working on right now?
Caulfield: The 100,000 Genomes Project is a healthcare transformation program that reads through the entire DNA code using whole genome sequencing. That’s 3.3 billion letters that make you the individual you are. It gives insight into what talents you have as well as what makes you susceptible to disease. My research is focused on infectious disease and rare inherited diseases such as cancer. Technology can bring answers that are usable in the health system now across our 13 centers.
When studying rare disease, the optimal unit is a mother, father and an affected offspring. The reason is that both parents allow the researcher to filter out rare variations that occur in the genetic code that are unrelated to the disease, focusing in on a precise group. This project will result in more specific diagnosis for patients, a better understanding of disease, biological insights which may pave the way for new therapies and a better understanding of the journey of patients with cancer, rare disease and infection.
Intel: How does this project benefit patients?
Caulfield: By building a picture of the entirety of the genome or as much as we can read today, which is about 97.6 percent of your genome, we have a more comprehensive picture and a far greater chance of deriving healthcare benefits for patients. Cancer is essentially a disease of disordered genome. With genomic sequencing, we can gain insights into what drove the tumor to occur in the first place, what drives its relapse, what drives its spread and other outcomes. Most importantly, we can understand what drives response to therapy. We already have good examples of where cancer genotyping is making a real difference to therapy for patients.
Intel: What is the biggest hurdle?
Caulfield: Informed consent is essential to the future application of the 100,000 genomes project. It’s very hard to guarantee, that you can absolutely secure data. I think it’s the responsibility of all medical professionals like myself in this age to be upfront about the risk to data access. Most patients understand these risks. We try and keep patient data as secure as is reasonably possible within the present technological bounds.
Intel: What is crucial to the success of genomic sequencing?
Caulfield: We need big data partners and people who know how to analyze a large amount of data. We also need commercial partners that will allow us to get new medicines to patients as quickly as possible. That partnership, if articulated properly, is well received by people. Once we have this established, we can make strides in gaining and keeping public and patient trust, which is crucial to the success of genomic sequencing.
If you want public trust, you must fully inform patients about the plan. Ensure their medical professionals understand that plan and that patients are bought into a conversation. This allows the patients and the public to shape your work. Sometimes in medicine, we become a little remote from what the patient wants when in actuality, this is their money. It should be their program, not mine.
Intel: What goal should researchers focus on?
Caulfield: With this large amount of data comes the need to process it as quickly as possible in order to provide helpful results for both the patient and care team. Intel’s All in One Day initiative is an important goal because it accelerates the time from when a person actually enrolls in such a program to receiving a diagnostic answer.
The goal is to get the turn-a-round as fast as possible. For example, if a patient has cancer, that person may have an operation where the cancer is removed. Then the patient would then need to heal. If chemotherapy were needed, it would be important to start that as quickly as possible. We have to use the best technology we have available so we can shrink the time from involvement to answer.
As Internet services, television, and the Internet of Things transform the way we connect to and experience the world around us, a key trend in transformational connectivity is emerging: Connected home gateways. At INTX: The Internet and Television Expo in … Read more >
The post Puma 7 Home Gateway Leads the Way in Transformational Connectivity appeared first on IoT@Intel.
By Jennifer Mulveny, Director for Global Public Policy, Australia and New Zealand Australian Prime Minister Malcolm Turnbull recently released the Government’s Cyber Security Strategy, which reveals an attractive trifecta of what Intel considers to be smart security policy: relying on … Read more >
The post Keeping Australia Cyber-Safe and Open for Business appeared first on Policy@Intel.
All In One Day by 2020 – the phrase encompasses our real ambition here at Intel to empower researchers to give clinicians the information they need to deliver a targeted treatment plan for patients in just one 24-hour period. I wanted to provide you with some insight into where we are today and what’s driving forward the journey to All In One Day by 2020.
Genomics Code Optimization
We have been working with industry-leader experts, and commercial and open source authors of key genomic codes for several years on code optimization to ensure that genome processing runs as fast as possible on Intel®-based systems and clusters. The result is a significant improvement on the speed of key genomic programs which will help get sequencing and processing down to minutes, for example:
- Intel has sped up a key piece of the Haplotype Caller in GATK, the pairHMM kernel to be 970x faster for an overall 1.8x increase in the pipeline performance;
- The acceleration of file compression for genomics files, e.g. BAM and SAM files by over 4x
- The acceleration of Python using Intel’s Math Kernel Library (MKL) producing a 15x speedup on a 16-core Haswell CPU;
- Finally, using the enhanced MKL, in conjunction with its Data Analytics Acceleration Library (DAAL), has enabled DAAL to be 100x faster than R for k-means clusters and 35x faster than Weka on Apriori.
You can find out more about Intel’s work in code optimization at our dedicated Optimized Genomics Code webpage.
Scalability for Success
As we see an explosion in the volume of available data the importance of being able to scale a high performance computing system becomes ever more critical to accelerating success. We have put forth the Intel® Scalable System Framework to guide the market on the optimal construction of an HPC solution that is multi-purpose, expandable and scalable.
Combining the Scalable System Framework with optimized life sciences codes has resulted in a new, more flexible, scalable, and performant architecture. This reduces the need for purpose-built systems and instead offers an architecture that can span a variety of diverse workloads while offering increased performance.
Another key element of an architecture is the balance between three key factors: compute, storage, and fabric. And today we see the fruits of our work coming to life, for example, in a brilliant collaboration between TGen, Dell and Intel which optimized TGen’s RNA-Seq pipeline from 7 days to under 4 hours. TGen are successfully operating FDA-approved clinical trials, balancing research and providing clinical treatment of pediatric oncology patients.
From a week to a day
It’s useful, I think, to see just how far we’ve come in the last four years as we look ahead to the next four years to 2020. In 2012 it took a week to perform the informatics on a whole human in a cloud environment going from the raw sequence data to an annotated result. Today, the time for the informatics had decreased to just 1 day for whole genomes.
With the Dell and Qiagen reference architectures that are based on optimized code and the Intel® Scalable System Framework, a throughput-based solution has been created. This means that when fully loaded these base systems will perform the informatics on ~50 whole genomes per day.
However, it is important to note the genomes processed on these systems still take ~24 hours to run, but they are being processed in a highly parallel manner. If you use a staggered start time of ~30 minutes between samples, this results in a completed genome being produced approximately every 30 minutes. For the sequencing instrumentation, Illumina can process a 30x whole human genome in 27 hours using its “rapid-run mode”.
So, in 2016, we can sequence a whole genome and do the informatics processing in just over 2 days (51 hours consisting of 27 hours of sequencing + 24 hours of informatics time), that’s just ~1 day longer than our ambition of All In One Day by 2020.
Three final points to keep in mind:
- There are steps in the All In One Day process that are our outside of the sequencing and the informatics, such as the doctor’s visit, the sample preparation for sequencing, the genome interpretation and the dissemination of results to the patient. These steps will add additional time to the above 51 hours.
- The reference architectures are highly scalable meaning a larger system can do more genomes per day. 4 times the nodes produce 4 times throughput.
- There are enhancements still to be made. For example, streaming the output from the sequencer to the informatics cluster such that the informatics can be started before the sequencing is finished will further compress the total time towards our all-in-one-day goal.
I’m confident our ambitions will be realized.
Good security is about balancing Risks, Costs, and Usability. Too much or too little of each can be unhealthy and lead to unintended consequences. We are entering an era where the risks of connected technology can exceed the inconveniences of interrupted online services or the release of sensitive data. Failures can create life-safety issues and major economic impacts. The modernization of healthcare, critical infrastructure, transportation, and defense industries is beginning to push the boundaries and directly impact people’s safety and prosperity. Lives will hang in the balance and it is up to the technology providers, users, and organizations to ensure the necessary balance of security is present.
We are all cognizant of the risks in situations where insufficient security opens the door to exposure and the compromise of systems. Vulnerabilities allow threats to undermine the availability of systems, confidentiality of data, and integrity of transactions. On the other end of the spectrum, too much security can also cause serious issues.
A recent incident described how a piece of medical equipment crashed during a heart procedure due to an overly aggressive anti-virus scan setting. The device, a Merge Hemo, is used to supervise heart catheterization procedures, while doctors insert a catheter inside blood vesicles to diagnose various types of heart diseases. The module is connected to a PC that runs software to record and display data. During a recent procedure, the application crashed due to the security software which began scanning for potential threats. The patient remained sedated while the system was rebooted, before the procedure could be completed. Although the patient was not harmed, the mis-configuration of the PC security software caused an interruption during an invasive medical procedure.
Security is not an absolute. There is a direct correlation between the increasing integration of highly connected and empowered devices, and the risks of elevated attack frequency with a greater severity of impacts. The outcome of this particular situation was fortunate, but we should recognize the emerging risks and prepare to adapt as technology rapidly advances.
Striking a balance is important. It may not seem intuitive, but yes, too much security can be a problem as well. Protection is not free. Benefits come with a cost. Security functions can create overhead to performance, reduce productivity, and ruin users’ experiences. Additionally, security can increase the overall cost of products and services. These and other factors can create ripples in complex systems and result in unintended consequences. We all agree security must also be present, but the reality is, there must be an appropriate balance. The key is to achieve an optimal level, by tuning the risk management, costs, and usability aspects for any given environment and usage.