Recent Blog Posts

The Bleeding Edge of Medicine

Computer Aided Engineering (CAE) has become pervasive in the design and manufacture of everything from jumbo jets to razor blades, transforming the product development process to produce more efficient, cost effective, safe and easy to use products. A central component of CAE is the ability to realistically simulate the physical behavior of a product in real world scenarios, which greatly facilitates understanding and innovation.



Application of this advanced technology to healthcare has profound implications for society, promising to transform the practice of medicine from observation driven to understanding driven. However, lack of definitive models, processes and standards has limited its application, and development has remained fragmented in research organizations around the world.


Heart simulation invaluable

In January of 2014, Dassault Systèmes took the first step to change this and launched the “Living Heart Project” as a translational initiative to partner with cardiologists, researchers, and device manufacturers to develop a definitive realistic simulation of the human heart. Through this accelerated approach, the first commercial model-centric, application-agnostic, multi-physical whole heart simulation has been produced.


Since cardiovascular disease is the number one cause of morbidity and mortality across the globe, Dassault Systèmes saw the Living Heart Project as the best way to address the problem. Although there is a plethora of medical devices, drugs, and interventions, physicians face the problem of determining which device, drug, or intervention to use on which patient. Often times to truly understand what is going on inside a patient invasive procedures are needed.


CAE and the Living Heart Project will enable cardiologists to take an image (MRI, CT, etc) of a patient’s heart and reconstruct it on a 3D model thereby creating a much more personalized form of healthcare. The doctor can see exactly what is happening in the patient’s heart and definitively make a more informed decision of how to treat that patient most effectively.


What questions do you have about computer aided engineering?


Karl D’Souza is a senior user experience specialist at Dassault Systèmes Simulia Corp.

Read more >

Ethernet Shows Its Role as Fabric Technology for High-End Data Centers at OCP Summit

March has been a big month for demonstrating the role of Intel® Ethernet in the future of several key Intel initiatives that are changing the data center.


At the start of the month we were in Barcelona at Mobile World Congress demonstrating the role of Ethernet as the key server interconnect technology for Intel’s Software Defined Infrastructure initiative read my blog post on that event.


And just this week, Intel was in San Jose at the Open Compute Project Summit highlighting Ethernet’s role in Rack Scale Architecture, which is one of our initiatives for SDI.


RSA is a logical data center hardware architectural framework based on pooled and disaggregated computing, storage and networking resources from which software controllers can compose the ideal system for an application workload.


The use of virtualization in the data center is increasing server utilization levels and driving an insatiable need for more efficient data center networks. RSA’s disaggregated and pooled approach is an open, high-performance way to meet this need for data center efficiency.


In RSA, Ethernet plays a key role as the low-latency, high bandwidth fabric connecting the disaggregated resources together and to other resources outside of the rack. The whole system depends on Ethernet providing a low-latency, high throughput fabric that is also software controllable.


MWC was where we demonstrated Intel Ethernet’s software controllability through support for network virtualization overlays; and OCP Summit is where we demonstrated the raw speed of our Ethernet technology.


A little history is in order. RSA was first demonstrated at last year’s OCP Summit, and as a part of that, we revealed an integrated 10GbE switch module proof of concept that included switch chip and multiple Ethernet controllers that removed the need for a NIC in the server.


This proof of concept showed how this architecture could disaggregate the network from the compute node.


At the 2015 show, we demonstrated a new design with our upcoming Red Rock Canyon technology, a single-chip solution that integrates multiple NICs into a switch chip. The chip delivered throughput of 50 Gbps between four Xeon nodes via PCIe, and multiple 100GbE connections between the server shelves, all with very low latency.


The features delivered by this innovative design provide performance optimized for RSA workloads. It’s safe to say that I have not seen a more efficient or high-performance rack than this PoC video of the performance.


Red Rock Canyon is just one of the ways we’re continuing to innovate with Ethernet to make it the network of choice for high-end data centers.

Read more >

NVM Express* Technology Goes Viral – From Data Center to Client to Fabrics

Amber Huffman, Sr Principal Engineer, Storage Technologies Group at Intel

For Enterprise, everyone is talking about “Cloud,” “Big Data,” and “Software Defined X,” the latest IT buzzwords. For consumers, the excitement is around 4K gaming and 4K digital content creation. At the heart of all this is a LOT of data. A petabyte of data used to sound enormous – now the explosion in data is being described in exabytes (1K petabytes) and even zettabytes (1K exabytes). The challenge is how to get fast access to the specific information you need in this sea of information.


NVM Express* (NVMe*) was designed for enterprise and consumer implementations, specifically to address this challenge and the opportunities created by the massive amount of data that businesses and consumers generate and devour.


NVMe is the standard interface for PCI Express* (PCIe*) SSDs. Other interfaces like Serial ATA and SAS were defined for mechanical hard drives and legacy interfaces are slow, both from a throughput and a latency standpoint. NVMe jettisons this legacy and is architected from the ground up for non-volatile memory, enabling NVMe to deliver amazing performance and low latency. For example, NVMe delivers up to 6x the performance of state of the art SATA SSDs1.


There are several exciting new developments in NVMe. In 2015, NVMe will be coming to client systems, delivering great performance at the low power levels required in 2-in-1s and tablets. The NVM Express Workgroup is also developing “NVMe over Fabrics,” which brings the benefits of NVMe across the data center and cloud over fabrics like Ethernet, Fibre Channel, InfiniBand*, and OmniPath* Architecture.


NVM Express is the interface that will serve data center and client needs for the next decade. For a closer look at the new developments in NVMe, look Under the Hood with this video. Check out more information at



1Tests document performance of components on a particular test, in specific systems. Differences in hardware, software, or configuration will affect actual performance. Configuration: Performance claims obtained from data sheet, Intel® SSD DC P3700 Series 2TB, Intel® SSD DC S3700 Series: Intel Core i7-3770K CPU @ 3.50GHz, 8GB of system memory, Windows* Server 2012, IOMeter. Random performance is collected with 4 workers each with 32 QD. Configuration for latency: Intel® S2600CP server, Intel® Xeon® E5-2690v2 x2, 64GB DDR3, Intel® SSD DC P3700 Series 400GB, LSI 9207-8i, Intel® SSD DC S3700.


© 2015 Intel Corporation


Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

*Other names and brands may be claimed as the property of others.

Read more >

Transforming the Workplace for a New Generation of Workers

Workplace transformation is not a new concept. It’s a piece of our evolution. As new generations enter the workforce, they bring new expectations with them; what the workplace meant for one generation doesn’t necessarily fit with the next. Think about the way we work in 2015 versus the way we worked in, say, 2000.


In just 15 years, we’ve developed mobile technology that lets us communicate and work from just about anywhere. Robust mobile technologies like tablets and 2 in 1s enable remote workers to video conference and collaborate just as efficiently as they would in the office. As these technologies evolve, they change the way we think about how and where we work.



Working Better by Focusing on UX


Over the past decade, mobile technologies have probably had the most dramatic impact on how we work, but advances in infrastructure will pave the way for the next big shift. Wireless technologies have improved by leaps and bounds. Advances in wireless display (WiDi) and wireless gigabit (WiGig) technologies have created the very real possibility of a wire-free workplace. They drive evolution in a truly revolutionary way.


Consider the impact of something as simple as creating a “smart” conference room with a large presentation screen that automatically pairs with your 2 in 1 or other device, freeing you from adapters and cords. The meeting room could be connected to a central calendar and mark itself as “occupied” so employees always know which rooms are free and which ones are in use. Simple tweaks like this keep the focus on the content of meetings, not the distractions caused by peripheral frustrations.


The workstation is another transformation target. Wireless docking, auto-connectivity, and wireless charging will dramatically reduce clutter in the workplace. The powerful All-in-One PC with the Intel Core i5 processor will free employees from the tethers of their desktop towers. Simple changes like removing cords and freeing employees from their cubicles can have huge impacts for companies — and their bottom lines.


The Benefits of an Evolved Workplace


Creating the right workplace for employees is one of the most important things companies can do to give themselves an advantage. By investing in the right infrastructure and devices, businesses can maximize employee creativity and collaboration, enhance productivity, and attract and retain top talent. Evolving the workplace through technology can empower employees to do their best work with fewer distractions and frustrations caused by outdated technology.


If you’re interested in learning more about what I’ve discussed in this blog, tune in to the festivities and highlights from CeBit 2015.


To continue this conversation on Twitter, please use #ITCenter. And you can find me on LinkedIn here.

Read more >

#TechConnect Mar 4 Chat Recap: “Solid-State Drives for Data Center Storage”

Another great Tech Connect Chat occurred on Wednesday, March 4. Intel’s David Blunden, Alan Winscott, Suman Sarkar, and Zhdan Bybin lead the discussion on “Solid-State Drives for Data Center Storage.” Here are some highlights from the hosts and participants: (1) M.2 Form Factors … Read more >

The post #TechConnect Mar 4 Chat Recap: “Solid-State Drives for Data Center Storage” appeared first on Technology Provider.

Read more >

The Behavioral Shift Driving Change in the World of Retail

Ready or Not, Cross-Channel Shopping Is Here to Stay


Of all the marketplace transitions that have swept through the developed world’s retail industry over the last five to seven years, the most important is the behavioral shift to cross-channel shopping.


The story is told in these three data points1:


  1. 60 plus percent of U.S. shoppers (and a higher number in the U.K.) regularly begin their shopping journey online.
  2. Online ratings and reviews have the greatest impact on shopper purchasing decisions, above friends and family, and have four to five times greater impact than store associates.
  3. Nearly 90 percent of all retail revenue is carried out in the store.


Retail today is face-to-face with a shopper who’s squarely at the intersection of e-commerce, an ever-present smartphone, and an always-on connection to the Internet.


Few retailers are blind to the big behavioral shift. Most brands are responding with strategic omni-channel investments that seek to erase legacy channel lines between customer databases, inventories, vendor lists, and promotions.



Channel-centric organizations are being trimmed, scrubbed, or reshaped. There’s even a willingness — at least among some far-sighted brands — to deal head-on with the thorny challenge of revenue recognition.


All good. All necessary.



Redefining the Retail Space


But, as far as I can tell, only a handful of leaders are asking the deeper question: what, exactly, is the new definition of the store?


What is the definition of the store when the front door to the brand is increasingly online?


What is the definition of the store when shoppers know more than the associates, and when the answer to the question of how and why becomes — at the point of purchase — more important than what and how much?


What is the definition of the store beyond digital? Or of a mash-up of the virtual and physical?


What is the definition — not of brick-and-mortar and shelves and aisles and four-ways and displays — but of differentiating value delivery?


This is a topic we’re now exploring through whiteboard sessions and analyst and advisor discussions. We’re hard at work reviewing the crucial capabilities that will drive the 2018 cross-brand architecture.


Stay tuned. I’ll be sharing my hypotheses (and findings) as I forge ahead.



Jon Stine
Global Director, Retail Sales

Intel Corporation


This is the second installment of a series on Retail & Tech. Click here to read Moving from Maintenance to growth in Retail Technology.


1 National Retail Federation. “2015 National Retail Federation Data.” 06 January 2015.

Read more >

Can Technology Enable Viable Virtual Care?


I recently spoke to Mark Blatt, Intel’s Worldwide Medical Director, about whether virtual care can deliver equal to or better than face-to-face care. Across the world, ageing populations are stretching public health services to the limit. It’s impractical for everybody with a health problem to go to a hospital or clinic, taking up the valuable time of a limited number of doctors and nurses that could be better used elsewhere.


That’s why we believe virtual care is a trend that will increase markedly in the future. It isn’t something that is entirely new – in the past my fellow medical professionals have found the telephone a valuable diagnostic tool. And while it remains an important part of virtual care, the desk telephone (which is more commonly used in a mobile situation today), when used in isolation, can help to deliver only basic support.


So, what does the future hold for virtual care? Take a look at the video above to hear Mark’s thoughts and leave us your ideas too. I’d love to hear from you in the comments section below.

Read more >

Tightening up Intel SCS service account permissions for managing Intel AMT computer objects in Microsoft Active Directory

An enterprise customer wanted to enable Active Directory integration with Intel AMT on their large Intel vPro client estate. However their security team wanted the permissions for the Intel SCS service account against the Organisational Unit (OU) where Intel AMT computer objects are stored to support Kerberos, to be as restrictive as possible.


As defined in the Intel® Setup and Configuration Software User Guide, permissions for the SCS service account on the OU container are “Create Computer objects”, “Delete Computer objects” and “List content” (the latter seems to be default) and full control on descendant computer objects. The latter was not acceptable so …



… to support AMT maintenance tasks such as updating the password of the AD object representing the Intel AMT device and ensuring the Kerberos clock remains synchronised, the following explicit permissions are required on all descendant computer objects within the OU.


The customers security team were happier with these permissions and they are now activating their Intel vPro systems to enable the powerful manageability and security capabilities that Active Management Technology, available on Intel vPro Technology platforms provides.

Read more >

Emerging Technology Sectors Changing the IT-Business Landscape

Intel’s CIO Kim Stevenson is “…convinced that this is an exciting time as we enter a new era for Enterprise IT. Market leadership is increasingly being driven by technology in all industries, and a new economic narrative is being written that challenges business models that have been in place for decades.”


With enterprise pumping more funds into the industry than ever, Gartner projects that IT spending will reach $3.8 trillion this year. Gartner’s prediction indicates that while many of the traditional enterprise IT-focused areas — data center systems, devices, enterprise software, IT services, and telecom services — will continue to see increased investment, new areas are expected to emerge much faster.


As the business invests more in IT — whether it’s in these traditional focused areas or these new emergent areas — one thing is stable. The business is becoming more dependent on IT for both organizational efficiency and competitive value.


Let’s take a closer look at two of the emergent growth segments along with the challenges, opportunities, and value they create for this new era of business-IT relationships.


Security and the Internet of Things


Gartner projects an almost 30-fold increase in the number of installed IoT units (0.9 billion to 26 billion) between 2009 and 2020. The data collected from these devices is an essential component to future IT innovation; however, this technology comes with significant security and privacy risks that cannot be ignored. “Data is the lifeblood of IoT,” states Conner Forrest of ZDNet. “As such, your security implementation for IoT should center around protecting it.”


The potential for the IoT remains largely undefined and at risk, especially with 85 percent of devices still unconnected and security threats prevalent. The Intel IoT Platform was designed to address this business challenge. The Intel IoT Platform is an end-to-end reference model that creates a secure foundation for connecting devices and transferring data to the cloud. With this reference architecture platform, countless IoT solutions can be built and optimized with the advantages of scalable computing, security from device to cloud, and data management and analytics support.


Invest.pngThe Enterprise Investing in Startups


2014 represented the biggest year in corporate venture group capital investment since 2000, and this trend is set to continue, according to a MoneyTree Report jointly conducted by PricewaterhouseCoopers LLP, the National Venture Capital Association, and Thomson Reuters.  What is interesting to me is the why. Organizations want and need a critical asset: creative talent.


As the term “innovation” runs rampant through the enterprise, CIOs know they must make changes in order to stay fresh and competitive. However, according to Kim Nash of CIO, 74 percent of CIOs find it hard to balance innovation and operational excellence, suggesting that a more powerful approach would be to acquire a startup to capture its talent, intelligence, and creative spirit.


While buying a startup is not in every organization’s wheelhouse, some businesses are providing venture capital to startups in order to tap into their sense of innovation. “By making such moves,” explains Nash, “non-IT companies gain access to brand new technology and entrepreneurial talent while stopping short of buying startups outright.”

Leadership Tips For IT Innovation


IT’s success in this new environment will not follow a pre-defined formula. In fact, it will rely on new skills and an evolving partnership between business and IT. For this reason, Intel partnered with The IT Transformation Institute to present the Transform IT Show. Transform IT is a web-based show that features in-depth interviews with business executives, IT leaders, and industry experts to shed light on what the future holds for business and the IT organizations that power them. Most importantly, the show highlights the advice for all future leaders on how to survive and thrive in the coming era of IT.


I hope you enjoy our guests and can apply the insights you gain from the Transform IT Show. Join this critical conversation by connecting with me on Twitter at @chris_p_intel or by using #TransformIT.

Read more >

Hardware Hacking with Rowhammer

Memory2.jpgRowhammer represents a special case for vulnerability exploitation, it accomplishes something very rare, by hacking the hardware itself.  It takes advantage of the physics happening at the nano-level in a very specific architecture structure present in some designs of computer memory.  Rowhammer allows attackers to change bits of data in sections of memory they should not have access to. It seems petty, but don’t underestimate how flipping bits at this level can result in tremendous risk.  Doing so, could grant complete control of a system and bypass many security controls which exist to compartmentalize traditional malicious practices.  Rowhammer proves memory hardware can be manipulated directly.


In the world of vulnerabilities there is a hierarchy, from easy to difficult to exploit and from trivial to severe in overall impact.  Technically, hacking data is easiest, followed by applications, operating systems, firmware and finally hardware.  This is sometimes referred to as the ‘stack’ because it is how systems are architecturally layered. 


The first three areas are software and are very portable and dynamic across systems, but subject to great scrutiny by most security controls.  Trojans are a great example where data becomes modified and can be easily distributed across networks.  Such manipulations are relatively exposed and easy to detect at many different points.  Applications can be maliciously written or infected to act in unintended ways, but pervasive anti-malware is designed to protect against such attacks and are constantly watchful.  Vulnerabilities in operating systems provide a means to hide from most security, open up a bounty of potential targets, and offer a much greater depth of control.  Knowing the risks, OS vendors are constantly identifying problems and sending a regular stream of patches to shore up weaknesses, limiting the viability of continued exploitation by threats.  It is not until we get to Firmware and Hardware, do most of the mature security controls drop away.   


The firmware and hardware, residing beneath the software layers, tends to be more rigid and represents a significantly greater challenge to compromise and scale attacks.  However, success at the lower levels means bypassing most detection and remediation security controls which live above, in the software.  Hacking hardware is very rare and intricate, but not impossible.  The level of difficulty tends to be a major deterrent while the ample opportunities and ease which exist in the software layers is more than enough to lure hackers in staying with easier exploits in pursuit of their objectives. 

Attackers Move Down the Stack.jpg

Attackers are moving down the stack.  There are tradeoffs to attacks at any level.  The easy vulnerabilities in data and applications yield much less benefits for attackers in the way of remaining undetected, persistence after actions are taken against them, and the overall level of control they can gain.  Most security products, patches and services work at this level and have been adapted to detect, prevent, and evict software based attacks.  Due to the difficulty and lack of obvious success, most vulnerability research doesn’t explore much in the firmware and hardware space.  This is changing.  It is only natural, attackers will seek to maneuver where security is not pervasive. 


Rowhammer began as a theoretical vulnerability, one with potentially significant ramifications.  To highlight the viability, the highly skilled Google Project Zero team developed two exploits which showcased the reality of gaining kernel privileges.  The blog from Rob Graham, CEO of Errata Security, provides more information on the technical challenges and details.


Is Rowhammer an immediate threat?  Probably not.  Memory vendors have been aware of this issue for some time and have instituted new controls to undermine the current techniques.  But this shows at a practical level how hardware can be manipulated by attackers and at a theoretical level how this could have severe consequences which are very difficult to protect against.


As investments in offensive cyber capabilities from nations, organized crime syndicates, and elite hackers-for-hire continue to grow, new areas such as hardware vulnerabilities will be explored and exploited.  Rowhammer is a game-changer with respect to influencing the direction of vulnerability research.  It is breaking new ground which others will follow, eventually leading to broad research in hardware vulnerability research across computing products which influence our daily lives.  Hardware and firmware hacking is part of the natural evolution of cybersecurity and therefore a part of our future we must eventually deal with.



Twitter: @Matt_Rosenquist

IT Peer Network: My Previous Posts



Read more >

Five Questions about Transactive Energy for David Forfia

David Forfia is the Chairperson of the GridWise Architecture Council and Senior Director of Information Technology Services at the Electric Reliability Council of Texas (ERCOT), the Texas system operator that creates the wholesale market, manages reliability functions, and enables retail switching … Read more >

The post Five Questions about Transactive Energy for David Forfia appeared first on Energy.

Read more >