Managing Mobile and BYOD: Madrid Community Health Department

The Bring Your Own Device (BYOD) movement is booming. Tech Pro Research’s latest survey shows that 74 percent of organizations globally are either already using or planning to allow employees to bring their own devices to work.


Allowing employees to bring their own devices into the office for business use has helped companies cut hardware and service costs, increase flexibility and achieve greater productivity, but there are also inherent security and data protection risks.

According to the same Tech Pro Research study, security concerns were the primary barrier to adoption of BYOD for a large majority (78 percent) of respondents; followed by IT support concerns (49 percent); lack of control over hardware (45 percent); and regulatory compliance issues (39 percent).


The cost of a data breaches is often substantial. Data from the Ponemon Institute shows that in EMEA in 2014 the organisational cost of a breach was some £2.02m in UAE/Saudi Arabia, £2.21m in the United Kingdom and over £2.50m in Germany.


Of course these concerns and costs are understandable, but they needn’t be a showstopper.


Mobile risk analysis

Carrying out a thorough risk analysis of the impact of BYOD can help organizations better understand the associated security, management and compliance issues and help them chose the mobility solution that best aligns with their strategies.


Madrid Community Health Department, the agency in charge of providing public health services in Madrid, found that increasing numbers of physicians and other staff were trying to access the corporate network from their own tablets and smartphones.


Rather than try and resist this rising tide it called in an independent security expert to collaborate with its IT and Legal teams to draw up a list of 18 security requirements its mobility strategy needed to meet.


A full list of these requirements can be found here: [ENG]/[ESP].


It then assessed the capability of three different scenarios in assuring compliance with these statements.


  • A tablet running a Windows 8.1 operating system (OS) managed by Mobile Device Management (MDM)
  • A tablet running an Android OS managed by MDM
  • A tablet running a Windows 8.1 OS managed as a normal PC


Managing Windows 8.1 tablets was shown to meet all 18 compliance statements. Managing Windows 8.1 and Android tablets with MDM was only able to meet eight and 10 user compliance statements respectively.


Managing mobile as a PC

From this Madrid Community Health Department was able to conclude that tablets running a Windows 8.1 OS offered greater flexibility, since they can be managed both with an MDM and as a normal PC.


However, adopting and managing tablets with Windows 8.1 running as a normal enterprise PC can manage and cover most of the defined risks, providing the tablet is given to the employee by Madrid Community Health Department as a normal PC.


For Madrid Community Health Department carrying out a full risk analysis showed that managing Windows 8.1 devices as a normal PC best aligns with its strategies.

If your organization is uncertain which management solution to choose, then a similar analysis could be the way to move you closer towards BYOD.


Read more >

Is cloud destined to be purely public?


51 per cent of workloads are now in the cloud, time to break through that ceiling?



At this point, we’re somewhat beyond discussions of the importance of cloud. It’s been around for some time, just about every person and company uses it in some form and, for the kicker, 2014 saw companies place more computing workloads in the cloud (51 per cent) – through either public cloud or colocation – than they process in house.


In just a few years we’ve moved from every server sitting in the same building as those accessing it, to a choice between private or public cloud, and the beginning of the IT Model du jour, hybrid cloud. Hybrid is fast becoming the model of choice, fusing the safety of an organisation’s private data centre with the flexibility of public cloud. However, in today’s fast paced IT world as one approach becomes mainstream the natural reaction is to ask, ‘what’s next’? A plausible next step in this evolution is the end of the permanent, owned datacentre and even long-term co-location, in favour of an infrastructure entirely built on the public cloud and SaaS applications. The question is will businesses really go this far in their march into the cloud? Do we want it to go this far?


Public cloud, of course, is nothing new to the enterprise and it’s not unheard of for a small business or start-up to operate solely from the public cloud and through SaaS services. However, few, if any, examples of large scale corporates eschewing their own private datacentres and co-location approaches for this pure public cloud approach exist.


For such an approach to become plausible in large organisations, CIOs need to be confident of putting even the most sensitive of data into public clouds. This entails a series of mentality changes that are already taking place in the SMB. The cloud based Office 365, for instance, is Microsoft’s fastest selling product ever. For large organisations, however, this is far from a trivial change and CIOs are far from ready for it.


The data argument


Data protectionism is the case in point. Data has long been a highly protected resource for financial services and legal organisations both for their own competitive advantage and due to legal requirements designed to protect their clients’ information. Thanks to the arrival of big data analysis, we can also add marketers, retailers and even sports brands to that list, as all have found unique advantages in the ability to mine insights from huge amounts of data.

This is at the same time an opportunity and problem. More data means more accurate and actionable insights, but that data needs storing and processing and, consequently, an ever growing amount of server power and storage space. Today’s approach to this issue is the hybrid cloud. Keep sensitive data primarily stored in a private data centre or co-located, and use public cloud as an overspill when processing or as object storage when requirements become too much for the organisation’s existing capacity.


The amount of data created and recorded each day is ever growing. In a world where data growth is exponential,  the hybrid model will be put under pressure. Even organisations that keep only the most sensitive and mission critical data within their private data centres whilst moving all else to the cloud will quickly see data inflation. Consequently, they will be forced to buy ever greater numbers of servers and space to house their critical data at an ever growing cost, and without the flexibility of the public cloud.


In this light, a pure public cloud infrastructure starts to seem like a good idea – an infrastructure that can be instantly switched on and expanded as needed, at low cost. The idea of placing their most sensitive data in a public cloud, beyond their own direct control and security, however, will remain unpalatable to the majority of CIOs. Understandable when you consider research such as that released last year stating that only one in 100 cloud providers meets EU Data Protection requirements currently being examined in Brussels.


So, increasing dependence on the public cloud becomes a tug of war between a CIO’s data burden and their capacity for the perceived security risk of the cloud.


Cloud Creep


The process that may well tip the balance in this tug of war is cloud’s very own version of exposure therapy. CIOs are storing and processing more and more non critical data in the public cloud and, across their organisations, business units are independently buying in SaaS applications, giving them a taste of the ease of the cloud (from an end user point of view, at least). As this exposure grows, the public cloud and SaaS applications will increasingly prove their reliability and security whilst earning their place as invaluable tools in a business unit’s armoury. The result is a virtuous circle of growing trust of public cloud and SaaS services – greater trust means more data placed in the public cloud, which creates greater trust. Coupled with the ever falling cost of public cloud, eventually, surely, the perceived risks of the public cloud fall enough to make its advantages outweigh the disadvantages, even for the most sensitive of data?


Should it be done?


This all depends on a big ‘if’. Trust in the public cloud and SaaS applications will only grow if public cloud providers remain unhacked and SaaS data unleaked. This is a big ask in a world of weekly data breaches, but security is relative and private data centre leaks are rapidly becoming more common, or at least better publicised, than those in the public cloud. Sony Pictures’ issues arose from a malevolent force within its network, not its public cloud based data. It will take many more attacks such as these to convince CIOs that losing direct control of their data security and putting all that trust in their cloud provider is the most sensible option. Those attacks seem likely to come, however, and in the meantime, barring a major outage or truly headline making attack on it, cloud exposure is increasing confidence in public cloud.


At the same time, public cloud providers need to work to build confidence, not just passively wait for the scales to tip. Selecting a cloud service is a business decision and any CIO will lend the diligence that they would any other supplier choice. Providers that fail to meet the latest regulation, aren’t visibly planning for the future or fail to convince on data privacy concerns and legislation will damage confidence in the public cloud and actively hold it back, particularly within large enterprises. Those providers that do build their way to becoming a trusted partner will, however, flourish and compound the ever growing positive effects of public cloud exposure.


As that happens, the prospect of a pure public cloud enterprise becomes more realistic. Every CIO and organisation is different, and will have a different tolerance for risk. This virtuous circle of cloud will tip organisations towards pure cloud approaches at different times, and every cloud hack or outage will set the model back different amounts in each organisation. It is, however, clear that, whether desirable right now or not, pure public cloud is rapidly approaching reality for some larger enterprises.

Read more >

The Bleeding Edge of Medicine

Computer Aided Engineering (CAE) has become pervasive in the design and manufacture of everything from jumbo jets to razor blades, transforming the product development process to produce more efficient, cost effective, safe and easy to use products. A central component of CAE is the ability to realistically simulate the physical behavior of a product in real world scenarios, which greatly facilitates understanding and innovation.



Application of this advanced technology to healthcare has profound implications for society, promising to transform the practice of medicine from observation driven to understanding driven. However, lack of definitive models, processes and standards has limited its application, and development has remained fragmented in research organizations around the world.


Heart simulation invaluable

In January of 2014, Dassault Systèmes took the first step to change this and launched the “Living Heart Project” as a translational initiative to partner with cardiologists, researchers, and device manufacturers to develop a definitive realistic simulation of the human heart. Through this accelerated approach, the first commercial model-centric, application-agnostic, multi-physical whole heart simulation has been produced.


Since cardiovascular disease is the number one cause of morbidity and mortality across the globe, Dassault Systèmes saw the Living Heart Project as the best way to address the problem. Although there is a plethora of medical devices, drugs, and interventions, physicians face the problem of determining which device, drug, or intervention to use on which patient. Often times to truly understand what is going on inside a patient invasive procedures are needed.


CAE and the Living Heart Project will enable cardiologists to take an image (MRI, CT, etc) of a patient’s heart and reconstruct it on a 3D model thereby creating a much more personalized form of healthcare. The doctor can see exactly what is happening in the patient’s heart and definitively make a more informed decision of how to treat that patient most effectively.


What questions do you have about computer aided engineering?


Karl D’Souza is a senior user experience specialist at Dassault Systèmes Simulia Corp.

Read more >

Ethernet Shows Its Role as Fabric Technology for High-End Data Centers at OCP Summit

March has been a big month for demonstrating the role of Intel® Ethernet in the future of several key Intel initiatives that are changing the data center.


At the start of the month we were in Barcelona at Mobile World Congress demonstrating the role of Ethernet as the key server interconnect technology for Intel’s Software Defined Infrastructure initiative read my blog post on that event.


And just this week, Intel was in San Jose at the Open Compute Project Summit highlighting Ethernet’s role in Rack Scale Architecture, which is one of our initiatives for SDI.


RSA is a logical data center hardware architectural framework based on pooled and disaggregated computing, storage and networking resources from which software controllers can compose the ideal system for an application workload.


The use of virtualization in the data center is increasing server utilization levels and driving an insatiable need for more efficient data center networks. RSA’s disaggregated and pooled approach is an open, high-performance way to meet this need for data center efficiency.


In RSA, Ethernet plays a key role as the low-latency, high bandwidth fabric connecting the disaggregated resources together and to other resources outside of the rack. The whole system depends on Ethernet providing a low-latency, high throughput fabric that is also software controllable.


MWC was where we demonstrated Intel Ethernet’s software controllability through support for network virtualization overlays; and OCP Summit is where we demonstrated the raw speed of our Ethernet technology.


A little history is in order. RSA was first demonstrated at last year’s OCP Summit, and as a part of that, we revealed an integrated 10GbE switch module proof of concept that included switch chip and multiple Ethernet controllers that removed the need for a NIC in the server.


This proof of concept showed how this architecture could disaggregate the network from the compute node.


At the 2015 show, we demonstrated a new design with our upcoming Red Rock Canyon technology, a single-chip solution that integrates multiple NICs into a switch chip. The chip delivered throughput of 50 Gbps between four Xeon nodes via PCIe, and multiple 100GbE connections between the server shelves, all with very low latency.


The features delivered by this innovative design provide performance optimized for RSA workloads. It’s safe to say that I have not seen a more efficient or high-performance rack than this PoC video of the performance.


Red Rock Canyon is just one of the ways we’re continuing to innovate with Ethernet to make it the network of choice for high-end data centers.

Read more >

NVM Express* Technology Goes Viral – From Data Center to Client to Fabrics

Amber Huffman, Sr Principal Engineer, Storage Technologies Group at Intel

For Enterprise, everyone is talking about “Cloud,” “Big Data,” and “Software Defined X,” the latest IT buzzwords. For consumers, the excitement is around 4K gaming and 4K digital content creation. At the heart of all this is a LOT of data. A petabyte of data used to sound enormous – now the explosion in data is being described in exabytes (1K petabytes) and even zettabytes (1K exabytes). The challenge is how to get fast access to the specific information you need in this sea of information.


NVM Express* (NVMe*) was designed for enterprise and consumer implementations, specifically to address this challenge and the opportunities created by the massive amount of data that businesses and consumers generate and devour.


NVMe is the standard interface for PCI Express* (PCIe*) SSDs. Other interfaces like Serial ATA and SAS were defined for mechanical hard drives and legacy interfaces are slow, both from a throughput and a latency standpoint. NVMe jettisons this legacy and is architected from the ground up for non-volatile memory, enabling NVMe to deliver amazing performance and low latency. For example, NVMe delivers up to 6x the performance of state of the art SATA SSDs1.


There are several exciting new developments in NVMe. In 2015, NVMe will be coming to client systems, delivering great performance at the low power levels required in 2-in-1s and tablets. The NVM Express Workgroup is also developing “NVMe over Fabrics,” which brings the benefits of NVMe across the data center and cloud over fabrics like Ethernet, Fibre Channel, InfiniBand*, and OmniPath* Architecture.


NVM Express is the interface that will serve data center and client needs for the next decade. For a closer look at the new developments in NVMe, look Under the Hood with this video. Check out more information at



1Tests document performance of components on a particular test, in specific systems. Differences in hardware, software, or configuration will affect actual performance. Configuration: Performance claims obtained from data sheet, Intel® SSD DC P3700 Series 2TB, Intel® SSD DC S3700 Series: Intel Core i7-3770K CPU @ 3.50GHz, 8GB of system memory, Windows* Server 2012, IOMeter. Random performance is collected with 4 workers each with 32 QD. Configuration for latency: Intel® S2600CP server, Intel® Xeon® E5-2690v2 x2, 64GB DDR3, Intel® SSD DC P3700 Series 400GB, LSI 9207-8i, Intel® SSD DC S3700.


© 2015 Intel Corporation


Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

*Other names and brands may be claimed as the property of others.

Read more >

Transforming the Workplace for a New Generation of Workers

Workplace transformation is not a new concept. It’s a piece of our evolution. As new generations enter the workforce, they bring new expectations with them; what the workplace meant for one generation doesn’t necessarily fit with the next. Think about the way we work in 2015 versus the way we worked in, say, 2000.


In just 15 years, we’ve developed mobile technology that lets us communicate and work from just about anywhere. Robust mobile technologies like tablets and 2 in 1s enable remote workers to video conference and collaborate just as efficiently as they would in the office. As these technologies evolve, they change the way we think about how and where we work.



Working Better by Focusing on UX


Over the past decade, mobile technologies have probably had the most dramatic impact on how we work, but advances in infrastructure will pave the way for the next big shift. Wireless technologies have improved by leaps and bounds. Advances in wireless display (WiDi) and wireless gigabit (WiGig) technologies have created the very real possibility of a wire-free workplace. They drive evolution in a truly revolutionary way.


Consider the impact of something as simple as creating a “smart” conference room with a large presentation screen that automatically pairs with your 2 in 1 or other device, freeing you from adapters and cords. The meeting room could be connected to a central calendar and mark itself as “occupied” so employees always know which rooms are free and which ones are in use. Simple tweaks like this keep the focus on the content of meetings, not the distractions caused by peripheral frustrations.


The workstation is another transformation target. Wireless docking, auto-connectivity, and wireless charging will dramatically reduce clutter in the workplace. The powerful All-in-One PC with the Intel Core i5 processor will free employees from the tethers of their desktop towers. Simple changes like removing cords and freeing employees from their cubicles can have huge impacts for companies — and their bottom lines.


The Benefits of an Evolved Workplace


Creating the right workplace for employees is one of the most important things companies can do to give themselves an advantage. By investing in the right infrastructure and devices, businesses can maximize employee creativity and collaboration, enhance productivity, and attract and retain top talent. Evolving the workplace through technology can empower employees to do their best work with fewer distractions and frustrations caused by outdated technology.


If you’re interested in learning more about what I’ve discussed in this blog, tune in to the festivities and highlights from CeBit 2015.


To continue this conversation on Twitter, please use #ITCenter. And you can find me on LinkedIn here.

Read more >

The Behavioral Shift Driving Change in the World of Retail

Ready or Not, Cross-Channel Shopping Is Here to Stay


Of all the marketplace transitions that have swept through the developed world’s retail industry over the last five to seven years, the most important is the behavioral shift to cross-channel shopping.


The story is told in these three data points1:


  1. 60 plus percent of U.S. shoppers (and a higher number in the U.K.) regularly begin their shopping journey online.
  2. Online ratings and reviews have the greatest impact on shopper purchasing decisions, above friends and family, and have four to five times greater impact than store associates.
  3. Nearly 90 percent of all retail revenue is carried out in the store.


Retail today is face-to-face with a shopper who’s squarely at the intersection of e-commerce, an ever-present smartphone, and an always-on connection to the Internet.


Few retailers are blind to the big behavioral shift. Most brands are responding with strategic omni-channel investments that seek to erase legacy channel lines between customer databases, inventories, vendor lists, and promotions.



Channel-centric organizations are being trimmed, scrubbed, or reshaped. There’s even a willingness — at least among some far-sighted brands — to deal head-on with the thorny challenge of revenue recognition.


All good. All necessary.



Redefining the Retail Space


But, as far as I can tell, only a handful of leaders are asking the deeper question: what, exactly, is the new definition of the store?


What is the definition of the store when the front door to the brand is increasingly online?


What is the definition of the store when shoppers know more than the associates, and when the answer to the question of how and why becomes — at the point of purchase — more important than what and how much?


What is the definition of the store beyond digital? Or of a mash-up of the virtual and physical?


What is the definition — not of brick-and-mortar and shelves and aisles and four-ways and displays — but of differentiating value delivery?


This is a topic we’re now exploring through whiteboard sessions and analyst and advisor discussions. We’re hard at work reviewing the crucial capabilities that will drive the 2018 cross-brand architecture.


Stay tuned. I’ll be sharing my hypotheses (and findings) as I forge ahead.



Jon Stine
Global Director, Retail Sales

Intel Corporation


This is the second installment of a series on Retail & Tech. Click here to read Moving from Maintenance to growth in Retail Technology.


1 National Retail Federation. “2015 National Retail Federation Data.” 06 January 2015.

Read more >

Can Technology Enable Viable Virtual Care?


I recently spoke to Mark Blatt, Intel’s Worldwide Medical Director, about whether virtual care can deliver equal to or better than face-to-face care. Across the world, ageing populations are stretching public health services to the limit. It’s impractical for everybody with a health problem to go to a hospital or clinic, taking up the valuable time of a limited number of doctors and nurses that could be better used elsewhere.


That’s why we believe virtual care is a trend that will increase markedly in the future. It isn’t something that is entirely new – in the past my fellow medical professionals have found the telephone a valuable diagnostic tool. And while it remains an important part of virtual care, the desk telephone (which is more commonly used in a mobile situation today), when used in isolation, can help to deliver only basic support.


So, what does the future hold for virtual care? Take a look at the video above to hear Mark’s thoughts and leave us your ideas too. I’d love to hear from you in the comments section below.

Read more >

Tightening up Intel SCS service account permissions for managing Intel AMT computer objects in Microsoft Active Directory

An enterprise customer wanted to enable Active Directory integration with Intel AMT on their large Intel vPro client estate. However their security team wanted the permissions for the Intel SCS service account against the Organisational Unit (OU) where Intel AMT computer objects are stored to support Kerberos, to be as restrictive as possible.


As defined in the Intel® Setup and Configuration Software User Guide, permissions for the SCS service account on the OU container are “Create Computer objects”, “Delete Computer objects” and “List content” (the latter seems to be default) and full control on descendant computer objects. The latter was not acceptable so …



… to support AMT maintenance tasks such as updating the password of the AD object representing the Intel AMT device and ensuring the Kerberos clock remains synchronised, the following explicit permissions are required on all descendant computer objects within the OU.


The customers security team were happier with these permissions and they are now activating their Intel vPro systems to enable the powerful manageability and security capabilities that Active Management Technology, available on Intel vPro Technology platforms provides.

Read more >

Emerging Technology Sectors Changing the IT-Business Landscape

Intel’s CIO Kim Stevenson is “…convinced that this is an exciting time as we enter a new era for Enterprise IT. Market leadership is increasingly being driven by technology in all industries, and a new economic narrative is being written that challenges business models that have been in place for decades.”


With enterprise pumping more funds into the industry than ever, Gartner projects that IT spending will reach $3.8 trillion this year. Gartner’s prediction indicates that while many of the traditional enterprise IT-focused areas — data center systems, devices, enterprise software, IT services, and telecom services — will continue to see increased investment, new areas are expected to emerge much faster.


As the business invests more in IT — whether it’s in these traditional focused areas or these new emergent areas — one thing is stable. The business is becoming more dependent on IT for both organizational efficiency and competitive value.


Let’s take a closer look at two of the emergent growth segments along with the challenges, opportunities, and value they create for this new era of business-IT relationships.


Security and the Internet of Things


Gartner projects an almost 30-fold increase in the number of installed IoT units (0.9 billion to 26 billion) between 2009 and 2020. The data collected from these devices is an essential component to future IT innovation; however, this technology comes with significant security and privacy risks that cannot be ignored. “Data is the lifeblood of IoT,” states Conner Forrest of ZDNet. “As such, your security implementation for IoT should center around protecting it.”


The potential for the IoT remains largely undefined and at risk, especially with 85 percent of devices still unconnected and security threats prevalent. The Intel IoT Platform was designed to address this business challenge. The Intel IoT Platform is an end-to-end reference model that creates a secure foundation for connecting devices and transferring data to the cloud. With this reference architecture platform, countless IoT solutions can be built and optimized with the advantages of scalable computing, security from device to cloud, and data management and analytics support.


Invest.pngThe Enterprise Investing in Startups


2014 represented the biggest year in corporate venture group capital investment since 2000, and this trend is set to continue, according to a MoneyTree Report jointly conducted by PricewaterhouseCoopers LLP, the National Venture Capital Association, and Thomson Reuters.  What is interesting to me is the why. Organizations want and need a critical asset: creative talent.


As the term “innovation” runs rampant through the enterprise, CIOs know they must make changes in order to stay fresh and competitive. However, according to Kim Nash of CIO, 74 percent of CIOs find it hard to balance innovation and operational excellence, suggesting that a more powerful approach would be to acquire a startup to capture its talent, intelligence, and creative spirit.


While buying a startup is not in every organization’s wheelhouse, some businesses are providing venture capital to startups in order to tap into their sense of innovation. “By making such moves,” explains Nash, “non-IT companies gain access to brand new technology and entrepreneurial talent while stopping short of buying startups outright.”

Leadership Tips For IT Innovation


IT’s success in this new environment will not follow a pre-defined formula. In fact, it will rely on new skills and an evolving partnership between business and IT. For this reason, Intel partnered with The IT Transformation Institute to present the Transform IT Show. Transform IT is a web-based show that features in-depth interviews with business executives, IT leaders, and industry experts to shed light on what the future holds for business and the IT organizations that power them. Most importantly, the show highlights the advice for all future leaders on how to survive and thrive in the coming era of IT.


I hope you enjoy our guests and can apply the insights you gain from the Transform IT Show. Join this critical conversation by connecting with me on Twitter at @chris_p_intel or by using #TransformIT.

Read more >

Hardware Hacking with Rowhammer

Memory2.jpgRowhammer represents a special case for vulnerability exploitation, it accomplishes something very rare, by hacking the hardware itself.  It takes advantage of the physics happening at the nano-level in a very specific architecture structure present in some designs of computer memory.  Rowhammer allows attackers to change bits of data in sections of memory they should not have access to. It seems petty, but don’t underestimate how flipping bits at this level can result in tremendous risk.  Doing so, could grant complete control of a system and bypass many security controls which exist to compartmentalize traditional malicious practices.  Rowhammer proves memory hardware can be manipulated directly.


In the world of vulnerabilities there is a hierarchy, from easy to difficult to exploit and from trivial to severe in overall impact.  Technically, hacking data is easiest, followed by applications, operating systems, firmware and finally hardware.  This is sometimes referred to as the ‘stack’ because it is how systems are architecturally layered. 


The first three areas are software and are very portable and dynamic across systems, but subject to great scrutiny by most security controls.  Trojans are a great example where data becomes modified and can be easily distributed across networks.  Such manipulations are relatively exposed and easy to detect at many different points.  Applications can be maliciously written or infected to act in unintended ways, but pervasive anti-malware is designed to protect against such attacks and are constantly watchful.  Vulnerabilities in operating systems provide a means to hide from most security, open up a bounty of potential targets, and offer a much greater depth of control.  Knowing the risks, OS vendors are constantly identifying problems and sending a regular stream of patches to shore up weaknesses, limiting the viability of continued exploitation by threats.  It is not until we get to Firmware and Hardware, do most of the mature security controls drop away.   


The firmware and hardware, residing beneath the software layers, tends to be more rigid and represents a significantly greater challenge to compromise and scale attacks.  However, success at the lower levels means bypassing most detection and remediation security controls which live above, in the software.  Hacking hardware is very rare and intricate, but not impossible.  The level of difficulty tends to be a major deterrent while the ample opportunities and ease which exist in the software layers is more than enough to lure hackers in staying with easier exploits in pursuit of their objectives. 

Attackers Move Down the Stack.jpg

Attackers are moving down the stack.  There are tradeoffs to attacks at any level.  The easy vulnerabilities in data and applications yield much less benefits for attackers in the way of remaining undetected, persistence after actions are taken against them, and the overall level of control they can gain.  Most security products, patches and services work at this level and have been adapted to detect, prevent, and evict software based attacks.  Due to the difficulty and lack of obvious success, most vulnerability research doesn’t explore much in the firmware and hardware space.  This is changing.  It is only natural, attackers will seek to maneuver where security is not pervasive. 


Rowhammer began as a theoretical vulnerability, one with potentially significant ramifications.  To highlight the viability, the highly skilled Google Project Zero team developed two exploits which showcased the reality of gaining kernel privileges.  The blog from Rob Graham, CEO of Errata Security, provides more information on the technical challenges and details.


Is Rowhammer an immediate threat?  Probably not.  Memory vendors have been aware of this issue for some time and have instituted new controls to undermine the current techniques.  But this shows at a practical level how hardware can be manipulated by attackers and at a theoretical level how this could have severe consequences which are very difficult to protect against.


As investments in offensive cyber capabilities from nations, organized crime syndicates, and elite hackers-for-hire continue to grow, new areas such as hardware vulnerabilities will be explored and exploited.  Rowhammer is a game-changer with respect to influencing the direction of vulnerability research.  It is breaking new ground which others will follow, eventually leading to broad research in hardware vulnerability research across computing products which influence our daily lives.  Hardware and firmware hacking is part of the natural evolution of cybersecurity and therefore a part of our future we must eventually deal with.



Twitter: @Matt_Rosenquist

IT Peer Network: My Previous Posts



Read more >

Desktop Tutorial: How to Optimize Clinician Workflow with All-In-One Devices

Technology is making huge advances in all spheres of life, especially in healthcare. Clinicians have a wider range of devices at their disposal and can choose the best device based on their needs. With increased connectivity, clinicians are able to turn to mobile devices for their portability and versatility, but for certain tasks that require a large screen size, plugged-in capability and high-performance power, all-in-one (AiO) desktop computers might be a better bet. 50076.jpg


The right device for the right time

AiO computers are capable of performing multiple functions that require a huge amount of data, making them ideal for many situations in a healthcare setting. For instance:


  • Senior administrators can use the touch and voice commands, combined with large and immersive screens to quickly navigate through large data files or numerous reports. Plus, AiOs take up little precious room on the desk or at a shared station, and technologies like Intel® RealSense™ can use facial recognition as a sign-on for added protection in a multi-user environment.
  • Surgeons in the operating room can connect critical monitoring devices to new AiOs so that real-time data needed by anesthesiologists, nurses, and physicians, along with a patient’s full medical history, is consolidated onto one large screen during a procedure. This provides a more holistic view of the patient to make better operating room decisions.
  • Doctors and nurses can use an AiO to replace a bedside terminal to collaborate with patients on critical care questions. After they sign off, the desktop can also be used by the patient and family members as their big-screen TV, streaming music station, or voice-enabled Web browsing desktop.
  • Teleradiologists will appreciate the large processing power and screen size of an AiO to examine X-rays and CAT scans in minute detail. With touch integration, they can rotate, enlarge, measure, and expand images without maxing out the processing requirements like you would on a laptop. The sleek footprint also gives new AiOs better usability in tight spaces, such as patient examination rooms or the ER, and it’s easy to plug in a handheld microphone for direct dictation.
  • Ob-gyns can take advantage of new, low-cost technology such as the USB probes that plug into AiOs to become ultrasound machines. Imagine being able to save tens of thousands of dollars on stand-alone ultrasound equipment by making use of the powerful performance and features of new desktop computers.

Better workflow and security

Unlike mobile devices where data can experience lag while it’s sent to and from the cloud, desktop systems connect directly to the network to streamline workflow because everything is updated in real time. This can be especially valuable in hospital areas where Wi-Fi is problematic or rooms that are purposely built to block X-rays. To speed things up even further, no additional encryption is needed for data both in flight and at rest, as would be for a mobile wireless device.


Additionally, Intel® vPro™ technology allows these powerful devices to be easily and even remotely managed, which can be especially valuable for smaller clinics that don’t have a dedicated IT department. Lastly, AiOs offer more physical security—it’s difficult to walk off with a desktop after all.


What questions about desktop computers in healthcare do you have? Do you use both mobile and desktop in your healthcare environment?

Read more >

Intel Gateway Solution Is at the Center of Internet of Things Deployments

The true potential of the Internet of Things (IoT) can only be reached when smart, embedded devices can interact and share data with the cloud, unlocking useful data that can provide new, invaluable insights to an organization. With enterprise embedded IoT demand on the rise, however, organizations are facing challenges of increasing separation, interoperability, and security risks.


To combat this, Intel has created unique, fully integrated hardware and software building blocks designed to connect devices, aggregate information, analyze data locally, and open the communication channel so secure data can flow into the cloud.



Expediting Intelligent Solutions


Available since early 2014, Intel Gateway Solution is a scalable, flexible family of integrated options that include the Intel Quark SoC X1000, Intel Quark SoC X1020D, and Intel Atom E3826 processors. The Intel Gateway Solution has the ability to connect legacy and new systems, ensuring that data generated by devices can flow securely from the edge to the cloud.


“It’s really about bringing together all the critical elements and really accelerating our customer’s time to market,” states Adam Burns, director of IoT Solutions Group at Intel. “We’ve put all the security elements in there so they can start out with a secure system…we’ve got the application environment, so their investment is really focusing on building their value-added applications and services, not creating the wheel on a bunch of foundational building blocks.”


Intel Gateway Solution is commercially ready, integrated with Wind River and McAfee software, and suitable for quick implementation into your own IoT infrastructure — something no one else on the market can currently claim.


Built-In Security


Security is a key building block in IoT. Without security integrated on every level, IoT deployment can go seriously awry. In addition, IoT will never achieve high levels of adoption if people don’t trust that data transferring is secure.


Intel Gateway Solution, coupled with McAfee Embedded Control and Wind River Intelligent Device Platform XT 2.1, provides rich enterprise-grade security features — including secure boot, GRSecurity, and IMA, to name a few — for strong support and end-to-end security protection.


The Internet-connected, data-driven era has arrived. Embedded sensors and devices have the ability to transform the enterprise, allowing for greater intelligence, cost efficiency, and value — Intel Gateway Solution is an essential component for any IoT business model, providing endless potential for innovation.


If you’re interested in learning more about what I’ve discussed in this blog, tune in to the festivities and highlights from CeBit 2015.


To continue this conversation, use #ITCenter.

Read more >

Giving Customers a Reason to Trust Digital Banking

There’s some amazing innovation happening in the financial services industry right now. From new mobile banking initiatives to peer-to-peer lending options, the landscape is changing fast. But fears about security continue to hover like a dark cloud. In my previous blog, I talked about how cybersecurity remains a massive threat, with an estimated $400 billion lost to cybercrime each year. And with the majority of customers accessing their bank through digital channels, security is a huge and growing concern.


Security and Convenience Are at the Heart of Trust


More than ever, financial institutions need to build trust and address the identity concerns of their customers. Building trust is about more than security — it’s also about the convenience of the overall experience. To truly provide a secure and frictionless digital banking experience, financial institutions want to offer strong authentication and convenient transaction authorization, so customers can perform transactions quickly and securely.



At Intel, we are helping our customers through the use of Intel Identity Protection Technology (IPT). Intel IPT is a hardware-based identity technology that embeds identity management directly into the customer’s device. With Intel IPT, banks issue a secure token that is stored in the security engine of a device’s Intel processor. For each banking service accessed, the bank generates and verifies a unique, one-time password, eliminating the need to verify identity using multiple factors for each premium service. Banks can also check the user’s presence through password verification.


Intel IPT Gains Traction in Turkey


Intel IPT has had success in Turkey, which is fast becoming a center of excellence in terms of innovative technology. Specifically, the country’s two largest banks have built their digital banking platform on top of Intel IPT and other technologies. Let’s take a closer look at how those banks are using Intel IPT to strengthen security for online and mobile banking.




The first bank in the world to use Intel IPT on a mobile banking application, Isbank now has more than 1 million mobile customers. Like all financial institutions in Turkey, Isbank must comply with banking regulations that require the use of two-factor authentication. By using Intel IPT, Isbank customers don’t need to enter the OTPs being sent by SMS or generated by hard/soft token generators to use each banking service. Additionally, the bank can verify users’ mobile devices through the hardware-based Intel IPT solution, which supports 2,000 different mobile phone models and all operating systems. To date, the bank has reported more than 30,000 Intel IPT users, a number that continues to grow by the day.


Garanti Bank


Garanti Bank has deployed Intel IPT into its transaction processes, combining the technology with a cloud-based authentication solution. To use the solution, Garanti customers download an applet from the bank’s website. The applet, which runs on Ultrabook laptops or other mobile devices, activates the cloud service and creates a unique secret code on the Intel processor using IPT. The code facilitates a safe connection to the bank’s Internet banking service every time a customer logs on from their device.


Trust-Banking.jpgFor both banks, the benefits of Intel IPT are clear: they can protect their customers’ identities with authentication deeply embedded in hardware, while also giving those customers a user-friendly way to do their banking. In short, Isbank and Garanti are using their solutions to build trust. Having strengthened its image as an innovator in its customers’ minds, Isbank is already planning to use identity and access management technologies from Intel for its mobile payments website, as well as additional cloud banking applications.


Intel IPT is just one part of our overall security roadmap. Our vision includes the integration of secure biometric techniques to further improve end-to-end identity protection solutions across all platforms, from the client to the cloud. By integrating identity and access management directly into hardware and software, we’re helping ensure our partners can provide the most trusted and convenient experiences possible for their customers.  


To continue the conversation, let’s connect on Twitter.


Mike Blalock

Global Sales Director

Financial Services Industry, Intel


This is the sixth installment of a seven part series on Tech & Finance. Click here to read blog 1, blog 2, blog 3, blog 4, and blog 5.

Read more >

HIMSS 15 Make it Personal: Devices, Analytics, and Consumers

The countdown to HIMSS 15 is on. Next month, the healthcare technology community gathers in Chicago April 12-16 for the world’s largest health IT event to see what devices, software, infrastructure and security architecture will be shaping the landscape in 2015 and beyond.


At Intel, we’re approaching HIMSS with a critical eye on three areas that we feel are focal points for CMIOs:


  • The right mobile device for the right decisions at any point of care
  • Clinical analytics
  • Consumer health (IoT, wearables) and the next generation of devices


To learn more about these pillars of healthcare technology, you’re invited to the Intel booth (#2525) to view the latest hardware and software that clinicians are beginning to utilize. We encourage you to sign up to take a guided tour, where you’ll see:


  • A simulated collaboration room with working technology
  • A device bar with applications and demonstrations
  • Server and analytics stations
  • An IoT/wearables table featuring Google Glass, sensors, headphones, and ultrasound technology


When you take a booth tour you’ll also have a chance to win a tablet computer in our HIMSS drawing.


Outside of the Intel booth, you will find our experts sharing their knowledge in a number of forums. For example:


Finally, be sure to follow us on Twitter to keep up-to-date on all the happenings going on at the event. We’ll be live tweeting from the show floor and sharing pictures of cool health IT products/services that we discover.


HIMSS is always a great event and we are looking forward to seeing you in Chicago.


What questions about HIMSS do you have? What are you most looking forward to seeing during the show?

Read more >

Intel Xeon Processors Power Video Game Creation

Gamer-USC.jpgDid you know that some of today’s top-selling games have production budgets that rival or even exceed those of Hollywood movies? Young game developers have a bright future ahead of them, with the opportunity to create increasingly complex and exciting worlds that can have significant cultural impact as well as commercial success.


However, rising to the top of this highly competitive industry needs a lot of work and dedication. Those that want to break into the field are able to start early, with a number of universities now offering software development degrees that specialize in gaming.


Game Development at the Collegiate Level


When University Campus Suffolk (UCS) opened its doors for the first time in 2007, its bachelor’s degree course in Computer Game Design was a key offering in the prospectus. The course has been popular ever since and UCS has built a name for itself through its high quality teachings and the success of its students and graduates, such as the pair that won the Walking Dead game jam in October 2013.


It’s not just about the people though; like any other craftsman, the game maker needs the right tools to create a masterpiece. Compute-intensive tasks, such as 3D modeling and real-time editing demand workstations with strong performance. As an educational institution with budget constraints, UCS needed to achieve this performance while also being mindful of the energy consumption of its development lab and making it an effective learning environment.


Powerful Gamers Deserve Powerful Processors


When students started doing more coding at home than in the lab due to their home PC’s greater ability to cope with the computing demands of their project work, UCS decided to invest in upgrading its resources. It replaced its aging consumer PC fleet with 47 CELSIUS M730 Fujitsu workstations powered by Intel Xeon processors E5-1620 v2 product family, consciously choosing the type of platform that its students can expect to experience in their professional careers. You can read more about the deployment here.


The new machines combine the performance that tomorrow’s star game developers need to hone their skills, while also offering the price performance that the university’s finance team demands. You can hear from the team themselves about how the cooler, quieter, enhanced performance workstations have improved the classroom environment by watching this video.


Jane Williams
Online Sales Development Manager
Intel Corporation

Read more >

Building Support for SDI with Intel Ethernet at Mobile World Congress

It seems ironic to be blogging about wired Ethernet in conjunction with the world’s largest wireless technology show, Mobile World Congress.


But it’s actually very relevant because the “wireless” network is mostly wired and Ethernet is becoming a bigger part of this infrastructure.


That’s why we teamed with Cisco for a joint demonstration of a new Ethernet technology at MWC that shows the potential for virtualized Ethernet as a key part of Intel’s software-defined infrastructure initiative.


SDI is Intel’s vision of the future of the data center – both for enterprises and service providers. In an SDI data center, the network, compute and storage infrastructure are virtualized and managed by orchestration software to enable IT – or applications – to dynamically define and assign resources.


Compute and storage virtualization have been ongoing for some time, but the physical network is just now being virtualized via network virtualization overlays. NVOs are new packet encapsulation techniques that allow the industry to realize some of the same flexibility that we see today in virtualized compute and storage.


Intel has supported acceleration for early NVO protocols (Geneve, VXLAN, and NVGRE) in our Intel® Ethernet Controller XL710, codenamed Fortville. And now we’re supporting the latest NVO protocol – Network Service Header.


NSH is an IETF standard originally developed by Cisco that is an important advance in the ability to create service chains, making network design easier by routing packets through specific network services (firewall, encryption, etc.) in a virtualized network. When NSH is used for service chains, it simplifies the creation of complex services.


Which brings us to MWC where Intel and Cisco demonstrated an NFV security service-chaining application based on NSH using Intel’s Red Rock Canyon-based customer reference board.


Red Rock Canyon is a new breed of Ethernet product that integrates an Ethernet switch with high-speed network interface controllers. Red Rock Canyon includes PCIe 3.0 interfaces as a low latency way to connect Intel® Xeon-based servers to the network. Red Rock Canyon is now sampling to customers and will launch later this year.


It’s important to note that we’re supporting NSH in firmware at wire speed. This is an innovation that we originally developed for the XL710, which is now available in Red Rock Canyon too. This flexible protocol support is a critical precursor to Ethernet’s role as the interconnect for SDI.


The mission of my team is to provide Intel Ethernet network virtualization and switch innovations that enable the SDI vision.


This means that in addition to controllers, adapters and switches, we’ll be building our flexible Intel® Ethernet into Xeon CPUs and other devices to integrate it more completely with the compute node and with our SDI architecture.


The network may be the last component of the data center to be virtualized, but that is happening right now and it is a major milestone for the promise of SDI to be fully realized.

Read more >

How Intel IT Generated US$351 Million in Value using Analytics

BI351.pngIn contrast to a number of recent reports that many Big Data projects fail to deliver, Intel IT’s 2014-2015 Annual Business Review states that Intel’s Information Technology Group (IT) has generated $351 million dollars of value to Intel through the use of Big Data, Business Intelligence (BI), and Analytic tools.  That is a remarkably large number.  Intel CIO Kim Stevenson talks more about the Annual Performance report in this interview with the Wall Street Journal.


How has Intel IT managed to generate that much value?  Over the next few months, I am going to be blogging about some of the ways that Intel uses analytics and how Intel IT looks to provide analytics in the future.  I’ll talk about some Big Data/Analytics use cases, as well as some of our methods and techniques for maintaining our analytics infrastructure.  Among the use cases is an extension of my data de-identification work into Big Data.  I’ll also talk about some of the challenges that Intel IT has encountered on our Big Data and analytics path, how we solved those, and the issues that we still face.

Read more >

Enterprise Social Networks: Build Communities in Your Business

“Connecting people to people and people to information and supporting users to surface information in real time is the ultimate goal of enterprise social software.” – Vanessa Thompson


Collaboration.pngAccording to IDC, revenue from worldwide enterprise social networks is predicted to grow from $1.24 billion in 2013 to $3.5 billion in 2018. There’s no doubt about the increasing demand for social tools adoption in the workplace. Business are starting to recognize the value in business social networking, as shown by a recent case study by Dr. Chris Archer-Brown at University of Bath: “There is a theme that engagement in the network acts as an indicator of the strength of the relationship between the staff and the organisation, which leads to benefits for the employee that can be observed by greater satisfaction, increased loyalty and longer tenure.”


Social business can drive collaboration, knowledge sharing, and partnerships while keeping remote and mobile employees better connected. Adoption, however, isn’t 100 percent about the technology employed. Through 2015, Gartner estimates that 80 percent of social business efforts will not be as effective because leadership will place too much emphasis on technology. Social tools platforms need a strong foundation in order to work — executives from the top down need to own social transformation and shifts in the cultural ecosystem for tools and processes to be successfully employed. Without contributor support and clear objectives, business social strategies will run the risk of outright failure.


The Driving Force of Consumerization


Well-known social sites allow users to access and post information quickly. Enterprise social tools need the same access. “Just as consumer social sites serve as central hubs to fulfill a range of needs and activities, enterprise social software has to be able to be a central platform from which to do work and communicate with stakeholders throughout an organization,” notes Bryant Harland from No Jitter. Through open APIs, businesses can deeply integrate social tools into established business platforms. This will allow the same seamless functionality employees are already getting in their personal lives, encouraging user engagement and breaking down information silos.


Collaboration Communities in the Enterprise


Intel IT recently implemented a single collaboration platform with social interactivity as a means to better connect sales and marketing teams. Based on extensive user research, the team designed a sales model for high-value collaboration ranging from specific tasks to general interaction. The model revolved around three types of collaboration communities: account and opportunity, work, and interest-based.




Before this platform, information was received primarily through email and filtered through different levels of management to sales staff. The use of collaboration communities eliminates the middleman. When information is posted, members of that community can respond immediately, and all comments and decisions are available in real time. Collaboration communities allow for highly effective communication techniques, ranging from crowdsourcing to networking to scaled sharing.


The experience was a huge success; the platform tapped into the collective knowledge of the sales teams, giving team members visibility and improved productivity. It’s a first step toward providing a collaborative platform across the entire enterprise.


To continue the conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter.

Read more >