ADVISOR DETAILS

RECENT BLOG POSTS

Riding and Taming Security’s Perfect Storm with Intel Core vPro

Cyberthreats, unfortunately, never take a holiday. In fact, with each passing day, attacks become more numerous, organized, powerful, and, with the explosion in smart devices and cloud-based systems, more opportunistic.

 

No wonder 50 percent of the 182 IT professionals who participated in Computerworld’s Forecast 2016 survey said they plan to increase spending on security technologies in the next 12 months. Security ran a close second after cloud computing as the most important technology project currently underway at their organizations.

 

Security’s ‘Perfect Storm’


Mike Seawright, director of security business development at Intel, discussed these challenges in Secure Your Business, our latest webinar in the Business Devices Webinar Series. Not only are IT security professionals facing increasing complexity with more devices and the shift to cloud computing, but they must act quickly, as organizations can be compromised in mere minutes, while utilizing limited staffing and budget resources.

 

The latest devices with Intel vPro technology offer a solid first line of defense in preventing threats. Built on Intel’s security technologies, each successive generation delivers evolutionary security capabilities. Intel Core vPro processors feature remote capabilities that allow scarce IT staff resources to maximize their efficiency in protecting compute devices across the enterprise.


What Aspects of Security Are Most Important?


Unfortunately, there is no easy strategy to take in IT security. “Security is complicated—sorry folks!” Mike said. To be truly secure, he explained, IT departments need to defend all areas against modern attacks: identity, platform, data, and applications.


However, Mike explained, a whopping half of all security breaches stem from identity and authentication gaps, so stronger authentication is a key part of security. Fortunately, Intel and Microsoft work collaboratively to combat security threats with user-friendly features and technologies such as True Key by Intel Security, Microsoft Credential Guard, and Intel Identity Protection Technology Multifactor Authentication.


These and other multifaceted defensive tactics and tools were explained in the hour-long webinar, which included a Q&A session. Here is a sample of what webinar participants had on their minds:


Q: I have health care clients. Do you have a security checklist?

 

Mike: Our health care team has a presentation you could use for this. Send me a note at michael.seawright@intel.com.

 

Q: Does True Key update itself?

 

Mike: True Key is like most software in that some portions will update automatically if that setting is applied. But then as we have major releases, it will usually require a user update.

 

Q: Are there any encryption key “manager” apps available for SMBs or partners that are acting as the IT department for multiple SMBs?

 

Mike: The McAfee ePolicy Orchestrator does a nice job of this. Another vendor to look at would be Venafi.

 

If you missed the webinar, you can listen in to the on-demand version available now and hear other questions and answers as well as download the presentation slides.

 

Ask a Question, Win a Tablet


This month, the lucky winners of a new Intel-based tablet and a new set of SMS Audio BioSport Smart Earbuds are Ed Goad of MeteorComm and D. Komnick of Advanced Business Technology Services, respectively. Congrats to both! And, if you didn’t win this time, you’ll have another chance to ask questions and win at the next webinar, which is sure to be a popular one: Introducing 6th Gen Intel Core vPro.

 

If you’ve already registered for the Business Devices Webinar Series, you’re all set: just click on the link in the reminder email you’ll receive a day or two before the event. But if you need to register, you can join our next webinar by clicking here.

 

  With the latest Intel Core vPro processor-based devices, more businesses big and small can set and reach their New Year’s resolution to make their entire enterprise more secure.

Read more >

Reinventing the Retail Experience with Intel, SAP and the Internet of Things

Tis the season to go shopping—at least if you are one of the millions of people buying gifts for the holidays. If so, perhaps you’ve noticed that your shopping strategies have changed in recent years. Fewer and fewer people go window shopping or wander the merchandise aisles, seeking inspiration. Instead, the retail experience for many begins by researching products with web and mobile tools before going to a bricks-and-mortar store to make further evaluations and a purchase decision: While more than 90 percent of retail transactions still occur in stores, over 60 percent of purchase research begins online. That means that consumers come to retail locations better informed than ever, often with their minds already made up about what they wish to purchase.

 

This can be a challenge for retailers who rely on traditional marketing strategies that focus on sales associates engaging with customers and helping make purchase decisions—with a consequent increase in the size of the sales basket.

 

How can a retailer compete? Well, it turns out that consumers aren’t the only ones with access to new sources of data. Innovative retailers can use new technologies—including a network of smart sensors, Internet of Things (IoT) gateways, and cloud-based data analytics—to know more about individual consumers and their preferences; personalize the customer’s shopping experience to increase engagement and sales; and to deliver a unified retail experience from online to in-store via smart signage, loyalty programs, targeted sales assistance, mobile POSs, and more.

 

To enable these kinds of data-driven experiences for retail requires a secure, end-to-end solution that can seamlessly extend from sensor-based data capture at the store level, to transaction data collected at retail locations, and all the way to analytics processing in the cloud—to help retailers improve engagement with their customers, better understand what’s happening in their stores, and improve the size of individual transactions. To help retailers gain these kinds of data-driven insights, Intel and SAP are working together to create an end-to-end IoT solution that can deliver actionable retail insights on a near real-time basis. This joint IoT solution, which includes Intel® IoT Gateways to capture and secure sensor data, SAP SQL Anywhere* database software to enable intelligence at the edge, and the SAP HANA* cloud, will allow retailers to take advantage of real-time analytics to act faster, make smarter decisions, and know more about customers than ever before. To learn more about the Intel and SAP IoT solution, watch this video:

 

 

When we talk about bringing improved analytics to the retail industry, I want to emphasize that this new solution enables two very different types of analytics. The first is conventional big data analytics, the kind of number crunching that typically takes place in corporate offices and allows high-level retail decision-makers to decide where to locate the next store and view sales patterns for the season’s hot products. This kind of analytics has traditionally been processed on its own siloed computing infrastructure, completely apart from the transactional infrastructure that runs the sales and transactional systems for daily retail operations, because you wouldn’t risk running a big analytics query on the same system that held your transactions log, for fear that it might crash the system. But SAP HANA, running on Intel® Xeon® processors, has the power to run both analytical and transactional workloads on the same infrastructure, and in memory. This not only offers the potential to lower infrastructure costs, it means that transactional information can be made available for analysis as soon as it is processed, enabling retailers to act on those insights in near real-time.

 

The Intel and SAP IoT solution doesn’t just enable analytics at the macro level, it also delivers immediate analytical insights at the store level. Using sensor-based tools such as smart-mirrors and –fixtures, traffic flow detectors, interactive visual aids, inventory tags, and personal tracking technologies, Intel IoT Gateways can give retailers instant access to new sources of data that can identify customers and their past purchases and preferences, and to offer personalized offerings and services to differentiate the shopping experience from competitors. These instantaneous analytics, drawn from the store itself, deliver insights not just about customers, but also about how to offer a more engaged customer experience in a store. This can include store layout and traffic flow information, suggestions for strategic pairings of products, identification of inventory not selling well, and alerts about up-to-the-second shopping trends. Collection and real-time analysis of data at the network edge gives a store manager the ability to immediately address dynamics in the store; the IoT gateway also routes the data to the cloud for traditional analytics.

 

Intel products and technologies translate IoT into business value and differentiation for the retail industry. Intel provides the critical, foundational building blocks for building secure IoT networks, with a proven portfolio of compute, storage, network and software components that span from the edge to the cloud. Because the Intel platform is modular, retailers can build their infrastructure as their needs evolve; they can also unlock value from the data and infrastructure they already have.  Essential for retailers, Intel also offers a variety of security technologies built into its processors, including encryption and tokenization, to ensure protection of both customer and retail data. Intel is also at the center of a partner ecosystem that includes not only SAP but retail industry ISVs, and our technology is vendor neutral, standards based, and interoperable.

 

Intel may not be the first company that comes to mind when you think of retail, but if you peer behind the thin veneer that is the storefront, you’ll find that that the technology—and the innovation—that powers retail solutions are similar to the platforms in other industries where Intel is a proven leader.

 

Interested in learning more about Intel’s retail solutions? Visit our website and join us at the National Retail Federation (NRF) annual conference, to be held January 17-20. 2016 in New York City. Stop by the Intel booth #2543 to say hello and let us show you how Intel is revolutionizing retail.

 

Follow me at @TimIntel and #TechTim to keep up with the latest with Intel and SAP.

Read more >

Sensing the Simple Way to Better Healthcare

Technology can solve complex healthcare problems, whether that be analysing large volumes of genomic data or allowing a specialist to see a 3D image of a beating heart, but sometimes we often overlook the simple, day-to-day tasks where technology is having meaningful impact for patients and healthcare professionals today.

 

Providing Efficiency in Clinician’s Workflow

I’m seeing a lot of interest and excitement here in China around the Internet of Things in healthcare. Sensors are increasingly being used to not only provide more efficiency in a clinician’s workflow in a hospital setting but also to help those patients who require care in the home to live more independent lives.

 

A great example in development that I’d like to share is the Intel Edison-based uSleepCare intelligent bed which is able to record a patient’s vital signs such as rate and depth of breathing, heart-rate and HRV without the need for nurse intervention. Movement sensors also help to identify where there may be cause for concern over pressure ulcers or patients may fall out of their bed for example, which may prolong a hospital stay.

 

Early Identification of Abnormalities

The sensors not only collect data but also use WiFi to transmit that data seamlessly to a cloud platform for analysis which can then be used in a variety of meaningful ways. The most obvious and pressing data use demand is for early identification of abnormalities which can alert nursing staff to the need for human intervention, thus reducing the requirements to have nurses ‘doing the rounds’ which is resource-intensive and costly for providers.

 

Additionally the archive of data helps clinicians tackle chronic diseases at the patient level, spotting trends where patients may having a worsening or improving condition. This is particularly valuable as devices such as the UsleepCare intelligent bed become available in a homecare setting. Imagine a community nurse being able to prioritise visits to those patients who are showing abnormal signs as recorded by IoT sensors via alerts, all on a mobile device in real-time. This is truly mobile healthcare, delivering the right care where it is needed and when it is needed, with the right information at their fingertips.

 

Data Collection Bring Efficiencies

And as this sensor technology becomes more prevalent in both the hospital and homecare setting, the data becomes increasingly useful at a population level too. It will assist providers in spotting trends which will in turn help them to become more efficient and allocate resources where appropriate.

 

All of which ultimately benefits the patient, particularly those with chronic conditions. They will perhaps spend less time in hospital with an improved level of care and be able to spend more time at home, with the confidence that their condition is being monitored by a healthcare professional 24/7.

 

The Internet of Things is having a rapidly transformative effect on healthcare. Investment by providers in sensor technology such as the Intel Edison-based USleepCare intelligent bed is helping to drive efficiency-savings while also having a meaningful impact on patient care. In China we’re already pushing forward with implementation in this area and I look forward to sharing the results in the future.

 

Read more >

Capitalizing on Non-Volatile Memory Technologies

I recently gave a talk at SNIA’s annual Storage Developer Conference (SDC), filling in for a colleague who was unable to travel to the event.  The talk, “Big Data Analytics on Object Storage—Hadoop Over Ceph Object Storage with SSD Cache,” highlighted work Intel is doing in the open source community to enable end users to fully realize the benefits of non-volatile memory technologies employed in object storage platforms[1].

 

In this post, I will walk through some of the key themes from the SNIA talk, with one caveat: This discussion is inherently technical. It is meant for people who enjoy diving down into the details of storage architectures.

 

Let’s begin with the basics. Object storage systems, such as Amazon Web Service’s Simple Storage Service (S3) or the Hadoop Distributed File System (HDFS), consist of two components: an access model that adheres to a well-specified interface and a non-POSIX-compliant file system.

 

In the context of this latter component, storage tiering has emerged as a required capability. For the purpose of this discussion, I will refer to two tiers: hot and warm. The hot tier is used to service all end-user-initiated I/O requests while the warm tier is responsible for maximizing storage efficiency. This data center storage design is becoming increasingly popular.

 

Storage services in a data center are concerned with the placement of data on the increasingly diverse storage media options available to them. Such data placement is an economic decision based on the performance and capacity requirements of an organization’s workloads. On the $-per-unit-of-performance vector there is often significant advantage to placing data in DRAM or on a storage media such as 3D XPoint that has near-DRAM speed. On the dollar-per-unit-capacity vector there is great motivation to place infrequently accessed data on the least expensive media, typically rotational disk drives but increasingly 3D NAND is an option.

 

With the diversity of media: DRAM, 3D XPoint, NAND, and Rotational, the dynamic of placing frequently accessed, so-called “hot” data on higher performing media while moving less frequently accessed, “warm” data to less expensive media is increasingly important. How is data classified, in terms of its “access frequency?” How are data placement actions carried out based on this information? We’ll look more closely on the “how” in a subsequent post. The focus of this discussion is on data classification. Specifically, we look at data classification within the context of distributed block and file storage systems deployed in a data center.

 

Google applications such as F1, a distributed relational database system built to support their AdWords business and Megastore, a storage system developed to support online services such as Google Application and Compute Engine [2,3]. These applications are built on top of Spanner and BigTable respectively. In turn, Spanner and BigTable store their data, b-tree-like files and write-ahead log to Colossus and Google File Systems respectively [4]. In the case of the Colossus File System (CFS), Google has written about using “Janus” to partition CFS’s flash storage tier for workloads that benefit from the use of the higher performing media [5]. The focus of this work is on characterizing workloads to differentiate these based on a “cacheability” metric that measures cache hit rates. More recently, companies such as Cloud Physics and Coho Data have published papers along similar lines. The focus of this work is on characterizations that efficiently produce a [cache] “miss ratio curve (MRC)” [6-8] Like Google, the goal is to keep “hot” data in higher-performing media while moving less frequently accessed data to a lower cost media.

 

What feature of the data-center-wide storage architecture enables such characterization? In both Google’s and Coho’s approaches, the distinction between servicing incoming end-user I/O requests for storage services and accessing backend storage is fundamental. In Google’s case, applications such as F1 and Megastore indirectly layer on top of the distributed storage platform. However, the Janus abstraction is transparently interposed with such applications and the file system. Similarly, Coho presents a storage service, such as NFS mount points or HDFS mount points, to end-user applications via a network virtualization layer. [9,10] This approach allows for the processing pipeline to be inserted between the incoming end-user application I/O requests and Coho’s distributed storage backend.

 

One can imagine incoming I/O operation requests—such as create, delete, open, close, read, write, snapshot, record, and append—encoded in well-specified form. Distinguishing between, or classifying, incoming operations with regard to workload, operation type, etc. becomes analogous to logging in to an HTTP/web farm [11]. And like such web farms, read/access requests are readily directed toward caching facilities while write/mutation operations can be appended to a log.

 

In other words, from the end user’s perspective storage is just another data-center-resident distributed service, one of many running over shared infrastructure.

 

And what about the requisite data management features of the storage platform? While the end user interacts with a storage service, the backend storage platform is no longer burdened with this type processing. It is now free to focus on maximizing storage efficiency and providing stewardship over the life of an organization’s ever-growing stream of data.

 

 

References

 

  1. Zhou et al, “Big Data Analytics on Object Storage – Hadoop Over Ceph Object Storage with SSD Cache,” 2015.
  2. Shute et al, “F1: A Distributed SQL Database That Scales,” 2013 http://static.googleusercontent.com/media/research.google.com/en//pubs/archive/41344.pdf
  3. Baker et al, “Megastore: Providing Scalable, Highly Available Storage for Interactive Services,” 2011 http://www.cidrdb.org/cidr2011/Papers/CIDR11_Paper32.pdf
  4. Corbett et al, “Spanner: Google’s Globally-Distributed Database,” 2012 (see section 2.1 Spanserver Software Stack for a discussion on how Spanner uses Colossus) http://static.googleusercontent.com/media/research.google.com/en//archive/spanner-osdi2012.pdf
  5. Albrecht et al, “Janus: Optimal Flash Provisioning for Cloud Storage Workloads,” 2013.
  6. Waldspurger et al, “Efficient MRC Construction with SHARDS,” 2015 http://cdn1.virtualirfan.com/wp-content/uploads/2013/12/shards-cloudphysics-fast15.pdf
  7. Warfield, “Some experiences building a medium sized storage system,” 2015 (see specifically slide 21 “Efficient workload characterization”).
  8. Wires et al, “Characterizing Storage Workloads with Counter Stacks,” 2014.
  9. Cully et al, “Strata: High-Performance Scalable Storage on Virtualized Non-volatile Memory,” 2014.
  10. Warfield et al, “Converging Enterprise Storage and Business Intelligence: A Reference Architecture for Data-Centric Infrastructure,” 2015.
  11. Chen et al, “Client-aware Cloud Storage,” 2014 http://www.csc.lsu.edu/~fchen/publications/papers/msst14-cacs.pdf

Read more >

Wearables, Cultural Changes, and What’s Next

Read Part I: Transforming Healthcare with Patient-Generated Data

Read Part II: How Wearables are Impacting Healthcare

Read Part III: Challenges of User-Generated Data

Read Part IV: Wearables for Tracking Employee Movement

 

This blog series has been about how wearables have become more than a passing trend and are truly changing the way people and organizations think about managing health. I hear from many companies and customers who want to understand how the wearables market is impacting patient care as well as some of the changes taking place with providers, insurers, and employers. So far, I’ve shared some of their questions and my responses. The final question in this series is:

 

What kinds of organizational and cultural changes are driven by patient-generated data?

 

There is definitely a cultural shift, and you get different adoption and excitement on a clinician-by-clinician basis. It is still early days.Some clinicians are championing patient-generated data while others aren’t buying into its significance.

Where I hope Intel can play a role both near term and going forward is with predictive analytics and using streaming wearable data to help inform predictive models and make them more accurate. As I mentioned in earlier posts, we want to make it easier for health systems and clinicians to adopt a data-driven approach, enabling better allocation of limited resources and, ultimately, improving patient outcomes.

 

I am most excited about the ability to monitor patients and members continuously rather than periodically, moving from episodic to real time. That’s the game changer. And it’s enabled by technologies with the combination of very low power consumption, very small form factor or package, and the ability to send sensor data (either directly or via the ubiquitous smartphone) to the cloud or to backend information systems.

 

As wearables become more pervasive it will be exciting to see the industry move beyond consumer-based wearable devices that were developed for fitness purposes to devices with more sophisticated sensing capabilities targeted for healthcare use cases. I feel these devices will have a significant impact on reducing costs and improving outcomes by monitoring conditions and patients 24×7.

 

What questions about wearables do you have?

Read more >

The Greater Manchester Health Devolution Plan: Using Technology to Drive Partnerships between Health and Social Care to Improve Health Outcomes

As much of the developed world faces the burden of escalating healthcare costs, ageing populations, and increased incidence of chronic diseases, countries and localities are experimenting with innovative approaches to address these challenges.

 

In a recent visit to Greater Manchester, England, I learned of an innovative initiative recently launched that aims to use technology to create a civic partnership between NHS and social care providers. The principal goals of the initiative are to 1) improve health outcomes, 2) reduce healthcare disparities, and 3) reduce income inequality. While many nations and municipalities are working to achieve these goals, Greater Manchester’s approach to achieving these goals is multi-pronged and systematic, and will use technology in a way that will integrate social services into healthcare delivery.

 

Using Technology for Data Storage, Analysis, and Interoperability

Greater Manchester’s plan calls for the creation of “Connected Health Cities” (CHCs), which will be powered by health innovation centers. These centers will assemble the data from multiple sources. What is especially novel about these CHCs is that the collection, management, and analysis of both health and social care data will happen at a scale that until now has been impossible. Shared protocols for data analysis across the CHCs will allow for timelier and more powerful research studies, and ultimately better informed decision-making.

 

Interoperability and integration of myriad data sources will be a vital component for realizing these goals. Clinical, community, and patient-generated data will be used to inform decision-making not just among clinical providers, but also among public health policy makers, planners, social care providers, and researchers. Finally, technology will also support continuous evaluation of the program, and use of actionable measures to drive decision-making.

 

Decision Making at the Local Level

Another unique feature of this plan is that the UK will provide the locality with 6 billion pounds for Greater Manchester to use at its own discretion. Granting control of the budget to local municipalities will allow for decision making to happen at the lowest possible level.

 

The Greater Manchester Board will set strategies and priorities, but local boards will devise plans that will be tailored depending on the needs of the local environment. The budget will be used not just for healthcare, but also for social programs and public health activities. The technology and data will help to bridge health and social programs at this local level.

 

Tackling Priority Care Pathways

Greater Manchester will focus initially on optimizing four “care pathways.” One pathway will use support tools for self-care to reduce hospital admissions for patients with chronic conditions. Another will support schizophrenia patients by linking self-reported symptoms to responses from community psychiatric nurse visits. Studying the effects of the program on specific pathways will allow for evaluation and iteration to help ensure the program’s success.

 

Public and Patient Engagement

Involvement from the public and from patients is a central pillar of this devolution plan. A panel of 10 patient and public representatives will have a say in how the data is used. The program leaders believe such involvement will ensure sustainability and transparency of these CHCs, and ensure that citizens’ needs are met in this civic partnership.

 

Insights for other Countries and Cities

I am eager to observe the implementation and early results that come from this Greater Manchester initiative, as it is a test bed for the devolution of health and social care. The potential for this initiative to accelerate and scale innovation to reduce disparities and improve population health is exciting. Data will be used from multiple streams in ways that haven’t before been possible, and will be used in a way that is tailored to the local environment. With a growing body of research that highlights the impact that social determinants (e.g., housing, social services and support, access to care) have on healthcare, using technology and data to tackle these issues could help other nations and cities in their efforts to improve population health.

 

Contact Jennifer Esposito on LinkedIn

 

Read more >

5 Questions for Dr. Robert C. Green, Brigham and Women’s Hospital and Harvard Medical School

Dr. Robert Green is director of the Genomes2People Research Program, associate director for Research for Partners Personalized Medicine and associate professor of Medicine at Brigham and Women’s Hospital, a Harvard Medical School teaching hospital in Boston, Massachusetts. We recently sat down with him to discuss his research on personal genomics and where that research is headed in the future. DrGreen.jpg

 

Intel: How has gene sequencing evolved over the last decade?


Green: Genetic sequencing is tremendously exciting and important for the future of medicine. We have all this technology that allows us to sequence, align, and call these variants so much better than we could before. The real question is, how do you integrate that into the practice of medicine? Our research at Brigham and Women’s Hospital is zeroed in on finding out how we can implement genomic information, particularly sequencing, into the practice of medicine for adults or even for newborns.

 

Intel: Are patients interested in consumer genetic testing? If so, why?


Green: Patients want to know if they are at increased risk for particular diseases. They want to know if there are drugs they should or shouldn’t take. Some of them want to know if they are carrying recessive carrier traits that would put them at risk for having a child with a recessive disease if their partner is also carrying a copy of a mutation. A lot of people are really curious about the diseases they already have.

 

Our research helps us understand why people want services like consumer genetic testing. In part, it’s perhaps to predict future illness, but it’s also to explain what they already have or what’s running in their family.

 

Intel: What kind of impact can genomic sequencing have on patients?


Green: Genome sequencing isn’t going to be relevant to everyone’s disease. It isn’t going to be relevant to everyone’s medication. But it is going to be relevant to people with rare diseases and those on some medications. It’s going to be relevant to a lot of people who have cancer. And it’s highly relevant to people who are planning their family and want to do preconception testing to avoid recessive conditions.

 

When it comes to individuals who are dealing with sepsis or bad infection, we’re going to not only sequence those patients, but also sequence the microbiomes of the bacteria that are infecting them.

 

One of the directions for sequencing is to have a file or a database of genomic information a patient can call upon to use when they face a situation where they need a new medication, are going to have a family, or have a mysterious illness. That information will be there. It can be brought up in an electronic health record at the point of care in a decision support manner.

 

Intel: What’s the biggest hurdle right now?


Green: One of the great challenges for the practice of medicine is to keep up with, educate ourselves about, and use research effectively in our practices.

With initiatives to accelerate the implementation of genomics, like the Intel All in One Day goal, it’s going to be important that clinicians and those who are in training—fellows, residents, medical students—become much more familiar with genomics than they are today.

 

Intel: What’s your vision for the future of genomic research application?


Green: I think five years from now, about 20 percent of the population is going to walk into their doctor’s office with their own genome, and doctors are going to have to decide how they want to use this information. I can envision that about 10 years from now everyone will use their genome as a daily part of their medical care.

Read more >

Big Data Webinar Series: 3. Big Data in Clinical Medicine: Bringing the Benefits of Genome-Aware Medicine to Cancer Patients – Available On-Demand

The 3rd and final webinar in Frost & Sullivan’s series on Big Data in Healthcare took place recently and is now available to view on-demand. The webinar is a must-watch for healthcare leaders and features some very practical advice on how to get the most value from the data your organizations holds.

 

Greg Caressi, Sr. VP of Transformational Health at Frost & Sullivan, opened with an overview of how providers are leveraging big data platforms to drive health, concluding with ‘the real bottleneck is in turning data into knowledge’.

 

David Delaney, MD, Chief Medical Officer at SAP Healthcare, took the conversation forward by sharing his thoughts on how HANA® Healthcare In-Memory simplifies and accelerates data analysis for healthcare providers. Intel and SAP have a long-standing relationship and have worked together on SAP HANA® since 2005.

 

Finally, Kevin Fitzpatrick, CEO of ASCO CancerLinQ, gave a fantastic presentation on shaping the future of cancer through rapid learning. A brilliant insight into really important work for the cancer community.

 

Listen to this webinar now on-demand and don’t forget that if you missed the first two webinars you can also view them today using the links below:

 

  • Watch Webinar 3: Big Data in Clinical Medicine: Bringing the Benefits of Genome-Aware Medicine to Cancer Patients
  • Watch Webinar 2: Predictive Healthcare Analytics in a Big Data World: Use Cases from the Field
  • Watch Webinar 1: Future of Healthcare is Today: Leveraging Big Data

Read more >

Healthcare Security is Increasingly About Survival

In the wild, no prey is completely safe from predators. However, it is clear to prey which members of the herd are weak or vulnerable to predators. They certainly also know which one is being pursued, or has been caught by a predator.

 

In healthcare, no organization is immune from breaches, regardless of how advanced their security is. There is always residual risk, for example from spear phishing. Healthcare organizations, even those that do risk assessments well, and address any deficiencies identified, don’t know how their security stands relative to other healthcare organizations. In a sense they don’t know whether they are “low hanging fruit” for predators such as cybercriminals. Exacerbating this, many healthcare organizations lack the security intrusion detection capabilities to detect when they are being actively pursued by hackers. Many cybercrime breaches go on for months or even years before they are detected and stopped, vastly increasing the number of patient records compromised and the business impact of the breach.

iStock_000042526114_Small.jpg

 

Intel Health and Life Sciences, together with Intel Security, just released an online Healthcare Security Survey that invites participants from healthcare organizations to securely and confidentially answer a few questions about their breach security safeguards and posture. Based on the input provided their breach security is scored, and summary recommendations are made on possible next steps improve. Participants are then offered the opportunity to engage in a subsequent 1-2 hour breach security assessment that confidentially analyzes their healthcare organizations breach security posture in more detail, identifying any potential gaps and improvements. Post assessment participants receive a report of their results together with how their healthcare organizations breach security compares with the rest of the healthcare industry. Several of the breach security controls analyzed improve detection of breaches, enabling the organization to detect hacking, intrusions or breaches.

 

Ponemon Institute 2015 Cost of a Data Breach Research estimates the total average cost of a data breach at $6.53M, or $398 per patient record. In fact, we have seen the business impact of recent cybercrime breaches going higher than $100M. With this kind of impact, and the alarming frequency of breaches, the need to rapidly address this issue has never been more urgent.

 

If you are working for a healthcare organization, join us now in taking the online Healthcare Security Survey challenge, and enrolling in our Healthcare Breach Security Assessment pilot program to confidentially see where your healthcare organization stands in terms of breach security posture relative to the healthcare industry, and any potential gaps and opportunities to improve with a multi-year, incremental, layered approach that fits within your budget and resource constraints.

 

What questions do you have?

Read more >

Happily Ever After: Windows 10 and Intel Core vPro a Perfect Match for Better Productivity, Security, Manageability

By the time a couple is married 21 years, they’ve had their share of disagreements, unlocked the mysteries of the other, and, happily, come to the realization that they’re better together than not.

 

Such is the double-decade partnership between Intel and Microsoft, which has persevered through tech booms and busts. The “Better Together: Windows 10 and Intel Core vPro Processor-based Devices” webinar glimpsed into the future with the Intel Core vPro processor and Microsoft OS, Windows 10. We saw how they work together to raise the bar in enterprise computing, with much excitement from end users, IT and business decision makers, and OEMs.

 

Windows 10 fully supports the Intel vPro pillars of strength—productivity, security, and manageability—with a familiar Windows 7-based user interface and numerous new dynamic features. For example, for better productivity, Microsoft host expert Stephen Rose explained how a device used as a PC with a keyboard and mouse can switch for optimal tablet use. Windows 10 responds automatically by adjusting window size for touch-based actions and biometrics.


Sixth-gen Intel Core vPro processor-based devices “are the most manageable, most productive, and most secure platform for enterprise,” webinar Intel technology expert Greg Reiff said. Intel Core vPro has enabled the creation of more streamlined form factors that are 50 percent thinner, 50 percent lighter than devices more than four years old, and use much less power.


With the newest features in Windows 10, users and IT departments can build more security around their data and devices. Features such as Intel Virtualization Technology prevent unauthorized software from being loaded, and Intel SSD Pro Series Data Protection guards data off-network. These features on the back end support the mission on the front end to “kill the password,” according to Rose, by “moving away from what you know [passwords] to what you have; things like your face (detected via Intel RealSense and Microsoft Hello), fingerprints, and wearables.”


Webinar attendees were clamoring to know more, asking many questions during the interactive Q&A. Here’s a sample:


Q: Can you add biometric devices to older PCs that run Windows 10?


Steve Forsberg (Intel host expert): You could attach an external RealSense camera if your older hardware does not have an infrared camera integrated.


Q: Are the new Intel Q170 chipset machines shipping now?


Greg Reiff (Intel host expert): Some are shipping but not as enterprise Intel Core vPro platforms [those are scheduled for release soon].


Q: Is the Microsoft Surface Pro 4 tablet available through distribution?


Stephen Rose (Microsoft host expert): Yes. We have a wide variety of resellers including Dell, CDW, and others.


Q: Is the process/recommendation of upgrading the UEFI published somewhere?


Greg Reiff: Upgrading a platform’s BIOS to UEFI is OEM-specific. Each OEM should have an upgrade guide on their support site under drivers > firmware > download. If vPro is enabled, we have best practices documents on www.intel.com.


As with all webinars in the Business Devices Webinar Series, participants were entered into a drawing for an Intel-based tablet or a set of SMS Audio BioSport smart earbuds. Congratulations to tablet winner Kent Liu of Williams-Sonoma and to Andy Yu of American Portwell Technologies for scoring the cool earbuds!


Our next webinar is happening December 9, 10 a.m. PST. Be sure to attend, because it’s all about security: what the key risks are, how to manage them, and ways to prepare with the latest solutions from our top technology experts.


If you’ve already registered for the Business Devices Webinar Series, click on the link in the reminder email you’ll receive a day or two before the event. If you need to register, we’d love to have you join our next session by clicking here.


The “Better Together: Windows 10 and Intel Core vPro Processor-based Devices” webinar can be watched anytime on demand if you missed it. For more on how Windows 10 and the latest Intel technology can help businesses overcome their challenges, read this recent white paper.


It’s exciting to see how ongoing collaboration between Intel and Microsoft continues to advance better, more efficient, and more amazing experiences in the world of enterprise computing.

Read more >

Graphics Driver issue

I recently upgraded from Windows 7 to Windows 10 and now when my grandson tries to play Minecraft he gets an error message that the graphics driver needs updated.  I determined that it currently has a Intel Driver and I ran the Intel Driver Utilit… Read more >

Multiple Alarms Feature

Hello, I am trying to remotely configure 150ish PC’s with multiple AMT Alarms.  I can see from this webpage that AMT 8.0 and later supports the Multiple Alarm Feature, and all of our machines are 8.1 or newer.  I have successfully created ind… Read more >

Tightening up Intel SCS service account permissions for managing Intel AMT computer objects in Microsoft Active Directory

An enterprise customer wanted to enable Active Directory integration with Intel AMT on their large Intel vPro client estate. However their security team wanted the permissions for the Intel SCS service account against the Organisational Unit (OU) where Intel AMT computer objects are stored to support Kerberos, to be as restrictive as possible.

 

As defined in the Intel® Setup and Configuration Software User Guide, permissions for the SCS service account on the OU container are “Create Computer objects”, “Delete Computer objects” and “List content” (the latter seems to be default) and full control on descendant computer objects. The latter was not acceptable so …

 

SCS_AD_Perms_OU_Create_Delete.jpgSCS_AD_Perms_OU_List.jpg

… to support AMT maintenance tasks such as updating the password of the AD object representing the Intel AMT device and ensuring the Kerberos clock remains synchronised, the following explicit permissions are required on all descendant computer objects within the OU.

SCS_AD_Perms_Descendant_Change_Password.jpgSCS_AD_Perms_Descendant_Write_All_Properties.jpg

The customers security team were happier with these permissions and they are now activating their Intel vPro systems to enable the powerful manageability and security capabilities that Active Management Technology, available on Intel vPro Technology platforms provides.

Read more >