Recent Blog Posts

In a Twist, School Principals Teach the Value of Data Analytics


Elementary and secondary school principals must solve a challenging optimization problem. Faced with a deluge of applicants for teaching positions, demanding teaching environments, and very little time to spend on the applicant review process, school principals need a search algorithm with ranking analytics to help them find the right candidates. This is a classic data science problem.


Elsewhere, I have described the ideal data scientist, a balanced mix of math and statistics sophistication, programming chops, and business savvy: A rare combination, indeed. To solve the teacher applicant ranking problem, does every school in the country need to hire one of these “unicorn” data scientists to create a system to automatically identify the best teacher candidates?


I propose that the answer is “No!” and Big Data startup TeacherMatch agrees with me.  It is not a good use of resources for every school to hire a data scientist to help analyze teacher applications with advanced analytics, natural language processing and machine learning, yet the need to make teacher candidate selection more effective and efficient is huge.  The solution is to leverage the work of an expert who has already done that analysis.


TeacherMatch is such an expert. Based on a huge amount of historical data, TeacherMatch has developed a score for ranking teacher applicants, the EPI score, based upon a prediction of how likely a candidate is to be successful in the environment that the principal is looking to fill. Suddenly, it is nearly instantaneous to identify the top handful of candidates out of a list of potentially hundreds of applicants.



I met Don Fraynd, CEO of TeacherMatch, last year when he joined a [data science] (  panel that I hosted I was impressed with his deep understanding of the challenges of hiring good teachers and with his practical approach to analytics. He has created a big data analytics solution that is sensible to incorporate into any organization that needs to hire teachers. See for yourself in this very interesting video about TeacherMatch.


Looking more broadly at the needs of all industry for more powerful analytics, the shortage of data scientists available to hire is a challenge. TeacherMatch’s model represents a real solution. In fact, I suspect that analytics-as-a-service will help drive a new era of advanced data analytics because it allows business users of analytics to leverage the output of a small number of data scientists who solve common problems across different organizations. In this regard, TeacherMatch represents the future of analytics.


Through the example of TeacherMatch, it appears that our principals and teachers are taking us to school on analytics.

Read more >

Cyber Threats are a Danger to Corporate Growth

Executives are beginning to understand that cyber-based threats are a potentially significant impediment to business success. The challenges extend far beyond the annoyance of rising security budgets.  The recent PwC survey showed 61% of CEO’s believe cyber threats pose a … Read more >

The post Cyber Threats are a Danger to Corporate Growth appeared first on Intel Software and Services.

Read more >

A 2016 Prediction: Prescriptive Analytics Will Take Flight


In May 2015, I wrote the first in a series of blog posts exploring the journey to software-defined infrastructure. Each blog in the series dives down into different stages of a maturity model that leads from where we are today in the typical enterprise data center to where we will be tomorrow and in the years beyond.  During that time, I also delved into the workloads that will run on the SDI platform.


As I noted in last month’s post, traditional analytics leads first to reactive and then to predictive with the ultimate destination of prescriptive analytics.  This state is kind of the nirvana of today’s big data analytics and the topic we will take up today.


Prescriptive analytics extends beyond the predictive stage by defining the actions necessary to achieve outcomes and the inter-relationship of the outcomes to the effects of each decision. It incorporates both structured and unstructured data and uses a combination of advanced analytics techniques and other scientific disciplines to help organizations predict, prescribe, and adapt to changes that occur.  Essentially, we’ve moved from, “why did this happen,” to, “what will happen,” and we’re now moving to, “how do we make this happen,” as an analytics methodology.


Prescriptive analytics allows an organization to extract even more value and insight from big data—way above what we are getting today. This highest-level of analytics brings together varied data sources in real time and makes adjustments to the data and decisions on behalf of an organization. Prescriptive analytics is inherently real-time—it is always triggering these adjustments based on new information.



Let’s take few simple examples to make this story more tangible.


  • In the oil and gas industry, it can be used to enable natural gas price prediction and identify decision options—such as term locks and hedges against downside risk—based on an analysis of variables like supply, demand, weather, pipeline transmission, and gas production. It might also help decide when and where to harvest the energy, perhaps even spinning up and shutting down sources based on a variety of environmental and market conditions.


  • In healthcare, it can increase the effectiveness of clinical care for providers and enhance patient satisfaction based on various factors across stakeholders as a function of healthcare business process changes.  It could predict patient outcomes and help alleviate issues before they would normally even be recognized by medical professionals.


  • In the travel industry, it can be used to sort through factors like demand curves and purchase timings to set seat prices that will optimize profits without deterring sales.  Weather and market conditions could better shape pricing and fill unused seats and rooms while relieving pressure in peak seasons.


  • In the shipping industry, it can be used to analyze data streams from diverse sources to enable better routing decisions without the involvement of people. In practice, this could be as simple as a system that automatically reroutes packages from air to ground shipment when weather data indicates that severe storms are likely to close airports on the usual air route.


I could go on and on with the examples, because every industry can capitalize on prescriptive analytics. The big takeaway here is that prescriptive analytics has the potential to turn enormous amounts of data into enormous business value—and do it all in real time.


With the impending rise of prescriptive analytics, we are entering the era in which machine learning, coupled with automation and advanced analytics, will allow computers to capture new insights from massive amounts of data in diverse datasets and use that data to make informed decisions on our behalf on an ongoing basis.


At Intel, we are quite excited about the potential of prescriptive analytics. That’s one of the reasons why we are a big backer of the open source Trusted Analytics Platform (TAP) initiative, which is designed to accelerate the creation of cloud-native applications driven by big data analytics. TAP is an extensible open source platform designed to allow data scientists and application developers to deploy solutions without having to worry about infrastructure procurement or platform setup, making analytics and the adoption of machine learning easier. To learn about TAP, visit

Read more >

Blueprint: Tips for Avoiding a Data Center Blizzard

This article originally appeared on Converge Digest



We’re in the depth of winter and, yes, the snow can be delightful… until you have to move your car or walk a half block on icy streets. Inside the datacenter, the IT Wonderland might lack snowflakes but everyday activities are even more challenging year round. Instead of snowdrifts and ice, tech teams are faced with mountains of data.



So what are the datacenter equivalents of snowplows, shovels, and hard physical labor? The right management tools and strategies are essential for clearing data paths and allowing information to move freely and without disruption.


This winter, Intel gives a shout-out to the unsung datacenter heroes, and offers some advice about how to effectively avoid being buried under an avalanche of data. The latest tools and datacenter management methodologies can help technology teams overcome the hazardous conditions that might otherwise freeze up business processes.


Tip #1: Take Inventory


Just as the winter holiday season puts a strain on family budgets, the current economic conditions continue to put budget pressures on the datacenter. Expectations, however, remain high. Management expects to see costs go down while users want service improvements. IT and datacenter managers are being asked to do more with less.


The budget pressures make it important to fully assess and utilize the in-place datacenter management resources. IT can start with the foundational server and PDU hardware in the datacenter. Modern equipment vendors build in features that facilitate very cost-effective monitoring and management. For example, servers can be polled to gather real-time temperature and power consumption readings.


Middleware solutions are available to take care of collecting, aggregating, displaying, and logging this information, and when combined with a management dashboard can give datacenter managers insights into the energy and temperature patterns under various workloads.


Since the energy and temperature data is already available at the hardware level, introducing the right tools to leverage the information is a practical step that can pay for itself in the form of energy savings and the ability to spot problems such as temperature spikes so that proactive steps can be taken before equipment is damaged or services are interrupted.


Tip #2: Replace Worn-Out Equipment


While a snow shovel can last for years, datacenter resources are continually being enhanced, changed, and updated. IT needs tools that can allow them to keep up with requests and very efficiently deploy and configure software at a rapid pace.


Virtualization and cloud architectures, which evolved in response to the highly dynamic nature of the datacenter, have recently been applied to some of the most vital datacenter management tools. Traditional hardware keyboard, video, and mouse (KVM) solutions for remotely troubleshooting and supporting desktop systems are being replaced with all-software and virtualized KVM platforms. This means that datacenter managers can quickly resolve update issues and easily monitor software status across a large, dynamic infrastructure without having to continually manage and update KVM hardware.


Tip #3: Plan Ahead


It might not snow everyday, even in Alaska or Antarctica. In the datacenter, however, data grows everyday. A study by IDC, in fact, found that data is expected to double in size every two years, culminating in 44 zettabytes by 2020. An effective datacenter plan depends on accurate projections of data growth and the required server expansion for supporting that growth.


The same tools that were previously mentioned for monitoring and analyzing energy and temperature patterns in the datacenter can help IT and datacenter architects better understand workload trends. Besides providing insights about growth trends, the tools promote a holistic approach for lowering the overall power budget for the datacenter and enable datacenter teams to operate within defined energy budget limits. Since many large datacenters already operate near the limits of the local utility companies, energy management has become mission critical for any fast-growing datacenter.


Tip #4: Stay Cool


Holiday shopping can be a budget buster, and the credit card bills can be quite a shock in January. In the datacenter, rising energy costs and green initiatives similarly strain energy budgets. Seasonal demands, which peak in both summer and the depths of winter, can mean more short-term outages and big storms that can force operations over to a disaster recovery site.


With the right energy management tools, datacenter and facilities teams can come together to maximize the overall energy efficiency for the datacenter and the environmental conditions solutions (humidity control, cooling, etc.). For example, holistic energy management solutions can identify ghost servers, those systems that are idle and yet still consuming power. Hot spots can be located and workloads shifted such that less cooling is required and equipment life extended. The average datacenter experiences between 15 to 20 percent savings on overall energy costs with the introduction of an energy management solution.


Tip #5: Reading the Signs of the Times


During a blizzard, the local authorities direct the snowplows, police, and rescue teams to keep everyone safe. Signs and flashing lights remind everyone of the rules. In the datacenter, the walls may not be plastered with the rules, but government regulations and compliance guidelines are woven into the vital day-to-day business processes.


Based on historical trends, regulations will continue to increase and datacenter managers should not expect any decrease in terms of required compliance-related efforts. Public awareness about energy resources and the related environment impact surrounding energy exploration and production also encourage regulators.


Fortunately, the energy management tools and approaches that help improve efficiencies and lower costs also enable overall visibility and historical logging that supports audits and other compliance-related activities.

When “politically correct” behavior and cost savings go hand in hand, momentum builds quickly. This effect is both driving demand for and promoting great advances in energy management technology, which bodes well for datacenter managers since positive results always depend on having the right tools. And when it comes to IT Wonderlands, energy management can be the equivalent of the whole toolshed.

Read more >

Software Will Drive “Sensification” of Computing ie Everything

The phrase “sensification” has become an ear worm of sorts. I can’t shake it. Intel CEO Brian Krzanich, discussed it at CES where he explained computing will move to things that sense their environment, are connected, and become an extension of the user. He … Read more >

The post Software Will Drive “Sensification” of Computing ie Everything appeared first on Intel Software and Services.

Read more >

Embracing A More Connected Vision Of Healthcare

One of the most promising areas of innovation and transformation in healthcare today is the move to distributed care, achieved through the creation of patient-centered networks of intelligent, connected devices that span across the home, workplace, community and the mobile spaces in between. Data capture and analysis, and communication between the patient and their care team can all be enhanced and harnessed to deliver more effective healthcare to more people at lower cost.


Connected Care, Everywhere

In the home, this will be driven by new types of consumer medical devices and smart-home connectivity and features. In the workplace and the community, new mobile devices and services including kiosks will be available. And for persistent real-time data and connectivity, new purpose-built and general purpose devices will fill in critical gaps.


Community Care Impact

In the home, sensors are transforming the way we care for the elderly, helping them stay more independent and spend longer at home, thus improving general well-being and reducing costs to the provider. Mimocare’s sensor solution is a great example of just how the Internet of Things can help us move the focus towards prevention rather than cure.


For community nurses this kind of distributed care is a win-win, they’re alerted (remotely) to patients showing abnormal signs earlier which enables a more speedy intervention and appropriate care is delivered more quickly, while also reducing the need for unnecessary monitoring visits too.


Patient-Centered Connectivity

I’d highly recommend reading a recent blog on the use of the Intel® RealSense™ 3D Camera by GPC too, which can help clinicians in a hospital setting make better-informed decisions in the area of wound care management. It’s an exciting development as wound care management accounts for a high-spend by most care providers, for example,  in the UK the NHS spends some £3 billion per year in this area.


RealSense™ is available across a range of mobile devices today so I see a future where patients are able to play a greater role in their wound care management in the home setting by recording the healing progress of wounds using the 3D camera and sharing the results with clinicians. This is undoubtedly more convenient for patients and more efficient for clinicians and providers.


Balancing the Demands of Modern Healthcare

These patient-centered networks of intelligent, connected devices generate significant volumes of data which can be analyzed by healthcare providers to help balance the demands of an ageing population with increased pressure on costs.  Underlying this shift to distributed care is patient preference to stay at, and be clinically managed, at home.  The tools are available today, so let’s embrace a more connected vision of healthcare where we deliver even better care to patients.


Read more >

RealSense and Design Thinking Help Consumers More Easily Buy Shoes that Fit

The innovation team within the organization I manage, along with others in Intel, have been working to improve the retail shopping experience for consumers, and their work recently made headlines. Our project with Nordstrom garnered attention from almost 100 reporters, with more than 20 articles … Read more >

The post RealSense and Design Thinking Help Consumers More Easily Buy Shoes that Fit appeared first on Intel Software and Services.

Read more >

How End-To-End Network Transformation Fuels the Digital Service Economy

To see the challenge facing the network infrastructure industry, I have to look no farther than the Apple Watch I wear on my wrist.


That new device is a symbol of the change that is challenging the telecommunications industry. This wearable technology is an example of the leading edge of the next phase of the digital service economy, where information technology becomes the basis of innovation, services and new business models.


I had the opportunity to share a view on the end-to-end network transformation needed to support the digital service economy recently with an audience of communications and cloud service providers during my keynote speech at the Big Telecom Event.


These service providers are seeking to transform their network infrastructure to meet customer demand for information that can help grow their businesses, enhance productivity and enrich their day-to-day lives.  Compelling new services are being innovated at cloud pace, and the underlying network infrastructure must be agile, scalable, and dynamic to support these new services.


The operator’s challenge is that the current network architecture is anchored in purpose-built, fixed function equipment that is not able to be utilized for anything other than the function for which it was originally designed.  The dynamic nature of the telecommunications industry means that the infrastructure must be more responsive to changing market needs. The challenge of continuing to build out network capacity to meet customer requirements in a way that is more flexible and cost-effective is what is driving the commitment by service providers and the industry to transform these networks to a different architectural paradigm anchored in innovation from the data center industry.


Network operators have worked with Intel to find ways to leverage server, cloud, and virtualization technologies to build networks that cost less to deploy, giving consumers and business users a great experience, while easing and lowering their cost of deployment and operation.


Transformation starts with reimagining the network


This transformation starts with reimagining what the network can do and how it can be redesigned for new devices and applications, even including those that have not yet been invented. Intel is working with the industry to reimagine the network using Network Functions Virtualization (NFV) and Software Defined Networking (SDN).


For example, the evolution of the wireless access network from macro basestations to a heterogeneous network or “HetNet”, using a mix of macro cell and small cell base-stations, and the addition of mobile edge computing (MEC) will dramatically improve network efficiency by providing more efficient use of spectrum and new radio-aware service capabilities.  This transformation will intelligently couple mobile devices to the access network for greater innovation and improved ability to scale capacity and improve coverage.


In wireline access, virtual customer premises equipment moves service provisioning intelligence from the home or business to the provider edge to accelerate delivery of new services and to optimize operating expenses. And NFV and SDN are also being deployed in the wireless core and in cloud and enterprise data center networks.


This network transformation also makes possible new Internet of Things (IoT) services and revenue streams. As virtualized compute capabilities are added to every network node, operators have the opportunity to add sensing points throughout the network and tiered analytics to dynamically meet the needs of any IoT application.


One example of IoT innovation is safety cameras in “smart city” applications. With IoT, cities can deploy surveillance video cameras to collect video and process it at the edge to detect patterns that would indicate a security issue. When an issue occurs, the edge node can signal the camera to switch to high-resolution mode, flag an alert and divert the video stream to a central command center in the cloud. With smart cities, safety personnel efficiency and citizen safety are improved, all enabled by an efficient underlying network infrastructure.


NFV and SDN deployment has begun in earnest, but broad-scale deployment will require even more innovation: standardized, commercial-grade solutions must be available; next-generation networks must be architected; and business processes must be transformed to consume this new paradigm. Intel is investing now to lead this transformation and is driving a four-pronged strategy anchored in technology leadership: support of industry consortia, delivery of open reference designs, collaboration on trials and deployments, and building an industry ecosystem.


The foundation of this strategy is Intel’s role as a technology innovator. Intel’s continued investment and development in manufacturing leadership, processor architecture, Ethernet controllers and switches, and optimized open source software provide a foundation for our network transformation strategy.


Open standards are a critical to robust solutions, and Intel is engaged with all of the key industry consortia in this industry, including the European Telecommunications Standards Institute (ETSI), Open vSwitch, Open Daylight, OpenStack, and others. Most recently, we dedicated significant engineering and lab investments to the Open Platform for NFV’s (OPNFV) release of OPNFV Arno, the first carrier-grade, open source NFV platform.


The next step for these open source solutions is to be integrated with operating systems and other software into open reference software to provide an on-ramp for developers into NFV and SDN. That’s what Intel is doing with our Open Network Platform (ONP); a reference architecture that enables software developers to lower their development cost and shorten their time to market.  The innovations in ONP form the basis of many of our contributions back to the open source community. In the future, ONP will be based on OPNFV releases, enhanced by additional optimizations and proofs-of-concept in which we continue to invest.


We also are working to bring real-world solutions to market and are active in collaborating on trials and deployments and deeply investing in building an ecosystem that brings companies together to create interoperable solutions.


As just one example, my team is working with Cisco Systems on a service chaining proof of concept that demonstrates how Intel Ethernet 40GbE and 100GbE controllers, working with a Cisco UCS network, can provide service chaining using network service header (NSH).  This is one of dozens of PoCs that Intel has participated in in just this year, which collectively demonstrate the early momentum of NFV and SDN and its potential to transform service delivery.


A lot of our involvement in PoCs and trials comes from working with our ecosystem partners in the Intel Network Builders. I was very pleased to have had the opportunity to share the stage with Martin Bäckström and announce that Ericsson has joined Network Builders. Ericsson is an industry leader and innovator, and their presence in Network Builders demonstrates a commitment to a shared vision of end-to-end network transformation.


The companies in this ecosystem are passionate software and hardware vendors, and also end users, that work together to develop new solutions. There are more than 150 Network Builder members taking advantage of this program and driving forward with a shared vision to accelerate the availability of commercial grade solutions.


NFV and SDN are deploying now – but that is just the start of the end-to-end network transformation. There is still a great deal of technology and business innovation required to drive NFV and SDN to scale, and Intel will continue its commitment to drive this transformation.

I invited the BTE audience – and I invite you – to join us in this collaboration to create tomorrow’s user experiences and to lay the foundation for the next phase of the digital services economy.

Read more >

Save Lives, Prevent Equipment Failures, and Gain Insights at the Edge

Internet of Things (IoT) technologies from Intel and SAP enable innovative solutions far from the data center


Can a supervisor on an oil rig know immediately when critical equipment fails? Can a retail store manager provide customers with an up-to-date, customized ex¬¬perience without waiting for back-end analysis from the parent company’s data center? A few years ago, the answer would have been a clear “no.” But today, real-time, actionable data at the edge is a reality.


Innovative technologies from Intel and SAP can enable automated responses and provide critical insights at remote locations. The unique joint solutions enable companies to dramatically improve worker safety, equipment reliability, and customer engagement, all without an infrastructure overhaul. For example, technicians on a remote, deep-sea oil rig can be equipped with sensors that detect each technician’s location, heart rate, and exposure to harmful gasses. Additionally, sensors powered by Intel Quark SoCs can be placed on equipment throughout the oil rig to monitor for leaks or fires. The collective data from these sensors is fed to an Intel IoT Gateway and processed to provide data visualization and a browser interface that is easily accessible from any device.


From any location on the rig with Wi-Fi access, supervisors can monitor worker health and safety data from an app running on a tablet device. In addition, automated alerts and alarms can signal when an employee is in danger or a critical malfunction has occurred. All of this processing can happen in real time, on-site, without depending on a reliable wide-area network (WAN) connection to a back-end server that might be hundreds or thousands of miles away.


When the WAN connection is available, the SAP Remote Data Sync service synchronizes data with SAP HANA running in the cloud or in the data center. This synchronization provides cloud-based reporting and back-end SAP integration for long-range analysis




With IoT sensors, Intel IoT Gateway, and SAP software, businesses can improve safety and gain real-time insights right at the edge.


To learn more about the joint Intel and SAP solution at the edge, read the solution brief Business Intelligence at the Edge.

Read more >

Intel IoT Platform Paving the Road to the Car of the Future

Telematics offerings expand daily, both in number and capabilities. Driven by consumer demand for vehicles that extend the connected lifestyle, provide enhanced safety, and reduce environmental impact, the market for in-vehicle telematics is surging. The automotive industry is now the … Read more >

The post Intel IoT Platform Paving the Road to the Car of the Future appeared first on IoT@Intel.

Read more >