Recent Blog Posts

Using 2 in 1s for Disruptive Innovation at Front Porch

In a time of rapid change, innovation is crucial for any enterprise. But I haven’t seen many organizations approach innovation as thoughtfully and systematically as Front Porch. This California-based nonprofit supports a family of companies offering assisted living, skilled nursing, retirement, and other communities across four states.

 

Front Porch has a Center for Innovation and Wellbeing as well as a commitment to disruptive, caused-based innovation called Humanly Possible℠. “We want everyone at every part of our organization to focus on what’s possible and what’s next—to look at how we can do what we do better, to bring new value to people we serve,” says Kari Olson, chief innovation and technology officer for Front Porch and president of its innovation center.

                     

Olson and other Front Porch leaders were quick to see value in flexible 2 in 1 devices based on Intel® technologies and Windows.

 

“Two-thirds of our workforce are out and about, not sitting at a desk,” Olson says. “If we can give them portable devices that let them do their computing in a secure, reliable way, when and where they need to, we can have a big impact—both on their productivity and on our ability to meet the needs of the people we serve. If we can do that and stay consistent with our enterprise applications and tools—that’s huge.”


front porch.jpg

Front Porch staff saved time and increased patient engagement by using their 2 in 1 devices in members’ residential rooms, care centers, activity rooms, team meetings, and other settings.


But could 2 in 1 devices help deliver transformative value? And how would Front Porch’s people-focused helping professionals—who often have an “I’ll use it if I have to” attitude toward technology—feel about the new devices?

 

Intel just completed a case study that answers these questions. In it, Front Porch leaders describe surprises they encountered as employees ranging from nurses to activities coordinators began using 2 in 1s. Front Porch shares best practices for mobile technology adoption, and highlights the benefits they’re seeing for patient engagement, organizational efficiency, quality of care, and more.

 

I found their results fascinating. They’re relevant not just for healthcare, but for any organization that wants to empower a mobile workforce.

                                                                                                                                                                            

Read the case study and let me know your thoughts. Where might enterprise-capable 2 in 1s add value in your organization? Post a comment, or join and participate in the Intel Health and Life Sciences Community.

 

Learn more about Intel® Health & Life Sciences.

 

Read more about Front Porch and the Front Porch Center for Innovation and Wellness.

 

Stay in touch: @IntelHealth, @hankinjoan

Read more >

Intel Rack Scale Architecture 1.0 Ready for Developers

By Jay Kyathsandra, Intel



The first Intel Rack Scale Architecture Developer Summit kicks off a busy week for the Intel Developer Forum 2015

 

Over the past months as Intel has been preparing to roll out Intel Rack Scale Architecture, industry partners, software ISVs, and developers have not been waiting idly on the sidelines. With high interest from cloud service providers, telcos, and enterprises focused on next-generation software-defined infrastructure and big data implementations, Intel Rack Scale Architecture has been developing along with a vibrant and growing ecosystem of supporting standards bodies, OEMs, and ISVs.

 

Why all the industry support for Intel Rack Scale Architecture? By defining a logical architecture that disaggregates and pools compute, storage, and network resources, rack scale architecture can greatly simplify the management of these resources. Even better, the rack scale approach enables data center operators to dynamically compose resources based on workload-specific demands, enabling user-defined performance, higher utilization, and interoperability—an essential capability for cloud deployments. And now Intel Rack Scale Architecture is ready for developers.

 

The architecture was officially presented to a very receptive audience at today’s Intel Rack Scale Architecture Developer Summit in San Francisco. Kicking off the event was Ryan Parker, Intel General Manager, who shared insights into the next generation cloud infrastructure trends. Other speakers included OEM partners announcing upcoming products, implementation and deployment scenarios, technical experts, and early customer testimonials. A popular topic, data center security and trends, led by Intel VP Curt Aubley, wrapped up the presentations.

 

For those attending IDF, there will be several demos featuring Intel Rack Scale Architecture on the show floor in the Data Center and Software Defined Infrastructure Community. Look for more information to become available for solution developers in the coming weeks. Detailed specifications and APIs will soon be available for download on Intel.com/IntelRackScaleArchitecture.

 

If you would like to learn more about Intel Rack Scale Architecture and how it will re-architect the data center of today, check out this video.

 

Read more >

Intel & OHSU Announce Collaborative Cancer Cloud at Intel Developer Forum

Each year millions of people all over the world, including more than 1 million patients in the United States, learn that they have a cancer diagnosis. Instead of going through painful chemotherapy that can kill healthy cells along with cancerous cells, what would happen if those patients were able to be treated as individuals based on their specific genome sequencing, and a precision treatment plan could be tailored specifically for their disease? And what if it could happen within 24 hours?

 

Today, I announced at the Intel Developer Forum that we are setting our sights on making this scenario a reality through an ambitious, open Platform-as-a-Service solution called the Collaborative Cancer Cloud.

 

The Collaborative Cancer Cloud is a precision medicine analytics platform that allows institutions to securely share patient genomic, imaging and clinical data for potentially lifesaving discoveries. It will enable large amounts of data from sites all around the world to be analyzed in a distributed way, while preserving the privacy and security of that patient data at each site.

 

The end goal is to empower researchers and doctors to help patients receive a diagnosis based on their genome and potentially arm clinicians with the data needed for a targeted treatment plan. By 2020, we envision this happening in 24 hours — All in One Day. The focus is to help cancer centers worldwide—and eventually centers for other diseases—securely share their private clinical and research data with one another to generate larger datasets to benefit research and inform the specific treatment of their individual patients.

 

The Rise of Precision Medicine                        

Precision medicine – taking into account individual differences in people’s genes, environments, and lifestyles – is one of the biggest of the big data problems and is on the cusp of a remarkable transformation in medicine. We view genomics as the first wave of precision medicine, and we’re working with our partners to drive adoption of genomic sequencers, genomic appliances, and cloud-based genomic analytics. With the Collaborative Cancer Cloud, we are combining next generation Intel technologies and bio-science advancements to enable solutions that make it easier, faster, and more affordable for developers, researchers, and clinicians to understand any disease that has a genetic component, starting with Cancer.

 

Initially, Intel and the Knight Cancer Institute at Oregon Health & Science University (OHSU) will launch the Collaborative Cancer Cloud. We expect two new institutions will be on board by 2016, addressing the critical need for larger patient pools and practitioner awareness. And from there, we can open up this federated, secure Collaborative Cancer Cloud network to dozens of others institutions—or let them create their own–to accelerate the science and the precision treatment options for clinicians to share with their patients. They can also apply it to advance personalized research in other diseases that are known to have a genetic component, including Alzheimer’s, diabetes, autism, and more.

 

In the same timeframe, we also intend to deliver open source code contributions to ensure the broadest developer base possible is working on delivering interoperable solutions. Open sourcing this code will drive both interoperability across different clouds, and allow analytics across a broader set of data – resulting in better insights for personalized care.

 

A Complementary Effort

You may be asking, “Haven’t we seen efforts like this before?” There have been numerous multi-institution partnerships formed to utilize big data analytics to look for insights about cancer and its treatment. Our focus on the federation/distribution of private datasets is complementary to the exciting work that’s happening to make public data sets more accessible to research. In CCC, each partner will maintain control of its patients’ data, while the shareable cancer treatment knowledgebase grows in availability and in impact. We want to help harness the power of that data — in a way that benefits clinicians, researchers and patients with a better knowledgebase and preserves security and privacy. By securely sharing clinical and research data amongst many institutions while maintaining patient privacy, the entire research community can benefit from insights revealed in large data cohorts.

 

In the end, precision medicine will only be as precise as available data allows. To better understand complex diseases like cancer, the medical and technology industries must collaborate to make the growing wealth of insights resulting from secure analysis of public and private genetic datasets accessible for the patient’s benefit. And if we do, we can turn an agonizing and uncertain process for the patient into a personalized process that occurs all in one day.

 

We encourage you to view the links below to learn more about our work with OHSU:

OHSU’s Exacloud

Collaborative Analytics for Personalized Cancer Care

 

Learn more about precision medicine and genomic code research at these resources:

www.intel.com/healthcare/optimizecode

https://www.whitehouse.gov/precision-medicine

Read more >

Data Center at IDF 2015

idf-banner.jpgThis week is going to be a great week for all things data center and Intel at IDF2015. Two words: industry innovation.  Technology innovation across the data center is moving at unprecedented pace, and Intel is working with the broader data center community to drive new compute models from the cloud to next gen data analytics and the network that connects us all.  It’s a very exciting time for me personally as IDF2015 offers a unique look at the cutting edge while providing an amazing networking opportunity to catch up with leaders from across the industry.  Here’s a quick glimpse for what’s in store this week.

 

If visionary insight on the future of computing is on your mind, we start, of course, at the top. Intel CEO, Brian Krzanich will kick IDF2015 off with what will surely be a fantastic keynote discussing the role of computing in shaping our experiences.  To get to the heart Intel’s vision for the data center, I’d recommend two IDF2015 mega sessions on critical topics shaping our industry focus. Intel’s Diane Bryant will be discussing the story data can tell us in her joint mega session with Intel’s IoT lead, Doug Davis. Not enough? Intel’s Sandra Rivera will be discussing 5G Innovation and network transformation with Aicha Evans.

 

Looking for even more technology insight while you’re at IDF? Don’t miss the chance to learn about implementing Software Define Infrastructure stacks to deliver cloud agility to the data center from Das Kamhout, see the latest on workload acceleration from Al Gara, hear Rob Crooke’s insights on Intel NVM technology, or Pradeep Dubey’s deep dive on machine learning and data analytics. Looking for what for the next great application for analytics? Don’t miss out on Alan Ross’s session improving IT security with analytics. Interested in a topic I haven’t listed? Check out the rich selection of presentation in the IDF schedule to see if we’ve got something for you.

 

While you’re at IDF it’s not all about lectures and sessions though – You’re there to see technology in action and connect with experts! Be sure to visit the technology showcase where you’ll find demonstrations of the latest data center innovation. While you’re there don’t miss out on the opportunity to participate in the Data Center Lock Down – a room escape experience that only Intel could deliver. When you’re done beating puzzles and winning prizes, don’t forget to check out the SDI challenge where you can test your knowledge of software defined infrastructure. Before you head out Tuesday night don’t miss the opportunity to meet technology experts and CTOs in the data center community from 5 to 7 pm.

 

Here’s a quick run-down on where to find Data Center topics at IDF:

 

 

Mega Sessions – Level 3, Keynote Hall

 

IoT and Big Data Insights: Data Has a Story to Tell with Diane Bryant and Doug Davis – Wednesday, August 19, 9:30-10:30 AM

 

In today’s connected world, every device, every decision and every ambition leaves a unique digital footprint. From smart grids to cars, retail to healthcare, the opportunities to create new experiences and extract breakthrough insights are endless. Join Diane Bryant, Senior Vice President and General Manager, Data Center Group, and Doug Davis, Senior Vice President and General Manager, Internet of Things Group to learn about the technologies, platforms, and tools that will enable your next digital story.

 

5G: Innovation from Client to Cloud with Sandra Rivera and Aicha Evans – Wednesday, August 19, 11:00-12:00

 

5G is foundational to global connected compute, and the communications industry is mobilizing now. Device and infrastructure developers must be ready to handle significant challenges in capacity, data rate, latency, and more. This creates exciting new opportunities in device-to-device communication, new network architectures, and intelligence everywhere. Join Sandra Rivera, Vice President, Data Center Group and General Manager, Network Platforms Group, and Aicha Evans, Corp. Vice President, Platform Engineering Group and General Manager, Intel Communication and Devices Group for a glimpse into the next-generation networks, devices, and consumer experiences that will collectively define the mobile industry’s “5G Era.”

 

Technology Insight Sessions – Level 3, Room 3016

 

Intel Non-Volatile Memory Inside. The Speed of Possibility Outside – Tuesday, August 18, 5:15-6:15

 

Join Rob Crooke, Senior VP of the Non-Volatile Memory Solutions Group and Al Fazio, Intel Senior Fellow, as they discuss the latest Non-Volatile Memory Technology. Rapid growth in connected devices and digital services brings an explosion of data that must be stored and analyzed very quickly. The digital world is expected to create up to 44 Zettabytes of data by 2020 and there is tremendous value in quickly turning that enormous amount of data into valuable information.

 

Workload Acceleration – Tuesday, August 18, 4:00-5:00

 

Datacenter workloads have become very diverse. Intel provides a portfolio of technologies ranging from processers optimized for energy efficiency, throughput or single thread performance as well as an array of accelerators. This Technology Insight will review the criteria, benefits, and costs of using these technologies for performance gains.  Hear how developers can take advantage of these technologies and the developer tools that Intel provides.

 

Data Analytics and Machine Learning – Wednesday, August 19, 11:00-12:00

 

This Technology Insight will demonstrate how to optimize data analytics and machine learning workloads for Intel® Architecture based data center platforms.

 

Data Center Technical Tracks

 

Optimizing for Data Center Workloads

As data centers evolve, enabling a highly responsive and high performance infrastructure optimized for workloads becomes increasingly more important. Join this track to learn about how to optimize Intel® Architecture based infrastructure and solutions for data center workloads, from high performance computing to big data analytics to media streaming workloads and more.

 

Graphics and Visual Computing

 

Intel continues to enhance the Visual Computing experience with no graphics card required. Join sessions in this track to learn how to take advantage of our next generation graphics microarchitecture to address key user experience needs such as low power optimization, performance gaming, media, visual computing, Ultra HD and display technologies. The information provided, just like Intel® HD Graphics, Intel® Iris™ and Iris Pro graphics products, will scale across platforms from tablets to 2 in 1s to workstations and to servers. The session will include live demos and highlight case studies and results achieved using Intel® tools for workload balancing, performance, and low power tuning.

 

High Speed I/O Technologies

 

This track demonstrations how Intel® technologies combine to accelerate end user innovation and scale performance and power as new generations of I/O solutions on Intel® architecture become available. Technology topics include PCI Express*, USB, and Thunderbolt™ Technologies.

 

Making Software Defined Infrastructure a Reality

 

The digital service economy is driving the need for new services to be deployed quickly and with much greater responsiveness to business needs. Data centers must evolve and become more intelligent to keep up with growing demands on infrastructure. Software Defined Infrastructure (SDI) is the foundation for a more automated, efficient, and responsive infrastructure, yet developers are grappling with a diversity of existing and emerging tools, technologies, and processes to implement SDI. Attend the session in this track to learn more about harnessing compute, network, storage, and orchestration solutions optimized with Intel® technologies to accelerate the development and deployment of SDI.

 

Intel® Solid-State Drive Technology

 

Join the sessions in this track to find out where and why solid-state drives (SSDs) excel. Hear about key considerations when designing SSDs into data centers. This track also explores PCI Express*/NVM Express and the future of SSDs in client and data center applications.

 

Data Center Communities and Pavilions – First Floor, Technical Showcase

 

Data Center and Software Defined Infrastructure Community

 

The data center community showcases the latest Intel technologies and ecosystem solutions that enable more automated, agile, and efficient infrastructure for the digital services economy. At the heart of this transformation is SDI, which allows data center operators to take advantage of dynamic and cost effective resource pools built on industry standard solutions. See the latest technology in action – including storage optimization, network function virtualization, high performance computing, and cloud security. Intel technology experts will be on hand to discuss these and many other innovations for the data center of today and tomorrow.

 

Intel® Network Builders Community

 

The Intel® Network Builders Community showcases key ecosystem partners that are collaborating with Intel to accelerate the adoption and deployment of software-defined networking (SDN) and network function virtualization (NFV) solutions. The Intel Network Builders ecosystem is helping to drive the network transformation by promoting open standards and interoperability along with highly optimized virtual network functions running on Intel® architecture-based servers.

 

NVM Express* (NVMe) Community

 

PCI Express* has emerged as the storage interconnect for performance demanding client and enterprise environments. With NVM Express* products now available, this community demonstrates the availability of these products and the breadth of industry investment driving this technology transition. NVM Express unlocks the potential of current and future NVM PCIe storage, delivering increased performance, while providing support for security, end-to-end data protection, and other Enterprise and Client features users need. Visit the NVM Express community at IDF to see the eco-system momentum around NVM Express Technology.

 

 

What if you missed out and can’t make it to IDF this year? Don’t worry. While you might miss some of the fun and the valuable connections, you will get as much data center as we can share this week. Stay tuned to the Data Stack to get more updates from Intel’s experts. Join the conversation on Twitter with #IDF15. And make sure to tune into the Intel Chip Chat live from the show floor on Day 2 and Day 3 of IDF.

Read more >

IDF15 Preview: Sizzling Hot IoT Events at Intel’s Biggest Party of the Year

Yes, there will be robots. There may even be a dance off. Anything is possible. In less than 24 hours, visionary coders, extraordinary makers, and inspirational creators will join forces at IDF15 in San Francisco, Intel’s premier event for developers … Read more >

The post IDF15 Preview: Sizzling Hot IoT Events at Intel’s Biggest Party of the Year appeared first on IoT@Intel.

Read more >

The Mexican Appetite For Large-Scale Business Transformation

The Vortex of Change


dcc.jpg

In business, a new norm is upon us. Static industrial age models are being turned upside down, as the ability to ‘innovate with velocity’ has overtaken size as a key driver of success. Businesses need to be able to roll out new products and services in previously unimagined timeframes – we’re talking weeks, even days, as opposed to months and years and if you don’t do it someone else will!


An agile, modern, flexible IT infrastructure underpins the 21st Century business, but this alone does not guarantee success. IT deployments need to happen hand in hand with large-scale workplace transformation.

 

At Intel we refer to this whirling mass of business, technological and cultural transition as The Vortex of Change. Once the dust has settled only the innovative will emerge in one piece; the nervous and slow will undoubtedly struggle to stay relevant.

 


The SMAC stack

vvv.jpg

At Intel and beyond, the SMAC (Social, Mobile, Analytics, Cloud) stack is recognised as the Digital Platform model required as the starting point to drive this large-scale business transformation. I’ll recap quickly on what we mean by SMAC stack but for a longer and more detailed description I’d recommend you read this blog from my colleague and good friend Jim Henrys.


  • Social

      Democratizes ideas and options, eliminates traditional hierarchies for communication, sharing and connecting

 

  • Mobile

      Allows us to work any time and from any where

 

  • Analytics

      Enables filtering of information to create observations, predictions, and drive real-time decision making

 

  • Cloud

      Provides access to information for collaboration anywhere, any time and on any device

 

How is this manifesting in the real-world?

 

We know from our interactions with customers that the vast majority of organisations across the globe acknowledge the fact that they have to become more nimble in order to survive. We also know that the SMAC stack is recognised as the starting point to Digital Convergence, which is fusing the best attributes of traditional business with the agility of digital business. Nowhere is this more evident than in Mexico.

 

I was lucky enough to take a recent trip to Mexico City and witness a country on the move. Mexico has an awful lot going for it:

 

  • Great labour market

Above all else, it’s people that drive transformation and Mexico has a great labour market with high-quality manufacturing output. This makes it very attractive for companies looking to invest and use Mexico as a hub from which to expand into other areas of Latin America.

 

  • Huge potential for e-commerce

Traditionally Internet penetration in Mexico has been low as many families do not own a computer. However, analysts are now predicting an increase in internet users from 65 million in 2015 to 80 million in 2018, driven primarily through increased smartphone and tablet ownership together with falling data costs. By 2017 the number of smartphone users is expected to exceed 54.4 million, while in 2015 Mexico will have the highest tablet penetration in Latin America at 35 percent. Couple this with the fact that Mexico has 112 million inhabitants, many living in areas where physical access to goods is harder than in countries like the US where most small towns have a Walmart, and it’s easy to see the huge potential for e-commerce

 

  • Growth in IT 

Analysts are predicting a CAGR of 7.2 percent in IT spending over the next four years driven by a convergence of income growth, declining device prices and Prosoft 3.0 – a supportive government ICT development policy. Cyber security software and services, cloud computing, retail hardware, demand for tablets and hybrid notebooks and outsourcing will drive this trend

 

  • Great business conditions

When compared with Brazil, for example, Mexico has 30 percent more GDP per capita and conditions for doing business are arguably more attractive. It is ranked number 39 in the World Bank’s ease of doing business index, while Brazil is ranked 120.

 

When it comes to individual companies what I saw in Mexico is consistent with what I’ve been seeing globally – businesses recognise that they need to change if they want to stay competitive and, in some instances, they are ahead of the curve. Many of the customers I spoke to were not only aware of the changes they needed to make to survive the Vortex of Change; they were already putting plans in place and taking concrete steps towards implementing them.


Turbulent times ahead

 

However, change is not easy. While many of the customers I met with in Mexico have transformation plans in place, the next steps will be less than straightforward. It was clear to me that many businesses are on the lookout for partners who can give them advice on the technological and cultural change required to get from A to B. I’d like to think that Intel can be a valued partner for many of them, offering an independent, global perspective on how transformation is being driven across industries.

 

Breaking it down, there are roughly five key considerations businesses should take into account when making the transformation from industrial age (static, slow and immovable) to digital age (nimble, fast and innovative):

 

  • Being data driven

      What data have we got, what can we do with it, and how can we monetise it?

 

  • Being on demand

      How can we economically and rapidly deliver value-add services to customers who want everything immediately?

 

  • Being secure

      How do we move from playing Whack-A-Mole to managing risk in a proactive and cost-effective manner?

 

  • Being customer centric

      How do we win and retain customers through properly connected experiences?

 

  • Being innovative

        How do we attract and retain the talent to drive innovation?

 

In Mexico these changes and considerations are not hypothetical; they are real and they are happening right now. My next blog takes a closer look at some of the Mexican businesses I met with on my trip and examines their progress down this path.


To continue the conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter

Read more >

Rethinking Enterprise Storage in the Cloud Era

By Nigel Cook, Intel Fellow, Chief Cloud Architect, Data Center Group, Innovation Pathfinding & Architecture Group at Intel

 

In today’s world, a business department can spin up a storage environment in a public cloud in a matter of minutes. To offer business users the same level of agility, enterprise IT teams need private cloud deployment models that enable the rapid self-service delivery of high-performance storage services. In other words, enterprise IT organizations now need to think and perform like cloud service providers.

 

This shift requires a rethinking of current processes and approaches to storage provisioning and management. In a conventional environment, storage is manually provisioned and matched to the specific needs of the application. When a business department wants to bring a new application online, it might take a month or two for the IT team to provision appropriate storage and carry out all the checks and balances to ensure end-to-end functionality. That doesn’t sound very cloud-like, does it? Lags of weeks or months are unacceptable in the cloud era.

 

Person_row_racks.png

To meet the rising expectations of end users, and to keep pace with an unprecedented influx of data, the IT organization now must find ways to deliver high-performance, scalable storage with the agility and economics of a public cloud. This new approach to the delivery of storage services requires a combination of innovative technologies that drive higher performance and automated management of the entire storage environment—so data is always stored at the right cost and the right level of performance.

 

To enable this cloud-like approach to storage, enterprises need to identify the right combinations of optimized hardware and software components. These combinations enable storage to be provisioned from a private cloud in a matter of seconds—all via a self-service portal and guided by backend management software that applies policies set by IT administrators.

 

At Intel, we understand these new requirements for enterprise storage in the cloud era. More specifically, we have a sharp focus on the technologies that will enable high-performance scalable storage for enterprise private clouds in an affordable manner—from next-generation Intel® Xeon® processors to new 3D XPoint technology for non-volatile memory and solid-state storage. Together with our OEM and open-source software partners, we are working actively to create a reference architecture that will bring the vision to life.

 

In one such collaboration, we are working with HP, Red Hat, and other partners to enable new advanced technologies and open source Ceph storage management software. In this particular project, we are optimizing Ceph to take full advantage of the capabilities of Intel® Solid State Drives (SSDs) and associated Intel technologies, including Intel® Cache Acceleration Software (Intel® CAS). This reference architecture integrates hardware with soon-to-be-available-on-the-market software in a new way to achieve the scalability, affordability, and performance results private clouds seek.  A broader goal is to drive the adoption of Ceph and enable the efforts of the open source community.

 

This collaboration, which will be in the spotlight at the Intel Developer Forum (IDF) in San Francisco August 18-20, proves that with the right combination of SSDs, optimized storage software, and enterprise-class processors, you can create a system that delivers tremendous improvements in storage performance. We’re talking about gains measured in orders-of-magnitude over conventional systems built around hard disk drives (HDDs). Even better, we’re showing that you can achieve this top-tier performance while making sure that data is stored in a cost-effective manner.

 

If you have the opportunity to attend IDF, you can see a firsthand demo of an Intel-based high-performance, scale-out cloud storage solution with HP Helion OpenStack® and  high-density HP Apollo servers. Look for the demo in the Data Center SDI Community Booth. Or for a deeper dive, attend the session INFS004 – High Performance Scale-out Storage for the Enterprise Private Cloud.

 

For a closer look at the future of storage in general, visit intel.com/storage.

 

Intel, the Intel logo Xeon, Intel Atom, and Intel Core are trademarks of Intel Corporation in the United States and other countries.
* Other names and brands may be claimed as the property of others.

Read more >

My Top 5 Attractions for Networking Enthusiasts at IDF15

By Renu Navale, Director of Intel Network Builders

 

It is an exciting time to work in the networking industry! Networking technology is going through a massive transformation as software defined networking (SDN) and network functions virtualization (NFV) solutions open the door to new business opportunities. Network infrastructure is also gearing up for 5G, the next wave in mobile connectivity expected to ramp by the end of the decade. With so much innovation on display, the Intel Developer Forum is an excellent place to dig deeper into network transformation and take the pulse of the rapidly changing industry landscape.

 

 

Here are my top 5 highlights of IDF15 that networking enthusiasts can’t miss:

 

1. Intel Network Builders Community (http://www.intel.com/content/www/us/en/intel-developer-forum-idf/san-francisco/2015/idf-2015-san-francisco-technology-showcase.html): IDF15 marks the second anniversary of the Intel Network Builders ecosystem program (http://networkbuilders.intel.com), which is helping to drive network transformation by building a community of vendors and providers working together towards solutions optimization, integration, and deployment. The Intel Network Builders community at IDF features 24 ecosystem partners demonstrating the next generation SDN/NFV, Mobile Edge Computing, and Carrier Grade technologies and solutions. This is at the top of my list to see in the Technology Showcase because it provides direct access to a diverse set of technical experts and networking demonstrations.

 

2. 5G: Innovation from Client to Cloud Mega Session: This ”Mega Session” features Sandra Rivera, Vice President, Data Center Group, and General Manager, Network Platforms Group, and Aicha Evans, Corp. Vice President, Platform Engineering Group, and General Manager, Intel Communication and Devices Group. They will take the stage to provide a glimpse into the next-generation networks, devices, and consumer experiences that will define the mobile industry’s “5G era.” I think this is a must see for networking enthusiasts and hope to see you there on Wednesday, August 19, 11:00-12:00.  If you can’t make it you can catch my tweets @renunavale!

 

3. Networking Technical Sessions: All of the sessions I’ve chosen below will give you an opportunity to learn from Intel technology experts and discuss some of the hottest technologies impacting software defined infrastructure and data center networking. Use the session titles and session IDs to find more information from the IDF15 webpage:

    • Software Defined infrastructure: Tips, Tricks and Tools for Network Infrastructure Design and Optimization (INFS002)
    • Tech Chat: 100Gb Ethernet Service Function Chaining on the Intel® Open Network Platform (INFC002)
    • Tech Chat: Intel® Open Network Platform Addressing SDN/NFV Industry Needs Through Innovation (INFC001)
    • Programmable, Scalable and High Performance Data Plane for Network Functions Virtualization (INFS010)
    • Optimizing Video Processing and Delivery with Intel® Xeon® Processor E3 Solutions (DCWS003)
    • Hands-on Lab: Data Plane Development Kit 101 – Discover the Performance Secret Sauce (DCWL001 & DCWL001R)
    • Intel® Open Network Platform Solutions for Network Functions Virtualization (INFS003)
    • Implementing Software Defined Infrastructure for Hyper-Scale (INFS001)

     

    4. 4.5/5G Wireless Pavilion and Data Center Community: As mobile continues to evolve beyond 4G to the fifth generation (5G) mobile standard the need for the next generation network evolves with it. The 4.5/5G Wireless Pavilion (Hyperlink text before to http://www.intel.com/content/www/us/en/intel-developer-forum-idf/san-francisco/2015/idf-2015-san-francisco-intel-pavilions.html) showcases cutting-edge wireless concepts designed to serve as the networking foundation for future mobile applications. I encourage you to come see what is driving the future of mobile networking and get an early glimpse into tomorrow’s mobile experiences. While you are in the area I suggest you visit the data center community which is showcasing the latest Intel technologies and ecosystem solutions that enable more automated, agile, and efficient data center infrastructure, including several networking related demonstrations.

     

    5. Data Center Escape Game: For the first time at IDF, the Data Center Group will host a live room-escape experience. While not directly related to networking technologies, this is a fun and competitive game that challenges your puzzle skills! With 30 minutes on the clock, teams of 5 are locked in a simulated control room and must work together to bring the data center back online. The championship will be on the line, and some amazing prizes too! I recommend you reserve your spot in advance at: https://infrastructurebuilders.intel.com/escaperoom or visit the game in the Data Center and SDI Community.

     

    There you go – five networking stops to add to your IDF15 agenda. I look forward to seeing you there.

    Read more >

    Using Data Value to Define Storage Requirements

    By Mike Ferron-Jones, Technology Marketing Manager, Data Center Group at Intel

     

    Over the past five years, the volume of data stored in homes, businesses, and data centers has grown exponentially. Gigabytes has grown to terabytes and petabytes. Looking ahead, the future will move us into the era of exabytes and zettabytes. By itself, data volume is an artifact of human and machine activity.

     

    With the rising tide of data stores has come with new approaches to turn data into information. The most dramatic change in data management was pioneered in the last decade when Hadoop and other tools enabled meaningful analysis of unstructured data. Released from the bounds of a structured database and subject to cloud scale analysis tools, data that heretofore was unused or silent, suddenly started speaking and giving direction. Since then, data’s value has increased by informing better decisions.  Big data analysis has changed the way we live, travel, guide medical care, make friends, communicate, buy goods and services and express ourselves to our community.

     

    Storage media devices have evolved to keep pace with exponential data growth. Given the current trend lines, however, incremental advances in legacy devices can no longer solve our data analysis challenges. We now need revolutionary changes in our approaches to data availability and storage, allowing larger datasets to guide better decisions in less time.

     

    So how did we get to this point? Let’s take a step back and look at today’s storage media pyramid. This price performance pyramid reflects the fact that every technology has trade-offs in terms of data storage capacity, speed, and cost. While the technologies within the pyramid have changed over time, the fundamental approach has remained the same for many years.

     

    This approach puts frequently used data in a fast, but relatively small and expensive, hot tier, and less-frequently used data in a large cold tier that uses slower, less-expensive technology. Data that falls somewhere between the hot and cold layers is stored in a warm tier. Currently, the hot and warm tiers leverage DDR4 DRAM and NAND solid-state disk (SSD), respectively, and the cold tier leverages spinning hard disk drives (HDDs).

    todays-storage-media-pyramid.pngToday’s storage media pyramid

     

    While it has served us well, the current storage media pyramid is starting to crack under the weight of today’s data loads. Datasets are swelling in size, the number of transactions is growing exponentially, and expected response times are shrinking. All the while, more and more processor cores and application containers are packed into each server, each contending to use that small and valuable HOT data layer. We need hot tier performance at cold tier cost efficiency and scale.

     

    As things now stand, the layers below HOT cannot rise to the performance challenges posed by tomorrow’s processors and software, they are just too slow.

    hot-layer-storage-performance.PNG

    In recognition of this reality, Intel is working with a broad ecosystem to deliver revolutionary new technologies that can better support the emerging requirements for data performance.

     

    As we begin to solve one performance bottleneck, we must look to the storage interface. Performance bottlenecks exist in legacy SAS and SATA interfaces.  Intel is a working with an industry consortium of more than 90 members to develop a standard storage interconnect called NVM Express (NVMe) to serve as the standard software interface for PCI Express* (PCIe*) SSDs.

     

    Legacy SATA and SAS interfaces were defined for mechanical hard drives. These legacy interfaces are slow in terms of throughput and latency. NVMe leaps over the limitations of these legacy technologies. It was designed from the ground up for non-volatile memory, low latency, and amazing storage media performance.

     

    For a closer look at this breakthrough technology, check out this video: Unlocking SSD Performance with NVM Express (NVMe) Technology

     

    On another front, Intel is delivering innovations to evolve and revolutionize storage media and further disrupt today’s mainstream storage architectures. These innovations include 3D NAND, which dramatically increases transistor densities in storage media. Unveiled earlier this year, 3D NAND is the world’s highest-density flash memory. It promises to greatly increase the capacity and reduce $-per-gigabyte of solid-state drives (SSDs).

     

    For a deep dive into 3D NAND, check out this webinar: Intel, Micron Discuss 3D NAND Technology. 3D fabrication methods increase data density and innovative materials science increases the speed at which data can be accessed by the processor.

     

    These new technologies will redraw the storage media pyramid, faster access to larger amounts of data for more accurate and complete analysis. Medical research, climate modelling, solution finding will all benefit from these innovations.  Data has become much more valuable for us and will become increasingly valuable as Intel scales non-volatile memory and the processing power that unleashes data value.

     

    As we look to the future, we need even more revolutionary advances in data access architectures. We will take up this topic (you might have heard about 3D XPointTM technology) in a follow-on post.

     

    Intel, the Intel logo Xeon, Intel Atom, and Intel Core are trademarks of Intel Corporation in the United States and other countries.
    * Other names and brands may be claimed as the property of others.

    Read more >

    Big Data Doesn’t Have to Be Big, If Done Right

    By Brian Womack, Director, IPAG Data Analytics & Machine Learning at Intel

     

    To meet demanding service-level agreements, data center operators need to aggregate and analyze telemetry data from many heterogeneous sources across the data center. These are essential capabilities in the effort to efficiently and adaptively manage SLA-related resources in a software-defined infrastructure environment.

    data swirl.png

     

    This analytics work, of course, can’t be done with manual processes in a data center that has thousands or tens of thousands, of servers with potentially millions of components that generate telemetry data. It requires the use of automated tools that capture and leverage telemetry data from processors, memory subsystems, storage, and fabric resources. Telemetry data enables the analytics, visualization, classification, and prediction capabilities that, in turn, drive efficient and adaptive SLA management.

     

    That’s the way things should be, anyway. The reality in today’s data centers is something else. The use of telemetry data for SLA management is hindered by tools that are oriented toward the metrics of the past. For example, many telemetry APIs provide only a partial view of cluster resources without analytics for classification or prediction in mind. And today’s tools often provide insufficient metrics for SLA performance tracking and prediction.

     

    In a related challenge, there are no set standards for the way different platforms present telemetry data. Telemetry data from different sources—such as the Microsoft Hyper-V or VMware vSphere ESXi hypervisors—is expressed in different units of measure, resolution, and naming. The result is the telemetry equivalent of apples-and-oranges comparisons—and a time consuming challenge for analytics developers in a heterogeneous data center.

     

    These challenges create the need for a new approach that enables more efficient and adaptive service level agreement management. That’s the goal of Data Center Telemetry Analytics (DCTA).

     

    DCTA leverages sophisticated capabilities like hierarchical telemetry analytics and distributed compression to take SLA management to a new level. In simple words, we’re talking about moving primary analytics operations close to the source of the raw telemetry data, doing the initial analysis there, and then sending the results — in a greatly condensed summary of the raw data that preserves its key attributes — to a central DCTA system for analysis in the context of telemetry data gathered from across the data center.

     

    The ability to compact large amounts of raw data into a summary form is a hugely important capability. It’s the equivalent of condensing a room full of vapor into a coffee cup of liquid. This data condensation greatly reduces the overhead of processing and transmitting enormous volumes of telemetry data across a data center fabric that should be used for paying application customers. The ability to tame telemetry data at its source in a mission-driven manner proves the proposition that, when done right, big data doesn’t have to be all that big.

     

    The key to DCTA lies in both a normalized and hierarchical ontology for telemetry. Here we’re talking about establishing one data schema and one interface for analytics developers that share the same units of measure for time, space, and domain. This unified interface that provides significantly more context over time simplifies and accelerates the work of analytics developers while enabling apples-to-apples comparisons of activities across a data center, a large enterprise, or an entire nation.

     

    Collectively, these DCTA capabilities enable adaptive SLA monitoring and management. They allow data center operators to engage in accurate predictive capacity planning; automate root-cause determination and resolution of IT issues; monitor customer jobs over time to assess performance trends; proactively recommend processor, memory, and fabric upgrades or downgrades; and predict system or component failures.

     

    Capabilities like these are keys to meeting ever-more-aggressive SLAs in a manner that makes optimal use of IT assets and drives economies across the data center.

     

    This isn’t a vision for a distant future. This is a path that Intel is on today in a research-driven initiative to bring DCTA to life. For a closer look at this initiative, see a demonstration in the Data Center SDI Community at IDF, or attend the session.  Also, check back after 8/19 for a link to the session presentation Data Center Analytics for Efficient and Adaptive SLA Management .  In addition, we encourage you to share your thoughts on DCTA on Twitter @IntelITCenter.


    Intel, the Intel logo Xeon, Intel Atom, and Intel Core are trademarks of Intel Corporation in the United States and other countries.

    * Other names and brands may be claimed as the property of others.

    Read more >

    The Mexican Appetite For Large-Scale Business Transformation

    The Vortex of Change

    dcc.jpg

    In business, a new norm is upon us. Static industrial age models are being turned upside down, as the ability to ‘innovate with velocity’ has overtaken size as a key driver of success. Businesses need to be able to roll out new products and services in previously unimagined timeframes – we’re talking weeks, even days, as opposed to months and years and if you don’t do it someone else will!

    An agile, modern, flexible IT infrastructure underpins the 21st Century business, but this alone does not guarantee success. IT deployments need to happen hand in hand with large-scale workplace transformation.

    At Intel we refer to this whirling mass of business, technological and cultural transition as The Vortex of Change. Once the dust has settled only the innovative will emerge in one piece; the nervous and slow will undoubtedly struggle to stay relevant.

     


    The SMAC stack

    vvv.jpg

    At Intel and beyond, the SMAC (Social, Mobile, Analytics, Cloud) stack is recognised as the Digital Platform model required as the starting point to drive this large-scale business transformation. I’ll recap quickly on what we mean by SMAC stack but for a longer and more detailed description I’d recommend you read this blog from my colleague and good friend Jim Henrys. 

    • Social

            Democratizes ideas and options, eliminates traditional hierarchies for communication, sharing and connecting

     

    • Mobile

           Allows us to work any time and from any where

     

    • Analytics

           Enables filtering of information to create observations, predictions, and drive real-time decision making

     

    • Cloud

            Provides access to information for collaboration anywhere, any time and on any device



    How is this manifesting in the real-world?

     

    We know from our interactions with customers that the vast majority of organisations across the globe acknowledge the fact that they have to become more nimble in order to survive. We also know that the SMAC stack is recognised as the starting point to Digital Convergence, which is fusing the best attributes of traditional business with the agility of digital business. Nowhere is this more evident than in Mexico.

    I was lucky enough to take a recent trip to Mexico City and witness a country on the move. Mexico has an awful lot going for it:

     

    • Great labour market

    Above all else, it’s people that drive transformation and Mexico has a great labour market with high-quality manufacturing output. This makes it very attractive for companies looking to invest and use Mexico as a hub from which to expand into other areas of Latin America

     

    • Huge potential for e-commerce

    Traditionally Internet penetration in Mexico has been low as many families do not own a computer. However, analysts are now predicting an increase in internet users from 65 million in 2015 to 80 million in 2018, driven primarily through increased smartphone and tablet ownership together with falling data costs. By 2017 the number of smartphone users is expected to exceed 54.4 million, while in 2015 Mexico will have the highest tablet penetration in Latin America at 35 percent. Couple this with the fact that Mexico has 112 million inhabitants, many living in areas where physical access to goods is harder than in countries like the US where most small towns have a Walmart, and it’s easy to see the huge potential for e-commerce

     

    • Growth in IT 

    Analysts are predicting a CAGR of 7.2 percent in IT spending over the next four years driven by a convergence of income growth, declining device prices and Prosoft 3.0 – a supportive government ICT development policy. Cyber security software and services, cloud computing, retail hardware, demand for tablets and hybrid notebooks and outsourcing will drive this trend

     

    • Great business conditions

    When compared with Brazil, for example, Mexico has 30 percent more GDP per capita and conditions for doing business are arguably more attractive. It is ranked number 39 in the World Bank’s ease of doing business index, while Brazil is ranked 120

    When it comes to individual companies what I saw in Mexico is consistent with what I’ve been seeing globally – businesses recognise that they need to change if they want to stay competitive and, in some instances, they are ahead of the curve. Many of the customers I spoke to were not only aware of the changes they needed to make to survive the Vortex of Change; they were already putting plans in place and taking concrete steps towards implementing them.

     


    Turbulent times ahead

     

    However, change is not easy. While many of the customers I met with in Mexico have transformation plans in place, the next steps will be less than straightforward. It was clear to me that many businesses are on the lookout for partners who can give them advice on the technological and cultural change required to get from A to B. I’d like to think that Intel can be a valued partner for many of them, offering an independent, global perspective on how transformation is being driven across industries.

    Breaking it down, there are roughly five key considerations businesses should take into account when making the transformation from industrial age (static, slow and immovable) to digital age (nimble, fast and innovative):

     

    • Being data driven

          What data have we got, what can we do with it, and how can we monetise it?

     

    • Being on demand

          How can we economically and rapidly deliver value-add services to customers who want everything immediately?

     

    • Being secure

           How do we move from playing Whack-A-Mole to managing risk in a proactive and cost-effective manner?

     

    • Being customer centric

           How do we win and retain customers through properly connected experiences?

     

    • Being innovative

            How do we attract and retain the talent to drive innovation?


     

    In Mexico these changes and considerations are not hypothetical; they are real and they are happening right now. My next blog takes a closer look at some of the Mexican businesses I met with on my trip and examines their progress down this path.

     

     

     

     

    To continue the conversation on Twitter, please follow us at @IntelITCenter or use #ITCenter

    Read more >

    Wound Care Management goes 3D with GPC and Intel® RealSense™

    With rising healthcare costs making news on an (almost) daily basis it’s essential that we push ahead with the drive for technology to have meaningful impact. One area seeing brilliant innovation is in the field of wound care where we’ve been working closely with GPC, a company based in the UK, who are combining Intel’s® RealSense™ 3D Camera capabilities with a mobile app to deliver better patient outcomes, improved clinical management and lower organisational costs.

     

    In the UK alone wound care is costing the NHS some £3 billion per year. The challenges are clear to me, specialist wound care clinicians are a limited and costly resource, monitoring and managing healing can be subjective and evidence-based decision making is hampered by a lack of consistent wound information. I’m pleased that a combination of talent and technology is meeting these problems head-on.

     

    Enhancing Mobility, Removing the Guesswork

    GPC, with extensive knowledge of the healthcare sector, set about overcoming these challenges by designing an innovative wound care management solution. I asked Huw Morgan, Technical Director at GPC, to tell us more: “We wanted to provide an enhanced level of mobility for clinicians working in wound care by delivering applications which work across multiple devices including tablets, laptops and mobile phones. Additionally, taking away some of the guesswork by offering a standardised and consistent method of capturing images of wounds – which enables the clinician to determine changes in size and colour – enhances evidence-based decision making considerably. These two factors will lead to better patient outcomes through reducing healing time and fewer complications”

     

    The team at GPC are utilising the Intel® RealSense™ 3D Camera which can be found in a range of mobile devices. The camera’s depth-sensing technology is a real win for healthcare across many scenarios and it truly excels in the field of wound care. Huw explained more: “With RealSense™ we’re able to not only record real-time data in 3D but the clinician can also rotate and interact with the wound image. This delivers a much greater understanding of a wound in respect of location, which more often than not will be sitting on an uneven surface on the body.”

     

    RealSense™ Enables Quantum Leap in Clinical Monitoring

    Dr. Ian Wiles, Medical Director at GPC, talked through some of the detailed clinical benefits of using RealSense™ in relation to wound management: “The widely adopted pressure ulcer classification system (four stages) is helpful to allow communication between clinicians and managers but it can be inconsistently applied. 3D cameras enable any carer to accurately assess and monitor an ulcer.”

     

    “ULNITS are the accurate measurement of an ulcer using a 3D camera – maximum width x maximum length x maximum depth – note this is not a cubic measurement of tissue loss because of the complex shapes of ulcers but it is an objective, repeatable gauge that can be accurately monitored. The 3D image will be of benefit when reviewing cases or for the tissue viability experts but the progress/deterioration of the ulcer by ULNIT is far more important.”

     

    “It is the monitoring of the ulcer using the RealSense™ 3D Camera that is the most powerful development. After the initial assessment the treatment can be started using locally agreed protocols and the progress of the ulcer can be accurately predicted while removing the subjective element in previous classifications. Every clinician involved in ulcer care understands this is a powerful development in measurement. RealSense™ has enabled a quantum leap in clinical monitoring, it’s not 3D for the sake of 3D but better care using 3D.”

     

    Combining 3D Images with Analytical Expertise

    GPC take the data from the RealSense™ 3D Camera and apply their analytical expertise to provide wound care specialists with a view that is consistent, both in terms of visual changes across time and in respect of size and colour. Additionally, GPC have developed an algorithm to more accurately measure wound severity and consequently healing.

     

    The buzz from healthcare providers around these advances in wound care technology from GPC at HIMSS15 this year was fantastic and that has continued through to events in the UK in recent months. I’m really pleased to be able to share this use case with you which will allow clinicians to be more mobile, capture an enhanced level of data and benefit from innovative analytics. Healthcare providers will see reduced costs around wound management and, importantly, deliver better patient outcomes.

     

    Wound care is just one aspect where the Intel® RealSense™ family of software and depth cameras enables more natural and intuitive interaction with personal computing devices and for healthcare there are many more possibilities such as:

     

    • Gesture control, facial detection and tracking for use in secure login;
    • Video conferencing where the background can be excluded;
    • Possibilities to help in the empathy and social‐emotional factors by assessing the facial expressions with emotion detection in recovery from stroke;
    • Tracking of 22 joints in a hand could assist in the post-operative treatment after hand surgery for instance.

     

    These are just some of the areas that have been discussed so far where this technology could be applied. And we must not forget that RealSense™ 3D cameras can be found in a range of laptops and all-in-one devices which opens up access to the significant benefits of solutions such as GPC’s wound care management to healthcare providers across the world.

     

    GPC will be showcasing their innovative wound care technology which uses Intel® RealSense™ at IDF15.

    Read more >

    What’s Next for Smart Buildings?

    Our buildings are getting smarter. They can tell us all sorts of things about the way we work and live our lives – and building automation systems and energy management systems have been keeping an eye on their hosts’ performance for years. But the IoT is about to send the average building’s IQ into the stratosphere. Smart buildings are going to get very smart indeed.

     

    Energy Management Leads the Way

     

    We’re already starting to see the IoT make a difference in energy management. Having built its business on providing energy-efficiency products to the construction industry worldwide, Kingspan is using the Intel IoT Gateway built on the Intel® Quark™ SoC to super-charge its energy-management solutions – and is on its way to achieving net-zero energy status for its headquarters in Dublin.

     

    58511.jpg

     

    Kingspan is not alone: Daikin Applied is using the Intel IoT Gateway to support more proactive management of buildings’ performance. Inspired by the problem of people getting stuck in elevators during a regional power-cut, Rudin (the leading private manager of business and residential property in Manhattan) has had a smart building strategy since 2008. But now Rudin and Intel are exploring how the IoT and machine learning can improve productivity and efficiency, maximize operations and enhance day-to-day life for its tenants. Its operational efficiency tool, Di-BOSS, has already helped achieve seven percent savings in energy consumption – worth about $1 million a year.

     

    Sustainability and Moore’s Law

                                                                

    What’s driving all this? Partly it’s regulation. Across the EU, buildings are responsible for 40 percent of energy consumption and 36 percent of CO2 emissions. As a result, the EU’s Energy Performance of Buildings Directive requires all new buildings to be nearly zero-energy by the end of 2020. All new public buildings must be nearly zero-energy by 2018.

     

    But it’s also driven by technology development. Think about progress in the last five years alone: touch-screens were once niche, now they’re normal. We carry the internet in our pockets – and access it almost wherever we want.

     

    And over the past 10 years, the costs of sensors have gone down by 50 per cent. The cost of bandwidth is down by a factor of 40 and computing costs are 60 times lower. Moore’s law – now in its 50th year – is as healthy as ever. The IoT is cost effective!

     

    Better Building and Facilities Management

     

    So where does this leave facilities and building management more generally? These are interesting times for those of us involved in the IoT. As we have seen, companies in the field of energy management are starting to embrace the IoT. But there’s still more to be done to persuade companies that there is an opportunity to take control and define what is needed from technology to improve the way we occupy and use buildings.

     

    This is even more true of facilities or general building management. As Yoga Systems is demonstrating, it is not just about energy use. Its Yoga PRO1 (based on the Intel IoT Gateway and using the Intel Quark SoC) can be used in commercial and industrial buildings to connect and control almost anything – wired and wireless security detectors, cameras, thermostats, smart plugs, lights, entertainment systems, locks, and appliances.

     

    That’s a lot of use cases. Facilities management service provider Coor is putting some of them into practice by using the IoT to simplify the office manager’s job. But generally speaking, facilities managers are yet to fully embrace the possibilities of the IoT and truly smart buildings.

     

    Security and Manageability

     

    However, as these examples all demonstrate, smarter buildings means ever more connected devices for IT teams to worry about, and even more data to secure. Tight integration of hardware- and software-based security will be essential. IT professionals will also need solutions that enable them to configure, monitor, and securely manage all those end-point devices, and then remotely maintain them and diagnose any problems. Both security and manageability are key tenets of the Intel IoT Gateway (alongside scalability and interoperability) to help make this happen.

     

    So What Can We Expect in the Near Future? Here Are a Few Predictions:

     

    • More small companies will get involved. The IoT creates an open and diverse ecosystem with plenty of opportunities for niche players. Change in facilities management in particular is likely to come from the small specialist players.
    • The IoT will drive development in Operational Technology (OT) – and the long-projected convergence of IT and OT will finally happen (Although this will depend in part on the level of security and manageability in place.)
    • Vertical solutions and proprietary infrastructure will not scale to meet the rate of anticipated change so we’ll see more and more common infrastructure driven by open standards that can benefit all.
    • Facilities management will eventually embrace the IoT. Their IT departments recognize that change is coming, it’s merely a question of time – and who gets to market first.

     

     

    What do you think? Who is embracing truly smart buildings? What are the business cases that will drive adoption? Have your say in the comments below.

     

    Rob Sheppard is IoT Product and Solutions Manager at Intel EMEA

     

    Keep up with him on Twitter (@sheppardi) or check out his other posts on IT Peer Network.

    Read more >