Here at the Intel Developer Zone, we’ve launched an exciting platform called the “Share Your App Project”, a great way for interested developers to share what they’re working on with the greater… Read more
Recent Blog Posts
I’ve noticed that people like to use a variety of sensor in their Intel Edison projects, but not all sensors operate at he same voltages. The some common voltages are 5.0V and 3.3V and the Intel Edison Arduino breakout board … Read more >
I’ve looked at many aspects of Bring Your Own Device in healthcare throughout this series of blogs, from the costs of getting it wrong to the upsides and downsides, and the effects on network and server security when implementing BYOD.
I thought it would be useful to distil my thoughts around how healthcare organisations can maximize the benefits of BYOD into 5 best practice tips. This is by no means an exhaustive list but provides a starting point for the no doubt lengthy conversations that need to take place when assessing the suitability of BYOD for an organisation.
If you’ve already implemented BYOD in your own healthcare organisation then do register and leave a comment below with your own tips – I know this community will appreciate your expertise.
Develop a Bring Your Own Device policy
It sounds like an obvious first step doesn’t it? However, I’d like to stress the importance of getting the policy right from day one. Do your research with clinical staff, understand their technology and process needs, identify their workarounds and ask how you can make their job of patient care easier. Development of a detailed and robust BYOD policy may take much longer than anticipated, and don’t forget that acceptance and inclusion of frontline staff is key to its success. Alongside the nuts and bolts of security it’s useful to explain the benefits to healthcare workers to get their trust, confidence and buy-in from the start.
Mobile Device Management
It’s likely that you have the network/server security aspect covered off under existing corporate IT governance. A key safeguard in implementing BYOD is Mobile Device Management (MDM), which should help meet your organisation’s specific security requirements. Some of these requirements may include restrictions on storing/downloading data onto the device, password authentication protocols and anti-virus/encryption software. Healthcare workers must also be given advice on what happens in the event of loss or theft of the mobile device, or when they leave the organisation in respect of remote deletion of data and apps. I encourage you to read our Case Study on Madrid Community Health Department on Managing Mobile for a great insight into how one healthcare organisation is assessing BYOD.
Make it Inclusive
For a healthcare organisation to fully enjoy the benefits of a more mobile and flexible workforce through BYOD they need to ensure that as many workers as possible (actually, I’d say all) can use their personal devices. It can be complex but some simple stipulations in the BYOD policy, such as requiring the user to ensure that they have the latest operating system and app updates installed at all times, can help to mitigate some of the risk. Also I would be conscious of the level of support an IT department can give from both a resource (people) and knowledge of mobile operating systems point of view. Ultimately, the most effective BYOD policies are device agnostic.
Plan for a Security Breach
The best BYOD policies plan for the worst, so that if the worst does happen it can be managed efficiently, effectively and have as little impact as possible on the organisation and patients. This requires creation of a Security Incident Response Plan. Planning for a security breach may prioritise fixing the weak link in the security chain, identifying the type and volume of data stolen and reporting the breach to a governmental department. For example, the Information Commissioner’s Office (ICO) in the UK advises that ‘although there is no legal obligation on data controllers to report breaches of security, we believe that serious breaches should be reported to the ICO.’
From a personal perspective we all know how quickly technology is changing and improving our lives. Healthcare is no different and it’s likely that the tablet carried by a nurse today has more computing power than the desktop of just a couple of years ago. With this rapid change comes the need to continually assess a BYOD policy to ensure it meets the advances in hardware and software on a regular basis. The risk landscape is also constantly evolving as new apps are installed, new social media services become available, and healthcare workers innovate new ways of collaborating. Importantly though, I stress that the BYOD policy must also take into account the advances in the working needs and practices of healthcare workers. We’re seeing some fantastic results from improved mobility, security and ability to store and analyse large amounts of data across the healthcare spectrum. We cannot afford for this progress to be hindered by out-of-date policies. The policy is the foundation of the security and privacy practice. A good privacy and security practice enables faster adoption, use, and realisation of the benefits of new technologies.
I hope these best practice tips have given you food for thought. We want to keep this conversation about the benefits of a more mobile healthcare workforce going so do follow us on Twitter and share our blogs amongst your personal networks.
BYOD in EMEA series: Read Part Three
Join the conversation: Intel Health and Life Sciences Community
Get in touch: Follow us via @intelhealth
David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.
Find him on LinkedIn
Keep up with him on Twitter (@davidhoulding)
Check out his previous posts
Today, many healthcare organizations are experimenting with and implementing the art of virtual care. Innovation in technology is finally able to address the need to go beyond brick and mortar and drive “care anywhere” when it is needed. While technology is enabling providers to drive virtual care initiatives to increase quality of care, provide patients with more access, and improve patient empowerment, therein lies the question: How secure is the ecosystem in which more and more personal health information is being exposed to?
First, let’s look at where we are currently. Healthcare is one of the most exciting industries today, thanks to digital technology and the industry and governments coming together to address some major pain points that existed for many decades. We are finally at a point where many of the “what if we could” ideas that clinicians and patients worldwide had can be realized. For example, many providers are driving initiatives around virtual care, including telehealth, and remote patient monitoring leveraging technology that can reside in patients’ homes.
In the future, payers may be able to use HIT and device information to drive big data and provide the optimal plans for patients in different demographics given the geographic region where they live, family history, and life habits. Last, but not least, patients are empowered with tools, devices, and information to proactively manage their own health the way that really makes sense, outside the hospital.
Wearables and Mobility
Simple forms of home monitoring have existed for years; however, today, there is a big disruption in the market due to new form factors of clinical wearables and connectivity solutions, which are easier to use and have a greater ability to transfer and provide access to patient data. Smartphones and tablets have become an integral part of people’s lives and can serve as a tool for telehealth, as well as a hub for clinical patient information. This makes the implementation of virtual care much easier, allowing patients to have options to cost-effective solutions and allowing them to manage their health more proactively.
At the same time, this proliferation of devices and data also increases the risk of data attack. Any points the data is collected, used, or stored can be at risk and needs to be secured. If the wearable devices that are collecting the data are outside the U.S. and this data is being uploaded to the cloud inside the U.S., then the use of these wearables can represent trans-border data flow which can be a significant concern, especially for countries with strong data protection laws such as in EU. We need to be more responsible on how the data can be captured, transmitted, and protected. At Intel, we provide security solutions that integrate well into the user experience such as fast encryption and cost reduction. We are working with our customers to develop the most effective solution for data privacy and security.
Overall, it is wonderful to see so many healthcare institutions driving virtual care. Care is definitely moving outside the traditional venues to new more natural settings closer to what patients need. However, this also exposes more patient health information to be outside the hospital walls and outside the walls of patients’ homes.
As such, at Intel, when we design a solution, we enable security in our core HW technology. And this provides differentiation in how the users experience security. To have a great experience, the end user should not be subjected to data breaches or other security incidents, and solutions need to be smarter about detecting user context and risks, and guiding the user to safer alternatives. Devices need to function reliably and be free of malware.
In addition, we are focused on driving consistent security performance across the compute continuum of care.
That brings us back to the original question: How secure is the ecosystem? Security will play a key role in ensuring a safe solution that providers, payers, and patients can all rely on. Security would also be key to enabling faster adoption of virtual care. Depending on the types of patient information collected, used, retained, disclosed, or shared, and how to store/dispose it, security can be designed to optimally protect privacy. It is a complex area to address, but given the value of health data, I am hopeful that organizations will start to design their virtual care solutions and ecosystem with security as one of the key pillars.
What questions do you have?
Kay Eron is General Manager Health IT & Medical Devices at Intel.
Popularly referred to as next-generation sequencing (NGS), or high-throughput sequencing, NGS is the catch-all term used to describe a number of different modern sequencing technologies including Illumina (Solexa), Roche 454, Ion Torrent (Proton/PGM), and SOLiD. This has allowed us to sequence DNA and RNA much faster and cheaper than the previously used Sanger sequencing, and has revolutionized the study of genomics and molecular biology.
The cost of genomic sequencing has also come a long way. From $3 billion to sequence the first human genome, it cost about $100 million per genome in 2001, and as of January 2014, the cost is about $1,000. Compared to Moore’s law that observes computing doubles every two years, the cost of sequencing a genome is falling five to 10 times annually.
The issue now is computing power to analyze this data. Newer sequencers are now producing four times the data in half the time. Intel® technologies like Xeon® and Xeon® Phi®, SSDs, 10/40 GbE networking solutions, Omni-Path fabric interconnect, Intel Enterprise Edition for Lustre (IEEL), along with partners like Cloudera and Amazon Web Services, are helping to cut down the time for secondary analysis from weeks to hours.
Genomic information is now catalogued and used for advancing precision medicine. For example, genomic information from TCGA (The Cancer Genome Atlas) has led to developments and FDA approval for certain cancer treatments. Currently, there are about 34 FDA-approved targeted therapies like Gleevec that treat gastrointestinal stromal tumors by blocking tyrosine kinase enzymes. Though approved by the FDA in 2001, it was further granted efficacy to treat 10 more types of cancers in 2011.
Sequencers are now producing four times more data in 50 percent less time at about 0.5TB/device/day. This is a lot of data. Newer modalities like 4-D imaging are now producing 2 TB/device/day. The majority of the software used for informatics and analytics is open sourced and the market is very fragmented.
Once the data is generated, the burden of storing, managing, sharing, ingesting, and moving it has its own set of challenges.
Innovation in algorithms and techniques is outpacing what IT can support, thus requiring flexibility and agility in infrastructures.
Collaboration across international boundaries is an absolute necessity and that introduces challenges with security and access rights.
Finally, as genomics makes its way into clinics, clinical guidelines like HIPAA will kick in.
At the clinical level, you have barriers around the conservation and validity of the sample, validity and repeatability of laboratory results, novelty and interpretation of biomarkers, merging genomics data with clinical data, actionability and eventually changing the healthcare delivery paradigm.
There are too few clinical specialists and key healthcare professionals, like pharmacists, who are trained in clinical genomics. New clinical pathways and guidelines will have to be created. Systems will need to be put in place to increase transparency and accountability of different stakeholders of genomic data usage. Equality and justice need to be ensured and protection against discrimination needs to be put in place (GINA).
Reimbursement methods need to consider flexible pricing for tailored therapeutics responses along with standardization and harmonization (CPT codes).
Looking ahead, we need to develop a standardized genetic terminology (HL7, G4GH, eMERGE) and make sure EHRs support the ability to browse sequenced data. Current EHRs will need standards around communication, querying, storing, and compressing large volumes of data while interfacing with EHRs’ identifiable patient information.
Intel is partnering with Intermountain Health to create a new set of Clinical Decision Support (CDS) applications by combining clinical, genomic, and family health history data. The goal is to promote widespread use of CDS that will help clinicians/counselors in assessing risk and assist genetic counselors in ordering genetic tests.
The solution will be agnostic to data collection tools, scale to different clinical domains and other healthcare institutions, be standards based where they exist, work across all EHRs, leverage state-of-the-art technologies, and be flexible to incorporate other data sources (e.g., imaging data, personal device data).
What questions do you have?
Ketan Paranjape is the general manager Life Sciences at Intel Corporation.
Intel and Spanish Society of Family Medicine & Community Collaborate to Create Tablet with App Store Exclusively for Doctors
Improving care for patients is a common goal for our healthcare team and partners, so I’m really excited to be able to share the outcome of a collaborative project we’ve been working on with the Spanish Society of Family Medicine and Community (semFYC).
Together we have created a tablet featuring an app store exclusively for doctors. Meeting the needs of healthcare professionals with an easy-to-use mobile device combined with medical applications that have the endorsement of a scientifically-recognised body in semFYC is incredibly exciting for all involved and a step-change for the way GPs and physicians access the latest clinical information.
Josep Basora, President of semFYC, spoke to me about the tablet and app store created in partnership with Intel: “When I started to drive this project I wanted to facilitate the right information, at the appropriate place and by the authorised time. Mobility is one of the keys that defines the work of the current healthcare professional.”
“For a physician, the possibility to use applications that have the endorsement of a scientific society such as semFYC has real significance, as it has the full assurance that the tool used is supported by rigorous governance. This has certainly had a positive effect on both resource optimisation and improvement of patient service.”
semFYC brings together more than 17 Societies of Family Medicine and Community in Spain covering a total of 19,500 GPs with a focus on improving the knowledge and skills of its members. The app store, which exclusively features medical applications, automatically updates installed apps with the latest information around procedures and drugs, thus reducing the time GPs require to update their knowledge and consequently increasing the quality of patient care.
Take a look at the video above to find about more about the tablet and health app store created by Intel with semFYC.
- Managing Mobile: Read our Madrid Community Health Department Case Study
- Join the conversation: Intel Health and Life Sciences Community
- Get in touch: Follow us via @intelhealth
In addition to the Big Data area, I have done and continue to do work with Cloud Computing. Intel is a member of the Open Data Center Alliance (ODCA), an independent consortium of global IT leaders building and recommending a unified customer vision for data center requirements. Late in 2014, the ODCA sent out a survey to its members on cloud adoption. I answered that survey, and recently the ODCA has published the survey results. One finding is there seems to be a strong preference for internal cloud solutions among ODCA members. Why is that? Is public cloud adoption really slowing? What are the key issues with both public and cloud adoption?
Overall, cloud adoption for ODCA members is on the increase for both public and private Cloud, although private cloud is increasing at a much faster rate. The ODCA survey highlighted the following top concerns: data security, regulatory issues, service reliability, and vendor lock-in. For the public cloud, the data security and regulatory issues are probably highest in priority. Intel IT has created a cloud brokering function for deciding whether to land an application externally in the public cloud or in the internal cloud. This function makes a decision based on factors like security requirements, control, and location.
A co-worker pointed out to me that the report seems to be IaaS-centric and that SaaS to the public Cloud is likely to grow. I would agree, and the report also mentions this. Opportunistically adopting SaaS Solutions is in Intel IT’s original Cloud Strategy, and today I see that public SaaS adoption continues to move ahead within Intel. The survey also points out key areas of interest to ODCA members, such as Software Defined Networking and hybrid cloud. SDN is also an area of focus for Intel IT, while moving to hybrid cloud has been a strategic goal.
A few other highlights that I didn’t cover:
- Since 2012, the number of respondents who have greater than 60% of their operations in an internal cloud has increased from 10% to 24%.
- Organizations project both their internal and public cloud usage to double by 2016.
- More than 80% of survey respondents are using or are planning to using hybrid cloud solutions at some point in the future.
You can see the details by downloading the report.
A perfect storm of market conditions is forming that will likely propel consumer health near the top of many enterprise priority lists and justify its estimated 40 percent CAGR in 2015.
Intel has been the driving force behind the global technology revolution for more than 40 years, and we’ve seen the dramatic impact of technology on healthcare. Looking ahead, here are the five drivers that we see fueling growth in consumer health:
One of the most important conditions is payment reform. As the basis for reimbursement shifts away from fee-for-service and toward quality-based outcomes in the U.S., providers will extend the continuum of care far beyond their hospitals to more accurately quantify value after discharge.
One of the best ways to optimize care and demonstrate effectiveness is to implement a holistic approach for understanding a person’s status by deriving actionable data about her individually and continuously from multiple sources — including consumer devices.
Consumer empowerment is also going to play a large role. It began with the shift from a business model that was traditionally B2B to one that was more B2C as commercial health insurers positioned themselves to personally engage millions of newly eligible customers. Now, consumer health solutions enable all payer organizations — private, public, employer — to promote healthy behaviors and timely preventative care that has been shown to reduce the occurrence of costly acute emergencies. Ultimately, consumers will have the ability to be more active in managing their own care, with the expectation of access to more of their health information anytime.
A demographic shift is also fueling this growth. Every day, 10,000 baby boomers celebrate their 65th birthday in the U.S., and that trend will continue until at least 2019. Unfortunately, 90 percent of them, with help from their family caregivers in some cases, are managing at least one chronic medical condition (860 million people worldwide). As telehealth becomes more widely adopted (and reimbursed), remote doctor consultations will increasingly rely on consumer health technologies to improve chronic disease management and ease the stress on a limited pool of primary care physicians.
Many fast-growing emerging global markets, like China and India, are exhibiting strong appetites for consumer health solutions that can add value while supplementing recent government efforts to provide more efficient virtual care to their significant aging and rural populations. As more technology vendors from the region offer innovative products at very competitive price points, access and adoption will continue to climb at a healthy pace, contributing to notable growth of the consumer health market segment regionally and worldwide.
Of course, one of the biggest hurdles to overcome is alignment of priorities for all major stakeholders. You need a consumer-centered design, an evaluation of clinical workflow integration, and a way to measure the business impact of the goals.
What questions do you have? What other drivers do you see impacting consumer health?
Michael Jackson is General Manager, Consumer Health at Intel Corporation.
At Intel Developer Forum in Shenzhen, I spoke about the role of software as the connective fabric empowering the China market and the global industry alike, to innovate together, working hand in hand, across all segments of computing. This year, … Read more >
The post China as the cornerstone of Software Innovation and Collaboration appeared first on Intel Software and Services.
The spectrum of smart, connected devices is rapidly expanding from phones, tablets and e-readers to phablets, smart watches, and even robotic drones. Tens of billions of devices are expected by the end of the decade, and manufactures must bring new … Read more >
The post Barriers removed – simplified and accelerated firmware development appeared first on Intel Software and Services.
This week the Intel Developer’s Forum (IDF) comes to Shenzhen! As its name suggests, IDF is an event focused on product development based on Intel technology. Technical sessions, press events, demonstration kiosks, and the UX Zone give developers plenty … Read more >
On March 22, 2015 I attended the Virtual Reality Los Angeles Spring Expo along with about 1500 other people. These are my thoughts and observations on the current state of VR.
Only one year ago a… Read more
We are creating data at an exponential rate. Yet, the data growth rate is not the biggest challenge for IT. The biggest challenge is that the need for useful information is growing faster than the data itself — providing a perfect storm for IT professionals and a business imperative to turn raw data into smart insights. To understand this challenge, I’d like to explore the history and evolution of big data complexity.
In 2001, Gartner analyst Doug Laney explained the initial challenges of big data in his 3Vs model. As time progressed, others have embraced the 3Vs model and incorporated two more areas of emphasis for big data analytics: veracity and value.
Ben Rossi does a nice job of articulating the impact of these five terms into one: smart data. “The purpose of smart data (veracity and value) is to filter out the noise and retain only the valuable data, which can be effectively used by the enterprise decision makers to solve business problems.”
Today, big data technology unfortunately isn’t meeting the needs of most businesses. There are two reasons. We should not be focused on the types of data, but the use case — business insights. And we must look far enough ahead in our use cases — trying to solve yesterday’s challenges and not just tomorrow’s. Michael Wu, chief scientist at Lithium Technologies, states that we are on a “maturity journey” when it comes to analytics and data visualizations. Understanding this evolution will help us better architect IT solutions today to extract the information and develop actionable insights for business decision makers.
There are three levels of analytics maturity that describe this progression:
- Descriptive analytics (what happened): A summary report of historical data, usually seen in a dashboard. Most enterprise analytics today fall into this category. An example includes a report of business data offering insights into an organization’s financials, sales, or inventory.
- Predictive analytics (what should happen): Makes predictions based on information that’s already available. An example includes financial services more accurately predicting future stock performance (noting that historical performance is not an indicator of future results).
- Prescriptive analytics (what you should do today): Analytics that not only predict the future but also deliver insights that allow you to decide today what path you should take to optimize your results. Google’s self-driving car is an example of prescriptive analytics, since the car needs to make decisions based on predictions of future outcomes. This is the use case that business leaders in a variety of industries are seeking and what’s driving the need for big data analytics.
Rossi’s concept of smart data enables intelligent insights when evaluated with a focus on prescriptive analytics. Wu summarizes nicely: “Big data technology won’t help you make bigger decisions … yet smart data can certainly help you make smarter decisions.”
Intel and Big Data Innovation
Extracting insights fast enough to support real-time business processes and decisions is critical, and companies are gathering, storing, and analyzing data they were never able to before. Intel understands the challenges and complexities facing IT professionals regarding the need to deliver high performance, cost-efficient big data solutions on a scalable, secure architecture.
As a result, Intel has joined forces with many industry leaders to enable enterprise solutions. SAP HANA, SAP Data Services, and SAP Business Objects provide solutions for real-time big data analytics using the Intel Distribution for Apache Hadoop software. Through these platforms, businesses can combine the performance of analytics with the scalability of Apache Hadoop, enabling a real-time analytics platform made to store, integrate, and analyze all business data.
Late last year, our CEO discussed a new Intel collaboration and equity investment with Cloudera aimed at bringing an enterprise-ready platform to the mainstream for impactful big data solutions.
The Big Data Maturity Journey
In summary, raw data is only useful when it’s used to add context-specific relevance, insights, and value to business operations. Smart data empowers the decision-making process by using analytics to achieve results that make sense to humans, not just machines. By making information actionable, we can make profitable decisions and solve problems in the process — and those are smart insights.
This guest post was written by Alex Sprayberry, a senior at Arizona State University’s Barrett Honors College, where she is earning her undergraduate degree in Business Management, a minor in Nonprofit Administration, and a certificate in International Business. As … Read more >
In the not-so-distant future, as smart grid and transactive energy technologies go mainstream, intelligent consumer devices will be able to lower energy bills and relieve grid congestion by powering up when intermittently-operating renewable energy sources like wind and solar are … Read more >
The post Storage Wars: Energy Storage Systems Opening New Frontiers appeared first on Energy.
I like to think of security as a chain, and like any other chain it is only as strong as its weakest link. In the case of security in healthcare the chain consists of the network, the server and the device. Often the focus is overwhelmingly placed on the security of the device but I argue that data is as equally, if not more, at risk when it’s in transit as it is when at rest. So, with that in mind I wanted to take a look at some of the wider security considerations around Bring Your Own Device (BYOD).
Whenever I speak at events about security and healthcare my starting point is often that we must remember that the priority for healthcare professionals is patient care. Security cannot, and must not, compromise usability as we know this drives workarounds. Often these workarounds mean using personal devices in conjunction with what is more commonly known as ‘Bring Your Own Cloud’.
Bring Your Own Cloud
Bring Your Own Cloud (BYOC) primarily refers to the use of clouds that are not authorized by the healthcare organization to convey sensitive data. This often occurs through an individual using an app they downloaded onto a personal device. Many such apps have backend clouds as part of their overall solution. When sensitive data is entered into the app it gets sync’d to the cloud. Furthermore, this transfer can occur over networks that are not managed by the healthcare organization, making the transfer invisible to the healthcare organization. Of course, sensitive data in an unauthorized cloud can constitute a breach. In many cases these 3rd party clouds can be in different countries, making this transfer a trans-border data flow and can represent further non-compliance issues with data protection laws.
For example, imagine a nurse taking patient notes that need to be sent to a specialist such as a cardiologist. This should be done using a secure device with a secure wireless network and a secure solution approved by the organization for such a task. However, lack of usability, or cumbersome security around such solutions, or a slow or overly restrictive IT department can drive the use of BYOC approach instead. In a BYOC approach the nurse uses a personal app on a personal mobile device together with either unencrypted email, a file transfer app, or social media to send these for analysis by a specialist.
This introduces risks to both the confidentiality of the sensitive healthcare data, as well as the integrity of the patient record that is often not updated with information traveling in these “side clouds”, rendering it incomplete, inaccurate, or out of data. In a best case this can result in suboptimal healthcare, and in a worst case this could be a patient safety issue. The consequences to both patient and organisation of such risks can be severe. Here at Intel we have security solutions available to healthcare organisations, which ensure that data is always secure whether at rest or in transit on the device or organisation’s network. Our security solutions also use hardware-enhanced security to maximize performance and usability, mitigating risk of cumbersome security and the healthcare worker being driven to resort to workarounds and BYOC.
Apps for Healthcare
One area where I’m seeing a lot of rapid change is in the development of apps for healthcare. I recently spoke to the Apps Alliance on the security challenges for developers of healthcare apps, whether they are aimed at healthcare professionals or consumers. These apps often make the recording and analysing of health information very easy and in some cases they can enhance the relationship between patient and clinician.
I’d also like to briefly take a look at what is often referred to as ‘Stealth IT’, also called ‘Shadow IT’. As with any form of workaround, the use of Stealth IT can be driven by an unresponsive or overly restrictive corporate IT department. One obvious example would be a small team of researchers requiring additional server space to store data but perceiving the organisational process slow and expensive in providing such resources. The consequence is the purchase of what is comparatively cheap and accessible server space with any number of easy-to-find companies on the web. I remind you of my earlier comments about knowing exactly how secure the server is and in which country or continent the server sits.
I like to think that a healthcare organisation looking to put a Bring Your Own Device policy in place appreciates the benefits and risks but starts with understanding why a healthcare professional uses their own device, logs on to an unsecure network or purchases unauthorised server space. Only then will the organisation, healthcare worker and patient truly reap the benefits of BYOD.
- BYOD in EMEA series: Read Part Two
- Join the conversation: Intel Health and Life Sciences Community
- Get in touch: Follow us via @intelhealth
David Houlding, MSc, CISSP, CIPP is a Healthcare Privacy and Security lead at Intel and a frequent blog contributor.
Find him on LinkedIn
Keep up with him on Twitter (@davidhoulding)
Check out his previous posts
Excuse the pun, but here in the UK, energy prices are hot news.
According to the Department of Energy and Climate Chance, since 2007, the prices of combined domestic gas and electricity bills have increased by 33 per cent in real terms. And, despite wholesale gas prices falling rapidly over the past 12 months, energy providers have been criticised for being slow to pass these savings onto customers. Of course it’s not that simple. Energy providers have to buy their wholesale gas supplies months in advance and have their own costs to cover, meaning it’s not always possible to immediately pass on these savings in full. Both parties are feeling the squeeze – not only in the UK, but all across Europe.
So what can be done to shake up the energy market and improve the situation for all?
“We have only reached the tip of the iceberg when it comes to realizing the positive impact of the Internet on our digital home life, in particular with regard to energy,” explains Serge Subiron, CEO and co-founder at IJENKO. “The Internet of Energy (IoE), as it is known, has the potential to connect the activities of utility providers and consumers in real time, enabling much more dynamic energy provision and consumption.”
IJENKO’s Home Management Solution allows energy suppliers to empower customers to become more efficient in their use of energy, enabling them to save on their energy bills and collectively influence the demand curve.
Imagine being able to use your Smart Phone to check how much energy your heating system is consuming, and what this equates to in monetary terms. Imagine then that you’re able to remotely turn down your thermostat a degree or two from wherever you are. This greater visibility and control, made possible by the IoT, allows for a much more proactive, dynamic and efficient use of energy, not to mention lower bills if that is your end goal.
And these solutions are not the stuff of science fiction. They are possible today thanks to IJENKO’s Home Management Solution and the Intel® IoT Gateway, which extracts data from legacy systems around the home and securely connects them to next-generation intelligent infrastructure.
As well as improving the customer experience, the IJNEKO solution presents utility providers with the opportunity to develop innovative smart services based on the interaction of a number of technologies, rather than one overarching standard. By adding real value to the customer, these services offer long-term stickiness and help utility providers to stay in control of the customer experience.
All in all, it’s a win-win scenario.
Online Sales Development Manager
Demand for efficiency, flexibility, and scalability continues to increase, and the data center must keep pace with movement to our digital business strategies. Previously, Diane Bryant, Intel’s senior vice president and general manager of Intel’s Data Center Group, recently stated, “We are in the midst of a bold industry transformation as IT evolves from supporting the business to being the business. This transformation and the move to cloud computing calls into question many of the fundamental principles of data center architecture.”
Those “fundamental principles of data center architecture” are on a collision course with the direction that virtualization has lead us. This virtualization in conjunction with automation and orchestration is leading us to the Software Defined Infrastructure (SDI). The demand of SDI is driving new hardware developments, which will open a whole new world of possibilities for running a state-of-the-art data center. This eventually will leave our legacy infrastructure behind. While we’re not quite there yet, as different stages need to mature, the process has the power to transform the data center.
SDI rebuilds the data center into a landing zone for new business capabilities. Instead of comprising multiple highly specialized components, it’s a cohesive and comprehensive system that meets all the demands placed on it by highly scalable, completely diversified workloads, from the traditional workloads to cloud-aware applications.
This movement to cloud-aware applications will demand the need for SDI; by virtualizing and automating the hardware that powers software platforms, infrastructure will be more powerful, cost-effective, and efficient. This migration away from manual upkeep of individual resources will also allow systems, storage, and network administrators to shift their focus to more important tasks instead of acting as “middleware” to connect these platforms.
Organizations will be able to scale their infrastructure in support of the new business services and products, and bring them to market much more quickly with the power of SDI.
Hardware Still Matters
As the data center moves toward an SDI-driven future, CIOs should be cautious in thinking that hardware does not count anymore. Hardware that works in conjunction with the software to ensure that the security and reliability of the workloads are fully managed and provide telemetry and extensibility that allow specific capabilities to be optimized and controlled within the hardware will be critical.
The Future of the Data Center Lies with SDI
Data centers must be agile, flexible, and efficient in this era of transformative IT. SDI allows us to achieve greater efficiency and agility by allocating resources according to our organizational needs, applications requirements, and infrastructure capabilities.
As Bryant concluded, “Anyone in our industry trying to cling to the legacy world will be left behind. We see the move to cloud services and software defined infrastructure as a tremendous opportunity and we are seizing this opportunity.”
In previous blog posts , , we’ve built the Mosquitto MQTT broker on Edison and created a sensor node for sensing motion, temperature, and light level. In this article, we will… Read more
Ceph, The Future of Storage™, is a massively scalable, open source, software-defined storage system that runs on commodity hardware. Ceph has been developed from the ground up to… Read more