Our team at Intel gained a great insight this summer: If you want to introduce some bold new ideas into your organization, bring in some high schoolers and arm them with design thinking.
In a… Read more
Our team at Intel gained a great insight this summer: If you want to introduce some bold new ideas into your organization, bring in some high schoolers and arm them with design thinking.
In a… Read more
While talking to enterprise and cloud data center operators, the subject of hyper-converged infrastructure is a very hot topic. This new approach to infrastructure brings together server, storage, and networking components into an appliance designed for quicker installation and easier management. Some industry observers say hyper-converged systems are likely to play a significant role in meeting the scalability and deployment requirements of tomorrow’s data centers.
One view, for example comes from IDC analyst Eric Sheppard: “As businesses embark on a transformation to become data-driven entities, they will demand a data infrastructure that supports extreme scalability and flexible acquisition patterns and offer unprecedented economies of scale. Hyperconverged systems hold the promise and the potential to assist buyers along this data-driven journey.”
Today, Intel is helping fuel this hyper-converged infrastructure trend with a line of new server products announced at this week’s VMworld 2015 U.S. conference in San Francisco. Intel® Server Products for Hyper-Converged Infrastructure are designed to be high quality, unbranded, semi-integrated, and configure-to-order server building blocks optimized for the hyper-converged infrastructure solutions that enterprise IT and cloud environments have requested.
These new offerings, which provide certified hardware for VMware EVO:RAIL* solutions, combine storage, networking, and compute in an all-in-one system to support homogenous enterprise IT environments in a manner that reduces labor costs. OEMs and channel partners can now provide hyper-converged infrastructure solutions featuring Intel’s most innovative technologies, along with world-class validation, compatibility, certification, warranty, and support.
For OEMs and channel partners, these products pave a path to the rapidly growing and potentially lucrative market for hyper-converged solutions. Just how big of a market are we talking about? According to IDC, workload and geographic expansion will help push hyper-converged systems global revenues past the $800 million mark this year, up 116 percent over 2014. Intel® Server Products for Hyper-Converged Infrastructure also bring together key pieces of the infrastructure puzzle, including Intel’s most innovative technologies designed hyper-converged infrastructure enterprise workloads.
Intel® Server Products for Hyper-Converged Infrastructure include a 2U 4-Node chassis supporting up to 24 hot-swap hard disk drives, dual-socket compute modules offering dense performance and support for the Intel® Xeon® processor E5-2600 v3 product family, and eight high-speed NVMe* solid-state drives acting as cache to deliver high performance for VMware Virtual SAN* (VSAN*).
With all key server, storage, and networking components bundled together, OEMs and channel partners have what they need to accelerate the delivery of hyper-converged solutions that are easily tuned to the requirements of customer environments. Better still, they can provide their customers with the confidence that comes with Intel hardware that is fully validated and optimized for VMware EVO:RAIL and integrated into enterprise-class VSAN-certified solutions.
For a closer look at these new groundbreaking server products, visit the Intel hyper-converged infrastructure site.
1 IDC MarketScape: Worldwide Hyperconverged Systems 2014 Vendor Assessment. December 2014. Doc # 253267.
2 IDC news release. “Workload and Geographic Expansion Will Help Push Hyperconverged Systems Revenues Past $800 Million in 2015, According to IDC” April 30, 2015.
The Intel® Math Kernel Library (Intel® MKL), the high performance math library for x86 and x86-64, is available for free for everyone (click here now to register and download). Purchasing is only… Read more
As I mentioned in the first part of this blog series, wearables have become more than a passing trend and are truly changing the way people and organizations think about managing health. I hear from many companies and customers who want to understand how the wearables market is impacting patient care as well as some of the changes taking place with providers, insurers, and employers. In the next several blogs, I’ll share some of their questions and my responses. Today’s question is:
For providers, one example is a pilot that the Mayo Clinic did with Fitbit to track patients recovering from cardiac surgery. They were able to predict which of those patients would be discharged sooner than others based on their activity in the hospital. You can easily see how this use case could be extended outside of the hospital, where you might be able to use wearables to more accurately predict which patients are at the highest risk for hospital readmission. This of course is a key quality metric that hospitals are incentivized to reduce.
On the payer side, organizations are using wearable devices to influence the behavior of their members, encourage a healthier lifestyle, and delay the onset of conditions like obesity and diabetes. Cigna has a program for their own employees where they identify individuals who may be at risk for diabetes. They created a wearables program that encouraged increased activity in those individuals’ daily lives, and it’s making a difference.
Gartner finds that over 2,000 corporate wellness programs have integrated wearables to track employees’ physical activity and incentivize them, sometimes financially, to have a healthier lifestyle. BP rolled out a program with 14,000 employees. Those who were able to achieve 1 million steps (equivalent to roughly 500 miles for an average-size person) over the course of a one year period received a health plan premium reduction the following year.
Now, has anybody been able to aggregate enough wearable data for some serious predictive analytics, or is that down the road? I think that’s down the road; certainly before it becomes mainstream. This will entail significant data integration and big data analytics. We’re looking to pull in multi-structured data from multiple distributed entities and repositories – data from electronic health records, health insurance claims, in some cases socioeconomic data, and all the new sensor data from wearables. If we can pull the continuous stream of patient-generated data into a repository, and overlay more traditional payer and provider data, I suspect the accuracy of predictive models will be significantly improved. We’ll be much better able to identify high-risk patients that will benefit most from additional outreach by a provider organization.
What questions do you have?
In my next blog, I’ll look at the primary challenges companies are facing in collecting, analyzing, and sharing data generated by wearables.
15 Tech Toys Turn Play into Learning is a really cool story for anyone with tech-interested kids in their lives. It gives a well-researched list of toys that help kids learn to code, etc. – with insights from Intel’s Mark … Read more >
My last two blogs centered on the advantage of Crosswalk over the embedded webview that is built into Android devices. If you haven’t already read those posts, here are the links: Build High-Performance HTML5 Cordova Apps with Crosswalk Chromium Command-Line Options for … Read more >
We’re experiencing ever-increasing volumes of data within health and life sciences. If we were to sequence just once the ~14M new cancer patients (T/N) worldwide, it would require more than 5.6 Exabytes (and the reality is we need to be able to sequence them multiple times during the course of treatment using a variety of omics and analytics approaches). The technical challenges of big data are many, from how do we manage and store such large volumes of data to being able to analyse hugely complex datasets. However, we must meet these challenges head-on as the rewards are very real.
I’m pleased to tell you about a significant project that Intel is supporting to help overcome these types of challenges which will assist in the drive to comprehensively analyse cancer genomes. Our HPC solutions are already facilitating organisations around the world to deliver better healthcare and individuals to overcome diseases such as cancer. And our relationship with the Pan-Cancer Analysis of Whole Genomes (PCAWG) project is helping scientists to access and share analysis of more than 2,600 whole human genomes (5200 matched Tumor/Normal pairs).
Scientific discovery can no longer operate in isolation – there is an imperative to collaborate internationally working across petabytes of data and statistically significant patient cohorts. The PCAWG project is turning to the cloud to enhance access for all which will bring significant advances in healthcare through collaborative research.
By working directly with industry experts to accelerate cancer research and treatment, Intel is at the forefront of the emerging field of precision medicine. Advanced biomarkers, predictive analytics and patient stratification, therapeutic treatments tailored to an individual’s molecular profile, these hallmarks of precision medicine are undergoing rapid translation from research into clinical practice. Intel HPC Big Data/Analytics technologies support high-throughput genomics research while delivering low-latency clinical results. Clinicians together with patients formulate individualized treatment plans, informed with the latest scientific understanding.
For example, Intel HPC technology will accelerate the work of bioinformaticists and biologists at the German Cancer Research Centre (DKFZ) and the European Molecular Biology Laboratory (EMBL), allowing these organisations to share complex datasets more efficiently. Intel, Fujitsu, and SAP are helping to build the infrastructure and provide expertise to turn this complex challenge into reality.
The PCAWG project is in its second phase which began with the uploading of genomic data to seven academic computer centres, creating what is in essence a super-cloud of genomic information. Currently, this ‘academic community cloud’ is analysing data to identify genetic variants, including cancer-specific mutations. And I’m really excited to see where the next phase takes us as our technology will help over 700 ICGC scientists worldwide to remotely access this huge dataset, performing secondary analysis to gain insight into their own specific cancer research projects.
This is truly ground-breaking work made possible by a combination of great scientists utilising the latest high-performance big data technologies to deliver life-changing work. At Intel it gives us great satisfaction to know that we are playing a part in furthering knowledge in both the wider genomics field, but also specifically in better understanding cancer which will lead to more effective treatments for everyone.
Juan Lopez Marcano was an Intel Scholar with the Platform Engineering group during the summer of 2015. He is currently earning a Master’s Degree in Electrical and Electronics Engineering at Virginia Polytechnic Institute and State University. If I could describe … Read more >
The post In Their Own Words: Intel Intern Juan Lopez Marcano Shares His Story appeared first on Jobs@Intel Blog.
It always causes me exquisite pain to see someone laboriously copying down a long number from their computer screen, just to type it in to another window or application. Doesn’t it for you?
After… Read more
I was last week in San Francisco attending to the IDF and I must confess that I’m still thinking about all the cool things that I saw. I though in sharing the technical sessions which I enjoyed the… Read more
Intel Processor Graphics: Architecture and Programming
Organizers: David Blyth, Hong Jiang, Geoff Lowney, Ken Lueh, CK Luk (all from Intel)
Duration: Full Day
Intel Processor… Read more
This year’s Intel Developer Forum was a sold-out event full of inspiring demos, innovative proof of concepts, and pioneering examples of cutting-edge technology. More Intel® Software Innovators were… Read more
As the Internet of Things (IoT) expands across the globe, Intel IoT Ignition Labs are opening at a rapid pace. Intel IoT and Intel Labs Europe’s Intel IoT Ignition Lab opened a lab in Haifa, Israel, in May, to encourage … Read more >
The post Intel IoT Ignition Lab in Israel Focuses on End-to-End Solutions appeared first on IoT@Intel.
I don’t know about you, but while I love being able to browse my favourite store’s latest range from the comfort of my sofa, the hands-on experience that I get from a visit to the store itself is also still very appealing. What’s great about today’s retail landscape is that we have the opportunity to do both. The way we try and buy items from our favourite brands is no longer dictated by the opening hours or stock levels in our local high street store.
While this is good news for the consumer, the battle is on for high street retailers. To entice as many shoppers as possible through their doors, retailers need to offer a totally unique shopping experience – something that will convince you and me to put down our tablets and head to the high street.
Digitopia, a digital retail solution provider in Belgium, is working with Intel to build devices and apps that retailers can use to create more compelling shopping experiences. By trailing different solutions in various retail environments on Antwerp’s most popular shopping street, Digitopia is helping retailers to define which technologies work best in each different store scenario.
On Innovation Boulevard, as Digitopia has dubbed it, shoppers can turn their phone into a remote control to browse holidays on a large screen in the travel agent’s window. They can use an interactive fitting room in a fashion boutique to check for alternative colors and sizes of the outfits they are trying on. It’s even possible to order and pay for their cafe refreshments with a smartphone app rather than queuing up in the store. A large number of the solutions are powered by Intel technologies.
For shoppers, the retail experience is smoother and more personalized. Importantly, the technologies are also helping retailers to increase sales, offer new services and continue to interact with their customers when the shops are closed.
You can read more about the exciting retail experience that Digitopia has created in our new case study. My personal favorite is the possibility to book a holiday while walking between shops – what’s yours?
Giselle Sholler is the Chair of the Neuroblastoma and Medulloblastoma Translational Research Consortium (NMTRC) and the Director of the Hayworth Innovative Therapeutic Clinic at Helen DeVos Children’s Hospital. The NMTRC is a group of 15 pediatric hospitals across the U.S, plus the American University in Beirut, Lebanon, and Hospital La Timone in Marseilles, France. We sat down recently with Dr. Sholler to talk about to role of precision medicine in her work and how it impacts patients.
Sholler: As a pediatric oncologist, one of the most challenging times is when we’re faced with a child who is not responding to standard therapy and we want to figure out how we can treat this patient. How can we bring hope to that family? A project that we are working on in collaboration with TGen, Dell and Intel has brought that hope to these families.
Sholler: When a child has an incurable pediatric cancer, we a take a needle biopsy and send it to TGen where the DNA and RNA sequencing occurs. When ready, that information comes back to the Consortium. Through a significant amount of analysis of the genomic information, we’re able to look at what drugs might target specific mutations or pathways. On a virtual tumor board, we have 15 hospitals across the U.S. and now two international hospitals in Lebanon and France that come together and discuss the patient’s case with the bioinformatics team from TGen. Everyone is trying to understand that patient and with the help of pharmacists create individualized treatment plans for that patient so that patient can have a therapy available to them that might result in a response for their tumor.
Sholler: Precision medicine is about using the genomic information data form a patient’s tumor to identify which drugs not only will work, but which ones may not work on that patient’s specific cancer. With precision medicine, we can identify the right treatment for a patient. We’re not saying chemotherapy is bad, but for many of our patients chemotherapy is attacking every rapidly dividing cell and leaves our children with a lot of long term side effects. My hope for the future is that as we can target patients more specifically with the correct medications, we can alleviate some of the side effects that we’re seeing in our patients. Half our children with neuroblastoma have hearing loss and need hearing aids for the rest of their lives. They have heart conditions, kidney conditions, liver conditions that we’d like to see if we can avoid in the future.
Sholler: The collaboration with Dell and Intel has been critical to making this entire project possible. The grant from Dell to fund this entire program over the last four years has been unparalleled in pediatric cancer. The computer power has also been vital to the success. Three years ago we were doing only RNA expression profile and it took two months; now, we’re doing RNA sequencing and DNA exomes completely and it takes less than two weeks to get the answers for our patients. The data transfer and networking used to entail shipping hard drives a few years ago. Now, we can send a tumor sample from Lebanon to TGen, complete the sequence in a few days and have a report for the tumor board a few days after that. It’s just been amazing to see the speed and accuracy improve for profiling.
Sholler: Another very critical piece that Dell has helped provide is the physician portal. Physicians are able to work together across the country, and across the world, and have access to patient records. The database now has grown and grown. When we do see patients, we can also pull up previous patients with similar sequencing or similar profiles, or treated with similar drugs, and see what was used in treatment. And how did they do? What was the outcome? We’re learning more and more with every patient and it doesn’t matter where we live anymore. Everything’s virtual online. It’s just been incredible.
Foundations of Digital Games is a summit of innovators and influencers in gaming-related academia as well as the games industry itself.
In what was originally “the premier educational conference… Read more
Until now development of applications with audio and video required background knowledge of different SDKs and APIs, whether you wanted to create a workload for Windows* or Android* or for either… Read more
What would happen if you were hauling a trailer down the road, and suddenly you realized that the trailer you thought you were pulling passes you on the highway. (Which I guess can actually happen, I… Read more
When developing a mobile business intelligence (BI) strategy, you can’t ignore the role that business processes may play. In many cases, the introduction of BI content into the portfolio of mobile BI assets provides opportunities to not only eliminate the gaps in your business operations, but to improve the existing processes.
Often, the impact is seen in two main ways. First, the current business processes may require you to change your mobile BI approach. Second, the mobile BI solution may highlight gaps that may require a redesign of your business processes to improve your mobile BI assets and your business operations.
Existing business processes will have a direct impact on the design of your mobile BI solution. I’m often amazed to discover that the lack of consideration given to identifying business processes stems not from a lack of insight but from wrong assumptions that are made during the requirements and design phases.
It’s true that the business processes may not be impacted if the scope of your mobile BI engagement is limited to mobilizing an existing BI asset (like a report or dashboard) without making any changes to the original end-product, including all underlying logic. But in many cases, the opposite is true—the mobile BI end product may be the driver for change, including the update of the existing BI asset as a result of a mobile BI design.
Mobile solutions may require different assumptions in many aspects of their design, which range from source data updates to report layout and logic. Advanced capabilities, such as a write-back option, will further complicate things because the integration systems outside the BI platform will require closer scrutiny and a much closer alignment with business processes.
Moreover, constraints that surround source data will have a direct influence on the mobile BI design. For example, if you’re dependent on feeds from external data sources, you may need to consider an additional buffer to take into account possible delays or errors in the data feed. Or, perhaps you have a new application that was just built to collect manually-entered data from field operations. If this new application was introduced as part of your mobile BI solution, the process that governs this data collection system will have a direct impact on your design because of its immediate availability. This wouldn’t have been as important before as an operational tool with a limited audience without mobile BI.
As part of designing your strategy or developing your mobile BI solution, you may discover either gaps or areas for improvement. Don’t worry. This is a known side effect, and it’s often considered a welcome gift because it gives you a chance to kill two birds with one stone: improve your business operations and increase the value of your mobile BI solution. However, it’s critical here to ensure that your team stays focused on the end goal of delivering on time and on schedule (unless the gaps turn out to be major showstoppers).
Typical examples are found in the areas of data quality and business rules. The design of a mobile BI asset—especially if it’s new—may highlight new or known data-quality issues. The visibility factor may be different with mobile. Adoption or visibility by executives often may force additional scrutiny. Moreover, adoption rates (ratio of actual users divided by total users of mobile solutions) may be higher because of the availability and convenience with mobile. As a result, mobile users may be less tolerant about the lack of quality assurance (QA) steps.
Business rules offer another example due to the same visibility factor. A proposed change in a business rule or process, which previously failed to get attention due to lack of support, may now have more backers when it’s associated with a mobile BI solution. Strong executive sponsorship may influence the outcome.
It’s easy to make the wrong assumptions when it comes to business processes. It happens not just in mobile BI but in other technology projects. You cannot take existing processes for granted. What may have worked before may not work for mobile BI. Let your business processes complement your overall mobile BI strategy, and let your mobile BI engagement become a conduit for opportunities to improve your operational efficiencies.
Not only will these opportunities improve your business operations, but they will lead to increased adoption by increasing the trust your customers/users have in your mobile BI content.
What do you see as the biggest challenge when it comes to business processes in your mobile BI strategy?
Stay tuned for my next blog in the Mobile BI Strategy series
This story originally appeared on the SAP Analytics Blog.